Answering Questions with a Vector Database
Answer natural language questions with facts pulled from a vector database
The VectorSearchQATool
(opens in a new tab) answer questions using information stored in a Vector Database.
Vector Databases
A Vector Database (aka Embedding Index) is just a way to store and search through natural language content.
Steamship contains a built-in vector database that's configured to "just work" with Agents and Tool. By default it auto-applies OpenAI's embeddings to text ans queries.
This tool works out of the box with Steamship's on-demand Vector Database:
from steamship.agents.tools.question_answering.vector_search_qa_tool import VectorSearchQATool
tool = VectorSearchQATool()
But you can also provide a few useful optional arguments:
ai_description
- Tells your agent how and when to use the tool.load_docs_count
- Tells your agent how many source documents to load from the Vector DB when answering the question
For example, if you want your agent to only use this tool to answer questions about France, you might say:
from steamship.agents.tools.search.search import SearchTool
tool = VectorSearchQATool(
ai_description=(
"Used to answer questions about France. "
"The input is a question about France. "
"The output is the answer to the question."
)
)
For more configuration options, see the source code (opens in a new tab).
Building an Agent with VectorSearchQATool
This tool can be provided to a FunctionsBasedAgent
(opens in a new tab) which will decide when to use it based on its ai_description
.
For example, if the tool's ai_description
says the tool is "used to answer question about historical events", then the agent will tend to use the tool when the user asks a question that matches that use.
In your AgentService
, here's how you can provide this tool to a FunctionsBasedAgent
(opens in a new tab):
# Create an instance of the tool
from steamship.agents.tools.question_answering.vector_search_qa_tool import VectorSearchQATool
vector_qa_tool = VectorSearchQATool()
# Create the AgentService's agent, providing it the tool
self._agent = FunctionsBasedAgent(
tools=[
vector_qa_tool,
],
llm=OpenAI(self.client),
)
Deploying an AgentService with VectorSearchQATool
To deploy an agent with this tool, just make sure it's properly wrapped in an AgentService
wrapper and then follow the normal deployment instructions.
You will get a URL that you can visit to create instances of your agent in the cloud and connect to it from the web & chat apps.
Customizing the Behavior of VectorSearchQATool
Steamship tools are just open-source Python. The source for this tool is available here (opens in a new tab). You can customize it by copy-pasting the tool into your own project, changing the code, and then using your new version.
This particular tool works by creating a prompt that:
- Uses Steamship's managed Vector Index to find source documents relevant to the user question
- Places those source documents into a prompt that gets combined with the user question
- Asks the LLM to answer the question using the provided source documents
If you create an interesting customization of this tool, please consider sending us a pull request on GitHub (opens in a new tab)!