1. Setup Models and Credentials

Go to the /models screen and add two models:


2. Setup Datasource

If running locally via Docker, during this process we reccommend running docker compose logs -f in your terminal to follow along and catch any errors if they occur. For Advanced debugging you can also open up apps like Qdrant or Airbyte to see progress as data passes through each system. Click Advanced Debugging for instructions on how to access these UIs.

Go to the /datasources screen, select New Connection and add a Bigquery data source:


3. Setup an Agent

Go to the /agents screen and create a new Agent


4. Create a Chat App

Go to the /apps screen and create a new Chat App

5. Have a chat!


If you want to make sure the agent always uses the tool, you can update the agent prompt and tell it, ALWAYS use the … tool. Otherwise if you want to build an agent with a bit more autonomy to decide on multiple tools, you can keep the config light and let it infer which tool is required. For example in the video we prompted the tool by writing According to Elon… which helped guide the LLM to the correct tool.