nao Tools
Codebase- Search codebase
- Get content of a file
- Grep tool (search code and data warehouse)
- Search/replace tool
- Read full content of notebooks
- Get dependencies of a dbt model
- Find source yaml of a dbt model
- Find documentation yaml of a dbt model
- Execute dbt commands (run, build, test)
- Create dbt models, sources, documentation, and tests
- Search datawarehouse
- Get schema of a table
- Data profiling (count rows, null values, value distribution)
- Get view query definition
- Execute a query on the warehouse
- Generate charts from query results in the chat
- Analyze query results sent from worksheets
- Run terminal commands
- Fix terminal errors in chat
- Execute Git operations
- Run build scripts and tests
MCPs
nao supports MCP (Model Context Protocol), allowing the agent to integrate with external tools and services through standardized protocols. MCPs enable the agent to access additional capabilities beyond the built-in tools.What are MCPs?
MCPs provide a standardized way for AI agents to interact with external systems, APIs, and tools. This allows nao to extend its capabilities by connecting to your custom tools and services.
- Notion: Access and manage Notion pages and databases
- Git: Interact with Git repositories and perform version control operations
- Airbyte: Connect to Airbyte for data integration workflows
- Omni: Integration with Omni tools and services
- Tableau: Access Tableau workbooks and data sources
- Elementary: Data quality and monitoring tools
- Select Star: Data catalog and discovery platform
- Go to Settings > Agent > MCPs
- Click “Add Custom MCP”
- Configure your MCP connection details directly in the mcp.json file
- The agent will have access to your custom MCP tools
Permissions
You can configure how the agent handles permissions for different actions in Settings > Agent > Chat:- Query Execution: Choose to always allow, always ask, or use auto-run settings
- Command Execution: Control whether the agent asks before running terminal commands
- Data Sharing: Set default behavior for sharing query results with the LLM (always share, always ask, or use lock toggle)