Supported Databases
The currentnao CLI supports these database types in nao_config.yaml:
- Athena
- BigQuery
- ClickHouse
- Databricks
- DuckDB
- Fabric
- MSSQL
- Postgres
- Redshift
- Snowflake
- Trino
Common Parameters
Every database entry supports these shared fields:name: Friendly connection nametype: One ofathena,bigquery,clickhouse,databricks,duckdb,fabric,mssql,postgres,redshift,snowflake,trinoinclude: Optional glob patterns forschema.tablevalues to includeexclude: Optional glob patterns forschema.tablevalues to excludeaccessors: Optional list of rendered context files
Accessors
These are the actual built-in accessors:columnsdescriptionpreviewprofilingindexes(optional, currently used for ClickHouse table/index metadata)ai_summary(optional, AI-generated table summary)
accessors, nao renders all four standard accessors (columns, description, preview, and profiling) by default.
Use the optional profiling config block to control profiling refresh behavior:
refresh_policy: Profiling refresh strategy.interval_days: Optional refresh interval in days.
ai_summary is opt-in. To use it, add ai_summary to accessors and configure llm.annotation_model in nao_config.yaml.
When enabled, nao can render databases/ai_summary.md.j2 and call prompt("...") inside that template to generate LLM-based summaries during nao sync.
Database Parameters
Athena
profile_nameaws_access_key_idaws_secret_access_keyaws_session_token
BigQuery
credentials_pathcredentials_jsonsso: truefor ADC / browser auth
partition_filters: map oftable_name: SQL filterused for preview queries on tables that enforcerequire_partition_filter = TRUE
ClickHouse
Databricks
DuckDB
Fabric
sql_password(SQL username/password)azure_cli(az logintoken)azure_interactive(browser login)azure_service_principal(client ID and secret)
MSSQL
Postgres
Redshift
Snowflake
private_key_pathpassphraseauthenticator
Trino
Synchronization
Once configured, sync your database schemas:- Connect to each database
- Extract schema information
- Render the configured accessors
- Save the output in
databases/
Context Files
After syncing, you’ll see a structure like:columns.md
description.md
preview.md
profiling.md
Table Selection
Control which tables are synced withinclude and exclude.
Use glob patterns on schema.table:
analytics.ordersanalytics.**.orders*_stagingtest_**
include first and then removes matches from exclude.
Best Practices
- Start with your core schemas only
- Keep
accessorssmall if token usage matters - Use
includeandexcludeto avoid temp, backup, and test tables
Context Engineering Principles
Learn how to find the optimal balance between comprehensiveness and efficiency