Analytics API
The Analytics API enables you to create and execute queries against external databases and the tenant's own schema (internal). It provides a secure, read-only query execution environment with connector validation.
Overview
The Analytics module allows you to:
- Connect to External Databases: PostgreSQL, MySQL, Redshift, Elasticsearch, ClickHouse
- Query Internal Tenant Data: Use the tenant's own PostgreSQL schema without storing credentials
- Create Reusable Queries: Define parameterized query templates
- Execute Queries Securely: Read-only execution with dangerous operation blocking
- Use Dynamic Parameters: Jinja2 templating for runtime parameter injection
- Access Secrets Securely: Reference stored credentials without exposing them
- Use Consistent API Envelopes: Success responses follow the platform
status/message/datastructure
Architecture
Flow
- Query Creation: Define a query with
connection_type(externalorinternal),query_text, and (for external)secret_key - Driver Resolution: External uses secret type slug (e.g.,
analytics-postgres); internal uses built-ininternaldriver - Template Rendering: Jinja2 renders the query with parameters and secrets (secrets only for external)
- Secure Execution: Connectors execute in read-only mode; internal forces tenant
search_pathand blocks cross-schema access - Result Return: Data is returned with execution metadata
Supported Drivers
| Driver | Secret Type Slug | Use Case | Connection Type |
|---|---|---|---|
| PostgreSQL | analytics-postgres | Relational data, complex queries | external |
| MySQL | analytics-mysql | Web application databases | external |
| Redshift | analytics-redshift | Data warehouse analytics | external |
| Elasticsearch | analytics-elasticsearch | Log analysis, full-text search | external |
| ClickHouse | analytics-clickhouse | Real-time analytics, time-series | external |
| Internal (tenant DB) | none | Internal database connection | internal |
Prerequisites
External analytics queries require secrets to be configured first. Internal analytics queries do not require secrets.
Step 1: Verify Secret Types Exist
Your tenant must have the analytics secret types seeded. These are created automatically when a tenant is provisioned. Verify they exist:
- REST API
- Python
- JavaScript
curl -X GET https://your-site.taruvi.cloud/api/apps/{app_slug}/secrets/types/ \
-H "Authorization: Bearer YOUR_TOKEN"
import requests
response = requests.get(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/secrets/types/",
headers={"Authorization": "Bearer YOUR_TOKEN"}
)
secret_types = response.json()
const response = await fetch(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/secrets/types/",
{
headers: { "Authorization": "Bearer YOUR_TOKEN" }
}
);
const secretTypes = await response.json();
You should see types like:
analytics-postgresanalytics-mysqlanalytics-redshiftanalytics-elasticsearchanalytics-clickhouse
If missing, contact your platform administrator to run the tenant seeder.
Step 2: Create a Connection Secret (external only)
Create a secret with your database credentials. The secret type determines which database driver will be used.
- REST API
- Python
- JavaScript
curl -X POST https://your-site.taruvi.cloud/api/apps/{app_slug}/secrets/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"key": "my-postgres-readonly",
"type_slug": "analytics-postgres",
"value": {
"host": "db.example.com",
"port": 5432,
"database": "analytics_db",
"username": "readonly_user",
"password": "secure_password"
}
}'
import requests
response = requests.post(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/secrets/",
headers={
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
json={
"key": "my-postgres-readonly",
"type_slug": "analytics-postgres",
"value": {
"host": "db.example.com",
"port": 5432,
"database": "analytics_db",
"username": "readonly_user",
"password": "secure_password"
}
}
)
result = response.json()
const response = await fetch(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/secrets/",
{
method: "POST",
headers: {
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
body: JSON.stringify({
key: "my-postgres-readonly",
type_slug: "analytics-postgres",
value: {
host: "db.example.com",
port: 5432,
database: "analytics_db",
username: "readonly_user",
password: "secure_password"
}
})
}
);
const result = await response.json();
Step 3: Create Analytics Query
Now you can create a query that references your secret (external):
- REST API
- Python
- JavaScript
curl -X POST https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Sales Report",
"connection_type": "external",
"secret_key": "my-postgres-readonly",
"query_text": "SELECT * FROM sales WHERE date = '\''{{ date }}'\''"
}'
import requests
response = requests.post(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/",
headers={
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
json={
"name": "Sales Report",
"connection_type": "external",
"secret_key": "my-postgres-readonly",
"query_text": "SELECT * FROM sales WHERE date = '{{ date }}'"
}
)
result = response.json()
const response = await fetch(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/",
{
method: "POST",
headers: {
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
body: JSON.stringify({
name: "Sales Report",
connection_type: "external",
secret_key: "my-postgres-readonly",
query_text: "SELECT * FROM sales WHERE date = '{{ date }}'"
})
}
);
const result = await response.json();
For internal (tenant DB) queries, omit secret_key and set connection_type to internal. Use the logical table name (e.g., orders) — the system automatically resolves it to the physical table name (e.g., myapp_orders) based on the app context:
- REST API
- Python
- JavaScript
curl -X POST https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Internal Orders",
"connection_type": "internal",
"query_text": "SELECT * FROM orders WHERE status = '\''{{ status }}'\''"
}'
import requests
response = requests.post(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/",
headers={
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
json={
"name": "Internal Orders",
"connection_type": "internal",
"query_text": "SELECT * FROM orders WHERE status = '{{ status }}'"
}
)
result = response.json()
const response = await fetch(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/",
{
method: "POST",
headers: {
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
body: JSON.stringify({
name: "Internal Orders",
connection_type: "internal",
query_text: "SELECT * FROM orders WHERE status = '{{ status }}'"
})
}
);
const result = await response.json();
For internal queries, you can reference tables by their logical name (e.g., orders, users) instead of the full physical name (e.g., myapp_orders). The analytics engine automatically resolves logical names to physical names using the app context from the URL. Queries using the full physical name also continue to work.
Common Setup Errors
| Error | Cause | Solution |
|---|---|---|
Secret not found | Secret key doesn't exist (external) | Create the secret first (Step 2) |
Secret type not found | Missing analytics-* type (external) | Run tenant seeder or contact admin |
Unsupported analytics datasource | Wrong secret type slug (external) | Use analytics-postgres, not postgres |
Driver not supported | Invalid secret type (external) | Check supported drivers table above |
Tenant schema required | Internal query without tenant context | Call the API with a valid tenant/site |
Secret Configuration
Before creating queries, you must create a secret with the appropriate type. The secret type slug determines which database driver is used.
PostgreSQL / MySQL / Redshift
{
"host": "db.example.com",
"port": 5432,
"database": "analytics_db",
"username": "readonly_user",
"password": "secure_password"
}
| Field | Type | Required | Description |
|---|---|---|---|
host | string | Yes | Database host |
port | integer | Yes | Database port (5432 for PostgreSQL, 3306 for MySQL, 5439 for Redshift) |
database | string | Yes | Database name |
username | string | Yes | Database username |
password | string | Yes | Database password |
Usernames and passwords may contain URI-special characters such as @, :, /, #, ?, and % without any manual encoding. The connection builder handles URL encoding internally, so store credentials exactly as issued by your database.
Elasticsearch
{
"host": "https://elasticsearch.example.com",
"port": 9200,
"index": "logs-*",
"username": "elastic",
"password": "secure_password",
"max_result_rows": 10000
}
| Field | Type | Required | Description |
|---|---|---|---|
host | string | Yes | Elasticsearch host URL |
port | integer | No | Port (default: 9200) |
index | string | Yes | Default index pattern |
username | string | No | Basic auth username |
password | string | No | Basic auth password |
api_key | string | No | API key (alternative to username/password) |
headers | object | No | Custom HTTP headers |
max_result_rows | integer | No | Maximum rows to return (default: 10000) |
ClickHouse
{
"host": "clickhouse.example.com",
"port": 8443,
"database": "analytics",
"username": "readonly",
"password": "secure_password",
"secure": true
}
| Field | Type | Required | Description |
|---|---|---|---|
host | string | Yes | ClickHouse host |
port | integer | Yes | Port (default: 8443 for HTTPS) |
database | string | Yes | Database name |
username | string | Yes | Username |
password | string | Yes | Password |
secure | boolean | No | Use HTTPS (default: true) |
Base URL
All Analytics endpoints are scoped to an app:
/api/apps/{app_slug}/analytics/queries/
Response Format
Analytics success responses use the same platform envelope as the rest of the API layer.
Success with data
{
"status": "success",
"message": "Human-readable message",
"data": {}
}
Success with list data
{
"status": "success",
"message": "Human-readable message",
"data": [],
"total": 0
}
Execute response
Query execution returns result rows in data, with execution metadata promoted to top-level fields:
{
"status": "success",
"message": "Query executed successfully",
"data": [],
"total": 0,
"execution_key": "uuid-or-execution-key"
}
Elasticsearch aggregation responses
When an Elasticsearch query contains aggregations, bucket results are flattened into the data array with the aggregation name included on each row:
{
"status": "success",
"message": "Query executed successfully",
"data": [
{"aggregation": "by_status", "key": "Open", "doc_count": 15},
{"aggregation": "by_status", "key": "Closed", "doc_count": 5}
],
"total": 2,
"execution_key": "uuid"
}
Single-value metric aggregations (e.g. avg, sum, cardinality) are surfaced as {"aggregation": "<name>", "value": <n>}. Multiple named aggregations in the same query produce concatenated rows.
Elasticsearch total count
For standard search-hit responses, total reflects the real document count from hits.total.value (ES 7+). Integer hits.total (ES 6) is also supported. total is not capped at the number of rows in the current response page.
Endpoints
List Queries
- REST API
- Python
- JavaScript
curl -X GET https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/ \
-H "Authorization: Bearer YOUR_TOKEN"
Response (200 OK):
{
"status": "success",
"message": "Data retrieved successfully",
"data": [
{
"id": "uuid",
"name": "Daily Sales Report",
"slug": "daily-sales-report",
"description": "Aggregates daily sales by region",
"app": "sales-app",
"connection_type": "external",
"secret_key": "sales-db-readonly",
"query_text": "SELECT region, SUM(amount) FROM sales WHERE date = '{{ date }}' GROUP BY region",
"tags": ["finance", "daily-reporting"],
"created_at": "2024-01-15T10:30:00Z",
"updated_at": "2024-01-15T10:30:00Z"
}
],
"total": 1
}
import requests
response = requests.get(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/",
headers={"Authorization": "Bearer YOUR_TOKEN"}
)
queries = response.json()
const response = await fetch(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/",
{
headers: { "Authorization": "Bearer YOUR_TOKEN" }
}
);
const queries = await response.json();
Create Query
- REST API
- Python
- JavaScript
curl -X POST https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Daily Sales Report",
"description": "Aggregates daily sales by region",
"connection_type": "external",
"secret_key": "sales-db-readonly",
"query_text": "SELECT region, SUM(amount) as total FROM sales WHERE date = '\''{{ date }}'\'' GROUP BY region",
"tags": ["finance", "daily-reporting"]
}'
import requests
response = requests.post(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/",
headers={
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
json={
"name": "Daily Sales Report",
"description": "Aggregates daily sales by region",
"connection_type": "external",
"secret_key": "sales-db-readonly",
"query_text": "SELECT region, SUM(amount) as total FROM sales WHERE date = '{{ date }}' GROUP BY region",
"tags": ["finance", "daily-reporting"]
}
)
result = response.json()
const response = await fetch(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/",
{
method: "POST",
headers: {
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
body: JSON.stringify({
name: "Daily Sales Report",
description: "Aggregates daily sales by region",
connection_type: "external",
secret_key: "sales-db-readonly",
query_text: "SELECT region, SUM(amount) as total FROM sales WHERE date = '{{ date }}' GROUP BY region",
tags: ["finance", "daily-reporting"]
})
}
);
const result = await response.json();
| Field | Type | Required | Description |
|---|---|---|---|
name | string | Yes | Human-readable query name |
description | string | No | Query description |
connection_type | string | No (default external) | external or internal |
secret_key | string | Required for external; must be null/omitted for internal | Reference to the connection secret |
query_text | string | Yes | Query template (SQL or Elasticsearch DSL/ClickHouse) |
tags | array[string] | No | Tag slugs for organization |
Response (201 Created):
{
"status": "success",
"message": "Query created successfully",
"data": {
"id": "uuid",
"name": "Daily Sales Report",
"slug": "daily-sales-report",
"description": "Aggregates daily sales by region",
"app": "sales-app",
"connection_type": "external",
"secret_key": "sales-db-readonly",
"query_text": "SELECT region, SUM(amount) as total FROM sales WHERE date = '{{ date }}' GROUP BY region",
"tags": ["finance", "daily-reporting"],
"created_at": "2024-01-15T10:30:00Z",
"updated_at": "2024-01-15T10:30:00Z"
}
}
Get Query
- REST API
- Python
- JavaScript
curl -X GET https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/ \
-H "Authorization: Bearer YOUR_TOKEN"
import requests
response = requests.get(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/",
headers={"Authorization": "Bearer YOUR_TOKEN"}
)
query = response.json()
const response = await fetch(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/",
{
headers: { "Authorization": "Bearer YOUR_TOKEN" }
}
);
const query = await response.json();
Response (200 OK):
{
"status": "success",
"message": "",
"data": {
"id": "uuid",
"name": "Daily Sales Report",
"slug": "daily-sales-report",
"description": "Aggregates daily sales by region",
"app": "sales-app",
"connection_type": "external",
"secret_key": "sales-db-readonly",
"query_text": "SELECT region, SUM(amount) as total FROM sales WHERE date = '{{ date }}' GROUP BY region",
"tags": ["finance", "daily-reporting"],
"created_at": "2024-01-15T10:30:00Z",
"updated_at": "2024-01-15T10:30:00Z"
}
}
Update Query
- REST API
- Python
- JavaScript
curl -X PUT https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Updated Query Name",
"connection_type": "internal",
"secret_key": null,
"tags": ["internal-reporting"],
"query_text": "SELECT * FROM updated_table WHERE id = '\''{{ id }}'\''"
}'
import requests
response = requests.put(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/",
headers={
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
json={
"name": "Updated Query Name",
"connection_type": "internal",
"secret_key": None,
"tags": ["internal-reporting"],
"query_text": "SELECT * FROM updated_table WHERE id = '{{ id }}'"
}
)
result = response.json()
const response = await fetch(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/",
{
method: "PUT",
headers: {
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
body: JSON.stringify({
name: "Updated Query Name",
connection_type: "internal",
secret_key: null,
tags: ["internal-reporting"],
query_text: "SELECT * FROM updated_table WHERE id = '{{ id }}'"
})
}
);
const result = await response.json();
Any writable query fields supported by the serializer may be updated, including:
namedescriptionconnection_typesecret_keyquery_texttags
Delete Query
- REST API
- Python
- JavaScript
curl -X DELETE https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/ \
-H "Authorization: Bearer YOUR_TOKEN"
Response: Uses standard AppResponse.success envelope (status/message).
import requests
response = requests.delete(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/",
headers={"Authorization": "Bearer YOUR_TOKEN"}
)
result = response.json()
const response = await fetch(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/",
{
method: "DELETE",
headers: { "Authorization": "Bearer YOUR_TOKEN" }
}
);
const result = await response.json();
Execute Query
- REST API
- Python
- JavaScript
curl -X POST https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/execute/ \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"params": {
"date": "2024-01-15",
"region": "North America"
}
}'
Response (200 OK):
{
"status": "success",
"message": "Query executed successfully",
"data": [
{"region": "North America", "total": 150000},
{"region": "Europe", "total": 120000}
],
"total": 2,
"execution_key": "exec_abc123"
}
import requests
response = requests.post(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/execute/",
headers={
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
json={
"params": {
"date": "2024-01-15",
"region": "North America"
}
}
)
result = response.json()
const response = await fetch(
"https://your-site.taruvi.cloud/api/apps/{app_slug}/analytics/queries/{slug}/execute/",
{
method: "POST",
headers: {
"Authorization": "Bearer YOUR_TOKEN",
"Content-Type": "application/json"
},
body: JSON.stringify({
params: {
date: "2024-01-15",
region: "North America"
}
})
}
);
const result = await response.json();
Template Syntax
Query templates use Jinja2 syntax for dynamic parameter injection.
Basic Parameters
Use {{ param_name }} to inject parameters at execution time:
SELECT * FROM users
WHERE status = '{{ status }}'
AND created_at > '{{ start_date }}'
Execute with:
{
"params": {
"status": "active",
"start_date": "2024-01-01"
}
}
String parameter values are automatically escaped for the target driver before Jinja rendering:
- SQL drivers (
postgres,mysql,redshift,clickhouse,internal): single quotes inside string values are doubled (e.g.O'BrienbecomesO''Brien) so values with quotes do not break SQL string literals and cannot be used for injection. - Elasticsearch: string values are JSON-escaped so embedded double quotes, backslashes, and control characters do not break the JSON DSL.
Numeric, boolean, and None parameter values pass through unchanged. This escaping is automatic and requires no changes to query templates.
Secret References
Use {{ secret.key }} to reference values from other secrets (external only):
SELECT * FROM users
WHERE api_key = '{{ secret.api_credentials }}'
AND tenant_id = '{{ secret.tenant_config.id }}'
Built-in User Profile
Access the current user's profile with {{ secret.user_profile }}:
SELECT * FROM audit_logs
WHERE user_id = '{{ secret.user_profile.id }}'
Available user profile fields:
id,username,emailfirst_name,last_nameis_staff,is_superuser
Tags
Analytics queries support tags as an array of tag slugs on create and update.
Example:
{
"name": "Orders by Region",
"connection_type": "external",
"secret_key": "sales-db-readonly",
"query_text": "SELECT region, COUNT(*) FROM orders GROUP BY region",
"tags": ["finance", "regional"]
}
Query Examples
SQL Query (PostgreSQL/MySQL)
SELECT
DATE(created_at) as date,
COUNT(*) as orders,
SUM(total_amount) as revenue
FROM orders
WHERE created_at BETWEEN '{{ start_date }}' AND '{{ end_date }}'
AND status = '{{ status }}'
GROUP BY DATE(created_at)
ORDER BY date DESC
LIMIT 100
Elasticsearch Query
{
"_source": ["customer_name", "order_total", "created_at"],
"query": {
"bool": {
"must": [
{"match": {"status": "{{ status }}"}},
{"range": {"created_at": {"gte": "{{ start_date }}", "lte": "{{ end_date }}"}}}
]
}
},
"sort": [{"created_at": "desc"}],
"size": 100
}
ClickHouse Query
SELECT
toDate(timestamp) as date,
count() as events,
uniq(user_id) as unique_users
FROM events
WHERE timestamp >= '{{ start_date }}'
AND event_type = '{{ event_type }}'
GROUP BY date
ORDER BY date DESC
Security
Read-Only Execution
- Internal: Tenant
search_pathforced to current schema, transaction setREAD ONLY. Logical table names are automatically resolved to physical names using the app context. - SQL (external): Session set to
READ ONLYmode. - ClickHouse (external):
readonly=1setting enforced. - Elasticsearch (external): Only
_searchendpoint used.
Blocked Operations
The following are blocked for security:
SQL (PostgreSQL, MySQL, Redshift):
- DDL:
CREATE,ALTER,DROP,TRUNCATE - DML:
INSERT,UPDATE,DELETE,MERGE - Admin:
GRANT,REVOKE - Dangerous functions:
pg_sleep,sleep,load_file
ClickHouse:
- External functions:
file,s3,url,remote,mysql,postgresql - System functions:
sleep,input
Elasticsearch:
- Script execution:
script,runtime_mappings
Secret Protection
- Secrets are resolved server-side and never exposed in responses.
- Query results never include secret values.
- Note: secret placeholders are allowed in templates but validation blocks dangerous use cases (e.g., passing
secret.*in params). Avoid placing secrets in SELECT output.
Error Handling
| Error Code | Description |
|---|---|
QueryValidationError | Invalid query syntax or blocked operations |
QueryExecutionError | Runtime execution failure |
SecretResolutionError | Secret not found or invalid type |
ConnectorError | Database connection failure |
Error Responses:
Errors are returned via the standard platform response format with a status and message. Validation failures include field-level errors; execution errors include a message and sanitized details. The payload will not include secrets.
Execution error detail:
When an external provider (Elasticsearch, ClickHouse, PostgreSQL, MySQL, Redshift) returns an error, the response message preserves the provider's actionable detail so callers can diagnose the failure. Examples:
- Elasticsearch:
"Elasticsearch query failed with HTTP 404: index_not_found_exception: no such index [my-index]" - ClickHouse:
"Code: 60. DB::Exception: Table default.my_table does not exist" - PostgreSQL / MySQL:
"(psycopg2.errors.UndefinedTable) relation \"my_table\" does not exist"
Connection strings containing credentials are automatically sanitized before the message is returned, so passwords never appear in error responses.
Best Practices
- Use Parameterized Queries: Always use
{{ param }}instead of string concatenation - Limit Result Size: Add
LIMITclauses to prevent large result sets - Create Read-Only Database Users: Use dedicated read-only credentials
- Use Descriptive Names: Name queries clearly for easy identification
- Document Queries: Use the description field to explain query purpose
- Test Parameters: Validate parameter values before execution