This guide covers best practices for building reliable applications with the Reporting API. These recommendations focus on patterns specific to the Reporting API's architecture — schema discovery, query construction, the Continue Wait pattern, and data freshness.
Note
Golden Rule: Never hard-code measures, dimensions, or segments. Always discover them from /v1/meta.
Bad — hard-coded measures:
const MEASURES = ['Orders.net_sales', 'Orders.tips_amount'];
Good — discovered at runtime:
const metadata = await fetchMetadata(); const measures = metadata.cubes.Orders.measures.map(m => m.name);
Schema can evolve as new measures and dimensions are added. Hard-coded values will break silently when they no longer match the API.
Recommended TTL: 1–24 hours depending on your use case.
- Refresh at application startup
- Refresh periodically (every 1–24 hours)
- Refresh after "measure not found" or similar schema errors
If a requested measure or dimension is missing from metadata, fall back to an alternative rather than failing outright. This makes your integration resilient to schema changes.
Before sending a query to /v1/load, check that every measure, dimension, and segment in the request exists in the current metadata. This catches typos and stale references early and produces clearer error messages than a failed API call.
Bad — manual filtering:
{ "measures": ["Orders.net_sales"], "filters": [{ "member": "Orders.state", "operator": "equals", "values": ["COMPLETED"] }] }
Good — use the segment:
{ "measures": ["Orders.net_sales"], "segments": ["Orders.closed_checks"] }
Segments encapsulate business logic maintained by Square. Using them ensures your results match Square dashboard reports.
Always include a dateRange in your timeDimensions. Open-ended queries can attempt to return years of data and time out.
{ "timeDimensions": [{ "dimension": "Orders.sale_timestamp", "dateRange": ["2024-01-01", "2024-01-31"] }] }
| Use Case | Granularity |
|---|---|
| Intra-day monitoring | hour |
| Daily reports | day |
| Weekly trends | week |
| Monthly analysis | month |
| Quarterly reports | quarter |
| Year-over-year | year |
Avoid using day granularity for a full year of data — use month or quarter to keep result sets manageable.
Always set a limit to avoid unexpectedly large result sets, especially when using high-cardinality dimensions like customer_id.
Recommended limits:
- UI display: 10–100
- Export/analysis: 1,000–10,000
- Pagination: 100–500 per page
Use order to sort results meaningfully — chronological for time series, descending by measure for rankings.
Bad — three separate queries:
const netSales = await query({ measures: ['Orders.net_sales'] }); const tips = await query({ measures: ['Orders.tips_amount'] }); const tax = await query({ measures: ['Orders.sales_tax_amount'] });
Good — single query:
const data = await query({ measures: [ 'Orders.net_sales', 'Orders.tips_amount', 'Orders.sales_tax_amount' ] });
One query is faster and uses fewer API calls.
Each additional dimension multiplies the result set size. Only include dimensions you actually need for your analysis.
The Orders cube has a data freshness of approximately 15 minutes.
- Historical data (yesterday and earlier) won't change — cache it aggressively (24 hours or longer)
- Today's data — cache for 15 minutes to match the cube's refresh cycle
- Metadata (
/v1/meta) — cache for 1–24 hours
This simple tiered caching strategy significantly reduces API calls without sacrificing data accuracy.
The Reporting API uses a "Continue wait" pattern for queries that take time to compute. Your client must handle this:
async function executeWithRetry(query, maxAttempts = 10) { const delays = [2, 2, 5, 5, 10, 10, 15, 15, 20, 20]; for (let attempt = 0; attempt < maxAttempts; attempt++) { const response = await fetch(url, { method: 'POST', headers, body: JSON.stringify(query) }); const data = await response.json(); if (data.error === 'Continue wait') { const delay = delays[attempt] || 20; await new Promise(r => setTimeout(r, delay * 1000)); continue; } return data; } throw new Error('Query timed out after max retries'); }
Warning
Without Continue Wait handling, complex queries will appear to fail on the first attempt. This is the most common integration issue.
When a preferred measure or dimension is unavailable in the current metadata, fall back to an alternative rather than crashing. This is especially important during schema transitions when measures may be temporarily renamed or replaced.
Store your Square access token in environment variables — never hard-code tokens in source code. Use separate tokens for sandbox and production environments, and rotate them periodically.
Validate any user-supplied input (date ranges, location IDs) before incorporating it into queries. Enforce reasonable limits on date range spans to prevent abuse.
- [ ] Metadata is discovered at runtime from
/v1/meta, not hard-coded - [ ] Metadata cache has appropriate TTL (1–24 hours)
- [ ] Queries are validated against current metadata before execution
- [ ] Continue Wait retry logic is implemented
- [ ] Explicit date ranges are specified in all queries
- [ ]
Orders.closed_checkssegment is used for sales reports - [ ] Historical data is cached aggressively (24+ hours)
- [ ] Result sets are limited to reasonable sizes
- [ ] Access tokens are stored securely in environment variables