Synder Importer API
Import CSV and XLSX accounting data into QuickBooks Online or Xero programmatically.
https://importer.synder.com/api/v1
One token, all endpoints
Upload & import
Full CRUD + imports
dryRun=true on import endpoints to validate mappings and file parsing without writing to QuickBooks or Xero.
Supported Providers
| Provider | ID | Entities |
|---|---|---|
| QuickBooks Online | intuit | Invoice, Bill, Expense, JournalEntry, Payment, CreditMemo, Estimate, PurchaseOrder, Customer, Vendor, Item, Account, Employee, SalesReceipt, RefundReceipt, Transfer, Deposit, Check, CreditCardCharge, TimeActivity, Class, VendorCredit, Attachable |
| Xero | xero | Invoice, Bill, CreditNote, BankTransaction, Contact, ManualJournal |
Authentication
1 Create an Account
Register at importer.synder.com. Confirm your email.
2 Connect Your Accounting Software
In the web UI, connect QuickBooks Online or Xero via OAuth.
3 Generate an API Key
Go to Account โ API Keys, click Generate. Copy your token โ it's shown only once.
4 Authenticate Requests
Include your token as a Bearer token in the Authorization header:
curl https://importer.synder.com/api/v1/account \
-H "Authorization: Bearer YOUR_API_TOKEN"
import requests
BASE = "https://importer.synder.com/api/v1"
headers = {"Authorization": "Bearer YOUR_API_TOKEN"}
resp = requests.get(f"{BASE}/account", headers=headers)
print(resp.json())
const BASE = "https://importer.synder.com/api/v1";
const headers = { "Authorization": "Bearer YOUR_API_TOKEN" };
const resp = await fetch(`${BASE}/account`, { headers });
const data = await resp.json();
console.log(data);
200 Response โ Account object
{
"id": "5",
"email": "[email protected]",
"firstName": "Jane",
"lastName": "Doe",
"status": "ACTIVE",
"companiesAmount": 2,
"successfulImports": 142,
"subscription": {
"plan": "Professional",
"active": true
}
}
| Field | Type | Description |
|---|---|---|
id | String | Your account ID |
email | String | Account email address |
firstName | String | First name |
lastName | String | Last name |
status | String | ACTIVE, TRIAL, INACTIVE, etc. |
companiesAmount | Integer | Number of connected companies |
successfulImports | Integer | Lifetime count of successful imports |
subscription.plan | String | Subscription plan name (or null for trial) |
subscription.active | Boolean | Whether subscription is currently active |
Your First Import
The complete flow from token to imported data:
1 List Companies
curl https://importer.synder.com/api/v1/companies \
-H "Authorization: Bearer $TOKEN"
companies = requests.get(f"{BASE}/companies", headers=headers).json()
company_id = companies[0]["id"]
const companies = await fetch(`${BASE}/companies`, { headers }).then(r => r.json());
const companyId = companies[0].id;
200 Response โ Array of companies
[
{
"id": "9",
"companyName": "Acme Corp",
"provider": "intuit",
"status": "ACTIVE",
"companyAddress": "123 Main St, San Francisco, CA",
"settings": {
"dateFormat": "MM/dd/yyyy",
"incrementDocNumber": false,
"sendDocNumberToProvider": true
}
}
]
| Field | Type | Description |
|---|---|---|
id | String | Company ID โ use as {companyId} in all endpoints |
companyName | String | Company display name from accounting provider |
provider | String | intuit or xero |
status | String | ACTIVE, INACTIVE, or DISCONNECTED |
companyAddress | String | Company address (may be null) |
settings | Object | Provider-specific settings (dateFormat, incrementDocNumber, etc.) |
2 Get Entity Fields
Check which fields are available for your entity type (e.g., Invoice). Content-Type: application/json
curl https://importer.synder.com/api/v1/companies/{companyId}/entities/Invoice/fields \
-H "Authorization: Bearer $TOKEN"
fields = requests.get(
f"{BASE}/companies/{company_id}/entities/Invoice/fields",
headers=headers
).json()
required = [f for f in fields if f["isRequired"]]
print(f"Required fields: {[f['title'] for f in required]}")
const fields = await fetch(
`${BASE}/companies/${companyId}/entities/Invoice/fields`,
{ headers }
).then(r => r.json());
const required = fields.filter(f => f.isRequired);
console.log("Required:", required.map(f => f.title));
200 Response โ Array of fields
[
{
"id": "231",
"title": "DocNumber",
"type": "String",
"isRequired": true,
"isForGrouping": true,
"maxSize": 21,
"orderNumber": 1,
"childElemName": null,
"alternativeTitles": ["Invoice #", "Inv No", "InvoiceNumber"],
"predefinedValues": [],
"enablingGuideUrl": null
},
{
"id": "176",
"title": "CustomerRef.name",
"type": "CustomerRef",
"isRequired": true,
"isForGrouping": false,
"maxSize": 100,
"orderNumber": 3,
"childElemName": null,
"alternativeTitles": ["Customer", "Client"],
"predefinedValues": [],
"enablingGuideUrl": null
}
]
| Field | Type | Description |
|---|---|---|
id | String | Field ID โ use as targetFieldId in mappings |
title | String | Field name in the accounting system |
type | String | String, Boolean, Date, Decimal, Integer, DateTime, CustomerRef, Array |
isRequired | Boolean | Must be mapped for the import to succeed |
isForGrouping | Boolean | If true, groups multiple CSV rows into one entity |
maxSize | Integer|null | Maximum character length |
orderNumber | Integer | Suggested display order |
alternativeTitles | Array | Alternative column names accepted by smart mapping |
predefinedValues | Array | Allowed values (e.g., for Boolean fields) |
enablingGuideUrl | String|null | Link to enable this field in the provider |
3 Create a Mapping
Map your CSV columns to entity fields. All required fields must be mapped. Content-Type: application/json
curl -X POST https://importer.synder.com/api/v1/companies/{companyId}/mappings \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"title": "My Invoice Mapping",
"entityName": "Invoice",
"fields": [
{"sourceFieldTitle": "Invoice #", "targetFieldId": "231"},
{"sourceFieldTitle": "Customer", "targetFieldId": "176"},
{"sourceFieldTitle": "Date", "targetFieldId": "179"},
{"sourceFieldTitle": "Item", "targetFieldId": "232"},
{"sourceFieldTitle": "Amount", "targetFieldId": "237"},
{"sourceFieldTitle": null, "targetFieldId": "234", "fixedValue": "Imported via API"}
]
}'
mapping = requests.post(
f"{BASE}/companies/{company_id}/mappings",
headers={**headers, "Content-Type": "application/json"},
json={
"title": "My Invoice Mapping",
"entityName": "Invoice",
"fields": [
{"sourceFieldTitle": "Invoice #", "targetFieldId": "231"},
{"sourceFieldTitle": "Customer", "targetFieldId": "176"},
{"sourceFieldTitle": "Date", "targetFieldId": "179"},
{"sourceFieldTitle": "Item", "targetFieldId": "232"},
{"sourceFieldTitle": "Amount", "targetFieldId": "237"},
{"sourceFieldTitle": None, "targetFieldId": "234",
"fixedValue": "Imported via API"}
]
}
).json()
mapping_id = mapping["id"]
const mapping = await fetch(`${BASE}/companies/${companyId}/mappings`, {
method: "POST",
headers: { ...headers, "Content-Type": "application/json" },
body: JSON.stringify({
title: "My Invoice Mapping",
entityName: "Invoice",
fields: [
{ sourceFieldTitle: "Invoice #", targetFieldId: "231" },
{ sourceFieldTitle: "Customer", targetFieldId: "176" },
{ sourceFieldTitle: "Date", targetFieldId: "179" },
{ sourceFieldTitle: "Item", targetFieldId: "232" },
{ sourceFieldTitle: "Amount", targetFieldId: "237" },
{ sourceFieldTitle: null, targetFieldId: "234",
fixedValue: "Imported via API" }
]
})
}).then(r => r.json());
const mappingId = mapping.id;
201 Response โ Created mapping
{
"id": "31",
"title": "My Invoice Mapping",
"entityName": "Invoice",
"active": true,
"createdAt": "2026-03-11T10:30:00Z",
"fields": [
{
"sourceFieldTitle": "Invoice #",
"targetFieldId": "231",
"fixedValue": null
},
{
"sourceFieldTitle": null,
"targetFieldId": "234",
"fixedValue": "Imported via API"
}
]
}
fixedValue with "sourceFieldTitle": null to set a constant for every row instead of mapping a CSV column. Useful for memo fields, categories, or tags.
4 Upload & Execute Import
Upload a file and start the import. Content-Type: multipart/form-data
curl -X POST https://importer.synder.com/api/v1/companies/{companyId}/imports \
-H "Authorization: Bearer $TOKEN" \
-H "Idempotency-Key: $(uuidgen)" \
-F "[email protected]" \
-F "mappingId=31"
import uuid
resp = requests.post(
f"{BASE}/companies/{company_id}/imports",
headers={**headers, "Idempotency-Key": str(uuid.uuid4())},
files={"file": open("invoices.csv", "rb")},
data={"mappingId": mapping_id}
).json()
import_id = resp["id"]
print(f"Import {import_id} โ Status: {resp['status']}")
const form = new FormData();
form.append("file", fs.createReadStream("invoices.csv"));
form.append("mappingId", mappingId);
const imp = await fetch(`${BASE}/companies/${companyId}/imports`, {
method: "POST",
headers: { ...headers, "Idempotency-Key": crypto.randomUUID() },
body: form
}).then(r => r.json());
const importId = imp.id;
console.log(`Import ${importId} โ Status: ${imp.status}`);
Returns 202 Accepted โ the import runs asynchronously.
202 Response โ Import created
{
"id": "42",
"status": "SCHEDULED",
"mappingId": "31",
"entityName": "Invoice",
"company": {
"id": "9",
"name": "Acme Corp"
}
}
5 Poll for Completion
GET .../imports/{id} to check status. Start at 2-second intervals, use exponential backoff, cap at 30 seconds.
# Poll until terminal status
curl https://importer.synder.com/api/v1/companies/{companyId}/imports/{importId} \
-H "Authorization: Bearer $TOKEN"
import time
terminal = {"FINISHED", "FINISHED_WITH_WARNINGS", "FAILED", "CANCELED"}
delay = 2
while True:
status = requests.get(
f"{BASE}/companies/{company_id}/imports/{import_id}",
headers=headers
).json()
print(f"Status: {status['status']} โ {status.get('summary', {})}")
if status["status"] in terminal:
break
time.sleep(delay)
delay = min(delay * 1.5, 30) # Exponential backoff, cap 30s
const terminal = new Set(["FINISHED","FINISHED_WITH_WARNINGS","FAILED","CANCELED"]);
let delay = 2000;
let status;
do {
await new Promise(r => setTimeout(r, delay));
status = await fetch(
`${BASE}/companies/${companyId}/imports/${importId}`,
{ headers }
).then(r => r.json());
console.log(`Status: ${status.status}`, status.summary);
delay = Math.min(delay * 1.5, 30000);
} while (!terminal.has(status.status));
200 Response โ Import with status & summary
{
"id": "42",
"status": "FINISHED",
"entityName": "Invoice",
"mappingId": "31",
"dateCreated": "2026-03-11T10:35:00Z",
"dateFinished": "2026-03-11T10:35:18Z",
"summary": {
"total": 25,
"succeeded": 23,
"failed": 2,
"warnings": 1
}
}
Status Lifecycle
SCHEDULED โ IN_PROGRESS โ FINISHED
โ FINISHED_WITH_WARNINGS
โ FAILED
โ CANCELED
FINISHED โ REVERTING โ REVERTED
| Status | Terminal? | Description |
|---|---|---|
SCHEDULED | No | Import queued, will start shortly |
IN_PROGRESS | No | Actively processing rows |
FINISHED | Yes | All rows imported successfully |
FINISHED_WITH_WARNINGS | Yes | Completed but some rows had warnings |
FAILED | Yes | Import failed โ check results for errors |
CANCELED | Yes | Canceled by user via cancel endpoint |
REVERTING | No | Reversal in progress |
REVERTED | Yes | All created entities deleted from accounting system |
6 Get Results
# All results
curl ".../imports/{importId}/results" \
-H "Authorization: Bearer $TOKEN"
# Errors only
curl ".../imports/{importId}/results?type=ERROR" \
-H "Authorization: Bearer $TOKEN"
results = requests.get(
f"{BASE}/companies/{company_id}/imports/{import_id}/results",
headers=headers,
params={"type": "ERROR"}
).json()
for row in results["data"]:
print(f"Row {row['orderNumber']}: {row['text']}")
const results = await fetch(
`${BASE}/companies/${companyId}/imports/${importId}/results?type=ERROR`,
{ headers }
).then(r => r.json());
results.data.forEach(r =>
console.log(`Row ${r.orderNumber}: ${r.text}`)
);
200 Response โ Paginated results
{
"data": [
{
"orderNumber": 1,
"type": "INFO",
"text": "Invoice #1001 created successfully",
"objectProviderId": "438",
"objectInfo": "Invoice #1001 for Acme Corp",
"isReverted": false,
"isUpdated": false
},
{
"orderNumber": 5,
"type": "ERROR",
"text": "Customer 'Unknown Corp' not found in QuickBooks",
"objectProviderId": null,
"objectInfo": null,
"isReverted": false,
"isUpdated": false
}
],
"pagination": {
"page": 1,
"perPage": 20,
"total": 25,
"totalPages": 2
}
}
| Field | Type | Description |
|---|---|---|
orderNumber | Integer | Row number from the uploaded file |
type | String | INFO (success), WARNING, or ERROR |
text | String | Human-readable result message |
objectProviderId | String|null | ID of the created entity in the accounting system |
objectInfo | String|null | Human-readable summary of the imported object |
isReverted | Boolean | Whether this row has been reverted |
isUpdated | Boolean | Whether this row updated an existing record |
File Requirements
CSV Files
| Requirement | Details |
|---|---|
| Encoding | UTF-8 (with or without BOM). Other encodings may cause garbled characters. |
| Delimiter | Comma (,). Semicolons and tabs are not supported. |
| Header row | Required. First row must contain column names matching your mapping's sourceFieldTitle values. |
| Max file size | 50 MB |
| Max rows | 100,000 rows per file (trial accounts have lower limits) |
| Quoting | Use double quotes for values containing commas: "Acme, Inc." |
XLSX Files
| Requirement | Details |
|---|---|
| Sheet | First sheet used by default. Specify sheetName in the upload to use a different sheet. |
| Header row | Required. First row must contain column names. |
| Max file size | 50 MB |
| Formatting | Dates should be formatted as text or standard date cells, not custom formulas. |
Sample CSV
Download and use this sample to follow the Quick Start tutorial:
Invoice #,Customer,Date,Item,Amount,Memo
1001,Acme Corp,2026-03-01,Consulting,1500.00,Q1 services
1001,Acme Corp,2026-03-01,Travel,350.00,Client site visit
1002,Beta Inc,2026-03-05,Consulting,2000.00,March retainer
1003,Gamma LLC,2026-03-10,Software License,899.99,Annual renewal
Invoice # = 1001. Because DocNumber is a grouping field (isForGrouping: true), they'll be combined into a single invoice with two line items.
Smart Auto-Import
Skip manual mapping โ the system matches your CSV headers to entity fields automatically using case-insensitive string matching.
curl -X POST https://importer.synder.com/api/v1/companies/{companyId}/imports/auto \
-H "Authorization: Bearer $TOKEN" \
-F "[email protected]" \
-F "entityName=Invoice" \
-F "dryRun=true"
auto = requests.post(
f"{BASE}/companies/{company_id}/imports/auto",
headers=headers,
files={"file": open("invoices.csv", "rb")},
data={"entityName": "Invoice", "dryRun": "true"}
).json()
print(f"Matched {auto['totalFieldsMapped']}/{auto['totalEntityFields']} fields")
for f in auto["proposedFields"]:
print(f" {f['sourceFieldTitle']} โ {f['targetFieldName']}")
200 Response โ Auto-mapping result (dryRun=true)
{
"entityName": "Invoice",
"fileHeaders": ["Invoice #", "Customer", "Date", "Item", "Amount", "Memo"],
"proposedFields": [
{"sourceFieldTitle": "Invoice #", "targetFieldId": "231", "targetFieldName": "DocNumber", "confidence": "high"},
{"sourceFieldTitle": "Customer", "targetFieldId": "176", "targetFieldName": "CustomerRef.name", "confidence": "high"},
{"sourceFieldTitle": "Date", "targetFieldId": "179", "targetFieldName": "TxnDate", "confidence": "medium"},
{"sourceFieldTitle": "Amount", "targetFieldId": "237", "targetFieldName": "Line.Amount", "confidence": "medium"}
],
"missingRequired": ["Line.ItemRef.name"],
"totalFieldsMapped": 4,
"totalEntityFields": 12
}
| Parameter | Required | Description |
|---|---|---|
file | Yes | CSV or XLSX file |
entityName | Yes | Entity to import (e.g., Invoice) |
dryRun | No | true (default) โ preview mapping only. false โ import immediately. |
dryRun=true before importing.
Complete Example
A single copy-paste script that goes from authentication to results:
"""Import invoices into QuickBooks via Synder Importer API."""
import requests, uuid, time
BASE = "https://importer.synder.com/api/v1"
TOKEN = "YOUR_API_TOKEN"
headers = {"Authorization": f"Bearer {TOKEN}"}
# 1. Get your company
companies = requests.get(f"{BASE}/companies", headers=headers).json()
company = next(c for c in companies if c["status"] == "ACTIVE")
cid = company["id"]
print(f"Using company: {company['companyName']} ({company['provider']})")
# 2. List required fields for Invoice
fields = requests.get(f"{BASE}/companies/{cid}/entities/Invoice/fields", headers=headers).json()
required = [f for f in fields if f["isRequired"]]
print(f"Required fields: {[f['title'] for f in required]}")
# 3. Create a mapping
mapping = requests.post(f"{BASE}/companies/{cid}/mappings",
headers={**headers, "Content-Type": "application/json"},
json={
"title": "Invoice Import",
"entityName": "Invoice",
"fields": [
{"sourceFieldTitle": "Invoice #", "targetFieldId": "231"},
{"sourceFieldTitle": "Customer", "targetFieldId": "176"},
{"sourceFieldTitle": "Date", "targetFieldId": "179"},
{"sourceFieldTitle": "Item", "targetFieldId": "232"},
{"sourceFieldTitle": "Amount", "targetFieldId": "237"},
]
}).json()
print(f"Mapping created: {mapping['id']}")
# 4. Upload file & start import
imp = requests.post(f"{BASE}/companies/{cid}/imports",
headers={**headers, "Idempotency-Key": str(uuid.uuid4())},
files={"file": open("invoices.csv", "rb")},
data={"mappingId": mapping["id"]}
).json()
print(f"Import {imp['id']} started โ status: {imp['status']}")
# 5. Poll until done
terminal = {"FINISHED", "FINISHED_WITH_WARNINGS", "FAILED", "CANCELED"}
delay = 2
while True:
status = requests.get(f"{BASE}/companies/{cid}/imports/{imp['id']}", headers=headers).json()
print(f" Status: {status['status']}")
if status["status"] in terminal:
break
time.sleep(delay)
delay = min(delay * 1.5, 30)
# 6. Show results
results = requests.get(f"{BASE}/companies/{cid}/imports/{imp['id']}/results", headers=headers).json()
for r in results["data"]:
icon = "โ
" if r["type"] == "INFO" else "โ ๏ธ" if r["type"] == "WARNING" else "โ"
print(f" {icon} Row {r['orderNumber']}: {r['text']}")
s = status["summary"]
print(f"\nDone: {s['succeeded']} succeeded, {s['failed']} failed, {s['warnings']} warnings")
/**
* Import invoices into QuickBooks via Synder Importer API.
* Run: node import.mjs (Node 18+)
*/
import fs from "fs";
import crypto from "crypto";
const BASE = "https://importer.synder.com/api/v1";
const TOKEN = "YOUR_API_TOKEN";
const headers = { Authorization: `Bearer ${TOKEN}` };
async function api(path, opts = {}) {
const res = await fetch(`${BASE}${path}`, { headers, ...opts });
if (!res.ok) {
const err = await res.json();
throw new Error(`${res.status}: ${err.error?.message || JSON.stringify(err)}`);
}
return res.json();
}
// 1. Get your company
const companies = await api("/companies");
const company = companies.find(c => c.status === "ACTIVE");
const cid = company.id;
console.log(`Using company: ${company.companyName} (${company.provider})`);
// 2. List required fields for Invoice
const fields = await api(`/companies/${cid}/entities/Invoice/fields`);
const required = fields.filter(f => f.isRequired);
console.log("Required fields:", required.map(f => f.title));
// 3. Create a mapping
const mapping = await api(`/companies/${cid}/mappings`, {
method: "POST",
headers: { ...headers, "Content-Type": "application/json" },
body: JSON.stringify({
title: "Invoice Import",
entityName: "Invoice",
fields: [
{ sourceFieldTitle: "Invoice #", targetFieldId: "231" },
{ sourceFieldTitle: "Customer", targetFieldId: "176" },
{ sourceFieldTitle: "Date", targetFieldId: "179" },
{ sourceFieldTitle: "Item", targetFieldId: "232" },
{ sourceFieldTitle: "Amount", targetFieldId: "237" },
]
})
});
console.log(`Mapping created: ${mapping.id}`);
// 4. Upload file & start import
const form = new FormData();
form.append("file", new Blob([fs.readFileSync("invoices.csv")]));
form.append("mappingId", mapping.id);
const imp = await fetch(`${BASE}/companies/${cid}/imports`, {
method: "POST",
headers: { ...headers, "Idempotency-Key": crypto.randomUUID() },
body: form
}).then(r => r.json());
console.log(`Import ${imp.id} started โ status: ${imp.status}`);
// 5. Poll until done
const terminal = new Set(["FINISHED","FINISHED_WITH_WARNINGS","FAILED","CANCELED"]);
let delay = 2000;
let status;
do {
await new Promise(r => setTimeout(r, delay));
status = await api(`/companies/${cid}/imports/${imp.id}`);
console.log(` Status: ${status.status}`);
delay = Math.min(delay * 1.5, 30000);
} while (!terminal.has(status.status));
// 6. Show results
const results = await api(`/companies/${cid}/imports/${imp.id}/results`);
for (const r of results.data) {
const icon = r.type === "INFO" ? "โ
" : r.type === "WARNING" ? "โ ๏ธ" : "โ";
console.log(` ${icon} Row ${r.orderNumber}: ${r.text}`);
}
const s = status.summary;
console.log(`\nDone: ${s.succeeded} ok, ${s.failed} failed, ${s.warnings} warnings`);
Error Handling
All errors return a consistent JSON envelope:
{
"error": {
"code": "NOT_FOUND",
"message": "Company not found: 999"
}
}
Error Codes
| Code | HTTP | What to Do |
|---|---|---|
UNAUTHORIZED | 401 | Check your Bearer token is valid and not revoked |
NOT_FOUND | 404 | Resource not found โ verify companyId, mappingId, importId, or entity name |
BAD_REQUEST | 400 | Invalid request โ missing fields, invalid file format, disconnected provider, etc. |
VALIDATION_ERROR | 422 | Validation failed โ missing required field mappings, invalid field IDs, etc. |
DUPLICATE_REQUEST | 409 | Same Idempotency-Key used within 24h. Use a new UUID |
CONFLICT | 409 | Import not in valid state for this operation (e.g., cancel a finished import) |
RATE_LIMITED | 429 | Wait for Retry-After header seconds, then retry |
Handling Errors in Code
def api_request(method, path, **kwargs):
"""Make an API request with retry on rate limit."""
resp = requests.request(method, f"{BASE}{path}", headers=headers, **kwargs)
if resp.status_code == 429:
retry_after = int(resp.headers.get("Retry-After", 60))
print(f"Rate limited โ waiting {retry_after}s")
time.sleep(retry_after)
return api_request(method, path, **kwargs) # Retry once
if resp.status_code >= 400:
error = resp.json().get("error", {})
raise Exception(f"[{error.get('code')}] {error.get('message')}")
return resp.json()
async function apiRequest(path, opts = {}) {
const resp = await fetch(`${BASE}${path}`, { headers, ...opts });
if (resp.status === 429) {
const retryAfter = parseInt(resp.headers.get("Retry-After") || "60");
console.log(`Rate limited โ waiting ${retryAfter}s`);
await new Promise(r => setTimeout(r, retryAfter * 1000));
return apiRequest(path, opts); // Retry once
}
if (!resp.ok) {
const { error } = await resp.json();
throw new Error(`[${error?.code}] ${error?.message}`);
}
return resp.json();
}
Rate Limits
| Tier | Scope | Limit | Window |
|---|---|---|---|
| Standard | Per account | 60 requests | 1 minute (sliding) |
| Import execution | Per company | 30 imports | 1 hour (sliding) |
Every response includes rate limit headers:
X-RateLimit-Limit: 60
X-RateLimit-Remaining: 57
X-RateLimit-Reset: 1773242094
| Header | Description |
|---|---|
X-RateLimit-Limit | Max requests allowed in the current window |
X-RateLimit-Remaining | Requests remaining before throttling |
X-RateLimit-Reset | Unix timestamp when the window resets |
Retry-After | Seconds to wait (only present on 429 responses) |
Pagination
List endpoints support cursor-based pagination with page and perPage parameters.
| Parameter | Default | Max | Description |
|---|---|---|---|
page | 1 | โ | Page number (1-indexed) |
perPage | 20 | 100 | Items per page |
curl ".../imports?page=2&perPage=50" -H "Authorization: Bearer $TOKEN"
Response with pagination metadata
{
"data": [ ... ],
"pagination": {
"page": 2,
"perPage": 50,
"total": 234,
"totalPages": 5
}
}
Paginated endpoints: GET .../mappings, GET .../imports, GET .../imports/{id}/results
Tips for AI Agents
- Always send
Idempotency-Keyon import calls โ prevents accidental duplicate imports if you retry - Poll with exponential backoff โ start at 2s, multiply by 1.5ร, cap at 30s
- Use
dryRun=trueto validate before importing into live accounting data - Check
/results?type=ERRORto quickly surface failures - For XLSX files โ specify
sheetNamein the upload form data - Smart auto-mapping works best when CSV headers closely match entity field titles
- Use
fixedValuein mappings for constants (e.g., default payment terms, memo fields) - Grouping fields โ fields with
isForGrouping: true(like DocNumber) combine multiple CSV rows into one entity (e.g., multi-line invoices with several line items) - Revert imports โ if you made a mistake,
POST .../imports/{id}/revertdeletes all created entities from the accounting system - Tolerate new fields โ we add new response fields without version bumps, so don't fail on unknown keys