Syncs let you maintain an up-to-date local copy of data from any external API. Nango polls the API on a schedule, detects additions, updates, and deletions, and notifies your backend via webhooks so you can process only what changed.
When to use syncs
Syncs are the right tool when you need to:
- Store a local copy of external data and keep it up to date (CRM contacts, Drive files, call transcripts).
- Feed a RAG pipeline — embed and index fresh documents as they change.
- Build a search index — re-index records only when they change.
- Trigger workflows on new or modified data without requiring the external API to support webhooks.
- Combine polling with webhooks for a reliable, real-time stream of changes.
For use cases where you only need to read or write data on demand (not continuously), use Actions instead.
How syncs work
Sync function runs on a schedule
You define a sync function in TypeScript. Nango executes it on each connection at your chosen frequency (minimum 15 seconds). The function fetches data from the external API, transforms it to your model, and saves it with nango.batchSave().
Nango detects changes
Nango compares incoming records against the previous snapshot and identifies additions, updates, and deletions automatically.
Webhook notification
After each sync run, Nango sends a webhook to your backend with a modifiedAfter timestamp. You fetch only the changed records.
Your app processes records
You call nango.listRecords() (or the REST API) with the cursor or timestamp from the webhook, retrieve the delta, and store or process the records in your system.
Full sync vs. incremental sync
| Approach | How it works | When to use |
|---|
| Full sync | Fetches all records on every run. Nango diffs against the previous run to detect changes. | Small datasets or APIs with no way to filter by modification date. |
| Incremental sync | Uses a checkpoint to track progress. Fetches only records modified since the last run. | Large datasets, or any API that supports filtering by date. Recommended. |
Checkpoints are saved with nango.saveCheckpoint() and retrieved with nango.getCheckpoint(). On the first run, getCheckpoint() returns null, so you fetch everything. On subsequent runs, you pass the saved timestamp to the API to fetch only the delta.
Building a sync
The following example syncs contacts from Salesforce incrementally:
import { createSync } from 'nango';
import * as z from 'zod';
const SalesforceContact = z.object({
id: z.string(),
first_name: z.string(),
last_name: z.string(),
email: z.string(),
account_id: z.string().nullable(),
last_modified_date: z.string(),
});
export default createSync({
description: 'Fetches contacts from Salesforce',
version: '1.0.0',
frequency: 'every hour',
autoStart: true,
trackDeletes: true,
// Checkpoint schema — enables incremental syncing
checkpoint: z.object({
lastModifiedISO: z.string(),
}),
models: {
SalesforceContact: SalesforceContact,
},
exec: async (nango) => {
const checkpoint = await nango.getCheckpoint(); // null on first run
const query = buildQuery(checkpoint?.lastModifiedISO);
await fetchAndSaveRecords(nango, query);
},
});
function buildQuery(lastModifiedISO?: string): string {
let q = `SELECT Id, FirstName, LastName, Email, AccountId, LastModifiedDate FROM Contact`;
if (lastModifiedISO) {
q += ` WHERE LastModifiedDate > ${lastModifiedISO}`;
}
return q + ` ORDER BY LastModifiedDate ASC`;
}
async function fetchAndSaveRecords(nango: any, query: string) {
let endpoint = '/services/data/v53.0/query';
while (true) {
const response = await nango.get({
endpoint,
params: endpoint === '/services/data/v53.0/query' ? { q: query } : {}
});
const records = response.data.records.map((r: any) => ({
id: r.Id,
first_name: r.FirstName,
last_name: r.LastName,
email: r.Email,
account_id: r.AccountId,
last_modified_date: r.LastModifiedDate,
}));
// Save records to Nango's cache
await nango.batchSave(records, 'SalesforceContact');
// Save checkpoint so the next run only fetches newer records
await nango.saveCheckpoint({
lastModifiedISO: records[records.length - 1].last_modified_date
});
if (response.data.done) break;
endpoint = response.data.nextRecordsUrl;
}
}
Key utilities used in the function:
nango.getCheckpoint() — retrieves saved checkpoint; returns null on first run.
nango.saveCheckpoint() — persists progress so the next run resumes from here.
nango.batchSave(records, modelName) — writes records to Nango’s encrypted cache.
nango.get({ endpoint }) — makes an authenticated request to the external API.
Reading synced records
After receiving a webhook notification from Nango, fetch the changed records using the modifiedAfter timestamp or a cursor.
Fetch by timestamp
import { Nango } from '@nangohq/node';
const nango = new Nango({ secretKey: process.env.NANGO_SECRET_KEY! });
const result = await nango.listRecords({
providerConfigKey: 'salesforce', // from webhook payload
connectionId: 'conn-user-123', // from webhook payload
model: 'SalesforceContact', // from webhook payload
modifiedAfter: '2024-03-04T06:59:51Z' // from webhook payload
});
for (const record of result.records) {
console.log(record._nango_metadata.last_action); // 'ADDED' | 'UPDATED' | 'DELETED'
console.log(record.first_name, record.last_name);
}
curl -G https://api.nango.dev/records \
--header 'Authorization: Bearer <ENVIRONMENT-SECRET-KEY>' \
--header 'Provider-Config-Key: salesforce' \
--header 'Connection-Id: conn-user-123' \
--data-urlencode 'model=SalesforceContact' \
--data-urlencode 'modified_after=2024-03-04T06:59:51Z'
Cursor-based synchronization
Webhooks can occasionally be missed. Cursor-based synchronization is more reliable — it tracks exactly how far you’ve consumed the record stream per connection, regardless of missed notifications.
Each record includes a _nango_metadata.cursor field. Store the cursor of the last record you processed. On the next fetch, pass that cursor to receive only records modified after it:
import { Nango } from '@nangohq/node';
const nango = new Nango({ secretKey: process.env.NANGO_SECRET_KEY! });
// Retrieve the stored cursor for this connection+model combination
const lastCursor = await db.getCursor('conn-user-123', 'SalesforceContact');
const result = await nango.listRecords({
providerConfigKey: 'salesforce',
connectionId: 'conn-user-123',
model: 'SalesforceContact',
cursor: lastCursor ?? undefined
});
// Process records
for (const record of result.records) {
await db.upsert(record);
}
// Persist the cursor from the last record for next time
if (result.records.length > 0) {
const newCursor = result.records[result.records.length - 1]._nango_metadata.cursor;
await db.setCursor('conn-user-123', 'SalesforceContact', newCursor);
}
curl -G https://api.nango.dev/records \
--header 'Authorization: Bearer <ENVIRONMENT-SECRET-KEY>' \
--header 'Provider-Config-Key: salesforce' \
--header 'Connection-Id: conn-user-123' \
--data-urlencode 'model=SalesforceContact' \
--data-urlencode 'cursor=<cursor-of-last-fetched-record>'
The recommended cursor synchronization loop:
- Receive a webhook notification from Nango.
- Look up the stored cursor for this connection and model.
- Call
listRecords with that cursor.
- Process and store the records.
- Save the cursor from the last record for next time.
Every record returned by listRecords includes a _nango_metadata object automatically populated by Nango:
{
"id": "003xx000004TmiQAAS",
"first_name": "Jane",
"last_name": "Doe",
"email": "[email protected]",
"_nango_metadata": {
"last_action": "ADDED",
"first_seen_at": "2024-03-04T06:59:51.471Z",
"last_modified_at": "2024-03-04T06:59:51.471Z",
"deleted_at": null,
"cursor": "MjAyNC0wMy0wNFQwNjo1OTo1MS40NzE0NDEtMDU6MDB8fDE1Y2NjODA1..."
}
}
last_action is one of ADDED, UPDATED, or DELETED. Use it to decide whether to insert, update, or remove the record from your system.
Webhooks for real-time triggers
Nango sends a webhook to your backend after each sync run that produces changes. Configure your webhook endpoint in the Nango dashboard, then handle incoming events:
// Express handler example
app.post('/webhooks/nango', (req, res) => {
const { type, connectionId, providerConfigKey, model, modifiedAfter } = req.body;
if (type === 'sync') {
// Fetch changed records using modifiedAfter or a stored cursor
processChangedRecords({ connectionId, providerConfigKey, model, modifiedAfter });
}
res.sendStatus(200);
});
Nango automatically prunes record payloads not updated for 30 days, and deletes all records for syncs not executed in 60 days. Fetch records promptly after webhook delivery and store them in your own system. Do not use Nango’s cache as your primary long-term data store.
Polling vs. event-driven
Nango supports both approaches and lets you combine them:
| Approach | How to set up | Best for |
|---|
| Polling | Set frequency in your sync definition. Nango runs on schedule. | APIs without webhooks, or as a reliability fallback. |
| Event-driven | Configure the external API to send webhooks to Nango. Nango triggers a sync run on receipt. | APIs that support webhooks; minimizes latency. |
| Combined | Use both — webhooks for low latency, polling as a catch-up mechanism. | Production RAG pipelines, real-time search indexes. |
See the real-time syncs guide for instructions on configuring external webhook triggers.