The Problem with Single-Record Anchoring
If you're anchoring records one at a time in a loop, you're making unnecessary API calls:
// DON'T do this for bulk operations!
for (const order of orders) {
await fetch('/v1/anchor', {
method: 'POST',
body: JSON.stringify({ data: order })
});
// 1000 orders = 1000 API calls = slow!
}
For 1,000 records, that's 1,000 API calls, 1,000 round trips, and significant latency. There's a better way.
Introducing Batch Anchoring
The new POST /v1/anchor/batch endpoint lets you anchor up to 100 records in a single API call:
// DO this for bulk operations
const response = await fetch('https://api.anchora.io/v1/anchor/batch', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${API_KEY}`
},
body: JSON.stringify({
records: orders.map(order => ({
data: order,
collection: 'orders'
})),
webhookUrl: 'https://your-app.com/webhook/batch'
})
});
// 100 records = 1 API call = fast!
How It Works: Two Levels of Batching
Anchora uses two levels of batching for maximum efficiency:
Level 1: API Batch (Your Request)
You send up to 100 records per API call. Each record gets queued individually for blockchain anchoring.
Level 2: Merkle Tree Batch (Our Processing)
Every 30 seconds, our worker collects up to 256 queued records, builds a Merkle tree, and anchors them in a single blockchain transaction.
Your API Call Anchora Processing
───────────── ──────────────────
POST /v1/anchor/batch Worker (every 30s)
└─ 100 records └─ Collects up to 256 records
│ │
▼ ▼
Queue Merkle Tree
(100 QUEUED) ┌────┴────┐
│ │ │
│ Hash1 Hash2
│ / \ / \
│ R1 R2 R3 R4
│ │
│ ▼
│ Blockchain TX
│ (1 transaction)
│ │
└──────────────────────────┘
│
▼
256 records = $0.01
Per record = $0.000039
API Reference
Request Format
POST /v1/anchor/batch
Content-Type: application/json
Authorization: Bearer dcp_live_xxxxx
{
"records": [
{
"data": { "orderId": "ORD-001", "total": 100.00 },
"collection": "orders",
"metadata": { "region": "US" }
},
{
"data": { "orderId": "ORD-002", "total": 250.00 },
"collection": "orders",
"metadata": { "region": "EU" }
},
{
"data": { "orderId": "ORD-003", "total": 75.50 },
"collection": "orders"
}
],
"webhookUrl": "https://your-app.com/webhook/batch"
}
Request Parameters
| Field | Type | Required | Description |
|---|---|---|---|
records |
array | Yes | Array of records to anchor (max 100) |
records[].data |
object | Yes | The data to anchor (any JSON) |
records[].collection |
string | No | Collection name for organization |
records[].metadata |
object | No | Additional metadata (not anchored) |
records[].hashOnly |
boolean | No | Only store hash, not data |
webhookUrl |
string | No | URL to receive notifications for each record |
Response Format
{
"success": true,
"message": "Batch anchoring completed",
"batchId": "batch_1735131234567_abc123",
"totalRecords": 3,
"results": [
{
"success": true,
"index": 0,
"recordId": "rec_xyz1",
"hash": "abc123def456...",
"status": "QUEUED"
},
{
"success": true,
"index": 1,
"recordId": "rec_xyz2",
"hash": "def456ghi789...",
"status": "QUEUED"
},
{
"success": true,
"index": 2,
"recordId": "rec_xyz3",
"hash": "ghi789jkl012...",
"status": "QUEUED"
}
],
"summary": {
"successful": 3,
"failed": 0
}
}
Cost Comparison
Here's how batch anchoring combined with Merkle tree batching delivers massive cost savings:
| Records | Traditional (1 TX each) | Anchora Batch | Savings |
|---|---|---|---|
| 100 | $1.00 (100 TX) | $0.004 (via Merkle) | 99.6% |
| 1,000 | $10.00 (1000 TX) | $0.04 (4 Merkle batches) | 99.6% |
| 10,000 | $100.00 (10,000 TX) | $0.40 (40 Merkle batches) | 99.6% |
| 100,000 | $1,000.00 | $4.00 (391 Merkle batches) | 99.6% |
Practical Examples
Example 1: E-commerce Order Anchoring
Anchor all orders from the last hour in a batch:
async function anchorHourlyOrders() {
// Get unanchored orders from the last hour
const orders = await db.orders.find({
anchoredAt: null,
createdAt: { $gte: new Date(Date.now() - 3600000) }
});
if (orders.length === 0) return;
// Chunk into batches of 100 (API limit)
const chunks = chunkArray(orders, 100);
for (const chunk of chunks) {
const result = await fetch('https://api.anchora.io/v1/anchor/batch', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${API_KEY}`
},
body: JSON.stringify({
records: chunk.map(order => ({
data: {
orderId: order.id,
customerId: order.customerId,
total: order.total,
items: order.items,
createdAt: order.createdAt
},
collection: 'orders',
hashOnly: true
})),
webhookUrl: 'https://shop.com/webhook/anchora'
})
}).then(r => r.json());
// Update orders with hashes
for (const item of result.results) {
if (item.success) {
await db.orders.updateOne(
{ id: chunk[item.index].id },
{ $set: { 'anchora.hash': item.hash, 'anchora.status': 'QUEUED' }}
);
}
}
console.log(`Anchored ${result.summary.successful} orders`);
}
}
// Run every hour
setInterval(anchorHourlyOrders, 3600000);
Example 2: Log File Anchoring
Anchor audit logs for compliance:
async function anchorAuditLogs(logs) {
const result = await fetch('https://api.anchora.io/v1/anchor/batch', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${API_KEY}`
},
body: JSON.stringify({
records: logs.map(log => ({
data: {
action: log.action,
userId: log.userId,
resource: log.resource,
timestamp: log.timestamp,
ipAddress: log.ip,
details: log.details
},
collection: 'audit-logs',
hashOnly: true // Don't store data, just hash
}))
})
}).then(r => r.json());
console.log(`Anchored ${result.summary.successful} audit logs`);
return result;
}
// Usage with stream processing
const logBuffer = [];
logStream.on('data', async (log) => {
logBuffer.push(log);
// Anchor when buffer reaches 100
if (logBuffer.length >= 100) {
await anchorAuditLogs(logBuffer.splice(0, 100));
}
});
Example 3: Using the SDK
The SDK provides a cleaner interface for batch operations:
import { AnchoraClient } from '@anchora/sdk';
const anchora = new AnchoraClient({
apiKey: process.env.ANCHORA_API_KEY
});
// Batch anchor with SDK
const result = await anchora.anchorBatch(
orders.map(order => ({
data: order,
collection: 'orders',
hashOnly: true
})),
{
webhookUrl: 'https://your-app.com/webhook/batch'
}
);
console.log('Successful:', result.summary.successful);
console.log('Failed:', result.summary.failed);
// Process results
for (const item of result.results) {
if (item.success) {
console.log(`Record ${item.index}: ${item.hash}`);
} else {
console.error(`Record ${item.index} failed: ${item.error}`);
}
}
Error Handling
Batch operations can have partial failures. Always check individual results:
const result = await anchora.anchorBatch(records);
// Check for partial failures
if (result.summary.failed > 0) {
console.warn(`${result.summary.failed} records failed`);
// Collect failed records for retry
const failedRecords = result.results
.filter(r => !r.success)
.map(r => ({
index: r.index,
error: r.error,
originalRecord: records[r.index]
}));
// Log failures
for (const failure of failedRecords) {
console.error(`Record ${failure.index} failed: ${failure.error}`);
}
// Optionally retry failed records
const retryRecords = failedRecords.map(f => f.originalRecord);
if (retryRecords.length > 0) {
await anchora.anchorBatch(retryRecords);
}
}
Best Practices
Split records into chunks of 100 (API limit) and process sequentially or in parallel.
Set a single webhookUrl for the batch. Each record triggers its own webhook when anchored.
For sensitive data, set hashOnly: true to only store hashes, not data.
Always check summary.failed and process individual results for errors.
Rate Limits
The batch endpoint has these rate limits:
| Limit | Value | Notes |
|---|---|---|
| Records per request | 100 max | Chunk larger datasets |
| Requests per minute | 100 | = 10,000 records/minute |
| Requests per hour | 1,000 | = 100,000 records/hour |
Summary
The Batch Anchoring API is designed for high-volume applications:
- Up to 100 records per API call
- Merkle tree batching combines 256 records into 1 blockchain transaction
- 99.6% cost reduction vs individual anchoring
- Individual webhooks for each record
- Partial failure handling - check each result
Ready to anchor at scale?
Start batch anchoring today. Free tier includes 10,000 records/month with no credit card required.
Get Your API Key