Batch Operations
Perform multiple create, update, or delete operations in a single request. Batch operations are optimized for high throughput and handle partial failures gracefully.
Your deployment tier determines batch size limits (500-20,000 records per request). Batch operations are processed in chunks for optimal performance.
First, generate an access token using your API key:
curl -X POST https://{EKODB_API_URL}/api/auth/token \
-H "Content-Type: application/json" \
-d '{"api_key": "YOUR_API_KEY"}'
Response:
{
"token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9..."
}
Batch Insert
Insert multiple records in a single request.
POST https://{EKODB_API_URL}/api/batch/insert/{collection}
Content-Type: application/json
Authorization: Bearer {YOUR_API_TOKEN}
{
"inserts": [
{
"data": {
"name": "User 1",
"email": "user1@example.com"
}
},
{
"data": {
"name": "User 2",
"email": "user2@example.com"
}
}
]
}
# Response
{
"successful": ["id1", "id2"],
"failed": []
}
Prefer a client library? See examples in Rust, Python, TypeScript, Go, or Kotlin
Want to use the REST API directly? See examples in JavaScript, Python, Go, or Rust
Disable Real-Time Sync (Ripples):
Use bypass_ripple=true to skip real-time data propagation for bulk imports. This significantly improves throughput by avoiding network overhead.
When to bypass ripples:
- Initial data loading (millions of records)
- Database migrations or imports
- Maintenance operations
- Temporary disconnection scenarios
- Bulk data seeding in development
Performance impact: Bypassing ripples can improve batch throughput by 2-5x for large operations.
POST https://{EKODB_API_URL}/api/batch/insert/{collection}?bypass_ripple=true
Content-Type: application/json
Authorization: Bearer {YOUR_API_TOKEN}
{
"inserts": [
{"data": {"name": "Bulk User 1"}},
{"data": {"name": "Bulk User 2"}}
]
}
Or set it in the request body:
{
"bypass_ripple": true,
"inserts": [
{"data": {"name": "User 1"}},
{"data": {"name": "User 2"}}
]
}
After completing a large import with bypass_ripple=true, you can synchronize replicas manually using WAL exports or by temporarily reconfiguring ripples. See Ripples - Data Propagation for synchronization strategies.
Response on Partial Failure:
{
"successful": ["id1", "id2"],
"failed": [
{
"id": null,
"error": "Chunk 3 failed: validation error"
}
]
}
Batch Update
Update multiple records by ID in a single request.
PUT https://{EKODB_API_URL}/api/batch/update/{collection}
Content-Type: application/json
Authorization: Bearer {YOUR_API_TOKEN}
{
"updates": [
{
"id": "record_id_1",
"data": {
"status": "active",
"updated_at": "2024-01-15T10:30:00Z"
}
},
{
"id": "record_id_2",
"data": {
"status": "inactive"
}
}
]
}
# Response
{
"successful": ["record_id_1", "record_id_2"],
"failed": []
}
Disable Real-Time Sync:
PUT https://{EKODB_API_URL}/api/batch/update/{collection}?bypass_ripple=true
Or in request body:
{
"bypass_ripple": true,
"updates": [...]
}
Response on Partial Failure:
{
"successful": ["record_id_1"],
"failed": [
{
"id": "record_id_2",
"error": "Record not found or update failed"
}
]
}
Batch Delete
Delete multiple records by ID in a single request.
DELETE https://{EKODB_API_URL}/api/batch/delete/{collection}
Content-Type: application/json
Authorization: Bearer {YOUR_API_TOKEN}
{
"deletes": [
{ "id": "record_id_1" },
{ "id": "record_id_2" },
{ "id": "record_id_3" }
]
}
# Response
{
"successful": ["record_id_1", "record_id_2", "record_id_3"],
"failed": []
}
Disable Real-Time Sync:
DELETE https://{EKODB_API_URL}/api/batch/delete/{collection}?bypass_ripple=true
Or in request body:
{
"bypass_ripple": true,
"deletes": [...]
}
Response on Partial Failure:
{
"successful": ["record_id_1", "record_id_2"],
"failed": [
{
"id": "record_id_3",
"error": "Record not found or delete failed"
}
]
}
Using Batch Operations Within Transactions
All batch operations can be performed within a transaction by adding the transaction_id query parameter. This ensures atomicity across all batch operations.
Batch Insert in Transaction:
POST https://{EKODB_API_URL}/api/batch/insert/users?transaction_id=tx-001
Content-Type: application/json
Authorization: Bearer {YOUR_API_TOKEN}
{
"inserts": [
{"data": {"name": "User 1", "email": "user1@example.com"}},
{"data": {"name": "User 2", "email": "user2@example.com"}}
]
}
Batch Update in Transaction:
PUT https://{EKODB_API_URL}/api/batch/update/users?transaction_id=tx-001
Content-Type: application/json
Authorization: Bearer {YOUR_API_TOKEN}
{
"updates": [
{"id": "user_1", "data": {"status": "active"}},
{"id": "user_2", "data": {"status": "active"}}
]
}
Batch Delete in Transaction:
DELETE https://{EKODB_API_URL}/api/batch/delete/users?transaction_id=tx-001
Content-Type: application/json
Authorization: Bearer {YOUR_API_TOKEN}
{
"deletes": [
{"id": "user_1"},
{"id": "user_2"}
]
}
Complete Transaction Example:
# Begin transaction
tx_id=$(curl -s -X POST https://{EKODB_API_URL}/api/transactions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer {YOUR_API_TOKEN}" \
-d '{"isolation_level": "Serializable"}' | jq -r '.transaction_id')
# Batch insert in transaction
curl -X POST https://{EKODB_API_URL}/api/batch/insert/users?transaction_id=$tx_id \
-H "Content-Type: application/json" \
-H "Authorization: Bearer {YOUR_API_TOKEN}" \
-d '{
"inserts": [
{"data": {"name": "User 1", "email": "user1@example.com"}},
{"data": {"name": "User 2", "email": "user2@example.com"}}
]
}'
# Batch update in transaction
curl -X PUT https://{EKODB_API_URL}/api/batch/update/users?transaction_id=$tx_id \
-H "Content-Type: application/json" \
-H "Authorization: Bearer {YOUR_API_TOKEN}" \
-d '{
"updates": [
{"id": "user_1", "data": {"verified": true}},
{"id": "user_2", "data": {"verified": true}}
]
}'
# Commit or rollback based on results
if [ $? -eq 0 ]; then
curl -X POST https://{EKODB_API_URL}/api/transactions/$tx_id/commit \
-H "Authorization: Bearer {YOUR_API_TOKEN}"
else
curl -X POST https://{EKODB_API_URL}/api/transactions/$tx_id/rollback \
-H "Authorization: Bearer {YOUR_API_TOKEN}"
fi
Using transactions with batch operations ensures that either all operations succeed or all are rolled back together. See Transactions for full transaction management.
Response Status Codes
Batch operations return different status codes based on the result:
- 201 Created - All inserts successful
- 200 OK - All updates/deletes successful
- 207 Multi-Status - Partial success (some succeeded, some failed)
- 400 Bad Request - All operations failed
- 503 Service Unavailable - Server is processing too many concurrent batch operations
Error Handling
Batch operations continue processing even if individual records fail:
# Example: Batch insert with validation error
curl -X POST https://{EKODB_API_URL}/api/batch/insert/users \
-H "Content-Type: application/json" \
-H "Authorization: Bearer {YOUR_API_TOKEN}" \
-d '{
"inserts": [
{"data": {"name": "Valid User", "email": "valid@example.com"}},
{"data": {"name": ""}},
{"data": {"name": "Another Valid", "email": "valid2@example.com"}}
]
}'
# Response (207 Multi-Status):
{
"successful": ["id1", "id3"],
"failed": [
{
"id": null,
"error": "Validation error: name cannot be empty"
}
]
}
Best Practices
Choose Appropriate Batch Size
# Small batch (< 100 records) - Fast response
POST /api/batch/insert/users
{"inserts": [...]} # 50 records
# Medium batch (100-1,000 records) - Balanced
POST /api/batch/insert/users
{"inserts": [...]} # 500 records
# Large batch (1,000-10,000 records) - Maximum throughput
POST /api/batch/insert/users
{"inserts": [...]} # 5,000 records
Handle Partial Failures
Always check both successful and failed arrays:
# Check response
response=$(curl -X POST .../api/batch/insert/users ...)
# Parse response
successful_count=$(echo $response | jq '.successful | length')
failed_count=$(echo $response | jq '.failed | length')
if [ $failed_count -gt 0 ]; then
echo "Warning: $failed_count operations failed"
# Log failures or retry
fi
Monitor Performance
# Start with smaller batches to test performance
curl -X POST .../api/batch/insert/users -d '{"inserts": [...]}' # 100 records
# Monitor response time and adjust batch size
# - Fast response (< 1s): Can increase batch size
# - Slow response (> 5s): Reduce batch size
# - 503 errors: Server overloaded, wait and retry with smaller batches
# Use appropriate timeouts for large batches
curl --max-time 60 -X POST .../api/batch/insert/users ...
# Handle rate limiting gracefully
if [ $http_code -eq 503 ]; then
echo "Server busy, waiting 5 seconds..."
sleep 5
# Retry with smaller batch
fi
Use Transactions for Atomicity
For operations that must succeed or fail together:
# Begin transaction
tx_id=$(curl -X POST .../api/transactions | jq -r '.transaction_id')
# Perform batch operations
curl -X POST .../api/batch/insert/users?transaction_id=$tx_id ...
# Commit or rollback based on results
if [ $success ]; then
curl -X POST .../api/transactions/$tx_id/commit
else
curl -X POST .../api/transactions/$tx_id/rollback
fi
Complete Example
Here's a complete workflow for batch inserting users with error handling:
#!/bin/bash
# Batch insert 1000 users
response=$(curl -s -X POST https://{EKODB_API_URL}/api/batch/insert/users \
-H "Content-Type: application/json" \
-H "Authorization: Bearer {YOUR_API_TOKEN}" \
-d @- << 'EOF'
{
"inserts": [
{"data": {"name": "User 1", "email": "user1@example.com"}},
{"data": {"name": "User 2", "email": "user2@example.com"}},
{"data": {"name": "User 3", "email": "user3@example.com"}}
]
}
EOF
)
# Parse response
successful=$(echo $response | jq -r '.successful[]')
failed=$(echo $response | jq -r '.failed[]')
# Log results
echo "Successfully inserted: $(echo $response | jq '.successful | length') records"
echo "Failed: $(echo $response | jq '.failed | length') records"
# Handle failures
if [ "$(echo $response | jq '.failed | length')" -gt 0 ]; then
echo "Failed records:"
echo $response | jq '.failed'
# Optionally retry failed records
# ... retry logic here ...
fi
Limits
| Tier | Max Batch Size | Max Concurrent Operations |
|---|---|---|
| Free | 500 | 5 |
| Starter | 2,000 | 10 |
| Pro | 10,000 | 50 |
| Enterprise | 20,000 | 100 |
If you receive a 503 Service Unavailable response, the server is processing too many concurrent batch operations. Wait and retry after a few seconds.
Related Documentation
- Basic Operations - Single record CRUD operations
- Query Expressions - Filter syntax for batch queries
- Transactions - ACID transactions for atomicity
- Authentication - API key and JWT authentication
Example Code
Direct HTTP/REST API Examples
Raw HTTP examples demonstrating the REST API directly:
- JavaScript -
batch_operations.js - Python -
batch_operations.py - Go -
batch_operations.go - Rust -
batch_operations.rs
Client Library Examples
Production-ready examples using official client libraries:
- Rust -
client_batch_operations.rs - Python -
client_batch_operations.py - TypeScript -
client_batch_operations.ts - Go -
client_batch_operations.go - Kotlin -
ClientBatchOperations.kt - JavaScript -
client_batch_operations.js