Batch Operations
Perform multiple create, update, or delete operations in a single request. Batch operations are optimized for high throughput and handle partial failures gracefully.
Your deployment tier determines batch size limits (500-20,000 records per request). Batch operations are processed in chunks for optimal performance.
Batch Insert
Insert multiple records in a single request.
POST https://{EKODB_API_URL}/api/batch/insert/{collection}
Content-Type: application/json
Authorization: Bearer {YOUR_TOKEN}
{
"inserts": [
{
"data": {
"name": "User 1",
"email": "user1@example.com"
}
},
{
"data": {
"name": "User 2",
"email": "user2@example.com"
}
}
]
}
# Response
{
"successful": ["id1", "id2"],
"failed": []
}
Disable Real-Time Sync:
Use bypass_ripple to skip real-time synchronization for bulk imports:
POST https://{EKODB_API_URL}/api/batch/insert/{collection}?bypass_ripple=true
Content-Type: application/json
Authorization: Bearer {YOUR_TOKEN}
{
"inserts": [
{"data": {"name": "Bulk User 1"}},
{"data": {"name": "Bulk User 2"}}
]
}
Or set it in the request body:
{
"bypass_ripple": true,
"inserts": [
{"data": {"name": "User 1"}},
{"data": {"name": "User 2"}}
]
}
Response on Partial Failure:
{
"successful": ["id1", "id2"],
"failed": [
{
"id": null,
"error": "Chunk 3 failed: validation error"
}
]
}
Batch Update
Update multiple records by ID in a single request.
PUT https://{EKODB_API_URL}/api/batch/update/{collection}
Content-Type: application/json
Authorization: Bearer {YOUR_TOKEN}
{
"updates": [
{
"id": "record_id_1",
"data": {
"status": "active",
"updated_at": "2024-01-15T10:30:00Z"
}
},
{
"id": "record_id_2",
"data": {
"status": "inactive"
}
}
]
}
# Response
{
"successful": ["record_id_1", "record_id_2"],
"failed": []
}
Disable Real-Time Sync:
PUT https://{EKODB_API_URL}/api/batch/update/{collection}?bypass_ripple=true
Or in request body:
{
"bypass_ripple": true,
"updates": [...]
}
Response on Partial Failure:
{
"successful": ["record_id_1"],
"failed": [
{
"id": "record_id_2",
"error": "Record not found or update failed"
}
]
}
Batch Delete
Delete multiple records by ID in a single request.
DELETE https://{EKODB_API_URL}/api/batch/delete/{collection}
Content-Type: application/json
Authorization: Bearer {YOUR_TOKEN}
{
"deletes": [
{ "id": "record_id_1" },
{ "id": "record_id_2" },
{ "id": "record_id_3" }
]
}
# Response
{
"successful": ["record_id_1", "record_id_2", "record_id_3"],
"failed": []
}
Disable Real-Time Sync:
DELETE https://{EKODB_API_URL}/api/batch/delete/{collection}?bypass_ripple=true
Or in request body:
{
"bypass_ripple": true,
"deletes": [...]
}
Response on Partial Failure:
{
"successful": ["record_id_1", "record_id_2"],
"failed": [
{
"id": "record_id_3",
"error": "Record not found or delete failed"
}
]
}
Using Batch Operations Within Transactions
All batch operations can be performed within a transaction by adding the transaction_id query parameter. This ensures atomicity across all batch operations.
Batch Insert in Transaction:
POST https://{EKODB_API_URL}/api/batch/insert/users?transaction_id=tx-001
Content-Type: application/json
Authorization: Bearer {YOUR_TOKEN}
{
"inserts": [
{"data": {"name": "User 1", "email": "user1@example.com"}},
{"data": {"name": "User 2", "email": "user2@example.com"}}
]
}
Batch Update in Transaction:
PUT https://{EKODB_API_URL}/api/batch/update/users?transaction_id=tx-001
Content-Type: application/json
Authorization: Bearer {YOUR_TOKEN}
{
"updates": [
{"id": "user_1", "data": {"status": "active"}},
{"id": "user_2", "data": {"status": "active"}}
]
}
Batch Delete in Transaction:
DELETE https://{EKODB_API_URL}/api/batch/delete/users?transaction_id=tx-001
Content-Type: application/json
Authorization: Bearer {YOUR_TOKEN}
{
"deletes": [
{"id": "user_1"},
{"id": "user_2"}
]
}
Complete Transaction Example:
# Begin transaction
tx_id=$(curl -s -X POST https://{EKODB_API_URL}/api/transactions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer {YOUR_TOKEN}" \
-d '{"isolation_level": "Serializable"}' | jq -r '.transaction_id')
# Batch insert in transaction
curl -X POST https://{EKODB_API_URL}/api/batch/insert/users?transaction_id=$tx_id \
-H "Content-Type: application/json" \
-H "Authorization: Bearer {YOUR_TOKEN}" \
-d '{
"inserts": [
{"data": {"name": "User 1", "email": "user1@example.com"}},
{"data": {"name": "User 2", "email": "user2@example.com"}}
]
}'
# Batch update in transaction
curl -X PUT https://{EKODB_API_URL}/api/batch/update/users?transaction_id=$tx_id \
-H "Content-Type: application/json" \
-H "Authorization: Bearer {YOUR_TOKEN}" \
-d '{
"updates": [
{"id": "user_1", "data": {"verified": true}},
{"id": "user_2", "data": {"verified": true}}
]
}'
# Commit or rollback based on results
if [ $? -eq 0 ]; then
curl -X POST https://{EKODB_API_URL}/api/transactions/$tx_id/commit \
-H "Authorization: Bearer {YOUR_TOKEN}"
else
curl -X POST https://{EKODB_API_URL}/api/transactions/$tx_id/rollback \
-H "Authorization: Bearer {YOUR_TOKEN}"
fi
Using transactions with batch operations ensures that either all operations succeed or all are rolled back together. See Transactions for full transaction management.
Response Status Codes
Batch operations return different status codes based on the result:
- 201 Created - All inserts successful
- 200 OK - All updates/deletes successful
- 207 Multi-Status - Partial success (some succeeded, some failed)
- 400 Bad Request - All operations failed
- 503 Service Unavailable - Server is processing too many concurrent batch operations
Error Handling
Batch operations continue processing even if individual records fail:
# Example: Batch insert with validation error
curl -X POST https://{EKODB_API_URL}/api/batch/insert/users \
-H "Content-Type: application/json" \
-H "Authorization: Bearer {YOUR_TOKEN}" \
-d '{
"inserts": [
{"data": {"name": "Valid User", "email": "valid@example.com"}},
{"data": {"name": ""}},
{"data": {"name": "Another Valid", "email": "valid2@example.com"}}
]
}'
# Response (207 Multi-Status):
{
"successful": ["id1", "id3"],
"failed": [
{
"id": null,
"error": "Validation error: name cannot be empty"
}
]
}
Best Practices
Choose Appropriate Batch Size
# Small batch (< 100 records) - Fast response
POST /api/batch/insert/users
{"inserts": [...]} # 50 records
# Medium batch (100-1,000 records) - Balanced
POST /api/batch/insert/users
{"inserts": [...]} # 500 records
# Large batch (1,000-10,000 records) - Maximum throughput
POST /api/batch/insert/users
{"inserts": [...]} # 5,000 records
Handle Partial Failures
Always check both successful and failed arrays:
# Check response
response=$(curl -X POST .../api/batch/insert/users ...)
# Parse response
successful_count=$(echo $response | jq '.successful | length')
failed_count=$(echo $response | jq '.failed | length')
if [ $failed_count -gt 0 ]; then
echo "Warning: $failed_count operations failed"
# Log failures or retry
fi
Monitor Performance
# Start with smaller batches to test performance
curl -X POST .../api/batch/insert/users -d '{"inserts": [...]}' # 100 records
# Monitor response time and adjust batch size
# - Fast response (< 1s): Can increase batch size
# - Slow response (> 5s): Reduce batch size
# - 503 errors: Server overloaded, wait and retry with smaller batches
# Use appropriate timeouts for large batches
curl --max-time 60 -X POST .../api/batch/insert/users ...
# Handle rate limiting gracefully
if [ $http_code -eq 503 ]; then
echo "Server busy, waiting 5 seconds..."
sleep 5
# Retry with smaller batch
fi
Use Transactions for Atomicity
For operations that must succeed or fail together:
# Begin transaction
tx_id=$(curl -X POST .../api/transactions | jq -r '.transaction_id')
# Perform batch operations
curl -X POST .../api/batch/insert/users?transaction_id=$tx_id ...
# Commit or rollback based on results
if [ $success ]; then
curl -X POST .../api/transactions/$tx_id/commit
else
curl -X POST .../api/transactions/$tx_id/rollback
fi
Complete Example
Here's a complete workflow for batch inserting users with error handling:
#!/bin/bash
# Batch insert 1000 users
response=$(curl -s -X POST https://{EKODB_API_URL}/api/batch/insert/users \
-H "Content-Type: application/json" \
-H "Authorization: Bearer {YOUR_TOKEN}" \
-d @- << 'EOF'
{
"inserts": [
{"data": {"name": "User 1", "email": "user1@example.com"}},
{"data": {"name": "User 2", "email": "user2@example.com"}},
{"data": {"name": "User 3", "email": "user3@example.com"}}
]
}
EOF
)
# Parse response
successful=$(echo $response | jq -r '.successful[]')
failed=$(echo $response | jq -r '.failed[]')
# Log results
echo "Successfully inserted: $(echo $response | jq '.successful | length') records"
echo "Failed: $(echo $response | jq '.failed | length') records"
# Handle failures
if [ "$(echo $response | jq '.failed | length')" -gt 0 ]; then
echo "Failed records:"
echo $response | jq '.failed'
# Optionally retry failed records
# ... retry logic here ...
fi
Limits
| Tier | Max Batch Size | Max Concurrent Operations |
|---|---|---|
| Free | 500 | 5 |
| Starter | 2,000 | 10 |
| Pro | 10,000 | 50 |
| Enterprise | 20,000 | 100 |
If you receive a 503 Service Unavailable response, the server is processing too many concurrent batch operations. Wait and retry after a few seconds.
Related Documentation
- Basic Operations - Single record CRUD operations
- Transactions - ACID transactions for atomicity
- Authentication - API key and JWT authentication