Documentation Index
Fetch the complete documentation index at: https://developer.moneyone.in/llms.txt
Use this file to discover all available pages before exploring further.
Overview
The Bulk Data Fetch feature allows you to schedule and process thousands of Financial Information (FI) requests in a single operation. Instead of making individual API calls for each consent, you can submit a batch of consent IDs and let the system handle processing, retries, and status tracking automatically.
This feature is essential for:
- Portfolio Management: Refresh financial data for all customers on a schedule
- Batch Analytics: Process large datasets for insights and reporting
- Periodic Monitoring: Implement recurring data refresh for continuous monitoring use cases
- Migration & Onboarding: Bulk process consents during system migrations
When to Use Bulk vs Single FI Requests
| Scenario | Recommended Approach |
|---|
| Single customer data fetch | Single FI Request API |
| Real-time data needs | Single FI Request API |
| 10+ consents to process | Bulk Data Fetch |
| Scheduled/periodic updates | Bulk Data Fetch |
| Analytics pipeline input | Bulk Data Fetch |
| Non-time-sensitive operations | Bulk Data Fetch |
Workflow
The bulk data fetch process follows these stages:
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Schedule │ --> │ Process │ --> │ Monitor │
│ Batch │ │ Records │ │ Status │
└─────────────┘ └─────────────┘ └─────────────┘
│ │ │
v v v
Submit via System Poll /bulk/
JSON API or processes status for
file upload async progress
Step 1: Schedule a Batch
Choose one of two methods to submit your batch:
Option A: JSON API (recommended for programmatic integration)
POST /v2/schedule/fi/requests
{
"reference_id": "daily-refresh-2024-11-10",
"consents_list": [
{"consent_id": "uuid-1"},
{"consent_id": "uuid-2"},
{"consent_id": "uuid-3"}
],
"fiDataRangeFrom": "2024-01-01T00:00:00Z",
"fiDataRangeTo": "2024-11-10T23:59:59Z"
}
Option B: File Upload (recommended for spreadsheet-based workflows)
POST /fi/requests/schedule/upload/v3
Content-Type: multipart/form-data
file: consents.csv
reference_id: daily-refresh-2024-11-10
fiDataRangeFrom: 2024-01-01T00:00:00Z
fiDataRangeTo: 2024-11-10T23:59:59Z
Step 2: Monitor Progress
Poll the status endpoint to track processing:
POST /bulk/status
{
"reference_id": "daily-refresh-2024-11-10",
"page": 1,
"limit": 100,
"include_balances": false,
"filters": { "status": ["IN_PROGRESS"] }
}
Response shows progress:
{
"data": {
"batch_status": "IN_PROGRESS",
"summary": {
"total": 1000,
"ready": 230,
"in_progress": 20,
"completed": 750,
"failed": 20,
}
}
}
Key Concepts
Reference ID
The reference_id is your unique identifier for tracking a batch. Best practices:
- Use meaningful, descriptive names (e.g.,
portfolio-refresh-2024-11-10)
- Include timestamps for scheduled batches
- Keep it between 5-60 characters
- Use only alphanumeric characters, hyphens, and underscores
- Store it for later status queries
Batch Status
Batches progress through these statuses:
| Status | Description |
|---|
READY | Batch accepted and ready for processing |
IN_PROGRESS | Records are being processed |
COMPLETED | All records processed successfully |
FAILED | Batch-level failure |
Row Status
Individual records within a batch have their own status:
| Status | Description |
|---|
READY | Record validated, ready for processing |
IN_PROGRESS | FI request sent, awaiting response |
COMPLETED | Data fetched successfully |
FAILED | Processing failed |
Retry Eligibility
Failed records are classified by retry eligibility:
| Value | Meaning | Examples |
|---|
Yes | Can retry immediately | Network timeout, temporary AA unavailability |
No | Cannot retry | Consent revoked, consent expired, quota exceeded |
YesWithModifications | Can retry with changes | Invalid date range (needs adjustment) |
Configuration Options
Global vs Per-Row Configuration
You can set default values that apply to all records, then override specific records:
{
"reference_id": "batch-001",
"consents_list": [
{
"consent_id": "uuid-1",
"fiDataRangeTo": "2024-06-30T23:59:59Z" // Override
},
{
"consent_id": "uuid-2" // Uses global defaults
}
],
"fiDataRangeFrom": "2024-01-01T00:00:00Z", // Global default
"fiDataRangeTo": "2024-12-31T23:59:59Z", // Global default
"configId": "analytics-v1" // Global default
}
Date Range Configuration
| Parameter | Description |
|---|
fiDataRangeFrom | Start date for financial data (ISO 8601) |
fiDataRangeTo | End date for financial data (ISO 8601) |
If not specified, the system uses the date range from the consent artefact.
Analytics Configuration
The configId parameter specifies which analytics configuration to apply when processing the data. Contact your FinPro administrator to get available configuration IDs.
Best Practices
Batch Size Optimization
- Maximum: 10,000 records per batch
- Recommended: 1,000-5,000 records for optimal performance
- Rationale: Smaller batches complete faster and are easier to troubleshoot
Polling Strategy
// Recommended polling intervals
const POLLING_INTERVALS = {
small: 10000, // < 100 records: every 10 seconds
medium: 30000, // 100-1000 records: every 30 seconds
large: 60000 // > 1000 records: every 60 seconds
};
function getPollingInterval(totalRecords) {
if (totalRecords < 100) return POLLING_INTERVALS.small;
if (totalRecords < 1000) return POLLING_INTERVALS.medium;
return POLLING_INTERVALS.large;
}
Error Handling
-
Check batch-level status first: If
batch_status is FAILED, check for file parsing or validation errors.
-
Review failed records: Use filters to retrieve only failed records:
{
"reference_id": "batch-001",
"filters": { "status": ["FAILED"] }
}
-
Check retry eligibility: Before creating a retry batch, verify records are actually retryable.
-
Log reference IDs: Always log the
reference_id and batch_id for troubleshooting.
Scheduling Recommendations
- Avoid peak hours: Schedule large batches during off-peak times
- Stagger batches: Don’t submit multiple large batches simultaneously
- Monitor completion: Set up alerts for batches that take longer than expected
Limits and Quotas
| Limit | Value |
|---|
| Maximum batch size | 10,000 records |
| Maximum file size (uploads) | 10 MB |
| Reference ID length | 5-60 characters |
| Pagination limit | 1,000 records per page |