Skip to main content

Overview

The Bulk Data Fetch feature allows you to schedule and process thousands of Financial Information (FI) requests in a single operation. Instead of making individual API calls for each consent, you can submit a batch of consent IDs and let the system handle processing, retries, and status tracking automatically. This feature is essential for:
  • Portfolio Management: Refresh financial data for all customers on a schedule
  • Batch Analytics: Process large datasets for insights and reporting
  • Periodic Monitoring: Implement recurring data refresh for continuous monitoring use cases
  • Migration & Onboarding: Bulk process consents during system migrations

When to Use Bulk vs Single FI Requests

ScenarioRecommended Approach
Single customer data fetchSingle FI Request API
Real-time data needsSingle FI Request API
10+ consents to processBulk Data Fetch
Scheduled/periodic updatesBulk Data Fetch
Analytics pipeline inputBulk Data Fetch
Non-time-sensitive operationsBulk Data Fetch

Workflow

The bulk data fetch process follows these stages:
┌─────────────┐     ┌─────────────┐     ┌─────────────┐     ┌─────────────┐
│   Schedule  │ --> │   Process   │ --> │   Monitor   │ --> │   Retrieve  │
│    Batch    │     │   Records   │     │   Status    │     │    Data     │
└─────────────┘     └─────────────┘     └─────────────┘     └─────────────┘
      │                   │                   │                   │
      v                   v                   v                   v
  Submit via          System              Poll /bulk/         Use Get FI
  JSON API or         processes           status for          Data APIs
  file upload         async               progress            with session_id

Step 1: Schedule a Batch

Choose one of two methods to submit your batch: Option A: JSON API (recommended for programmatic integration)
POST /v2/schedule/fi/requests
{
  "reference_id": "daily-refresh-2024-11-10",
  "consents_list": [
    {"consent_id": "uuid-1"},
    {"consent_id": "uuid-2"},
    {"consent_id": "uuid-3"}
  ],
  "fiDataRangeFrom": "2024-01-01T00:00:00Z",
  "fiDataRangeTo": "2024-11-10T23:59:59Z"
}
Option B: File Upload (recommended for spreadsheet-based workflows)
POST /fi/requests/schedule/upload/v3
Content-Type: multipart/form-data

file: consents.csv
reference_id: daily-refresh-2024-11-10
fiDataRangeFrom: 2024-01-01T00:00:00Z
fiDataRangeTo: 2024-11-10T23:59:59Z

Step 2: Monitor Progress

Poll the status endpoint to track processing:
POST /bulk/status
{
  "reference_id": "daily-refresh-2024-11-10",
  "page": 1,
  "limit": 100,
  "include_balances": false,
  "filters": { "status": ["IN_PROGRESS"] }
}
Response shows progress:
{
  "data": {
    "batch_status": "IN_PROGRESS",
    "summary": {
      "total": 1000,
      "ready": 230,
      "in_progress": 20,
      "completed": 750,
      "failed": 20,
    }
  }
}

Step 3: Retrieve Data

Once records complete, use the session_id from the status response to fetch data:
POST /fi/getdata
{
  "consentId": "uuid-1",
  "sessionId": "session-from-status-response"
}

Key Concepts

Reference ID

The reference_id is your unique identifier for tracking a batch. Best practices:
  • Use meaningful, descriptive names (e.g., portfolio-refresh-2024-11-10)
  • Include timestamps for scheduled batches
  • Keep it between 5-60 characters
  • Use only alphanumeric characters, hyphens, and underscores
  • Store it for later status queries

Batch Status

Batches progress through these statuses:
StatusDescription
READYBatch accepted and ready for processing
IN_PROGRESSRecords are being processed
COMPLETEDAll records processed successfully
FAILEDBatch-level failure

Row Status

Individual records within a batch have their own status:
StatusDescription
READYRecord validated, ready for processing
IN_PROGRESSFI request sent, awaiting response
COMPLETEDData fetched successfully
FAILEDProcessing failed

Retry Eligibility

Failed records are classified by retry eligibility:
ValueMeaningExamples
YesCan retry immediatelyNetwork timeout, temporary AA unavailability
NoCannot retryConsent revoked, consent expired, quota exceeded
YesWithModificationsCan retry with changesInvalid date range (needs adjustment)

Configuration Options

Global vs Per-Row Configuration

You can set default values that apply to all records, then override specific records:
{
  "reference_id": "batch-001",
  "consents_list": [
    {
      "consent_id": "uuid-1",
      "fiDataRangeTo": "2024-06-30T23:59:59Z"  // Override
    },
    {
      "consent_id": "uuid-2"  // Uses global defaults
    }
  ],
  "fiDataRangeFrom": "2024-01-01T00:00:00Z",  // Global default
  "fiDataRangeTo": "2024-12-31T23:59:59Z",    // Global default
  "configId": "analytics-v1"                   // Global default
}

Date Range Configuration

ParameterDescription
fiDataRangeFromStart date for financial data (ISO 8601)
fiDataRangeToEnd date for financial data (ISO 8601)
If not specified, the system uses the date range from the consent artefact.

Analytics Configuration

The configId parameter specifies which analytics configuration to apply when processing the data. Contact your FinPro administrator to get available configuration IDs.

Best Practices

Batch Size Optimization

  • Maximum: 10,000 records per batch
  • Recommended: 1,000-5,000 records for optimal performance
  • Rationale: Smaller batches complete faster and are easier to troubleshoot

Polling Strategy

// Recommended polling intervals
const POLLING_INTERVALS = {
  small: 10000,   // < 100 records: every 10 seconds
  medium: 30000,  // 100-1000 records: every 30 seconds
  large: 60000    // > 1000 records: every 60 seconds
};

function getPollingInterval(totalRecords) {
  if (totalRecords < 100) return POLLING_INTERVALS.small;
  if (totalRecords < 1000) return POLLING_INTERVALS.medium;
  return POLLING_INTERVALS.large;
}

Error Handling

  1. Check batch-level status first: If batch_status is FAILED, check for file parsing or validation errors.
  2. Review failed records: Use filters to retrieve only failed records:
    {
      "reference_id": "batch-001",
      "filters": { "status": ["FAILED"] }
    }
    
  3. Check retry eligibility: Before creating a retry batch, verify records are actually retryable.
  4. Log reference IDs: Always log the reference_id and batch_id for troubleshooting.

Scheduling Recommendations

  • Avoid peak hours: Schedule large batches during off-peak times
  • Stagger batches: Don’t submit multiple large batches simultaneously
  • Monitor completion: Set up alerts for batches that take longer than expected

Limits and Quotas

LimitValue
Maximum batch size10,000 records
Maximum file size (uploads)10 MB
Reference ID length5-60 characters
Pagination limit1,000 records per page