Skip to content
Last updated

The Data Pipeline API enables large-scale data ingestion from external data pipeline platforms and custom connectors. This API is designed for batch processing of unstructured data with high throughput.

Overview

The Pipeline Connector API provides:

  • Batch Upsert: Process up to 1,000 records per request
  • Flexible Schema: Accept free-form data structures from any source system
  • File Attachments: Support for file metadata and staged file references
  • Multi-Source Support: Works with multiple data pipeline platforms and custom connectors
  • Rate Limiting: 100 requests per minute per organization

Authentication

Pipeline endpoints require a dedicated Pipeline Connector API Key passed via the X-API-Key header.

curl https://api.aitronos.com/v1/connectors/pipeline/upsert \
  -H "X-API-Key: $PIPELINE_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{...}'

Use Cases

  • Data Warehouse Integration: Sync data from Snowflake, BigQuery, or Redshift
  • SaaS Application Sync: Ingest data from ClickUp, Jira, Notion, etc.
  • Custom ETL Pipelines: Build custom data ingestion workflows
  • File Processing: Process documents, spreadsheets, and other files

Endpoints