# Connectors - Data Pipeline The Data Pipeline API enables large-scale data ingestion from external data pipeline platforms like Airbyte, Fivetran, and custom connectors. This API is designed for batch processing of unstructured data with high throughput. ## Overview The Pipeline Connector API provides: - **Batch Upsert**: Process up to 1,000 records per request - **Flexible Schema**: Accept free-form data structures from any source system - **File Attachments**: Support for file metadata and staged file references - **Multi-Source Support**: Works with Airbyte, Fivetran, and custom pipeline platforms - **Rate Limiting**: 100 requests per minute per organization ## Authentication Pipeline endpoints require a dedicated **Pipeline Connector API Key** passed via the `X-API-Key` header. ```bash curl https://api.aitronos.com/v1/connectors/pipeline/upsert \ -H "X-API-Key: $PIPELINE_API_KEY" \ -H "Content-Type: application/json" \ -d '{...}' ``` ## Use Cases - **Data Warehouse Integration**: Sync data from Snowflake, BigQuery, or Redshift - **SaaS Application Sync**: Ingest data from ClickUp, Jira, Notion, etc. - **Custom ETL Pipelines**: Build custom data ingestion workflows - **File Processing**: Process documents, spreadsheets, and other files ## Endpoints - [POST /v1/connectors/pipeline/upsert](/docs/api-reference/connectors/upsert) - Batch upsert data records - [GET /v1/connectors/pipeline/health](/docs/api-reference/connectors/health) - Health check endpoint