List all scraping jobs for the authenticated user. ## Headers | Name | Type | Required | Description | | --- | --- | --- | --- | | `Authorization` | string | Yes | Bearer token authentication | ## Query Parameters | Parameter | Type | Default | Description | | --- | --- | --- | --- | | `status` | string | null | Filter by status: "pending", "processing", "completed", "failed", "cancelled" | | `limit` | integer | 50 | Number of jobs to return (1-100) | | `offset` | integer | 0 | Pagination offset | ## Response **Status**: `200 OK` | Field | Type | Description | | --- | --- | --- | | `jobs` | array | Array of job objects | | `total` | integer | Total number of jobs matching filter | | `limit` | integer | Limit used in request | | `offset` | integer | Offset used in request | ### Job Object Each job in the array contains: | Field | Type | Description | | --- | --- | --- | | `job_id` | string | Unique job identifier | | `status` | string | Job status | | `url` | string | Target URL | | `extracted_data` | array|null | Extracted items (null if not completed) | | `error_message` | string|null | Error message if failed | | `metadata` | object|null | Processing metadata | | `created_at` | string | Creation timestamp | | `completed_at` | string|null | Completion timestamp | List All Jobs ```bash curl -X GET "https://api.aitronos.com/api/v1/scrape/jobs?limit=20&offset=0" \ -H "X-API-Key: $FREDDY_API_KEY" ``` ```python import os import requests api_key = os.environ["FREDDY_API_KEY"] response = requests.get( "https://api.aitronos.com/api/v1/scrape/jobs", headers={"X-API-Key": api_key}, params={"limit": 20, "offset": 0} ) data = response.json() print(f"Total jobs: {data['total']}") for job in data['jobs']: print(f"- {job['job_id']}: {job['status']}") ``` ```javascript const axios = require('axios'); const apiKey = process.env.FREDDY_API_KEY; axios.get('https://api.aitronos.com/api/v1/scrape/jobs', { headers: { 'X-API-Key': apiKey }, params: { limit: 20, offset: 0 } }) .then(response => { const data = response.data; console.log(`Total jobs: ${data.total}`); data.jobs.forEach(job => { console.log(`- ${job.job_id}: ${job.status}`); }); }); ``` **Response** `200 OK` ```json { "jobs": [ { "job_id": "job_abc123", "status": "completed", "url": "https://example.com/page1", "extracted_data": [...], "metadata": {...}, "created_at": "2024-12-16T10:30:00Z", "completed_at": "2024-12-16T10:30:02Z" }, { "job_id": "job_def456", "status": "failed", "url": "https://example.com/page2", "extracted_data": null, "error_message": "Site blocked by anti-bot measures", "metadata": {...}, "created_at": "2024-12-16T10:31:00Z", "completed_at": "2024-12-16T10:31:05Z" } ], "total": 45, "limit": 50, "offset": 0 } ``` Filter by Status ```bash curl -X GET "https://api.aitronos.com/api/v1/scrape/jobs?status=completed&limit=20" \ -H "X-API-Key: $FREDDY_API_KEY" ``` ```python import os import requests api_key = os.environ["FREDDY_API_KEY"] response = requests.get( "https://api.aitronos.com/api/v1/scrape/jobs", headers={"X-API-Key": api_key}, params={"status": "completed", "limit": 20} ) ```