Lakehouse API
POST /query
Execute SQL queries and retrieve results in streaming JSONL format.
Request:
curl https://api.altertable.ai/query \-H "Authorization: Bearer $ALTERTABLE_BASIC_AUTH_TOKEN" \-H "Content-Type: application/json" \-d '{"statement": "SELECT event, COUNT(*) AS count FROM altertable.main.events GROUP BY event"}'
Request Body:
{"statement": "SELECT ... your SQL query"}
Response Format:
Responses use JSONL (JSON Lines) format for efficient streaming:
- First line: Query metadata (execution time, etc.)
- Second line: Column names and types
- Remaining lines: Result rows, one JSON object per line
Example Response:
{"statement": "SELECT ... your SQL query", "session_id": "1234567890", ...}[{"name": "event", "type": "VARCHAR"}, {"name": "count", "type": "BIGINT"}]["signup", 42]["login", 123]...
POST /upload
Upload data files to create or update tables in your lakehouse. Supports CSV, JSON, and Parquet formats with multiple insertion modes.
Request:
curl https://api.altertable.ai/upload \-H "Authorization: Bearer $ALTERTABLE_BASIC_AUTH_TOKEN" \-H "Content-Type: application/octet-stream" \--data-binary @data.csv \"?catalog=my_catalog&schema=public&table=users&format=csv&mode=create"**Query Parameters:**
catalog(required): Name of the catalog to upload data toschema(required): Name of the schema within the catalogtable(required): Name of the table to create or insert intoformat(required): File format -csv,json, orparquetmode(required): Upload mode -create,append,upsert, oroverwriteprimary_key(optional): Primary key column name (required forupsertmode)
Request Body:
Binary file data in the specified format:
- CSV: Comma-separated values with header row
- JSON: JSON array of objects or JSONL (one JSON object per line)
- Parquet: Apache Parquet columnar format (most efficient)
Upload Modes:
create: Create a new table with the uploaded data (fails if table already exists)append: Append the uploaded data to an existing table (preserves existing data)upsert: Update existing rows and insert new ones based on primary key (requiresprimary_keyparameter)overwrite: Drop the existing table and recreate it with the uploaded data (replaces all data)
Response:
Returns 200 OK on successful upload. The endpoint accepts files up to 100 GB in size.
Example: Upload CSV file
curl https://api.altertable.ai/upload \-H "Authorization: Bearer $ALTERTABLE_BASIC_AUTH_TOKEN" \-H "Content-Type: application/octet-stream" \--data-binary @users.csv \"?catalog=my_catalog&schema=public&table=users&format=csv&mode=create"
Example: Upload JSON file
curl https://api.altertable.ai/upload \-H "Authorization: Bearer $ALTERTABLE_BASIC_AUTH_TOKEN" \-H "Content-Type: application/octet-stream" \--data-binary @events.json \"?catalog=my_catalog&schema=public&table=events&format=json&mode=append"
Example: Upsert with primary key
curl https://api.altertable.ai/upload \-H "Authorization: Bearer $ALTERTABLE_BASIC_AUTH_TOKEN" \-H "Content-Type: application/octet-stream" \--data-binary @updates.parquet \"?catalog=my_catalog&schema=public&table=users&format=parquet&mode=upsert&primary_key=id"