COMMAND LINE

Your data.
Your terminal.

Upload files, query datasets, stream logs. All from the command line. Single binary, instant startup, works everywhere.

TERMINAL
$ lite upload products.csv --name products ✓ Dataset created Name products Rows 2,847 Columns id, name, category, price, rating API https://api.liteio.dev/d/products $ lite q products category=electronics --sort -price --limit 3 NAME PRICE RATING Pro Headphones 299.00 4.9 USB-C Hub 79.99 4.7 Wireless Mouse 59.99 4.6
INSTALL

One line. Any platform.

Single binary, no runtime dependencies. Built in Go for instant startup.

CURL
curl -fsSL https://api.liteio.dev/install.sh | sh
macOS & Linux
HOMEBREW
brew install liteio/tap/lite
macOS & Linux
NPM
npm install -g @liteio/cli
Any platform with Node.js
GO
go install github.com/go-mizu/lite@latest
Any platform with Go
QUICKSTART

Three commands to a live API.

1

Upload

Drop any CSV, JSON, TSV, or Excel file. Schema detected automatically.

UPLOAD
$ lite upload sales.json --name sales Uploading sales.json (89 KB)... ✓ Dataset created Name sales Rows 12,450 Columns order_id (integer), customer (text), product (text), amount (number), date (date), region (text) API https://api.liteio.dev/d/sales
2

Query

Filter, sort, search, paginate. All from the command line.

QUERY
$ lite q sales region=west amount:gt=500 --sort -amount --limit 5 CUSTOMER PRODUCT AMOUNT DATE Acme Corp Enterprise 4,200 2026-03-20 DataTech Inc Pro License 2,800 2026-03-19 CloudBase Team Plan 1,500 2026-03-18 NetOps Ltd Enterprise 1,200 2026-03-17 DevHub Co Pro License 890 2026-03-15 Showing 5 of 1,847 results.
3

Share

Open in the browser, pipe to other tools, or give the URL to your AI.

SHARE
$ lite open sales Opening https://api.liteio.dev/d/sales in browser... $ lite export sales --filter region=west --format csv | head -3 order_id,customer,product,amount,date,region 8291,Acme Corp,Enterprise,4200,2026-03-20,west 8274,DataTech Inc,Pro License,2800,2026-03-19,west
COMMANDS

Everything you need. Nothing you don't.

Short aliases for speed. --json on every command for scripts.

DATA
upload <file> --nameUpload file, get APIup
query <name>Filter, sort, searchq
schema <name>Show column types
stats <name>Column statistics
get <name> <row>Get single rowcat
export <name>Export CSV/JSON/JSONL
MANAGE
lsList datasets
push <name> <file>Replace dataset
append <name> <file>Add rows
delete <name>Delete datasetrm
open <name>Open in browser
logs [dataset]Stream access logs
ACCOUNT
loginAuthenticate
logoutClear credentials
whoamiShow identity
config [key] [val]Get/set config
completion <shell>Shell completions
versionPrint version
QUERY ENGINE

Nine operators. Any combination.

Filter by any column with typed operators. Chain multiple filters with AND. Sort by any field, ascending or descending. Full-text search across all text columns.

Positional shorthand lite q products price:gt=100
Multi-sort --sort -price,name for multi-field
Full-text search --search "wireless headphones"
field=valueexact match
field:gt=100greater than
field:gte=100greater or equal
field:lt=50less than
field:lte=50less or equal
field:ne=booksnot equal
field:like=wire%pattern match
field:in=a,b,cin set
field:null=trueis null
OUTPUT

Human. Machine. You choose.

Colored tables for humans. JSON for scripts. Quiet mode for pipes. Auto-detects when output is piped and switches to machine-friendly format.

--json on every command, with field selection
--quiet minimal output (IDs, counts only)
Auto-detect no color or spinners when piped
TABLE (default)
$ lite ls NAME ROWS COLS CREATED products 2,847 6 2h ago sales 12,450 14 1d ago users 891 4 3d ago
JSON (--json)
$ lite ls --json --fields name,rows [{"name":"products", "rows":2847}, {"name":"sales", "rows":12450}, {"name":"users", "rows":891}]
OBSERVABILITY

Watch your API live.

Stream access logs in real time. See who's querying your data, what they're filtering, and how fast it responds. Filter by status code or time range.

Real-time streaming logs appear as requests come in
Filter by status --status 4xx for errors only
Time range --since 1h for recent history
LOGS
$ lite logs products Streaming logs... (Ctrl+C to stop) 12:03:41 GET ?category=electronics&sort=-price 200 23ms 12:03:44 GET /schema 200 8ms 12:04:02 GET ?q=wireless 200 31ms 12:04:15 GET ?format=csv 200 89ms 12:04:28 GET ?page=2&limit=50 200 19ms 12:05:01 GET ?price:gt=1000 200 12ms 12:05:33 GET /stats 200 45ms
SCRIPTING

Pipes. Scripts. CI/CD.

Every command speaks JSON. Meaningful exit codes. Pipe-friendly. Capture dataset IDs, chain queries, integrate with jq, Python, or your CI pipeline.

Exit codes 0 success, 1 error, 2 usage, 3 auth, 4 not found
LITE_TOKEN env var for CI authentication
--quiet just the ID or count, for capture
SCRIPT
#!/bin/bash # Upload with a name lite upload data.csv --name products # Query by name, pipe to jq lite query products --json | jq '.data | length' # Upsert: update if exists, create if not if lite schema products --quiet 2>/dev/null; then lite push products data.csv else lite upload data.csv --name products fi
GITHUB ACTIONS
- name: Upload dataset env: LITE_TOKEN: ${{ secrets.LITE_TOKEN }} run: | curl -fsSL api.liteio.dev/install.sh | sh lite upload ./data/products.csv --name products
CONFIGURATION

Zero config to start. Full control when you need it.

Auth Flow

Browser-based login. Token stored securely. Override with env var for CI.

1 lite login opens browser
2 Authenticate in browser
3 Token stored in ~/.config/lite/
AUTH PRIORITY
1 --token flag
2 LITE_TOKEN env var
3 Stored credential
4 No auth (public only)

Config Files

Global config in ~/.config/lite/. Optional project config in lite.toml.

~/.config/lite/config.toml
format = "table" endpoint = "https://api.liteio.dev" default_limit = 25 no_color = false
lite.toml (project)
[aliases] prod = "products" ord = "orders" # Use short aliases: # lite query prod price:gt=100 # lite push ord ./data/orders.csv
ENVIRONMENT

Environment variables.

LITE_TOKEN Auth token (overrides stored credential)
LITE_ENDPOINT API endpoint (for self-hosted)
LITE_FORMAT Default output format (table, json, csv)
LITE_NO_COLOR Disable color output
NO_COLOR Standard no-color convention
COMPLETIONS

Tab to complete.

Dynamic completions for dataset IDs, column names, config keys, and filter operators.

Bash
lite completion bash > /etc/bash_completion.d/lite
Zsh
lite completion zsh > "${fpath[1]}/_lite"
Fish
lite completion fish > ~/.config/fish/completions/lite.fish
ERRORS

Errors that help.

Every error tells you what went wrong, why, and how to fix it. Exit codes are meaningful for scripts. No stack traces, no guessing.

0 Success
1 General error
2 Usage error
3 Auth error
4 Not found
5 Rate limited
ERROR
$ lite query orders Error: Dataset not found No dataset named "orders". Run lite ls to see your datasets.
ERROR
$ lite upload huge.csv Error: File too large (24 MB) Maximum file size is 10 MB. Split the file or compress it.

Install in 5 seconds. Upload in 10.

One line to install. One command to turn any file into a live API.

curl -fsSL https://api.liteio.dev/install.sh | sh
# CLI lite: upload files, query data, manage datasets from your terminal. Single binary. No runtime dependencies. Works without login for public datasets. ## Install
bash # macOS / Linux curl -fsSL https://api.liteio.dev/install.sh | sh # Homebrew brew install liteio/tap/lite # npm npm install -g @liteio/cli # Go go install github.com/go-mizu/lite@latest
## Quick Start
bash # Upload a file (pick a name, it becomes your endpoint) lite upload products.csv --name products # Query by name with filters and sort lite query products --filter category=electronics --sort -price # Full-text search lite query products --search "wireless headphones" # Show schema lite schema products # Open in browser lite open products
## Authentication
bash # Login (opens browser) lite login # Check identity lite whoami # Logout lite logout
Auth token stored in ~/.config/lite/credentials.json. Override with LITE_TOKEN env var or --token flag. Priority: --token flag > LITE_TOKEN env var > stored credential > no auth (public only). ## Commands ### Upload
bash lite upload <file> --name <name> lite up <file> --name <name> # alias
FlagDescription
--name <name>Choose a name for your dataset (becomes the API path)
--privateRequire auth to read
--schema <file>JSON column type overrides
--delimiter <char>Force CSV delimiter
--no-headerFirst row is data, not headers
`--format csv\tsv\json\jsonl\xlsx`Force format
--jsonJSON output
--quietPrint only dataset ID
Supported formats: CSV, TSV, JSON, JSON Lines, Excel (.xlsx). Auto-detected from extension. ### List Datasets
bash lite ls
FlagDescription
--jsonJSON output
--quietPrint only IDs
--fields <cols>Select JSON fields
### Query
bash lite query <name> [filters...] lite q <name> [filters...] # alias
FlagDescription
--filter <field:op=val>Filter (repeatable). Ops: gt, gte, lt, lte, ne, like, in, null
--sort <fields>Sort, prefix - for desc. Comma-separated
--search <query>Full-text search across text columns
--limit <n>Max rows (default 25)
--page <n>Page number
--offset <n>Skip rows
--columns <cols>Select columns
`--format table\json\csv\jsonl`Output format
--jsonAlias for --format json
--no-headerOmit table header
--wideDon't truncate columns
Filter shorthand:
bash # Positional filters after dataset name lite q products category=electronics price:gt=50 rating:gte=4
### Schema
bash lite schema <name>
Shows column names, types, nullability, and row count. ### Stats
bash lite stats <name>
Shows min, max, avg, null count, unique count, and top values per column. ### Get Row
bash lite get <name> <row_number> lite cat <name> <row_number> # alias
### Delete
bash lite delete <name> lite rm <name> # alias
Requires confirmation. Use --yes to skip in scripts. ### Push (Replace)
bash lite push <name> <file>
Replace entire dataset. Same ID, same URL. ### Append
bash lite append <name> <file>
Add rows to existing dataset. ### Export
bash lite export <name> [flags]
FlagDescription
`--format csv\json\jsonl`Export format (default: csv)
-o <file>Output file (default: stdout)
--filter, --sort, --searchApply before export
### Logs
bash lite logs [dataset]
Stream access logs in real time.
FlagDescription
--since <duration>Show logs from (e.g., 1h, 30m)
--status <code>Filter by status (e.g., 4xx, 200)
--jsonJSON output
--no-followDon't stream, print and exit
### Open
bash lite open <name>
Open dataset API URL in default browser. ## Output Formats Default: Human-readable tables with color. Auto-sized to terminal width. --json: Machine-readable JSON on every command. Supports --fields for field selection. --quiet: Minimal output (just IDs or counts). For scripting. Pipe detection: When stdout is not a TTY, color and spinners are disabled automatically. Query output defaults to JSON when piped. ## Configuration ### Config File ~/.config/lite/config.toml
toml format = "table" endpoint = "https://api.liteio.dev" default_limit = 25 no_color = false editor = "vim"
### Commands
bash lite config # Show all lite config <key> # Get value lite config <key> <value> # Set value lite config --reset # Reset to defaults
### Project Config Optional lite.toml in project directory:
toml [aliases] prod = "products" ord = "orders" # Use short aliases: # lite query prod --filter price:gt=100 # lite push ord ./data/orders.csv
### Environment Variables
VariableDescription
LITE_TOKENAuth token
LITE_ENDPOINTAPI endpoint
LITE_FORMATDefault output format
LITE_NO_COLORDisable color
NO_COLORStandard no-color convention
## Shell Completions
bash # Bash lite completion bash > /etc/bash_completion.d/lite # Zsh lite completion zsh > "${fpath[1]}/_lite" # Fish lite completion fish > ~/.config/fish/completions/lite.fish
Dynamic completions for dataset IDs, column names, and config keys. ## CI/CD
yaml # GitHub Actions - name: Upload dataset env: LITE_TOKEN: ${{ secrets.LITE_TOKEN }} run: | curl -fsSL https://api.liteio.dev/install.sh | sh lite upload ./data/products.csv --name products
Scripting patterns:
bash # Upload with a name lite upload data.csv --name products # Query by name, pipe to jq lite query products --json | jq '.data | length' # Upsert: update if exists, create if not if lite schema products --quiet 2>/dev/null; then lite push products data.csv else lite upload data.csv --name products fi # Export for another tool lite export products --filter status=active --format csv | python analyze.py
## Exit Codes
CodeMeaning
0Success
1General error
2Usage error (bad arguments)
3Auth error
4Not found
5Rate limited
## Links - API Reference: Full HTTP API - Source: CLI source code - Releases: Download binaries