Upload files, query datasets, stream logs. All from the command line. Single binary, instant startup, works everywhere.
Single binary, no runtime dependencies. Built in Go for instant startup.
curl -fsSL https://api.liteio.dev/install.sh | shbrew install liteio/tap/litenpm install -g @liteio/cligo install github.com/go-mizu/lite@latestDrop any CSV, JSON, TSV, or Excel file. Schema detected automatically.
Filter, sort, search, paginate. All from the command line.
Open in the browser, pipe to other tools, or give the URL to your AI.
Short aliases for speed. --json on every command for scripts.
upload <file> --nameUpload file, get APIupquery <name>Filter, sort, searchqschema <name>Show column typesstats <name>Column statisticsget <name> <row>Get single rowcatexport <name>Export CSV/JSON/JSONLlsList datasetspush <name> <file>Replace datasetappend <name> <file>Add rowsdelete <name>Delete datasetrmopen <name>Open in browserlogs [dataset]Stream access logsloginAuthenticatelogoutClear credentialswhoamiShow identityconfig [key] [val]Get/set configcompletion <shell>Shell completionsversionPrint versionFilter by any column with typed operators. Chain multiple filters with AND. Sort by any field, ascending or descending. Full-text search across all text columns.
lite q products price:gt=100--sort -price,name for multi-field--search "wireless headphones"field=valueexact matchfield:gt=100greater thanfield:gte=100greater or equalfield:lt=50less thanfield:lte=50less or equalfield:ne=booksnot equalfield:like=wire%pattern matchfield:in=a,b,cin setfield:null=trueis nullColored tables for humans. JSON for scripts. Quiet mode for pipes. Auto-detects when output is piped and switches to machine-friendly format.
Stream access logs in real time. See who's querying your data, what they're filtering, and how fast it responds. Filter by status code or time range.
--status 4xx for errors only--since 1h for recent historyEvery command speaks JSON. Meaningful exit codes. Pipe-friendly. Capture dataset IDs, chain queries, integrate with jq, Python, or your CI pipeline.
Browser-based login. Token stored securely. Override with env var for CI.
lite login opens browser
~/.config/lite/
--token flagLITE_TOKEN env varGlobal config in ~/.config/lite/. Optional project config in lite.toml.
LITE_TOKEN
Auth token (overrides stored credential)
LITE_ENDPOINT
API endpoint (for self-hosted)
LITE_FORMAT
Default output format (table, json, csv)
LITE_NO_COLOR
Disable color output
NO_COLOR
Standard no-color convention
Dynamic completions for dataset IDs, column names, config keys, and filter operators.
lite completion bash > /etc/bash_completion.d/lite
lite completion zsh > "${fpath[1]}/_lite"
lite completion fish > ~/.config/fish/completions/lite.fish
Every error tells you what went wrong, why, and how to fix it. Exit codes are meaningful for scripts. No stack traces, no guessing.
One line to install. One command to turn any file into a live API.
curl -fsSL https://api.liteio.dev/install.sh | shlite: upload files, query data, manage datasets from your terminal.
Single binary. No runtime dependencies. Works without login for public datasets.
## Install
~/.config/lite/credentials.json. Override with LITE_TOKEN env var or --token flag.
Priority: --token flag > LITE_TOKEN env var > stored credential > no auth (public only).
## Commands
### Upload
--name <name>Choose a name for your dataset (becomes the API path)--privateRequire auth to read--schema <file>JSON column type overrides--delimiter <char>Force CSV delimiter--no-headerFirst row is data, not headers--jsonJSON output--quietPrint only dataset ID--jsonJSON output--quietPrint only IDs--fields <cols>Select JSON fields--filter <field:op=val>Filter (repeatable). Ops: gt, gte, lt, lte, ne, like, in, null--sort <fields>Sort, prefix - for desc. Comma-separated--search <query>Full-text search across text columns--limit <n>Max rows (default 25)--page <n>Page number--offset <n>Skip rows--columns <cols>Select columns--jsonAlias for --format json--no-headerOmit table header--wideDon't truncate columns--yes to skip in scripts.
### Push (Replace)
-o <file>Output file (default: stdout)--filter, --sort, --searchApply before export--since <duration>Show logs from (e.g., 1h, 30m)--status <code>Filter by status (e.g., 4xx, 200)--jsonJSON output--no-followDon't stream, print and exit--json: Machine-readable JSON on every command. Supports --fields for field selection.
--quiet: Minimal output (just IDs or counts). For scripting.
Pipe detection: When stdout is not a TTY, color and spinners are disabled automatically. Query output defaults to JSON when piped.
## Configuration
### Config File
~/.config/lite/config.toml
lite.toml in project directory:
LITE_TOKENAuth tokenLITE_ENDPOINTAPI endpointLITE_FORMATDefault output formatLITE_NO_COLORDisable colorNO_COLORStandard no-color convention