Install
Terminal · npx$
npx skills add https://github.com/vercel-labs/agent-skills --skill vercel-react-best-practicesWorks with Paperclip
How Analyzing Data fits into a Paperclip company.
Analyzing Data drops into any Paperclip agent that handles this kind of work. Assign it to a specialist inside a pre-configured PaperclipOrg company and the skill becomes available on every heartbeat — no prompt engineering, no tool wiring.
S
SaaS FactoryPaired
Pre-configured AI company — 18 agents, 18 skills, one-time purchase.
$27$59
Explore packSource file
SKILL.md107 linesExpandCollapse
---name: analyzing-datadescription: Queries data warehouse and answers business questions about data. Handles questions requiring database/warehouse queries including "who uses X", "how many Y", "show me Z", "find customers", "what is the count", data lookups, metrics, trends, or SQL analysis.--- # Data Analysis Answer business questions by querying the data warehouse. The kernel auto-starts on first `exec` call. **All CLI commands below are relative to this skill's directory.** Before running any `scripts/cli.py` command, `cd` to the directory containing this file. ## Workflow 1. **Pattern lookup** — Check for a cached query strategy: ```bash uv run scripts/cli.py pattern lookup "<user's question>" ``` If a pattern exists, follow its strategy. Record the outcome after executing: ```bash uv run scripts/cli.py pattern record <name> --success # or --failure ``` 2. **Concept lookup** — Find known table mappings: ```bash uv run scripts/cli.py concept lookup <concept> ``` 3. **Table discovery** — If cache misses, search the codebase (`Grep pattern="<concept>" glob="**/*.sql"`) or query `INFORMATION_SCHEMA`. See [reference/discovery-warehouse.md](reference/discovery-warehouse.md). 4. **Execute query**: ```bash uv run scripts/cli.py exec "df = run_sql('SELECT ...')" uv run scripts/cli.py exec "print(df)" ``` 5. **Cache learnings** — Always cache before presenting results: ```bash # Cache concept → table mapping uv run scripts/cli.py concept learn <concept> <TABLE> -k <KEY_COL> # Cache query strategy (if discovery was needed) uv run scripts/cli.py pattern learn <name> -q "question" -s "step" -t "TABLE" -g "gotcha" ``` 6. **Present findings** to user. ## Kernel Functions | Function | Returns ||----------|---------|| `run_sql(query, limit=100)` | Polars DataFrame || `run_sql_pandas(query, limit=100)` | Pandas DataFrame | `pl` (Polars) and `pd` (Pandas) are pre-imported. ## CLI Reference ### Kernel ```bashuv run scripts/cli.py warehouse list # List warehousesuv run scripts/cli.py start [-w name] # Start kernel (with optional warehouse)uv run scripts/cli.py exec "..." # Execute Python codeuv run scripts/cli.py status # Kernel statusuv run scripts/cli.py restart # Restart kerneluv run scripts/cli.py stop # Stop kerneluv run scripts/cli.py install <pkg> # Install package``` ### Concept Cache ```bashuv run scripts/cli.py concept lookup <name> # Look upuv run scripts/cli.py concept learn <name> <TABLE> -k <KEY_COL> # Learnuv run scripts/cli.py concept list # List alluv run scripts/cli.py concept import -p /path/to/warehouse.md # Bulk import``` ### Pattern Cache ```bashuv run scripts/cli.py pattern lookup "question" # Look upuv run scripts/cli.py pattern learn <name> -q "..." -s "..." -t "TABLE" -g "gotcha" # Learnuv run scripts/cli.py pattern record <name> --success # Record outcomeuv run scripts/cli.py pattern list # List alluv run scripts/cli.py pattern delete <name> # Delete``` ### Table Schema Cache ```bashuv run scripts/cli.py table lookup <TABLE> # Look up schemauv run scripts/cli.py table cache <TABLE> -c '[...]' # Cache schemauv run scripts/cli.py table list # List cacheduv run scripts/cli.py table delete <TABLE> # Delete``` ### Cache Management ```bashuv run scripts/cli.py cache status # Statsuv run scripts/cli.py cache clear [--stale-only] # Clear``` ## References - [reference/discovery-warehouse.md](reference/discovery-warehouse.md) — Large table handling, warehouse exploration, INFORMATION_SCHEMA queries- [reference/common-patterns.md](reference/common-patterns.md) — SQL templates for trends, comparisons, top-N, distributions, cohorts