Skip to content

Commit 604977e

Browse files
ankurdotbclaude
andauthored
chore: add Claude Code project configuration (#408)
* chore: add Claude Code configuration with CLAUDE.md, hooks, skills, and agents Set up Claude Code project configuration including CLAUDE.md with architecture docs, auto-format and file protection hooks, a db-migrate skill for Drizzle workflow, and an api-documenter subagent. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: correct Claude Code hooks schema to use nested hooks array with jq stdin parsing The hooks were using an invalid flat command format and nonexistent $CLAUDE_FILE_PATH env var. Fixed to use the required nested hooks array structure and parse file_path from stdin JSON via jq. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * revert: remove unrelated Drizzle migration snapshot changes Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
1 parent abd241a commit 604977e

4 files changed

Lines changed: 107 additions & 0 deletions

File tree

.claude/agents/api-documenter.md

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
---
2+
name: api-documenter
3+
description: Generate REST API documentation by tracing route handlers
4+
---
5+
6+
# API Documenter
7+
8+
Analyze the API routes and handlers to generate endpoint documentation.
9+
10+
## Instructions
11+
12+
1. Read `src/index.ts` to identify all registered routes
13+
2. Read each handler in `src/handlers/` to understand request parameters and response shapes
14+
3. Read helpers in `src/helpers/` for business logic details where relevant
15+
4. Read types in `src/types/` for data structures
16+
5. For each endpoint, document:
17+
- HTTP method and path
18+
- URL parameters (if any)
19+
- Query parameters (if any)
20+
- Response format and shape
21+
- Example response values
22+
6. Output the documentation in a clear format

.claude/settings.json

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
{
2+
"hooks": {
3+
"PreToolUse": [
4+
{
5+
"matcher": "Edit|Write",
6+
"hooks": [
7+
{
8+
"type": "command",
9+
"command": "FILE=$(jq -r '.tool_input.file_path') && echo \"$FILE\" | grep -qE '(package-lock\\.json|src/database/migrations/)' && echo 'BLOCK: Do not edit generated files (lock files, migrations)' && exit 1 || exit 0"
10+
}
11+
]
12+
}
13+
],
14+
"PostToolUse": [
15+
{
16+
"matcher": "Edit|Write",
17+
"hooks": [
18+
{
19+
"type": "command",
20+
"command": "jq -r '.tool_input.file_path' | xargs npx prettier --write 2>/dev/null || true"
21+
}
22+
]
23+
}
24+
]
25+
}
26+
}

.claude/skills/db-migrate/SKILL.md

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
---
2+
name: db-migrate
3+
description: Generate and review Drizzle ORM database migrations after schema changes
4+
disable-model-invocation: true
5+
---
6+
7+
# Database Migration Workflow
8+
9+
Run the full Drizzle ORM migration workflow after schema changes.
10+
11+
## Steps
12+
13+
1. Review changes in `src/database/schema.ts` to understand what changed
14+
2. Run `npm run db:generate` to generate a new migration SQL file
15+
3. Read the newly generated SQL file in `src/database/migrations/` and review it for correctness
16+
4. Report the migration SQL to the user for confirmation before proceeding
17+
5. Only run `npm run db:migrate` if the user explicitly confirms

CLAUDE.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
# CLAUDE.md
2+
3+
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
4+
5+
## Commands
6+
7+
- `npm run dev` — Local dev server (staging env, port 8787)
8+
- `npm run build` — Dry-run deploy to generate dist
9+
- `npm run format` — Prettier format all files
10+
- `npm run lint` — ESLint check (config at `.github/linters/eslint.config.mjs`)
11+
- `npm run lint:fix` — ESLint autofix
12+
- `npm run db:generate` — Generate Drizzle migration from schema changes
13+
- `npm run db:migrate` — Push schema to database
14+
- `npm run db:seed` — Seed reference data (denoms, operation types)
15+
16+
## Architecture
17+
18+
Cloudflare Worker for the cheqd blockchain network. Two entry points in `src/index.ts`:
19+
20+
1. **`fetch`** — HTTP API via itty-router. Serves supply data, account balances, and identity analytics.
21+
2. **`scheduled`** — Hourly cron trigger that updates cached circulating supply balances and syncs identity data from BigDipper GraphQL into PostgreSQL.
22+
23+
### Data flow
24+
25+
- **Supply/balance endpoints** — Handlers in `src/handlers/` call external APIs (`src/api/`) via helpers. The Cosmos SDK REST API (`REST_API`) provides account data; BigDipper GraphQL (`GRAPHQL_API`) provides total supply and identity transactions.
26+
- **Circulating supply** — Watchlist addresses are stored in Cloudflare KV, grouped by `group_N:` prefix. The hourly cron processes one group per hour (24 groups = 24 hours), updating each address's cached balance breakdown. The circulating supply endpoint subtracts all watchlist balances from total supply.
27+
- **Identity analytics sync**`SyncService` in `src/helpers/identity.ts` incrementally syncs DID and resource transactions from BigDipper into PostgreSQL (via Hyperdrive). It tracks the last block height to avoid re-processing, with composite key deduplication (txHash + operationType + entityId).
28+
- **Analytics queries**`src/handlers/analytics.ts` queries the PostgreSQL tables with filtering (date range, operation type, denom, feePayer, didId, success) and pagination. Supports CSV export.
29+
30+
### Database
31+
32+
PostgreSQL accessed through Cloudflare Hyperdrive. Schema in `src/database/schema.ts` mirrors tables for mainnet and testnet (e.g., `did_mainnet`/`did_testnet`, `resource_mainnet`/`resource_testnet`). Each network has its own enum types, denom lookup table, and operation types lookup table. The `TABLES` map in `src/helpers/identity.ts` selects the correct table set by network.
33+
34+
### Environment
35+
36+
All env vars and bindings are typed in `src/worker-types.d.ts` as the global `Env` interface. Key bindings: `HYPERDRIVE` (PostgreSQL connection pooler), `CIRCULATING_SUPPLY_WATCHLIST` (KV namespace). Secrets (`WEBHOOK_URL`) are set via `wrangler secret put`, not in config files.
37+
38+
## Conventions
39+
40+
- Prettier: tabs, single quotes, 120 char width, trailing commas (es5)
41+
- Conventional commits — semantic-release automates versioning and changelogs
42+
- All token amounts are converted from lowest denom (`ncheq`) to main denom (`CHEQ`) using `TOKEN_EXPONENT` (10^9)

0 commit comments

Comments
 (0)