Skip to content

feat: transform ChittyProof to dual-export library + service#2

Open
chitcommit wants to merge 12 commits intomainfrom
feature/service-transform
Open

feat: transform ChittyProof to dual-export library + service#2
chitcommit wants to merge 12 commits intomainfrom
feature/service-transform

Conversation

@chitcommit
Copy link
Contributor

@chitcommit chitcommit commented Mar 2, 2026

Summary

Transforms ChittyProof from a library-only package into a dual-export project: SDK barrel export (src/index.js) for in-process consumers + Hono Worker (src/worker.js) deployed at proof.chitty.cc for HTTP consumers.

  • Auth middleware: Shared-secret Bearer token (CHITTY_AUTH_SERVICE_TOKEN)
  • 6 HTTP endpoints: /health, /api/v1/status, /api/v1/verify, /api/v1/canonicalize, /api/v1/hash, /api/v1/validate
  • 32/32 tests passing (10 original library + 22 new route/middleware/integration)
  • Compliance triad updated: CHARTER.md, CHITTY.md, CLAUDE.md all consistent

Changes

  • package.json — added hono, wrangler, scripts (dev, deploy), main entry
  • wrangler.toml — Worker config with KV binding, tail consumer, route
  • src/index.js — SDK barrel re-exporting 13 functions from both lib modules
  • src/worker.js — Hono app wiring routes + auth middleware
  • src/middleware/auth.js — Bearer token middleware
  • src/routes/ — health, verify, canonicalize, hash, validate handlers
  • tests/ — 22 new tests (middleware, routes, integration)
  • Docs — CHARTER.md, CHITTY.md, CLAUDE.md updated for dual-export

Pre-deployment steps

  • Create KV namespace: wrangler kv namespace create PROOF_KEY_CACHE
  • Set secret: wrangler secret put CHITTY_AUTH_SERVICE_TOKEN
  • Optional: wrangler secret put CHITTYCERT_JWKS_URL (fallback for cert.chitty.cc)
  • Register service at register.chitty.cc

Test plan

  • All 32 tests pass (npm test)
  • 10 original library tests unchanged (no regressions)
  • Auth middleware: no header → 401, wrong token → 401, valid → pass-through
  • Each route: valid input → 200, invalid input → 400
  • Worker integration: 404 fallback, error handler → 400
  • CI pipeline validates on push

🤖 Generated with Claude Code

Summary by CodeRabbit

  • New Features

    • ChittyProof available as both an installable SDK and a hosted HTTP service at proof.chitty.cc
    • Added API endpoints for verify, canonicalize, hash, validate, plus health/status probes
    • Bearer token authentication protects authenticated endpoints
  • Tests

    • Expanded test coverage to 32 passing tests
  • Chores

    • Added Cloudflare Workers deployment configuration and CI workflow

chitcommit and others added 11 commits March 1, 2026 21:13
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
CHARTER.md: artifact type, HTTP endpoints table, expanded dependencies
CHITTY.md: architecture, stack, endpoints, expanded consumers
CLAUDE.md: full project structure, dual-export architecture, new commands

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@chatgpt-codex-connector
Copy link

You have reached your Codex usage limits for code reviews. You can see your limits in the Codex usage dashboard.

@coderabbitai
Copy link

coderabbitai bot commented Mar 2, 2026

Note

Currently processing new changes in this PR. This may take a few minutes, please wait...

📥 Commits

Reviewing files that changed from the base of the PR and between 2c8e2e9 and ce6d9fd.

📒 Files selected for processing (1)
  • .github/workflows/ci.yml
 ______________
< Bugs begone! >
 --------------
  \
   \   (\__/)
       (•ㅅ•)
       /   づ

✏️ Tip: You can disable in-progress messages and the fortune message in your review settings.

Tip

CodeRabbit can enforce grammar and style rules using `languagetool`.

Configure the reviews.tools.languagetool setting to enable/disable rules and categories. Refer to the LanguageTool Community to learn more.

📝 Walkthrough

Walkthrough

ChittyProof transforms from a library-only project to a dual-export model with both SDK imports and HTTP service endpoints at proof.chitty.cc. The changes introduce Hono-based HTTP routes, Bearer token authentication middleware, comprehensive test coverage, and Cloudflare Worker configuration.

Changes

Cohort / File(s) Summary
Documentation & Charter Updates
CHARTER.md, CHITTY.md, CLAUDE.md
Updated project scope from library-only to library + service dual-export model. Added HTTP endpoints section (health, status, verify, canonicalize, hash, validate), new dependencies (ChittyAuth, Hono, ChittyTrack, ChittyBeacon), and consumer updates. Expanded test coverage reference from 10/10 to 32/32 vitest.
Project Configuration
package.json, wrangler.toml
Added main entry point to src/worker.js, dev and deploy scripts, new dependencies (hono ^4.12.3, wrangler ^4.69.0). Added Cloudflare Worker configuration with KV namespace binding, environment variables, and route pattern for proof.chitty.cc.
SDK Public Exports
src/index.js
New module exporting public API barrel with 13 re-exported functions from library modules (canonicalization, verification, encoding utilities).
HTTP Service Layer
src/worker.js, src/middleware/auth.js, src/routes/*
Introduced Hono HTTP application with global error handling and route registration. Added Bearer token authentication middleware with 401 error handling. Implemented 6 route modules: health (GET endpoints), verify (hash integrity + optional ECDSA), canonicalize, hash, and validate (schema validation). All authenticated routes except health.
Comprehensive Test Coverage
tests/middleware/auth.test.js, tests/routes/*.test.js, tests/worker.test.js
Added 9 new test files covering authentication middleware, all HTTP routes, and worker integration. Tests validate request/response handling, error cases, authentication enforcement, and proper routing.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Poem

🐰 A hop, skip, and bound through the HTTP!
From library alone to service with a grip,
With Bearer tokens guarding each cryptic call,
And health checks dancing in the hallway's hall,
ChittyProof now serves all, endpoints and all! 🔐

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 7.14% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat: transform ChittyProof to dual-export library + service' accurately and concisely summarizes the primary change: converting the project from a library-only package into a dual-export system supporting both SDK imports and HTTP service endpoints.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feature/service-transform

Tip

Try Coding Plans. Let us write the prompt for your AI agent so you can ship faster (with fewer bugs).
Share your feedback on Discord.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
package.json (1)

6-10: ⚠️ Potential issue | 🟠 Major

Export surface is misaligned with the dual-export contract.

Line 6 points main to src/worker.js, and Line 9 maps the package root import to src/lib/chittyproof-v2-canonical.js. This bypasses src/index.js, so the documented SDK barrel isn't actually exposed from @chitty/chittyproof. Additionally, there is no ./worker export defined.

Suggested manifest fix
-  "main": "src/worker.js",
+  "main": "src/index.js",
   "exports": {
     ".": {
-      "import": "./src/lib/chittyproof-v2-canonical.js"
+      "import": "./src/index.js"
     },
+    "./worker": {
+      "import": "./src/worker.js"
+    },
     "./verify-ecdsa": {
       "import": "./src/lib/chittyproof-verify-ecdsa.js"
     },
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@package.json` around lines 6 - 10, The package manifest currently points
"main" at src/worker.js and maps the package root import to
src/lib/chittyproof-v2-canonical.js, which hides the SDK barrel in src/index.js
and fails to export the worker; update package.json so the package root (".")
import/require maps to ./src/index.js (exposing the documented SDK barrel),
change "main" to point at src/index.js (or to remain compatible ensure it points
to ./src/index.js), and add an explicit "./worker" export entry that maps to
./src/worker.js (include both "import" and "require" variants if you support
ESM/CJS) so consumers can import the worker.
🧹 Nitpick comments (8)
src/routes/validate.js (1)

64-81: Add defensive checks for schema structure access.

Accessing schema.properties.fact.required etc. assumes the schema has this exact structure. If the schema file is malformed or missing these paths, this will throw at runtime.

🛡️ Proposed defensive access
     if (bundle.fact && typeof bundle.fact === "object") {
-      errors.push(...validateRequired(bundle.fact, schema.properties.fact.required, "fact"));
+      const factRequired = schema.properties?.fact?.required ?? [];
+      errors.push(...validateRequired(bundle.fact, factRequired, "fact"));
       errors.push(...validateConst(bundle.fact, "status", "sealed", "fact"));
     }

Apply similar pattern for ledger, chain, score, and pillars sections.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/routes/validate.js` around lines 64 - 81, The code assumes
schema.properties and per-property entries exist; update the bundle validation
blocks (the sections that call validateRequired and validateConst for
bundle.fact, bundle.ledger, bundle.chain, bundle.score, bundle.pillars) to
defensively read required arrays and const checks: first ensure schema &&
schema.properties && schema.properties.<prop> exist and that
schema.properties.<prop>.required is an array (otherwise use an empty array),
and only call validateConst for "status" when schema.properties.fact and its
const expectation exist; pass the safe fallback arrays/values into
validateRequired/validateConst to avoid runtime throws when the schema is
malformed or missing paths.
src/routes/verify.js (1)

33-40: Consider including ECDSA result even when hash verification fails.

Currently, ECDSA verification is skipped when hashResult.ok is false. This is reasonable for performance, but consumers might want to know if the signature itself is structurally valid even when the hash doesn't match. Consider adding an option or documenting this behavior.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/routes/verify.js` around lines 33 - 40, The current logic skips calling
verifyECDSA when hashResult.ok is false, so result.ecdsa is omitted; update the
flow in the handler around hashResult.ok and the ecdsa flag (the block that
calls verifyECDSA(bundle, c.env) and sets result.ecdsa) to always perform
verifyECDSA or make it conditional on a new option (e.g.,
includeEcdsaOnHashFail) so consumers can request ECDSA structural verification
even if the hash fails; ensure verifyECDSA is invoked with the same parameters
(bundle, c.env) and its return (ok, reason, kid) is assigned to result.ecdsa
regardless of hashResult.ok when the option is enabled.
src/middleware/auth.js (1)

19-21: Consider using timing-safe comparison for token validation.

The direct === comparison is susceptible to timing attacks. While exploitability is low for service-to-service auth, using a constant-time comparison is a security best practice for secrets.

🔒 Proposed timing-safe comparison
+import { timingSafeEqual } from "node:crypto";
+
+function safeCompare(a, b) {
+  if (typeof a !== "string" || typeof b !== "string") return false;
+  const bufA = Buffer.from(a);
+  const bufB = Buffer.from(b);
+  if (bufA.length !== bufB.length) return false;
+  return timingSafeEqual(bufA, bufB);
+}
+
 export async function bearerAuth(c, next) {
   const expected = c.env.CHITTY_AUTH_SERVICE_TOKEN;
   if (!expected) {
     return c.json({ error: "unauthorized", message: "auth not configured" }, 401);
   }

   const header = c.req.header("authorization");
   const token = header?.startsWith("Bearer ") ? header.slice(7) : null;

-  if (!token || token !== expected) {
+  if (!token || !safeCompare(token, expected)) {
     return c.json({ error: "unauthorized" }, 401);
   }

   await next();
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/middleware/auth.js` around lines 19 - 21, The token check in
src/middleware/auth.js currently uses a plain equality (if (!token || token !==
expected) ...) which is vulnerable to timing attacks; replace that branch with a
timing-safe comparison using Node's crypto.timingSafeEqual (or an equivalent
constant-time function): ensure both token and expected are non-null, convert
them to Buffers of equal length (reject if lengths differ) and then call
crypto.timingSafeEqual(bufferToken, bufferExpected); keep the early null/empty
check and return c.json({ error: "unauthorized" }, 401) on mismatch.
tests/worker.test.js (1)

26-48: Consider extending integration auth checks to the other protected endpoints.

/api/v1/canonicalize, /api/v1/hash, and /api/v1/validate are also guarded in src/worker.js; adding 401/authorized-path checks here would harden end-to-end coverage.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/worker.test.js` around lines 26 - 48, Add the same auth coverage you
have for /api/v1/verify to the other protected routes by adding tests in
tests/worker.test.js that call POST /api/v1/canonicalize, /api/v1/hash, and
/api/v1/validate using the existing app.request helper and ENV: for each route
add a request without an Authorization header expecting status 401, and a
request with Authorization: "Bearer test-token-123" (and minimal valid JSON body
similar to the verify test, e.g. { bundle: {}, ecdsa: false } or the
route-appropriate payload) expecting a non-401 response (e.g. 400 for invalid
payload), mirroring the pattern used in the "POST /api/v1/verify" tests.
tests/routes/canonicalize.test.js (1)

25-34: Add an invalid-JSON test to cover the explicit parser error branch.

The route has a dedicated invalid JSON 400 path, but this suite only covers missing payload.

✅ Suggested additional test
 describe("POST /api/v1/canonicalize", () => {
+  it("returns 400 for invalid JSON", async () => {
+    const app = makeApp();
+    const res = await app.request("/api/v1/canonicalize", {
+      method: "POST",
+      headers: { "Content-Type": "application/json" },
+      body: "{invalid",
+    });
+    expect(res.status).toBe(400);
+    const body = await res.json();
+    expect(body.error).toBe("bad_request");
+  });
+
   it("returns 400 when payload is missing", async () => {
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/routes/canonicalize.test.js` around lines 25 - 34, Add a new test case
in tests/routes/canonicalize.test.js to exercise the explicit "invalid JSON"
parser error branch for the POST /api/v1/canonicalize route: create a test
(e.g., it("returns 400 for invalid JSON payload", ...)) that uses makeApp(),
sends a POST to "/api/v1/canonicalize" with header "Content-Type:
application/json" but a malformed JSON body (e.g., "{invalid:") and asserts
res.status is 400 (and optionally checks the error message/body). This will
ensure the route's invalid-JSON handling is covered alongside the existing
missing-payload test.
tests/routes/verify.test.js (1)

13-59: Add one test that exercises the default ecdsa=true branch.

Current tests force ecdsa: false, so the verifyECDSA execution path is untested.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/routes/verify.test.js` around lines 13 - 59, Add a test in
tests/routes/verify.test.js that posts a valid bundle without the ecdsa flag so
the server takes the default ecdsa=true path: create the bundle with
makeFactProofBundle(), set bundle.signature.signed_fields_hash = await
hashSignedPayload(bundle), call makeApp() and POST to "/api/v1/verify" with body
JSON.stringify({ bundle }) (omit ecdsa), then assert res.status === 200,
body.hash_ok === true and body.ecdsa === true to exercise the verifyECDSA
branch.
tests/routes/hash.test.js (1)

16-29: Use at least one fixed test vector instead of deriving expected output from the same implementation.

Lines 16-28 compute expectedHash via hashSignedPayload, which can mask shared regressions between route and library.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/routes/hash.test.js` around lines 16 - 29, The test currently computes
expectedHash by calling hashSignedPayload(bundle), which can hide regressions;
replace that dynamic computation with at least one fixed test vector: hardcode a
known good hash string (e.g., "sha256:<64-hex-chars>") for the given bundle
payload, keep the request using makeApp() to POST to "/api/v1/hash" with the
same bundle, and assert res.status is 200, body.signed_fields_hash equals the
hardcoded value and still matches the /^sha256:[0-9a-f]{64}$/ regex; use the
hashSignedPayload function only to generate the fixed value once externally (or
from a trusted implementation) and embed that literal as expectedHash in the
test so the test asserts against a fixed vector rather than deriving expected
output from the same implementation.
tests/routes/validate.test.js (1)

61-70: Assert the 400 error payload, not just status.

On Line 69 you only assert 400. Add body assertions to lock the API contract (error + message) and catch regressions earlier.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/routes/validate.test.js` around lines 61 - 70, Update the "returns 400
when bundle is missing" test to assert the JSON error payload in addition to the
status: after calling app.request("/api/v1/validate", ...) parse the response
body (e.g., await res.json()) and assert it contains the expected error shape
(an "error" field and a "message" field) and appropriate values/strings
describing the missing bundle; keep the existing status check and use the same
test block name to lock the API contract for the validate route.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@CHITTY.md`:
- Around line 47-55: The table starting with the header row "| Method | Path |
Auth | Purpose |" under the "### Endpoints" section (and the other table around
the second occurrence noted) needs surrounding blank lines to satisfy
markdownlint MD058; insert a single blank line immediately before the table and
a single blank line immediately after the table (so there is an empty line
between the preceding heading/paragraph and the table, and between the table and
the following content) for both tables to fix the lint warnings.

In `@CLAUDE.md`:
- Around line 85-86: The auth middleware's current token check uses plain
equality (token !== expected) which is vulnerable to timing attacks; update the
middleware (in src/middleware/auth.js, the auth middleware function handling
variables token and expected) to perform a constant-time comparison using
crypto.timingSafeEqual: import Node's crypto, convert token and expected to
Buffers, ensure they are the same length (if not, compare against a dummy buffer
of the same length to avoid short-circuit timing differences) and replace the
simple equality check with the timingSafeEqual-based check so unauthorized
responses still return the same way when the comparison fails.

In `@src/routes/validate.js`:
- Around line 10-20: The module currently uses readFileSync at top-level to
populate the schema variable (readFileSync + import.meta.url), which fails in
Cloudflare Workers; replace that file-system read with a direct import of the
exported schema module (the package export mapped to "./schema") or load it from
an asset binding/KV at runtime and assign to the schema variable. Locate the
schema variable and the readFileSync usage in this file (schema, readFileSync,
import.meta.url) and change to either: a static import of the JSON module or a
runtime fetch from the Worker asset/KV binding, ensuring schema is available
before validation code runs.

In `@src/worker.js`:
- Around line 22-24: The global error handler app.onError should not leak raw
exception messages or coerce every error to 400; change it to inspect the thrown
error (use instanceof HTTPException or check for a numeric err.status) and for
HTTPException return c.json({ error: "error", message: err.message },
err.status) otherwise return a generic 500 response like c.json({ error:
"internal_server_error", message: "Internal server error" }, 500); ensure
library errors are wrapped as HTTPException with appropriate status in the
places they are thrown (or caught and rethrown as HTTPException) and handle
expected client validation errors inside route handlers instead of relying on
the global handler.

In `@wrangler.toml`:
- Around line 20-22: The kv namespace entry uses a placeholder id ("TBD") which
will prevent deployment; create the KV namespace for the binding PROOF_KEY_CACHE
in your Cloudflare account (or retrieve the existing namespace ID) and replace
the id value under the [[kv_namespaces]] block so id is the actual namespace
GUID for PROOF_KEY_CACHE; ensure the binding name PROOF_KEY_CACHE matches the
code that references it.
- Around line 12-14: The TOML currently uses a single-table header [routes]
which is incorrect for Wrangler route bindings; replace the header with an
array-of-tables header [[routes]] (or use [route] if you intentionally want the
singular form) so the route entry (pattern = "proof.chitty.cc/*", zone_name =
"chitty.cc") is parsed correctly; update the header from [routes] to [[routes]]
to fix Wrangler config parsing.

---

Outside diff comments:
In `@package.json`:
- Around line 6-10: The package manifest currently points "main" at
src/worker.js and maps the package root import to
src/lib/chittyproof-v2-canonical.js, which hides the SDK barrel in src/index.js
and fails to export the worker; update package.json so the package root (".")
import/require maps to ./src/index.js (exposing the documented SDK barrel),
change "main" to point at src/index.js (or to remain compatible ensure it points
to ./src/index.js), and add an explicit "./worker" export entry that maps to
./src/worker.js (include both "import" and "require" variants if you support
ESM/CJS) so consumers can import the worker.

---

Nitpick comments:
In `@src/middleware/auth.js`:
- Around line 19-21: The token check in src/middleware/auth.js currently uses a
plain equality (if (!token || token !== expected) ...) which is vulnerable to
timing attacks; replace that branch with a timing-safe comparison using Node's
crypto.timingSafeEqual (or an equivalent constant-time function): ensure both
token and expected are non-null, convert them to Buffers of equal length (reject
if lengths differ) and then call crypto.timingSafeEqual(bufferToken,
bufferExpected); keep the early null/empty check and return c.json({ error:
"unauthorized" }, 401) on mismatch.

In `@src/routes/validate.js`:
- Around line 64-81: The code assumes schema.properties and per-property entries
exist; update the bundle validation blocks (the sections that call
validateRequired and validateConst for bundle.fact, bundle.ledger, bundle.chain,
bundle.score, bundle.pillars) to defensively read required arrays and const
checks: first ensure schema && schema.properties && schema.properties.<prop>
exist and that schema.properties.<prop>.required is an array (otherwise use an
empty array), and only call validateConst for "status" when
schema.properties.fact and its const expectation exist; pass the safe fallback
arrays/values into validateRequired/validateConst to avoid runtime throws when
the schema is malformed or missing paths.

In `@src/routes/verify.js`:
- Around line 33-40: The current logic skips calling verifyECDSA when
hashResult.ok is false, so result.ecdsa is omitted; update the flow in the
handler around hashResult.ok and the ecdsa flag (the block that calls
verifyECDSA(bundle, c.env) and sets result.ecdsa) to always perform verifyECDSA
or make it conditional on a new option (e.g., includeEcdsaOnHashFail) so
consumers can request ECDSA structural verification even if the hash fails;
ensure verifyECDSA is invoked with the same parameters (bundle, c.env) and its
return (ok, reason, kid) is assigned to result.ecdsa regardless of hashResult.ok
when the option is enabled.

In `@tests/routes/canonicalize.test.js`:
- Around line 25-34: Add a new test case in tests/routes/canonicalize.test.js to
exercise the explicit "invalid JSON" parser error branch for the POST
/api/v1/canonicalize route: create a test (e.g., it("returns 400 for invalid
JSON payload", ...)) that uses makeApp(), sends a POST to "/api/v1/canonicalize"
with header "Content-Type: application/json" but a malformed JSON body (e.g.,
"{invalid:") and asserts res.status is 400 (and optionally checks the error
message/body). This will ensure the route's invalid-JSON handling is covered
alongside the existing missing-payload test.

In `@tests/routes/hash.test.js`:
- Around line 16-29: The test currently computes expectedHash by calling
hashSignedPayload(bundle), which can hide regressions; replace that dynamic
computation with at least one fixed test vector: hardcode a known good hash
string (e.g., "sha256:<64-hex-chars>") for the given bundle payload, keep the
request using makeApp() to POST to "/api/v1/hash" with the same bundle, and
assert res.status is 200, body.signed_fields_hash equals the hardcoded value and
still matches the /^sha256:[0-9a-f]{64}$/ regex; use the hashSignedPayload
function only to generate the fixed value once externally (or from a trusted
implementation) and embed that literal as expectedHash in the test so the test
asserts against a fixed vector rather than deriving expected output from the
same implementation.

In `@tests/routes/validate.test.js`:
- Around line 61-70: Update the "returns 400 when bundle is missing" test to
assert the JSON error payload in addition to the status: after calling
app.request("/api/v1/validate", ...) parse the response body (e.g., await
res.json()) and assert it contains the expected error shape (an "error" field
and a "message" field) and appropriate values/strings describing the missing
bundle; keep the existing status check and use the same test block name to lock
the API contract for the validate route.

In `@tests/routes/verify.test.js`:
- Around line 13-59: Add a test in tests/routes/verify.test.js that posts a
valid bundle without the ecdsa flag so the server takes the default ecdsa=true
path: create the bundle with makeFactProofBundle(), set
bundle.signature.signed_fields_hash = await hashSignedPayload(bundle), call
makeApp() and POST to "/api/v1/verify" with body JSON.stringify({ bundle })
(omit ecdsa), then assert res.status === 200, body.hash_ok === true and
body.ecdsa === true to exercise the verifyECDSA branch.

In `@tests/worker.test.js`:
- Around line 26-48: Add the same auth coverage you have for /api/v1/verify to
the other protected routes by adding tests in tests/worker.test.js that call
POST /api/v1/canonicalize, /api/v1/hash, and /api/v1/validate using the existing
app.request helper and ENV: for each route add a request without an
Authorization header expecting status 401, and a request with Authorization:
"Bearer test-token-123" (and minimal valid JSON body similar to the verify test,
e.g. { bundle: {}, ecdsa: false } or the route-appropriate payload) expecting a
non-401 response (e.g. 400 for invalid payload), mirroring the pattern used in
the "POST /api/v1/verify" tests.

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d40ae46 and 2c8e2e9.

⛔ Files ignored due to path filters (1)
  • package-lock.json is excluded by !**/package-lock.json
📒 Files selected for processing (20)
  • CHARTER.md
  • CHITTY.md
  • CLAUDE.md
  • package.json
  • src/index.js
  • src/middleware/auth.js
  • src/routes/canonicalize.js
  • src/routes/hash.js
  • src/routes/health.js
  • src/routes/validate.js
  • src/routes/verify.js
  • src/worker.js
  • tests/middleware/auth.test.js
  • tests/routes/canonicalize.test.js
  • tests/routes/hash.test.js
  • tests/routes/health.test.js
  • tests/routes/validate.test.js
  • tests/routes/verify.test.js
  • tests/worker.test.js
  • wrangler.toml

Comment on lines +47 to +55
### Endpoints
| Method | Path | Auth | Purpose |
|--------|------|------|---------|
| GET | `/health` | None | Health probe |
| GET | `/api/v1/status` | None | Service metadata |
| POST | `/api/v1/verify` | Bearer | Hash + ECDSA verification |
| POST | `/api/v1/canonicalize` | Bearer | Deterministic JSON |
| POST | `/api/v1/hash` | Bearer | SHA-256 of canonical payload |
| POST | `/api/v1/validate` | Bearer | FACT v2 schema validation |
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fix markdownlint MD058 around tables.

Line 48 and Line 75 start tables without required surrounding blank lines, which will keep lint noisy.

Suggested markdown fix
 ### Endpoints
+
 | Method | Path | Auth | Purpose |
 |--------|------|------|---------|
 | GET | `/health` | None | Health probe |
 | GET | `/api/v1/status` | None | Service metadata |
 | POST | `/api/v1/verify` | Bearer | Hash + ECDSA verification |
 | POST | `/api/v1/canonicalize` | Bearer | Deterministic JSON |
 | POST | `/api/v1/hash` | Bearer | SHA-256 of canonical payload |
 | POST | `/api/v1/validate` | Bearer | FACT v2 schema validation |
+
 ...
 ### Dependencies
+
 | Service | Purpose |
 |---------|---------|
 | ChittyCert | JWKS public key hosting for signature verification |
 | ChittyAuth | Shared-secret Bearer token validation |
 | ChittyTrack | Log/trace aggregation (tail_consumers) |
 | ChittyBeacon | Health monitoring (probes /health) |
+

Also applies to: 74-81

🧰 Tools
🪛 markdownlint-cli2 (0.21.0)

[warning] 48-48: Tables should be surrounded by blank lines

(MD058, blanks-around-tables)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@CHITTY.md` around lines 47 - 55, The table starting with the header row "|
Method | Path | Auth | Purpose |" under the "### Endpoints" section (and the
other table around the second occurrence noted) needs surrounding blank lines to
satisfy markdownlint MD058; insert a single blank line immediately before the
table and a single blank line immediately after the table (so there is an empty
line between the preceding heading/paragraph and the table, and between the
table and the following content) for both tables to fix the lint warnings.

Comment on lines +85 to +86
- Routes are thin wrappers around library functions — no business logic in route handlers
- Auth middleware uses simple string comparison against `env.CHITTY_AUTH_SERVICE_TOKEN`
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

# First, check if the file exists
if [ -f "src/middleware/auth.js" ]; then
  echo "=== Inspecting auth middleware implementation ==="
  cat -n src/middleware/auth.js
else
  echo "File src/middleware/auth.js not found. Searching for auth-related files..."
  fd auth . -type f 2>/dev/null | head -20
fi

Repository: chittyapps/chittyproof

Length of output: 932


🏁 Script executed:

#!/bin/bash
# Search for token comparison patterns
echo "=== Searching for token comparison patterns ==="
rg -n 'CHITTY_AUTH_SERVICE_TOKEN|token.*===|Bearer|crypto' --type-list | head -5
rg -n 'CHITTY_AUTH_SERVICE_TOKEN|token.*===|Bearer' -i 2>/dev/null || echo "No matches found"

Repository: chittyapps/chittyproof

Length of output: 9552


🏁 Script executed:

#!/bin/bash
# Check CLAUDE.md to confirm documentation
echo "=== Checking CLAUDE.md lines 85-86 ==="
sed -n '80,90p' CLAUDE.md

Repository: chittyapps/chittyproof

Length of output: 677


Use constant-time comparison for bearer-token validation.

The middleware at line 19 of src/middleware/auth.js uses token !== expected for secret validation. This simple string comparison is vulnerable to timing side-channel attacks that could leak information about token validity or length. Use crypto.timingSafeEqual() instead:

if (!token || !crypto.timingSafeEqual(Buffer.from(token), Buffer.from(expected))) {
  return c.json({ error: "unauthorized" }, 401);
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@CLAUDE.md` around lines 85 - 86, The auth middleware's current token check
uses plain equality (token !== expected) which is vulnerable to timing attacks;
update the middleware (in src/middleware/auth.js, the auth middleware function
handling variables token and expected) to perform a constant-time comparison
using crypto.timingSafeEqual: import Node's crypto, convert token and expected
to Buffers, ensure they are the same length (if not, compare against a dummy
buffer of the same length to avoid short-circuit timing differences) and replace
the simple equality check with the timingSafeEqual-based check so unauthorized
responses still return the same way when the comparison fails.

Comment on lines +10 to +20
import { readFileSync } from "node:fs";

const schema = JSON.parse(
readFileSync(
new URL(
"../../etc/authority/schema/chittyproof-v2-fact-bundle.schema.json",
import.meta.url,
),
"utf8",
),
);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# List root files to understand project type
git ls-files | head -20

Repository: chittyapps/chittyproof

Length of output: 626


🏁 Script executed:

#!/bin/bash
# Check for Cloudflare/Workers specific config
fd -e toml -e json -t f | grep -E "(wrangler|worker|cloudflare)" | head -10

Repository: chittyapps/chittyproof

Length of output: 79


🏁 Script executed:

#!/bin/bash
# Verify schema file exists
find . -name "*chittyproof*" -o -name "*schema.json" | head -10

Repository: chittyapps/chittyproof

Length of output: 293


🏁 Script executed:

#!/bin/bash
# Check package.json for project type/scripts
find . -name "package.json" -type f | head -5

Repository: chittyapps/chittyproof

Length of output: 80


🏁 Script executed:

#!/bin/bash
cat wrangler.toml

Repository: chittyapps/chittyproof

Length of output: 444


🏁 Script executed:

#!/bin/bash
cat package.json

Repository: chittyapps/chittyproof

Length of output: 787


🏁 Script executed:

#!/bin/bash
cat src/routes/validate.js

Repository: chittyapps/chittyproof

Length of output: 2819


Replace readFileSync with imported schema or asset binding.

In Cloudflare Workers, using readFileSync with import.meta.url at module initialization is unreliable. The import.meta.url resolves to the bundled worker location at runtime, not the original filesystem, causing the relative path ../../etc/authority/schema/... to fail in production. Since package.json already exports the schema as a module export ("./schema": "..."), import the schema directly instead of reading it from the filesystem. Alternatively, use an asset binding or KV namespace to serve the schema.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/routes/validate.js` around lines 10 - 20, The module currently uses
readFileSync at top-level to populate the schema variable (readFileSync +
import.meta.url), which fails in Cloudflare Workers; replace that file-system
read with a direct import of the exported schema module (the package export
mapped to "./schema") or load it from an asset binding/KV at runtime and assign
to the schema variable. Locate the schema variable and the readFileSync usage in
this file (schema, readFileSync, import.meta.url) and change to either: a static
import of the JSON module or a runtime fetch from the Worker asset/KV binding,
ensuring schema is available before validation code runs.

Comment on lines +22 to +24
app.onError((err, c) => {
return c.json({ error: "bad_request", message: err.message }, 400);
});
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# First, let's check if src/worker.js exists and examine the code around lines 22-24
git ls-files src/worker.js

Repository: chittyapps/chittyproof

Length of output: 79


🏁 Script executed:

# Read the src/worker.js file to examine lines 22-24
wc -l src/worker.js && echo "---" && cat -n src/worker.js | head -40

Repository: chittyapps/chittyproof

Length of output: 1498


🏁 Script executed:

# Let's look at the routes to understand what errors might be thrown
fd -e js routes/

Repository: chittyapps/chittyproof

Length of output: 48


🏁 Script executed:

# Let's check the directory structure and find route files
git ls-files src/routes/ | head -20

Repository: chittyapps/chittyproof

Length of output: 176


🏁 Script executed:

# Let's examine one of the route files to see how errors are thrown
wc -l src/routes/verify.js && echo "---" && cat -n src/routes/verify.js | head -60

Repository: chittyapps/chittyproof

Length of output: 1633


🏁 Script executed:

# Let's check the lib functions to see what errors they might throw
wc -l src/lib/chittyproof-v2-canonical.js && echo "---" && head -80 src/lib/chittyproof-v2-canonical.js

Repository: chittyapps/chittyproof

Length of output: 2252


🏁 Script executed:

# Search for throw statements in lib directory to see what errors can be thrown
rg "throw new Error" src/lib/

Repository: chittyapps/chittyproof

Length of output: 845


🏁 Script executed:

# Let's check the verify-ecdsa file to understand those errors better
cat -n src/lib/chittyproof-verify-ecdsa.js | head -120

Repository: chittyapps/chittyproof

Length of output: 4022


🏁 Script executed:

# Continue reading to find the "Unknown kid" error
cat -n src/lib/chittyproof-verify-ecdsa.js | sed -n '120,180p'

Repository: chittyapps/chittyproof

Length of output: 2311


🌐 Web query:

Hono error handling err.status property support

💡 Result:

In Hono, app.onError((err, c) => ...) receives a generic error, so err.status is not guaranteed to exist (e.g., plain throw new Error() won’t have it). Hono’s docs show onError as an error hook that just gives you err and c, without promising HTTP fields like status. [1]

err.status is supported when the thrown error is a Hono HTTPException (and potentially by some ecosystem libraries that throw their own “HTTP-like” exceptions). Hono’s HTTPException is designed for this use-case: it carries a status and provides getResponse() to produce the corresponding Response. [2]

Typical pattern:

import { HTTPException } from 'hono/http-exception'

app.onError((err, c) => {
  if (err instanceof HTTPException) {
    // ok: err.status exists
    return err.getResponse() // or: return c.text(err.message, err.status)
  }
  return c.text('Internal Server Error', 500)
})

So: err.status is not a general Hono error feature; it’s specifically available when you throw/catch HTTPException (or another custom exception type that defines status). [2]

Sources:
[1] Hono app.onError API docs (error handling) (honojs.com)
[2] Hono HTTPException docs (status + getResponse()) (hono.ubitools.com)

Citations:


Do not expose raw exception messages or force all exceptions to 400.

Line 23 currently returns internal error text directly (e.g., "Unknown kid: xyz"), and lines 22–24 classify every thrown error as bad_request (400), including real server faults and internal errors. This weakens error semantics and can leak implementation details.

The suggested fix in the diff won't work as proposed because Hono's generic error handler doesn't guarantee err.status exists (it's only available for HTTPException). Instead, either:

  • Wrap library errors in Hono's HTTPException with appropriate status codes, or
  • Catch and handle expected client errors (400) in route handlers separately, letting unhandled errors default to 500.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/worker.js` around lines 22 - 24, The global error handler app.onError
should not leak raw exception messages or coerce every error to 400; change it
to inspect the thrown error (use instanceof HTTPException or check for a numeric
err.status) and for HTTPException return c.json({ error: "error", message:
err.message }, err.status) otherwise return a generic 500 response like c.json({
error: "internal_server_error", message: "Internal server error" }, 500); ensure
library errors are wrapped as HTTPException with appropriate status in the
places they are thrown (or caught and rethrown as HTTPException) and handle
expected client validation errors inside route handlers instead of relying on
the global handler.

Comment on lines +12 to +14
[routes]
pattern = "proof.chitty.cc/*"
zone_name = "chitty.cc"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🌐 Web query:

Wrangler routes configuration syntax [[routes]] vs [routes] 2025

💡 Result:

In Wrangler (Cloudflare Workers) configs, routes is an array, so in TOML you define it as an array of tables:

  • Correct (multiple routes):

    [[routes]]
    pattern = "subdomain.example.com/*"
    zone_name = "example.com"
    
    [[routes]]
    pattern = "subdomain-two.example.com/example"
    zone_id = "<YOUR_ZONE_ID>"

    This matches the official docs’ TOML examples for routes (array) using [[routes]]. [1], [2]

  • Not correct for routes: [routes]
    In TOML, [routes] defines a single table, not an array, so it does not represent routes: Route[] as Wrangler documents it. Wrangler’s docs show [[routes]], not [routes], for routes. [1], [2]

If you only need one route, Wrangler also supports the singular key:

  • Single route (simple string):

    route = "example.com/*"

    [1]

  • Single route (as an object/table):

    [route]
    pattern = "example.org/*"
    zone_name = "example.org"

    [3]

Sources: Cloudflare Wrangler configuration docs (routes/types) [1], Cloudflare Workers “Routes” doc (Wrangler examples) [2], sample Wrangler config showing [route] [3].


Change [routes] to [[routes]].

The Wrangler configuration uses [routes] which defines a single table in TOML, but Wrangler expects [[routes]] (array of tables) for route binding. Alternatively, if this is a single route, use [route] (singular) instead.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@wrangler.toml` around lines 12 - 14, The TOML currently uses a single-table
header [routes] which is incorrect for Wrangler route bindings; replace the
header with an array-of-tables header [[routes]] (or use [route] if you
intentionally want the singular form) so the route entry (pattern =
"proof.chitty.cc/*", zone_name = "chitty.cc") is parsed correctly; update the
header from [routes] to [[routes]] to fix Wrangler config parsing.

Comment on lines +20 to +22
[[kv_namespaces]]
binding = "PROOF_KEY_CACHE"
id = "TBD"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

KV namespace ID placeholder will block deployment.

The id = "TBD" placeholder must be replaced with a real KV namespace ID before deployment. Wrangler will fail to deploy with this value.

Per the PR checklist, ensure the PROOF_KEY_CACHE KV namespace is created and the actual ID is substituted here.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@wrangler.toml` around lines 20 - 22, The kv namespace entry uses a
placeholder id ("TBD") which will prevent deployment; create the KV namespace
for the binding PROOF_KEY_CACHE in your Cloudflare account (or retrieve the
existing namespace ID) and replace the id value under the [[kv_namespaces]]
block so id is the actual namespace GUID for PROOF_KEY_CACHE; ensure the binding
name PROOF_KEY_CACHE matches the code that references it.

Runs npm test on push to main and on pull requests.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
.github/workflows/ci.yml (1)

1-26: LGTM! Clean and functional CI workflow.

The workflow correctly:

  • Triggers on push to main and pull requests targeting main
  • Uses current action versions (@v4)
  • Leverages npm caching for faster builds
  • Uses npm ci for reproducible installs

One optional enhancement to consider: adding a job timeout to prevent hung workflows from consuming runner minutes indefinitely.

💡 Optional: Add job timeout
 jobs:
   test:
     runs-on: ubuntu-latest
+    timeout-minutes: 10
     steps:
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/ci.yml around lines 1 - 26, The CI job "test" currently
has no timeout, so add a job-level timeout to prevent hung runs from consuming
runner minutes; in the jobs.test definition add a timeout-minutes field (e.g.,
timeout-minutes: 60 or another suitable value) so the "test" job will be
automatically canceled after the configured time.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In @.github/workflows/ci.yml:
- Around line 1-26: The CI job "test" currently has no timeout, so add a
job-level timeout to prevent hung runs from consuming runner minutes; in the
jobs.test definition add a timeout-minutes field (e.g., timeout-minutes: 60 or
another suitable value) so the "test" job will be automatically canceled after
the configured time.

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2c8e2e9 and ce6d9fd.

📒 Files selected for processing (1)
  • .github/workflows/ci.yml

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant