refactor(cli): migrate inference-config.js to TypeScript#1265
Conversation
…ript modules Move ~210 lines of pure, side-effect-free functions from the 3,800-line onboard.js into five typed TypeScript modules under src/lib/: - gateway-state.ts: gateway/sandbox state classification - validation.ts: failure classification, API key validation, model ID checks - url-utils.ts: URL normalization, text compaction, env formatting - build-context.ts: Docker build context filtering, recovery hints - dashboard.ts: dashboard URL resolution and construction Infrastructure: - tsconfig.src.json compiles src/ → dist/ as CJS (dist/ already gitignored) - tsconfig.cli.json updated to type-check src/ - npm run build:cli added to package.json - Pre-commit test-cli hook builds TS and includes dist/lib/ in coverage onboard.js imports from compiled dist/lib/ output. All 542 CLI tests pass. No user-facing behavior changes. Closes #1237. Relates to #924 (shell consolidation). Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
56 new tests across three co-located test files: - src/lib/validation.test.ts: classifyValidationFailure, classifyApplyFailure, classifySandboxCreateFailure, validateNvidiaApiKeyValue, isSafeModelId - src/lib/dashboard.test.ts: resolveDashboardForwardTarget, buildControlUiUrls - src/lib/url-utils.test.ts: compactText, stripEndpointSuffix, normalizeProviderBaseUrl, isLoopbackHostname, formatEnvAssignment, parsePolicyPresetEnv vitest.config.ts updated to include src/**/*.test.ts in the CLI project. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The existing callsite in printDashboard calls buildControlUiUrls() with no arguments when no token is available. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The tsc-js pre-push hook (jsconfig.json) type-checks bin/lib/onboard.js which requires dist/lib/ to exist. CI was not running npm run build:cli before the prek checks, causing TS2307 "Cannot find module" errors. - Add npm run build:cli step to .github/actions/basic-checks - Update tsc-js hook to run build:cli before tsc Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…law into cv/extract-onboard-pure-fns
## Summary - Convert `bin/lib/preflight.js` (357 lines) to `src/lib/preflight.ts` with full type definitions - Typed interfaces for all opts objects and return types: `PortProbeResult`, `MemoryInfo`, `SwapResult`, `CheckPortOpts`, `GetMemoryInfoOpts`, `EnsureSwapOpts` - Extract `parseLsofLines` helper to reduce duplication in `checkPortAvailable` - Incorporate #1227 fix: `sudo` → `sudo -n` (non-interactive) for lsof retry - `bin/lib/preflight.js` becomes a thin re-export shim — existing consumers unaffected - Co-locate tests: `test/preflight.test.js` → `src/lib/preflight.test.ts` - Add real net probe tests (EADDRINUSE detection on occupied ports) - Fix all co-located test imports to use `dist/` paths for coverage attribution - Add targeted dashboard/validation branch tests to maintain ratchet Stacked on #1240. Not touched by any #924 blocker PR. ## Test plan - [x] 612 CLI tests pass (601 existing + 11 new) - [x] `tsc -p tsconfig.src.json` compiles cleanly - [x] `tsc -p tsconfig.cli.json` type-checks cleanly - [x] `tsc -p jsconfig.json` type-checks cleanly (the pre-push check that caught the union issue) - [x] Coverage ratchet passes Relates to #924 (shell consolidation). Supersedes #1227. 🤖 Generated with [Claude Code](https://claude.com/claude-code) --------- Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Convert bin/lib/inference-config.js (143 lines) to src/lib/inference-config.ts with typed interfaces for ProviderSelectionConfig and GatewayInference. All 3 exported functions are pure — straightforward full conversion. Reduces switch/case duplication via shared base object. Co-locates tests. Relates to #924 (shell consolidation). Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (2)
🚧 Files skipped from review as they are similar to previous changes (2)
📝 WalkthroughWalkthroughReplaced the binary-level inline inference config with a re-export to Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~22 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
🧹 Nitpick comments (2)
src/lib/inference-config.ts (1)
118-122: Unreachable null return path.
resolvedModelwill always be a truthy string because the ternary on line 120 always returns eithermodel(when provided and truthy),DEFAULT_OLLAMA_MODEL, orDEFAULT_CLOUD_MODEL— all of which are non-empty strings. The conditionalresolvedModel ? ... : nullon line 121 will never evaluate tonull.This isn't a bug (the function still works correctly), but the dead code path and the
| nullreturn type are misleading. Consider simplifying if the intent is to always return a qualified model string.♻️ Suggested simplification
-export function getOpenClawPrimaryModel(provider: string, model?: string): string | null { - const resolvedModel = - model || (provider === "ollama-local" ? DEFAULT_OLLAMA_MODEL : DEFAULT_CLOUD_MODEL); - return resolvedModel ? `${MANAGED_PROVIDER_ID}/${resolvedModel}` : null; +export function getOpenClawPrimaryModel(provider: string, model?: string): string { + const resolvedModel = + model || (provider === "ollama-local" ? DEFAULT_OLLAMA_MODEL : DEFAULT_CLOUD_MODEL); + return `${MANAGED_PROVIDER_ID}/${resolvedModel}`; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/inference-config.ts` around lines 118 - 122, The function getOpenClawPrimaryModel has an unreachable null path because resolvedModel is always a non-empty string; update the function to always return a qualified model string by changing the return type from string | null to string and return `${MANAGED_PROVIDER_ID}/${resolvedModel}` directly (remove the ternary that returns null), keeping the resolution logic using model, DEFAULT_OLLAMA_MODEL and DEFAULT_CLOUD_MODEL unchanged.src/lib/inference-config.test.ts (1)
6-17: Tests import from compileddist/— build is configured but worth clarifying locally.Importing from
../../dist/lib/inference-configrequires thedist/directory to exist. CI handles this correctly via prek pre-push hooks which runnpm run build:clibeforenpx vitest run. The workflow is documented in CONTRIBUTING.md: usenpx prek run --all-filesfor a complete local check, or runnpm run build:cli && npm testexplicitly if testing in isolation.The comment "for correct coverage attribution" explains the pattern but could be clearer for contributors unfamiliar with the build structure. Consider expanding the comment to note that
npm run build:climust precede test execution.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/inference-config.test.ts` around lines 6 - 17, The test currently imports from "../../dist/lib/inference-config" which requires the compiled dist/ to exist; update the top comment in src/lib/inference-config.test.ts (the block above the import list referencing CLOUD_MODEL_OPTIONS, DEFAULT_OLLAMA_MODEL, DEFAULT_ROUTE_CREDENTIAL_ENV, DEFAULT_ROUTE_PROFILE, INFERENCE_ROUTE_URL, MANAGED_PROVIDER_ID, getOpenClawPrimaryModel, getProviderSelectionConfig, parseGatewayInference) to explicitly state that contributors must run the build before running tests locally—e.g., "run npm run build:cli before npm test" or use "npx prek run --all-files"—so it's clear how to produce dist/ for correct coverage attribution.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@src/lib/inference-config.test.ts`:
- Around line 6-17: The test currently imports from
"../../dist/lib/inference-config" which requires the compiled dist/ to exist;
update the top comment in src/lib/inference-config.test.ts (the block above the
import list referencing CLOUD_MODEL_OPTIONS, DEFAULT_OLLAMA_MODEL,
DEFAULT_ROUTE_CREDENTIAL_ENV, DEFAULT_ROUTE_PROFILE, INFERENCE_ROUTE_URL,
MANAGED_PROVIDER_ID, getOpenClawPrimaryModel, getProviderSelectionConfig,
parseGatewayInference) to explicitly state that contributors must run the build
before running tests locally—e.g., "run npm run build:cli before npm test" or
use "npx prek run --all-files"—so it's clear how to produce dist/ for correct
coverage attribution.
In `@src/lib/inference-config.ts`:
- Around line 118-122: The function getOpenClawPrimaryModel has an unreachable
null path because resolvedModel is always a non-empty string; update the
function to always return a qualified model string by changing the return type
from string | null to string and return
`${MANAGED_PROVIDER_ID}/${resolvedModel}` directly (remove the ternary that
returns null), keeping the resolution logic using model, DEFAULT_OLLAMA_MODEL
and DEFAULT_CLOUD_MODEL unchanged.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 63f96e66-33f7-4795-87cd-7efb117d8004
📒 Files selected for processing (3)
bin/lib/inference-config.jssrc/lib/inference-config.test.tssrc/lib/inference-config.ts
…el return type - Add npm run build:cli to the prepare npm hook so dist/ is built after npm install on fresh checkouts (fixes MODULE_NOT_FOUND on first run) - Remove unreachable null return path from getOpenClawPrimaryModel — resolvedModel is always a non-empty string Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
src/lib/inference-config.ts (2)
9-10: Add type annotation to avoidanyleakage.The
requirecall returns an untyped object, soDEFAULT_OLLAMA_MODELis implicitlyany. This allows type errors to slip through where it's used (lines 109, 120).♻️ Proposed fix to add explicit typing
// eslint-disable-next-line `@typescript-eslint/no-require-imports` -const { DEFAULT_OLLAMA_MODEL } = require("../../bin/lib/local-inference"); +const { DEFAULT_OLLAMA_MODEL } = require("../../bin/lib/local-inference") as { + DEFAULT_OLLAMA_MODEL: string; +};🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/inference-config.ts` around lines 9 - 10, The require call for DEFAULT_OLLAMA_MODEL is untyped and yields an implicit any; replace it with a typed import or add an explicit type annotation so DEFAULT_OLLAMA_MODEL is strongly typed (e.g. declare its type as string or the appropriate union/interface) to prevent any leakage where it's used (references: DEFAULT_OLLAMA_MODEL in this file and usages around lines where it's read). Concretely: change the require to a typed ES import or annotate the destructured constant like "const { DEFAULT_OLLAMA_MODEL }: { DEFAULT_OLLAMA_MODEL: string } = require(...)" (or the correct type) and update any downstream typings if needed so callers at the earlier mentioned uses (DEFAULT_OLLAMA_MODEL at its usage sites) no longer see any.
47-53: Nit:as constis unnecessary here.The
as constassertion on"custom"doesn't affect the final type since the value is spread into a larger object whereendpointTypeis inferred asstringanyway. It's harmless but could be removed for clarity.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/inference-config.ts` around lines 47 - 53, Remove the unnecessary TypeScript const assertion on the string literal in the base object: change the `endpointType: "custom" as const` entry inside the `base` object to simply `endpointType: "custom"`, leaving the other properties (`endpointUrl: INFERENCE_ROUTE_URL`, `ncpPartner`, `profile: DEFAULT_ROUTE_PROFILE`, `provider`) unchanged so the `base` variable continues to behave the same without the redundant `as const`.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@package.json`:
- Line 18: The prepare script currently silences failures for "npm run
build:cli"; update the package.json "prepare" script so the npm run build:cli
invocation does not include "2>/dev/null || true" (i.e., remove the stderr
redirection and the "|| true" that masks errors) so the build will fail fast if
build:cli fails; keep the rest of the prepare logic (the npm install step and
prek handling) intact and ensure the change targets the "prepare" script string
containing "npm run build:cli".
---
Nitpick comments:
In `@src/lib/inference-config.ts`:
- Around line 9-10: The require call for DEFAULT_OLLAMA_MODEL is untyped and
yields an implicit any; replace it with a typed import or add an explicit type
annotation so DEFAULT_OLLAMA_MODEL is strongly typed (e.g. declare its type as
string or the appropriate union/interface) to prevent any leakage where it's
used (references: DEFAULT_OLLAMA_MODEL in this file and usages around lines
where it's read). Concretely: change the require to a typed ES import or
annotate the destructured constant like "const { DEFAULT_OLLAMA_MODEL }: {
DEFAULT_OLLAMA_MODEL: string } = require(...)" (or the correct type) and update
any downstream typings if needed so callers at the earlier mentioned uses
(DEFAULT_OLLAMA_MODEL at its usage sites) no longer see any.
- Around line 47-53: Remove the unnecessary TypeScript const assertion on the
string literal in the base object: change the `endpointType: "custom" as const`
entry inside the `base` object to simply `endpointType: "custom"`, leaving the
other properties (`endpointUrl: INFERENCE_ROUTE_URL`, `ncpPartner`, `profile:
DEFAULT_ROUTE_PROFILE`, `provider`) unchanged so the `base` variable continues
to behave the same without the redundant `as const`.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 6aedc712-bcd4-4875-a1fe-0f18230eb1a7
📒 Files selected for processing (2)
package.jsonsrc/lib/inference-config.ts
ericksoa
left a comment
There was a problem hiding this comment.
Two things to address before merge:
1. prepare script silences build:cli failures (package.json)
CodeRabbit flagged this and I agree — 2>/dev/null || true on build:cli means if the build fails (e.g., devDeps stripped by --omit=dev), the shim at bin/lib/inference-config.js will hit MODULE_NOT_FOUND at runtime with no earlier signal. Can you either remove the error suppression or add a guard in the shim (e.g., try/catch with a helpful error message)?
2. Test coverage was weakened, not just migrated (inference-config.test.ts)
The switch from full-object assertions to expect.objectContaining with only model + providerLabel dropped coverage on the structural fields (endpointType, endpointUrl, ncpPartner, profile, credentialEnv). Those are the ones most likely to break if someone edits the base object. Could you keep at least one full-object assertion per provider category (one hosted, one local) alongside the shorter ones?
…ertions Address review feedback from ericksoa: 1. prepare hook: only run build:cli when tsc is available, but let it fail loudly if it does (no more 2>/dev/null || true suppression) 2. Restore full-object toEqual assertions for one hosted (openai-api) and one local (vllm-local) provider to catch structural regressions in the shared base object fields Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
Thanks @ericksoa — both addressed in 6dc2ed1: 1. This tolerates fresh clones before devDeps are installed (no tsc → skip), but surfaces real build failures (tsc present + compile error → hard fail). 2. Test assertions: Restored full-object |
ericksoa
left a comment
There was a problem hiding this comment.
Both items addressed. LGTM.
The inference-config migration (#1265) moved the implementation to src/lib/inference-config.ts. Resolve the merge conflict by keeping the thin shim in bin/lib/ and applying the qwen removal to the new TS source file. Signed-off-by: Aaron Erickson <aerickson@nvidia.com>
## Summary - Convert `bin/lib/inference-config.js` (143 lines) to `src/lib/inference-config.ts` - Typed interfaces: `ProviderSelectionConfig`, `GatewayInference` - All 3 exports are pure — straightforward full conversion - Co-locate tests: `test/inference-config.test.js` → `src/lib/inference-config.test.ts` Stacked on #1240. 614 CLI tests pass. Coverage ratchet passes. Relates to #924 (shell consolidation). 🤖 Generated with [Claude Code](https://claude.com/claude-code) <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Refactor** * Centralized inference configuration, provider selection, and gateway output parsing into a single consolidated module for more consistent behavior. * **Tests** * Updated tests to match the consolidated module and improved assertion patterns. * **Chores** * Install/prep process now includes a CLI build step during project setup (silently handled). <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…eScript (#1298) Move CJS implementations in `bin/lib/` to TypeScript in `src/lib/`, compiled via `tsconfig.src.json` to `dist/lib/`. Replaces the inline CJS with thin re-export shims following the pattern from #1262, #1265. - Add `src/lib/resolve-openshell.ts` with typed DI options interface - Add `src/lib/version.ts` preserving existing getVersion() string API (git describe → .version file → package.json fallback chain) - Add `src/lib/chat-filter.ts` preserving existing array-based API - Add co-located tests importing from `../../dist/lib/` for coverage - Replace `bin/lib/` full implementations with thin shims - Use platform-neutral paths in version tests (node:path join) No API changes — all consumers (bin/nemoclaw.js, scripts/telegram-bridge.js) continue to work without modification. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Summary
bin/lib/inference-config.js(143 lines) tosrc/lib/inference-config.tsProviderSelectionConfig,GatewayInferencetest/inference-config.test.js→src/lib/inference-config.test.tsStacked on #1240. 614 CLI tests pass. Coverage ratchet passes.
Relates to #924 (shell consolidation).
🤖 Generated with Claude Code
Summary by CodeRabbit
Refactor
Tests
Chores