Analysis, Local Studio, and deployment decision layer for local-first Edge AI inference validation.
-
Updated
May 14, 2026 - Python
Analysis, Local Studio, and deployment decision layer for local-first Edge AI inference validation.
Multi-repository entrypoint for the InferEdge local-first Edge AI inference validation pipeline.
Build provenance and handoff layer for ONNX-to-edge artifacts in the InferEdge validation pipeline.
Optional deterministic diagnosis evidence layer for explaining risky Edge AI inference outputs.
C++ runtime execution and Jetson/ONNX Runtime evidence export layer for Edge AI inference validation.
Jetson Orin Nano internal edge AI evidence lab: environment baselines, PyTorch/ONNX Runtime/TensorRT comparison, FastAPI serving, Whisper audio, and InferEdge exports.
Add a description, image, and links to the inferedge topic page so that developers can more easily learn about it.
To associate your repository with the inferedge topic, visit your repo's landing page and select "manage topics."