Edge AI · Triton inference insights
SPARROW runs MegaDetector + a species classifier on-device via NVIDIA Triton on a Pi-class compute board. This page surfaces where the edge models are weakest, where they're slowest, and which cameras' labels human curators should review first. Detector: md_v5b.0.0.pt · classifier: speciesnet-east-africa-v1 · format v1.5.
Overall confidence distribution
Histogram of top-1 species-classifier confidence across every animal detection. Bars left of the dashed line are flagged low-confidence and feed the curator queue.
Confidence by species
Per-camera inference latency
Hourly Triton p50 samples from system_metricsaggregated to per-camera percentiles. Bottom of the table (highest p95) often correlates with thermal throttling or a failing SD card.
| Camera | mean | p50 | p90 | p95 | max |
|---|---|---|---|---|---|
| UG-BFE-CAM-02 | 77.7 | 77.7 | 104.6 | 107.8 | 109.9 |
| UG-BFE-CAM-05 | 78.1 | 78.9 | 104.1 | 107.5 | 110 |
| UG-ZWR-CAM-05 | 76.9 | 75.2 | 104.6 | 107.3 | 110 |
| UG-MFR-CAM-03 | 78.9 | 79.8 | 104.7 | 107.3 | 110 |
| UG-ZWR-CAM-01 | 77.9 | 78.4 | 103 | 106.9 | 110 |
| UG-MFR-CAM-01 | 78.6 | 79 | 104 | 106.9 | 110 |
| UG-MFR-CAM-02 | 78 | 78.5 | 103.8 | 106.9 | 109.8 |
| UG-MFR-CAM-04 | 77.1 | 76.8 | 103.5 | 106.9 | 109.9 |
| UG-ZWR-CAM-03 | 77.2 | 76.5 | 103.2 | 106.8 | 109.9 |
| UG-BFE-CAM-03 | 77.5 | 77.3 | 103.3 | 106.7 | 109.5 |
| UG-MFR-CAM-05 | 77 | 78.2 | 104.1 | 106.7 | 109.8 |
| UG-ZWR-CAM-06 | 77.4 | 78.2 | 102.4 | 106.3 | 109.9 |
| UG-ZWR-CAM-04 | 77.1 | 76.9 | 103.5 | 106.2 | 109.9 |
| UG-ZWR-CAM-02 | 77.3 | 77.9 | 103.1 | 105.8 | 109.9 |
| UG-BFE-CAM-04 | 77 | 77.9 | 101.9 | 105.7 | 109.9 |
| UG-BFE-CAM-01 | 77.7 | 80.2 | 102.5 | 105.5 | 109.9 |
Model uncertainty heatmap — verify these cameras first
% of detections at each camera that fell below the 0.6 confidence floor. Sorted desc — the top entries are where human-in-the-loop verification has the highest yield.
| Camera | Site | Total | Low-conf | % |
|---|---|---|---|---|
| UG-ZWR-CAM-05 | UG-ZWR | 1,278 | 169 | 13.2% |
| UG-MFR-CAM-02 | UG-MFR | 1,264 | 164 | 13% |
| UG-BFE-CAM-04 | UG-BFE | 1,329 | 165 | 12.4% |
| UG-MFR-CAM-03 | UG-MFR | 1,448 | 179 | 12.4% |
| UG-MFR-CAM-05 | UG-MFR | 1,490 | 183 | 12.3% |
| UG-BFE-CAM-02 | UG-BFE | 1,209 | 147 | 12.2% |
| UG-ZWR-CAM-03 | UG-ZWR | 1,457 | 177 | 12.1% |
| UG-ZWR-CAM-06 | UG-ZWR | 1,084 | 131 | 12.1% |
| UG-BFE-CAM-03 | UG-BFE | 1,403 | 168 | 12% |
| UG-BFE-CAM-01 | UG-BFE | 1,098 | 128 | 11.7% |
| UG-ZWR-CAM-04 | UG-ZWR | 1,485 | 173 | 11.6% |
| UG-ZWR-CAM-01 | UG-ZWR | 1,391 | 160 | 11.5% |
| UG-MFR-CAM-04 | UG-MFR | 1,343 | 155 | 11.5% |
| UG-ZWR-CAM-02 | UG-ZWR | 1,442 | 157 | 10.9% |
| UG-MFR-CAM-01 | UG-MFR | 1,327 | 142 | 10.7% |
| UG-BFE-CAM-05 | UG-BFE | 1,281 | 134 | 10.5% |
Next steps
- Wire the top of the uncertainty table into a verification queue — a keyboard-driven review UI that shows the bbox- cropped image and lets the curator accept / reject / relabel. The verified subset becomes the next round's training data.
- If a single camera consistently produces high p95 latency (≥ 110 ms), schedule a maintenance visit; the SoC is likely throttling or the SD card is failing.
- If a single species sits at the bottom of the confidence-by-species list, it's a candidate for a targeted fine-tune on programme-collected images.