HomeInstagramGitHubLinkedIn
HomeAboutProjectsRésumé
HomeAboutProjectsRésuméExperience & EducationReferencesInterests & Photos
Experience & EducationReferencesInterests & Photos

WIP - Vision-Based Wildlife & Hazard Detection (Capstone w/ SAIC)

Multi-modal detections + interactive HCI dashboard for ranger safety—translatable to security/defense scenarios.

← Back to Projects
Vision-based multi-modal detection dashboard hero
2025Stakeholder liaisonVision/ML integration lead
Computer VisionBioacousticsSensor FusionGeospatialDashboard UX

Overview

We prototype a ranger-facing dashboard that fuses simulated sensors (vision, motion, vibration, audio) to surface animals, people, and hazards with geo-context, timestamps, and short media clips. Weekly reviews with SAIC keep requirements aligned with real defense workflows.

Problem

Vast areas are hard to monitor manually. Park teams need timely, trustworthy alerts that balance false positives with coverage and give enough context to act.

Solution

Detection models run on incoming signals and publish events to the dashboard. Operators see location, time, severity, and media snippets, plus seasonal insights and predictive elements based on history.

Features

Core detections

  • Animals/people via vision; hazards on trails; seasonal patterns
  • Acoustic cues (e.g., bird calls) to corroborate presence/behavior
  • Motion/vibration signals for anomaly confirmation

Dashboard & HCI

  • Map with tracks, landmarks, feeding grounds, timestamps
  • Media panel (photos, short clips, playback controls)
  • Seasonal and 24/7 views; filter by species/zone/severity
  • Predictive elements from detection history

Operations

  • Weekly syncs with SAIC stakeholders (requirements & reviews)
  • Nature-themed UX for class; architecture maps to defense use cases

Data & Models

Object detection/classification for camera feeds; bioacoustic cues for birds/animals; temporal models for migration/seasonality; multi-signal fusion to boost precision/recall.

Architecture

Ingestion (simulated sensors) → detection services (CV/audio) → event store → dashboard API → operator UI. Alerts include media snippets and geo-context; review tooling calibrates thresholds.

HCI / UX Notes

Clear severities, consistent focus/affordances, and explainable indicators reduce alert fatigue. Anchoring events to maps/timestamps builds trust for decisions.

Status & Next Steps

  • Status: Active capstone; iterative sprints with stakeholder feedback
  • Tune thresholds + multi-signal fusion
  • Evaluate false positives via review tooling
  • Export incident timelines for reporting

Website created by Brighton Young. Please contact me at brightonyoung.dev@gmail.com.