Case study
Copper Vision Mood Classifier
Copper Vision is a personal ML project where I used a computer-vision pipeline to identify my son Copper and classify mood states from image features. The model outputs an on-screen label (for example Relaxed, Curious/Annoyed, or Content) with a live bounding box overlay.
The brief
Challenge
Translate subtle behavioral cues into consistent mood labels while keeping inference understandable and visually clear in real scenes.
Approach
What we made
I used a supervised ML workflow with labeled examples, tuned class boundaries through iteration, and integrated model inference into a vision overlay that displays both detection and predicted mood.
- Built a practical mood-label inference flow with bounding-box visualization.
- Used iterative class definitions to distinguish similar behavior states.
- Implemented with a Python CV/ML stack including OpenCV, NumPy, Pandas, scikit-learn, and PyTorch.
Outcome
Results
Delivered a working end-to-end prototype that can classify and display Copper's likely mood directly on the camera output.
Gallery
Visual snapshots
Click any image to expand.