Type ✦ Software and Hardware
Year 2023      
                        

Wavelength -Wavelength -Wavelength -Wavelength -Wavelength -Wavelength -
Our project aims to visually encapsulate human emotions through immersive art, blending high-powered hardware and creative expression using facial recognition and sentiment analysis.

Leveraging the machine learning capabilities of the Arduino Portenta H7 for image processing, we wanted to reinterpret the literal world into an artistic vision of flashing lights and color. This project also celebrates our shared passion for merging computing and creativity, a bond that first connected us three years ago.

This project utilizes the camera on board the Arduino Portenta H7 to recognize emotion on an individual's face, then process that emotion to display it as a color on an artistic light matrix display controlled separately by an Arduino Uno [Treehacks 2023].


Scroll down to check out our video





Featured on Arduino’s Official Account




How was this built?

We began by sourcing a dataset of thousands of facial images labeled with six emotions: happy, sad, angry, surprised, neutral, and fearful. Using Edge Impulse, we created a transfer learning-based image classifier model to categorize these emotions.


The images to the right show the classes for emotional display and a hard case for the Portenta H7 from Stanford’s Product Realization Lab.

The model was optimized for the Arduino Portenta H7 + Vision Shield and deployed to the board.

Then wrote a script to identify the most likely emotion for a given image, outputting its corresponding class ID via the serial console. The Portenta was connected to a computer, which relayed this serial data to an Arduino Uno controlling a NeoPixel light matrix. Then, we programmed the Uno to map each emotion to a specific color based on color psychology, smoothly transitioning between colors to reflect the emotional data received, creating a visually expressive display.

Figuring out how to go from the Portenta H7 to Uno to Serial to LED/Neopixel Matrix was also difficult to navigate at first, but eventually it was figured out through trial and error. A major issue we ran into was the Portenta not connecting to the serial port, which was solved by re-flashing the firmware and uploading the code file to the board differently. Another minor issue we had was that we were constrained by time when we used the 3D printer to make our Arduino cover, and we were unable to use power tools to clean it up, which resulted in it not meeting our aesthetic expectations.

What did we learn?
The biggest thing we learned was probably how to use the Arduino Portenta. Before this hackathon, neither of us had ever seen a Portenta. Now, we have learned how to create a machine learning model for the Portenta in edge impulse, connect to OpenMV, interface from the Portenta to a computer, and from the computer to the LED grid. We have also learned a lot about neopixel interfacing with Arduino, facial recognition/detection algorithms, and serial communication.
What’s next?
We are hoping to expand and classify more emotions in the future — six human emotions is not enough to encapsulate how humans express themselves to one another. Combining different design patterns with more colors onto the LED Pixel Matrix is on the radar as well. Additionally, it is a hope to train our dataset more accurately so it beats more than 40% accuracy, although most emotion detection models have relatively low accuracy (at most 55-60%).




Built with
  • arduino
  • edge-impulse
  • portenta
  • pyserial
  • python

Try it here - Github Repo