I am a driven and ambitious computer science student currently pursuing a 1 to 1 degree at IADT. With a passion for innovation and technology, I am already making waves in the industry, working full-time as a technical specialist at Microsoft's MTC. In this role, I oversees experiences as a project manager, ensuring that projects are delivered on-time and on-budget, while also providing technical expertise to clients.
In summary, I am a highly motivated and accomplished computer science student and professional who is making a name for themselves in the industry. With a strong technical background, I am poised for continued success in my career and academic pursuits.
This video demonstrates the functionality of this application. It goes into great depth, explaining the code line by line. It also contains a brief discussion of the technologies used throughout the development of the application and details the steps to submitting an image for object detection.
My thesis incorporates facial feature detection using a Haar classifier to identify specific facial landmarks such as the eyes, nose, and mouth. The system generates a confidence rating for each emotion detected using the emotional facial recognition system, presented as a linear spectrum for greater transparency and accuracy. Custom parameters are also included to tailor the system to different use cases and environments. To enable manipulation and storage of data, each instance of emotion detection is converted to a JPEG image format, allowing for easy tracking, graphing, and analysis of results. This system aims to provide a highly customizable and accurate emotional facial recognition solution.
Graphing and File Saving:
To aid in data analysis, my thesis project includes a system for graphing and saving data. The system allows for real-time graphing of emotional expressions detected by the facial recognition algorithm. Additionally, the system logs all data and saves it as a CSV, Excel and .PKL file for later manipulation and analysis. The file saving system was implemented using the Python Pandas library, which allows for easy manipulation of data and integration with other analysis tools. Overall, this system provides a valuable tool for analyzing emotional expression trends over time and can help in understanding the effectiveness of emotional recognition algorithms in various contexts.
The physical component of my thesis is a Smart Mirror that integrates an infrared touch frame and a see-through mirror acrylic sheet with a 32-inch monitor. This Smart Mirror displays the emotion analysis results generated by the Facial Recognition System on a real-time graph, allowing users to track their emotions over time. Its compact design, which includes a wooden frame, makes it easy to install in different settings. The Smart Mirror is a novel and innovative addition to the system, providing users with an intuitive and interactive way of accessing and visualizing their emotional data.
Here I go into great detail on how the mirror is made, its functionality, as well as its results.
The Raspberry Pi and infrared frame were integral components in the creation of the smart mirror, which was designed as a physical complement to the software system developed for the thesis. The infrared touch frame allowed for user interaction with the mirror, while the Raspberry Pi served as the computing platform. Building the frame from scratch required a significant amount of woodworking expertise, which I had to learn and apply during the process. The final product was a result of careful planning, design, and execution, all based on the objectives of the thesis project.
In my thesis project, I initially implemented Azure Face API for facial recognition and feature detection. However, during the development phase, I found that many of the API functions that I was using became deprecated. Therefore, I had to switch to alternative libraries and implement my own algorithms to achieve accurate facial detection and feature recognition. While the use of Azure Face API would have provided a convenient solution, the deprecated functions made it necessary to find alternative approaches. Despite this setback, the experience allowed me to gain a deeper understanding of facial recognition algorithms and how they can be used in real-world applications.
The code recognizes emotions and faces in a webcam video stream using OpenCV. It loads pre-trained face, eye, and smile detection models from OpenCV and then opens a connection to the default webcam. It captures frames from the video stream, displays them, and processes each detected face. For each face, it draws a rectangle around it, crops the face region from the frame, and then detects eyes and smiles in the face region using the eye and smile detection models.
If both eyes and a smile are detected, it draws rectangles around the eyes and the smile, and then searches for crow's feet or wrinkles around the outer corners of the eyes. If crow's feet are detected, it draws a green circle around the eye. The code now has been modified to calculate the confidence percentage based on the presence of a smile and crows feet around the eyes. The confidence percentage is added to the text displayed next to the recognized face, which is obtained from the name parameter passed to the recognizeemotionand_face() function.
This thesis project aims to develop an Emotional Facial Recognition System (EFRS) using Artificial Intelligence (AI) and Machine Learning (ML) techniques. The EFRS system utilizes image processing techniques and facial recognition algorithms to detect emotions from facial expressions. The project focuses on developing a modular architecture that enables integration with various facial recognition algorithms and AI models. In addition, the project involved designing and building a physical component - a smart mirror - to provide a user-friendly interface for the system. The project also includes various testing methodologies, including functional testing, unit testing, and user testing, to ensure the system's reliability, accuracy, and usability. The system also tracks, graphs, and saves the data into CSV, Excel and PKL files for later analysis and manipulation. Overall, this thesis project aims to provide an effective and efficient system for detecting emotions in real-time, which can have numerous applications in various fields, such as healthcare, education, and marketing.