Personal Programming Project at FIEA: Dev Diary #4

Updated: Jun 30, 2019

Here I have a menu system that I added to my biometric tool. We have a playtest ID textbox for user tests, labels describing whether or not the biometrics are connected, and a couple of buttons - start, reset, and quit. As one may guess, the start button starts the capture data and the quit button will exit the application. The reset button is a little bit more in depth and not as trivial. I added the reset button for a couple of reasons mainly tied to how the heartbeat sensor was functioning. If the heartbeat sensor went out of range in terms of distance or if the user took the sensor off, then it would have a difficult time reconnecting. Also, if the experience was started and no heartbeat sensor was found initially, then it wouldn't keep checking and the API would settle for a "no connection found" error. This wasn't okay with me, so I dove into the API and learned how I could possibly initiate a reset for the system. After some time getting more familiar with the API, I ended up exposing an object in a script where I would then grab the heartbeat component. Then with that component, I would initiate a rescan for the heartbeat sensor in the OnClick event for Reset. Then if the sensor is found, I would resend the device information to the API at which would jump start the system. Once the system started, I laid out some of the feedback to the screen and added a key bind to stop the capture to return to the main menu. At first I tried using a button on this screen for stopping, but for some reason the button would only fire once no matter what I did. So I settled for a key binding instead.

Now that I had the menu system in place, I decided to work on optimizing and expanding my excel report. Originally, I had the facial coding system just spitting the numbers into the excel sheet with no real organize fashion or graphs. As seen below, this is what it looked like before:

But now, I change my logic to separate the emotion into different columns and I dynamically create graphs for each emotion! Then with the playtest ID, I can have an output folder of all my different reports.

This project is really coming together! For fun, I decided to create a horror game that used these biometrics (which I'll describe in a separate developer diary). But I will at least say that I presented the game in front of long time industry game veterans from companies like EA and they LOVED IT. They were super impressed with the experience, and pulling up the excel sheet with the data points at the end really drove it home. I think I gained some industry connections that day haha

I continued my research into CHI Play research papers from 2018 dealing with biometrics. It's really cool learning about what others did before me and how I can offer more research to this community. This time I looked into how facial expression feedback were used in Neural Network Based Facial Expression Analysis of GameEvents: A Cautionary Tale. This study found that there was an ambiguity to mapping facial expressions to simple emotions. The reason being that emotions can be much more complex. Just because someone smiles in the face of Mario falling off a ledge doesn't necessarily mean they are happy. But the results of this study are in line with previous work on physiological measurements of player affect, and better results were obtained by averaging over several game event instances. All the more reason for me to combine both emotion and psychological based feedback for more accurate results. Then if all goes well, perhaps this project and research will find its way into next year's CHI Play. :) I did not get to work on the EyeX this week in trying to capture the eye diameter. But after I give my big midterm status update to the faculty and my classmates this week, I'm going to find time to see what I can do with the EyeX software. Until next week my friends!




  • LinkedIn - White Circle
  • Facebook - White Circle
  • Instagram - White Circle
  • Twitter - White Circle