Welcome to my first dev diary entry for my Personal Programmer Project!
In my third semester at Florida Interactive Entertainment Academy at University of Central Florida, the programmers were tasked to come up with our own PPP (personal programming project) that would span 11 weeks. We needed to present to the class, and more importantly our instructors, what our project would be and the timeline for the project. The instructors would then give feedback about our proposals or any suggested directions for the project. As soon as this assignment was announced, I started thinking about my project which I started working on it right away once I figured out an idea.
The Proposal:
After some consideration, I found myself really interested in the idea of biofeedback enhanced experiences and metrics that would quantify this experience. I once wrote a game design document for game design class that outlined an idea for a first-person survival horror biofeedback enhanced experience- much like the biofeedback-enhanced adventure thriller Nevermind but with some twists. The game document was well received by industry veterans who encouraged the idea for it to become in fruition. Also, the independent game company Flying Mollusk, the developers of Nevermind, really showcased to me that the technology is there and that it works. But obviously I only have 11 weeks, and something with the scope of Nevermind would simply be impossible with limited time along with resources. So, I thought to myself- what if I can just capture a small part of that experience? I was led to the idea of Jesse Schell’s concept of the interest curve. The way an experience should be paced almost like an up and down roller-coaster if plotted on a graph. With biometrics, I’m thinking I could showcase exactly what makes a good gaming experience with the right engagement factors that are well paced. I would capture data and see if I can make part of the interest curve successfully and find experiences that don’t make the curve. So, my purpose: quantify the interest curve (and possibly other reactions) through biometrics to learn and showcase how to craft these experiences.
I planned to propose the idea of creating a short biofeedback enhanced first person horror experience. I would use a standard webcam for emotional feedback, along with a Bluetooth heartbeat sensor for psychological feedback. I would also try any other biometrics that seemed doable. I would use the metrics for research and dynamic gameplay. The gameplay would be altered based on the feedback. At the end of the experience, I would then have this data to see if I successfully created the curve. If not, I would continue to work on the experience to see if I can make this happen. Since the Affdex facial coding system SDK has only been used for Unity, I decided this would be best created in the Unity game engine. I also thought it would be cool if players could compare their results in highscore board.
In my first week before I proposed the project to the class, I went ahead and took some steps for the project. I contacted Affectiva and their partner IMotions to learn more about their technologies. They told me that there was a department at my college that actually had a license for IMotions. I have yet to find out who, but in the meantime I do have the SDK to work with. Then I contacted Flying Mollusk, the creators of Nevermind. They were very happy to hear about my project and have been in touch with me with all kinds of advice along with direction. I look forward to staying in touch with them through this process. Next, I wrote the short story and gameplay elements for my game, created some basic starting scripts, and my friend started white boxing the level. Lastly with much enthusiasm, I prepared my proposal that would most certainly begin with the words, “As game developers, we often have to think about our target audience…”
The Feedback:
The day came where I finally proposed this idea in a form of a PowerPoint to my class and even beginning prototypes to show off some of it. I think my excitement for the project really showed with my preparation for the talk. Maybe a little too exited because I went about 15 minutes and I was supposed to leave room for feedback in that time. Regardless, the feedback came inevitably…. Although the instructors liked the idea of using biometrics to quantify things like the interest curve, they did not like my approach to this idea at all and rightly so. They made a good argument that my steps for this project should be one that would prove that these tools could even work to quantify such a thing like interest. How would I know if I’m not working on faulty data when building my experience? I need to first create the tool, and then test it on already built experiences. Something that would take many many weeks to create, test, and prove. As bummed as I was to hear this because of my excitement for this cool game idea, I had to face the truth that they were right about this critique. This would mean that I would need to change my tasks for this project in hope that the new path would be a more successful one. So here I go!
The New Direction:
After the feedback, I found that it was time to really restructure this project. I decided the first step was to restructure the goal and timeline. My ultimate goal of wanting to quantify this idea of Jesse Schell’s interest curve and other information through biometrics stayed the same, but my approach is completely different now. The new timeline I will provide below, but in a nutshell, I would give a great deal of time to creating and perfecting a biometric system tool that I would then use for lots of different tests. I would also explore different biometric tools while creating my system, and what works or doesn’t work. Between every user test, I would constantly analyze all the feedback that I got and continue perfecting and expanding this tool. If I can perfect this tool enough, maybe I can even extend this tool for others to use or even for my future video game projects. So far, I was able to get the Affdex facial coding SDK to work in Unity and started digging some in their library. I messed with hiding and focusing on certain features of the face and research what emotions would be relevant. Also, I created an ANT+ BASICS account, which is apparently what I needed to get my MIO Link Heartbeat Sensor to transmit data to my computer through “device profile” settings. The annoying thing about this though, is that making the profile wasn’t enough. The account had to be confirmed through an email which is sent out the next business day. But seeing that it was a 4-day weekend with Monday being Memorial Day, the “next business day” was 4 days away. But finally, tomorrow is the day so I hope this heartbeat can be found in the computerize world! I feel I made substantial progress with the new direction of this project. Let’s hope my first 1-on-1 with my instructor this week goes well with my new proposal essentially, and I hope these week tasks run smoothly.
The New Timeline:
Week 1 - 5/20
-Set up account for ANT+ BASICS (confirmation in the next business day, so Tuesday after holiday) -Re-think and re-write proposal timeline -Create the first major blog entry with everything that has happened so far with the project and setting up the reader for what is ahead
Week 2 - 5/27
-Research what expressions/heart rates are important to focus on for interest and tension and how to interpret these factors -Get the heartbeat sensor set up with engine and working -Get eye tracking set up and working -Next blog entry
Week 3 - 6/3
-Research the possibility of Galvanic Skin Response sensor and getting it set up to use for this project, if successful then set it up -Research the possibility of Electroencephalogram and getting it set up to use for this project, if successful then set it up -Unify all these tools into one project in Unity -Next blog entry
Week 4 - 6/10
-Start the Gather Metrics script that will gather the needed metrics for these tools -Write export script for Metrics -Next blog entry
Week 5 - 6/17
-Create graphs that will use the data -Write highscore board -Test video recording latency compared to stored metrics/time -Test metric system with game demos on myself -Plan for the first live player tests that will use the biometric system -Next blog entry
Week 6 - 6/24
Play Test 1 & Status Update -Game 1 for live player test: The Last of Us (this would need a separate webcam from laptop since the player would be focused on a TV instead of laptop -Game 2: Assassin’s Creed Odyssey -Get feedback from players with both biometrics and their written feedback (which I would read later) -Next blog entry
Week 7 - 7/1
-Analyze data from all the play sessions with created graphs and look at any patterns -Compare to Jesse Schell’s interest curve concept -Tweak scripts for any wrong interpretations/calculations data -Write hypothesis based on biometrics and specific moments that I noticed, and see how closely that relates to their written feedback -Next blog entry
Week 8 - 7/8
Play Test 2 (players can choose games from list – but each must play a classic in addition) -Game 1: Tetris, Outrun, Pacman, etc -Game 2: Overwatch -Game 3: Portal -Game 4: Doom -Game 5: Psychonauts -Get feedback from players with both biometrics and their written feedback (which I would read later) -Next blog entry
Week 9 - 7/15
-Analyze data from all the play sessions with created graphs and look at any patterns -Compare to Jesse Schell’s interest curve concept -Tweak scripts for any wrong interpretations/calculations data -Write hypothesis based on biometrics and specific moments that I noticed, and see how closely that relates to their written feedback -Next blog entry
Week 10 - 7/22
Play Test 3 (Student Projects) -Game 1: Malediction -Game 2: Studio Scrapbot -Game 3: Snowmad -Game 4: Divine Beast -Game 5: One of my personal games -Get feedback from players with both biometrics and their written feedback (which I would read later) -Next blog entry
Week 11 - 7/29
-Soak testing -Finalize menus -Practice/Present presentation
Comments