When we started designing a game for blowing and the microphone, we had a few immediate questions we needed to solve:
What exactly is the interaction?
What will the player be doing and how will they be doing that?
How do we get them to blow? How would we get them to engage with the core mechanic?
Starting with "how would we get players to blow? How would we get them to engage with the core mechanic?" In early prototypes, we asked players to blow away a dandelion and some fog. These were intentional choices. In order to encourage players to blow, we introduced a lot of objects with real-world metaphors that signified blowing. Dandelions have seeds that can be blown away, fog is something in the air we can blow at and will react to our breath.
There are 3D dandelions in the game environment, and a dandelion that appears on the UI. We tested the dandelion UI very early on as part of our calibration (telling players to "blow the dandelion away"), which was extremely effective and players really liked.
Fog was a little bit trickier. Early playtests showed us that people knew the fog could be blown away, but it was very hard to communicate interaction distance. No matter how much bigger we made the interaction radius on the fog, people always seemed to just barely be outside of the range. We addressed this issue by introducing the dandelion UI as the main driver of blowing interactions. In calibration, we tell players that when they see the dandelion pop up, it means their blowing will have an impact on the game. Pairing the in-game 3D objects that signify blowing with interaction UI was well received by players and we came up with standardization with interaction.
For objects that need to be blown on (eg. a dandelion or fog that are puzzle elements and will progress the game state), the dandelion UI will appear when they can be interacted with. For objects that are optional interactions (eg. pinwheel, or just blowing at the Wisp at any time), no dandelion UI will appear, but all of these objects must all have a real-world metaphor for an object that can interact with blowing or the wind.
Initial fog prototype and UI appearing to tell players the fog can be cleared.
UI was particularly difficult for us to tackle. We faced a few issues early on:
Players did not know when they could blow, when the could not blow, and when they needed to blow.
Players did not know when their blowing was impacting the world/players did not know what output their blowing input was causing.
We experimented with a couple types of UI. The first one was a constant HUD element in the bottom right of the screen that would show a dandelion being blown away when a blow was detected (this UI is similar to one in One Hand Clapping, where the developers show a volume meter in the bottom left that shows what sound input the microphone is receiving at all times). It would bounce by default if the microphone was receiving sound, but then play a special "blowaway" animation when near an object that needed to be blown on. Playtesting showed that players didn't like this UI, because it still didn't really show them when the game was receiving blow input and didn't communicate when they needed to blow.
We next looked at Before Your Eyes. In Before Your Eyes, blinking advances time, but only when the metronome UI appears. Players can blink freely at any other point in the game. The game will show a blink animation, and occasionally a special interaction will happen, but the game will not move onto the next scene. We implemented something similar, and around this time is also when we polished up our calibration sequence. Previously it was just a simple screen that asked you to blow the dandelion away. We added some tutorialization in calibration that tells players: "when you see this dandelion, your blowing will create a change in the world". Players received this change very well and reported it's very clear when they needed to blow. The downside is that some players became less exploratory, and blew less often to see what changes it would cause. The introduction of pinwheels (3D blowable objects in the game world) helped to alleviate this issue.
The dandelion only appears when there's an object that must be blown away to continue on in the game. The player can still blow even when the dandelion UI is not present. The Wisp will spin and there will be the blowing camera and VFX, but the game state will not change otherwise.
Calibration, first implementation.
Calibration, second implementation.
Calibration, third implementation.
The very first interaction we built was the basic dandelion. Players would encounter a cliff they couldn't get to the top of, blow a dandelion which would trigger an updraft, press 'E' to enter the breeze, float to the top, and be able to proceed.
Our very FIRST playable with functional microphone detection and gameplay!
With our basic interactions and microphone working, we began prototyping gameplay that built towards our experience goal. The flowers in the game have unique mechanics that can be triggered by the player also engaging in unique breathing patterns. We had 2 types of blowing/breathing interactions. Continuous interactions would require players to do the breathing pattern continuously to maintain an effect. Instance interactions would require players to do one interaction for a set period of time for the mechanic to execute.
The chrysanthemum, as the tutorial flower, didn't require any pattern. Just natural breathing. This interaction is continuous.
The petunia required a big angry blow (like the Big Bad Wolf blowing the house down). This interaction is a single instance.
The forget-me-not required a long, held, quiet blow. This interaction is a single instance.
The calendula (before it was cut) required calm, steady breathing (box breathing). This interaction is continuous.
The lily-of-the-valley required a quiet exhale. This interaction is a single instance.
For example, the calendula flower would make the Wisp heavy, but also be able to slow down objects around the Wisp if the player breathes slowly and calmly. The prototype was two parts. The first has the Wisp falling down and the player would have to huff and puff to blow and keep their friend afloat. Players would be stressed out and scared watching the Wisp fall. The gravity and blow counter forces were tuned so the player would always fail eventually, and fall into a deep pit. The second half would teach players that breathing calmly could slow down surrounding objects, allowing the Wisp to pass (eg. an avalanche of rocks that can be slowed down so the Wisp can pass through, unharmed). This prototype was highly successful, and players displayed distress when they realized they couldn't keep the Wisp afloat.
Calendula prototype. The Wisp is heavy carrying the flower and will fall down. The player must blow to keep the Wisp afloat.
The unique flower and breathing mechanics were a difficult challenge to resolve, both from engineering and design perspective. We had no idea how we would be able to detect the continuous interactions for chrysanthemum and calendula. Having the microphone continuously listen for specific patterns would make false positives (the pattern was detected but the player didn't actually do anything) really problematic. Additionally, if people breathed very quietly, the microphone would not pick it up at all. Blowing is a little bit easier to detect because it does create noise.
One of the project's advisors, Zach Lower--the lead designer for One Hand Clapping-- gave us some advice: "are we using the response of the game to get people to sing or make sounds a certain way, or are we using the way people make sounds to show a response from the game?" In essence, he was describing if we want to build a very robust and accurate system that reacts to the way the player behaves, or do we want it to be loose and encourage players to physically do what we want, but the system isn't actually strictly checking for it? Our goal is the latter. As long as players are engaging with the game in the intended way, our experience goals would be met.
This meant two things:
For design, their main focus was to build experiences that encourage players to breathe/blow the way we want them to.
For engineering, they didn't need to build a system that was perfect, but they needed to find ways to detect the interactions.
Design prototyped the chrysanthemum first. The initial implementation for the chrysanthemum was that it would automatically clear fog as long as players were holding the chrysanthemum and breathing in time with the flower. Players enjoyed and engaged with the mechanic. Additionally, one player even reported that they were disappointed when they found out the system wasn't actually listening for if they were doing the pattern correctly or not.
Original chrysanthemum mechanic and UI. Players needed to breathe regularly. If they were doing so and holding the chrysanthemum, fog would clear automatically (instead of needing to be blown away by the player).
Design prototyped many different types of UI paired with the interactions. However, there were too many interactions and players were still confused about a lot of things:
"I wish the game let me know if I was doing the interaction correctly or not."
"Is the game reacting to me or should I be breathing in time with the UI?"
"Is my breathing actually having an effect on the fog?"
"I missed the UI I couldn't see it or read the text..."
"Is it the flower doing the interaction or is it me...?"
"How long do I have to repeat the pattern? Do I have to at all or is it still singular blows to clear objects?"
"What is the UI telling me to do exactly..."
Changes to our flower mechanics.
Unfortunately, almost all of this had to be simplified because there was too much mixed feedback from players that it was not within scope to address it all. Additionally, one player even reported that they were disappointed when they found out the system wasn't actually listening for if they were doing the pattern correctly or not.
We simplified all of the interactions to be instance driven, only interactable when a UI element like the dandelion appears. There are now only two forms of complex blowing interactions in the game: single blow (dandelion UI) and held blow until an action is completed (forget-me-not/pinwheel UI).
Interaction UI prototype.
Interaction UI prototype.
Speech vs. blowing detection has always been our greatest challenge. It's not uncommon for players to get through the game just by talking or making nonsense noises. We want players to be able to play the game without their speech impacting it, so as a stretch goal we are investigating speech detection and how we can cancel it.
Video by: Julia Wang.
Early speech vs. blow detection implementation.
Engineering has researched a couple of approaches and has conducted a lot of tests studying the sound waves of different interactions. To gather this data, we built a microphone sandbox and asked players to interact with objects and study how they interacted with the microphone and if they could understand interactions.
The data charts below show 5 testers that were all asked to do 4 different interactions:
A short blow (like blowing out a candle).
A long blow (like an exhale).
Reading a sentence ("the quick brown fox jumps over the lazy dog").
A scream.
Engineering is still researching and testing different implementations.