House Of Ignotus
A horror themed puzzle solving game with a heavy emphasis on psychological horror. Play as a quantum physics research assistant in search of his missing professor and solve puzzles to escape the mansion.
Stuff I Did
Firstly, when the project is being discussed, one of the most important key inside the Final Year Project was doing what we wanted. So I chose to become the audio programmer of the game. With that, we also discussed of the important instrument that is needed inside the game. For me I was contemplating between usage of middleware which is FMOD Studio or Wwise. We came up with the decision to choose FMOD Studio because the UI was friendly and easier to be use.
So the key important of the game was also using the Git for all the commits for the game so that everything will be in the cloud. FMOD can be used by one PC only but other people will have issue in which they won’t have any sound unless they installed FMOD Studio and sync their progress there. So me and some programmer discuss on building our own Unreal Engine from scratch integrated with FMOD and Nvidia Flow plugins. After that also there’s a submodule installed specifically for FMOD so that all the BGM and SFX will be stored there and also will be sync by developers who pulled the latest master.
Using FMOD was easy but the hard part was making sure they are playing perfectly, there was trouble with FMOD usage for me in the first few days but after that I did more research for it and do documentation. I understand most of the usage of FMOD including set up BUS, Snapshots, Mixers, Syncing, Parameters and many more. Image below will show like how many of functions I use for the game to make sure the audio play as I programmed it to do.
According to the rules set up by my Tech Lead and Tech Advisor, managers are really important in our project as its depend on it heavily. The flow of the development will be much easier if there’s manager in the game. Also it’s easy for other developer in the team to do their stuff by reading the class diagram, documents and the manager we produce for them. So I also tasked to do the SoundManager for easier access for everyone to just play their sound or testing by just using a variable “Name” type.
On Event BeginPlay, I did this on the first to check the variable of the SoundManager available or not in the level. Also putting the variable SoundManager in the Game Instance for easier access anywhere in the game project and won’t be that dangerous when using it every time.
As the SoundManager is tied with the props and item inside the game project. There’s also variable for them when picked up, dropped, swapped, and many more. Also included the make noise function that is produced after making noise with the props and item for the monster to know the location of the monster. I did the whole SoundManager with the using the Get(ref) with the current pickups of the type of props and items. So to make sure that everything is connected and can be played easily without changing much of the code.
I did this by taking the function that one of my team did in which I can play sound based on the name variable type easily without changing anything on the code. Everything is tied to each other so it won’t overlap sound with each other as well. There’s also “Check Distort MR” boolean that I make for the Area Space Effect distort sound special for the music room.
As the game progress, if the player did or fail something the sanity level will drop. If the sanity level is drop, it’s also my task to play the sound based on the sanity the player is on so that it’s an indication they are in that kind of sanity level. In the figure image, I use a master collection which connected to the “Play Sound” function to play according to the variable name type.
I did the play sound function based on the custom variable set up in the figure image above using the variable type “Name”. The play sound function works as intended and can easily get it by using the game instance stored previously anywhere inside the project. Also the sound can be played using custom event that is tied to the animation of the props or item. For example, the music box sound effect is being triggered by the animation of the music box.
I set up the music box inside the FMOD by mixing them both up which in audio track 1 have the original version while the audio track 2 have the distorted version. This distorted version will only play when the parameter value of “DistanceMB” is near with props or monsters. In the second figure image there’s an automation volume I set up so that it won’t interfere with each other when playing the Music Box. By doing this it will transition smoothly between the original version to distorted version.
I set up the music box to only trigger or start its music by using the animation blueprint in which will play the custom event to play the music box. Also it produced the return value to be used on pausing the music box when pocketing the item. By using FMOD it helps to play and pause it without having to do extra stuff to get what designed for the music box. Also when using the music box it produced sound emission which goes through the make noise function to play and call the monster on a distance.
A lot of trial and error to solve the music box setting to paused when pocketing it. Here what I did is that when the state of the music box is set to powered off, the music box will set to paused using the “Set Paused” from FMOD. One problem encountered during the testing of the music box, when restarting the game, music box won’t produce the music anymore. I solve this by stopping the music box event when restarting the game, so it won’t interfere or have any problem similar to this occurring.
I used play on distance for the music box as it won’t take a lot of processing power by using this method. I make the function as pure function for the “IsNearProps” and “IsNearMonster” and using the return value of the music box I can easily change the parameter. By adding the distorted alpha it will make the transition more smoothly as to not disturb the player immersiveness when they are near the puzzle props or monsters.
The Technical Art Documentation purpose is to help the artists on general purpose of development in the team such as installing the custom engine of the Unreal Engine the team uses, using Source Control and Asset Naming Convention setup and discussed together with the artists and programmers. This gives the artists and programmers to follow the correct convention naming and other stuffs related to art to follow the documentation that is already compiled and agreed together.
In order to help the artists to be familiar with Unreal Engine as soon as possible for the development of the game, the Technical Art Documentation has been compiled and set up which comes handy. A pipeline has been set up for them using the standard Unreal Engine instructions in which can be convenient for the usage by the artists. One example as shown in the image above is the Content Import Guides, in which on how to import FBX from one source files to the Engine. This method need to be done as the following so that there won’t be any data gone or problem with file locking when saving the Engine. It helps the artists to know which pipeline that they needed to follow if they forgot the steps for it.
These documents purpose is to help the team on how to set up the sublevel inside the game and how does the sublevel communicate with each other. It’s important because the sublevel is used for the loading levels and optimizing the game.
I did the Game Audio Direction Documentation so that I will be following on how the sound and music should be playing inside the game. A set rules that I set that I need to follow so that it won’t stray further from the horror genre that the team designed to do.
To produce 3D Sounds, at first I use reaper to submix the audio from stereo to 7.1 sounds. The process is not that hard but it takes a long time to do the submix as every process require precise details to it. Also I used different kind of software such as Nuendo and Audacity to make the sounds sound better. In Reaper to change to 7.1 Surround Sound, I used the Reasurround plugins and make it to 8 input and output so it can be submx easily. After that tracks will be cut down, trim and polish. Finally it will be rendered and export to the designated folder to be used in FMOD.
In FMOD after the file was import in, I changed the input and output to support 7.1 Surround Sounds. And finally build it to the engine itself. It produce a significant result of how 3D Sounds should work inside a game and most importantly doesn’t degrade the FPS problem.
I use reverbs to make the sounds feels for the player they are in a mansion. By using reverbs I can produce the same kind of situation where long corridor hallways have longer reverb effects as player walks in.
I set up first by using snapshots in which will send the SFX group busses to a reverb bus which have different kind of reverb. Each room or hall have different kind of reverb produced for them. This helps the player feels as they are in a mansion. The value can easily be changed in the FMOD itself and most importantly won’t overlapped with each other.
Setting it up reverbs value from FMOD Studio is quite easy as the code is nearly similar to what UE4 produce so it can be used with FMOD API as well. Each room or hall has its own audio volume which can be input their own reverb effect. For example the living room has its own reverb setting I implement based on the value set up in FMOD Studio.
The music room as designed has special sound effect when enter. When player is walking it will sound like the player is walking on a glass or running through a mud. This also effects when player dropped an item, picked up an item and many more for the sound effect. This effect only happens when the room is fogged. After solved it will be sound back to normal.
In House Of Ignotus, the game rely heavily on the music and sound to produce the ambient of scare required. It gives the player the sense of something is different with the music as they progressed along the game. This we called it adaptive music which adapt to what kind of situation the player is in.
In this function which is placed on Event Begin Play in which is to play the Introduction Music when the player start to play the game. By using this function that is produced by FMOD in UE4 it became easier to control multi event. As the sound is a BGM playing it on the Actor Transform means it will play it on 2D.
As the function is being called from the trigger box, all BGM will be played adaptively. Means that based on the situation the player is in it will play the BGM based on that. As the music is being played on Normal Situation, it will return a value of Sanity Event Instance. This value will be used to change the BGM according to the situation the player is in.
Based on the return value of the sanity event instance is being used, we can change the BGM according set inside the FMOD into Unreal Engine itself. For example, the Sanity suddenly dropped from level 0 which is normal to level 1 which is anxious. The return value will change based on that and play the level 1 event instance in which produce the adaptive music that we wanted inside the game. Also in the figure image include on what kind of situation the player is in such as when the monster is on alert mode or chasing mode it will change accordingly to that situation.