Mind the Gap – Unity Developments and updates to GDD and Script

During the week I have been working on making some amendments or updates to my scripts and Game Design document for ‘Mind the Gap’ while also working more on my storyboard in Tiltbrush.

Mind the Gap Unity Developments

I have spent some more time working on the environment project build for Mind the Gap by continuing to use the train model I had previously found on Turbosquid. The train has now been set up so that there are two carriages together with multiple spotlights up and down the carriage to create the underground effect.

The whole environment is contained inside solid black walls to simulate the train being in the underground hopefully leaving the outside in the dark during the whole experience. Two lights have also been set up outside of the train which are animated on a loop to continously pass the trian on either side to give the impression of passing lights in the tunnel.

Some of the assets and objects are now also set up in the train carriage such as multiple luggage items being set up in the carriage which are grabbable by the player to show the ‘Test of Strength’. As you can see from the image below using Unity models i have also tried to create some hand rails as part of the ‘Test of Speed’ sequence for the player to complete.

The image below also shows the pressure pad that the usert will need to find and pick an item to place on which will unlock the second carriage in the experience.

Unity Next Steps

The next steps for the unity environment is to finalise all objects and stages in the experience so that everything is in place with little interactivity at this given stage. I also need to find a way to create a train drivers carriage in the front of the train for the final stage of the experience. At the start of the story the player will also see multiple NPC characters sat around the carriage and so i think it will be great to find some models and perhaps add some animations to add a sense of immersion or engrossment in the starting environment.

Updates to my Script

After recieving some feedback for the third draft of my script I started to go back further to refine and finalise my script to be in a more polished state. Most of the feedback was to clarify some meaning or reason behind some story elements while also including my detail and clarification on some of the level design and environment that the players will be engaging with.

AR Ceramics – Recording Sound, Vase Animations and Modelling

During this week i have tried further experimenting with my concept for the AR experience in Vuforia. Using the model that i had previously created in Unity and recieving some feedback and input from my tutors I started to experiment with a larger quantity of the model to achieve full coverage of the Vase.

Using an approximate sphere model based on the actual side of the Vase I started to move, rotate and plot all of the different flower models around the Vase. This took me a lot longer than expected due to have to carefully position and rotate all of the models to give a curved effect around the object but I was able to eventually achieve this.

The imags below show the effect live in my webcam which luckily did work successfully and if you rotate the vase around it does give quite a convincing effect that it is moving around the curve with the Vase which is the effect I wanted.

Blooming Animation

While the models work correctly through vuforia by being displayed around the image target I also tried to use some of the animation options in Unity. This meant i could have a concept of the flowers changing sizes, rotating and also offsetting from one another to create a type of pattern. After experimenting with these different animations, I feel confident that this concept will work overall for the final experience.

Exporting my Blender Animation (problems)

Alternative Flower Model

Although the initial flower tutorial i followed did create a nice flower model which i felt would suit this project well, I decided to try out a different tutorial more based on a lily pad to see if I have more success exporting the animation. The screenshots below show some of the steps i followed in creating the flower model.

One aspect of this tutorial I quite enjoyed was the in depth look into the shader section of Blender and creating multiple colour options including shadows and textures which turned out very nice overall.

Unfortunately I still had some troubles trting to export my flower model where the animation does work and when i export the model in different ways only the central petal seems to have any animation and the outer petals dont seem to be moving. I will need to spend some time this week trying to find ways to export my models to then further experiment in vuforia.

Recording Sound

This main theme of this concept is of course around flowers and plants so sound effects of this nature would make sense for the AR experience. I started to go around outside and luckily I am nearby an ecology park where hopefully I would be able to record some different sound effects. I also want to record some water dropplet sound effects now that i have the option of using the lilypad model animation which i think will be suitable for my AR experience.

Next Steps

The next steps for this Augmented reality experience is to achieve a successfully exporting blooming model from Blender and to also find and print a better more visually appealing image target to place on the Vase. As a stretch goal it would be interesting to have another option for an image target which has a different animation effect or plant/flower.

Stutter – Credits, Navigation, Hand Models and Difficulty Transitions

Hand Models

One of the main aspects for this experience was of course to add some hand models so the player can see where there hands are in a virtual space but this is something I left until later. I started to research online ways to add hand models to the XR interaction tool which didnt seem to difficult overall because you can attach and resize imported objects to each controller which will then appear on screen. At the moment the hands are currently static models that dont have any motion or animations but I am hoping that gesture or grab animations wont take too long to implement into my gameplay.

Adding Navigation

Navigation is another aspect where originally I wanted the experience to only work if the user can actually physically step around the environment but then of course I needed to think about the accessibilty of the experience and how some people might play sitting down. Researching more into the XR interaction tool it was also not too difficult to add quite a simple form of navigation to the experience.

I have now added a continuous move provider which is based on a locomotion system so that using the left hand controller joystick the player can move around the environment at a relatively fast pace which give the oppurtunity to further explore the environment they are playing in. I then included a Continous turn provider which allows the user to turn the camera left and right using the right hand controller. This is currently set to a continous motion which after a few attempts gave me slight motion sickness so i want to research into this further to have set turns that will isntantly turn the player in 60 degree increments for example.

Boundary Walls

After adding navigation it became apparent to me that the boundary walls that i was going to originally add to the experience might not have a use to stop the player falling out of the environment. This was quite simple to add walls which have a collider and rigid body with no mesh to create a visible boundary that the player cannot cross to try and leave the environment.

Introduction Screen

After having our lesson/class on ethics in virtual reality i knew that it was extremely important to include some information or a disclaimer screen message to explain and justify the experience to the user. I also fet that including a title screen also just helps to round off the whole game as a full experience with a clear start and a clear end. To create this I decided to use a series of animations to keep it simple so that tht title ‘STUTTER’ would appear followed by the first disclaimer and then the tutorial informaiton for how the player can start the experience.

Outro Screen

The Outro screen was made in a very similar way to the introduction screen which was made through a series of animations all timed to fade in and out after one another. The outro screen starts by displaying a message of information about speech impediments and Stuttering which is followed by a message that i think is important to show players to wrap up their understanding of the full experience.

Typography/Word Updates

After some initial viewings and suggestions to my VR experience, I was told by multiple individuals that the typography that i had chosen is quite difficult to read and if I had done this on purpose. I knew that after quite a few people had mentioned this that i would need to change the typography. So i decided to go back into my original Illustrator file to change the fonts to something much more legible as you can see from the images below.

Difficulty Transitions

For this project I wanted to avoid having different levels and transitions to keep the scope and time frame for this project achievable in the given deadline. This meant that transitions between my difficulty ‘layers’ would be one of the most important aspects and something that I knew i would need to leave until last after everything else in the experience is completed.

This week i focused on ways that i could have the transitions take place by fading in and out the different sections or ‘Parent’ objects that i have seperated each difficulty, disclaimer and conversation. To start this process I created a canvas that would be triggered using an animationt to create a fade to black effect by block the players view while different sections load in and out of play. This was then achieved by identifying each section as a game object and using a ‘StartCouroutine’ and ‘IEnumerator’ i could time the transition effect between each difficulty level.

Each of the ‘CorrectWord’ boxes were then assigned a counter so that every time a correct word was shown in the box, the counter would go up by 1 each time. This then meant that if all three words were correct in the easy difficulty the counter would go up to 3. The transition was then triggered by an if statement so that if the counter was at 3, then the animation fade would start and the next difficulty level would load in. Each difficulty was assigned a certain counter number and so on which luckily achieved this effect with quite great success. I still need to play around with the timings and length to see what works best but im confident over time I can find a good length of time between levels.

Next Steps

The next steps for this experience is to start some user testing to see if all of the gameplay mechanics and understanding of the experience works for the user or player. I will also need to go back over and check the sound and audio for the whole experience and make sure that the timings for transitions and difficulty lengths work together to make a satisfying game for a player.

Mind the Gap – Finalising the Storyboard and Game Design Document

Over the past week i have tried to finalise most of the main aspects for this project which are the script, Game design document and the storyboard for the full experience.

Game Design Document:

I have now developed a full game design document for the ‘Mind the Gap’ Experience including details of dialogue, sound, environment and step by steps of each section and puzzle for the user.

Storyboard Updates:

I have now tried tried to to further develop and have all of the scenes and areas of my experience now drawn out in tiltbrush. The first scene below is where the player will start and showing multiple NPC characters who are sat around the train carriage environment.

This scene shows the first test which is the ‘Test of Hearts’ in which the playe will need to choose one of four items which are located and highlighted around the carriage. One of these items of choice will then need to be placed on the highlighted pressure pad located on the left hand side of the door to gain access to the next carriage and continue the experience.

The last scene of the experience is where the player will meet the Antagonist of the experience who is posing as the train driver. I did not have a smaller model for this area so this was hand drawn in tiltbrush which i may try to edit later as it does not fit the aesthetic of the rest of the storyboard.

AR Ceramics – Unity Development and Camera Tests

Now that i have looked at ways to develop a flower model and animation for my AR project i wanted to test how successfully these models will work in action and on a curved surface such as a Vase. I tried some initial tests with the image target of my notebook i was using last week and this again seemed to work fine on my webcam with the model showing up as a high quality which is reassuring for what i had created in Blender.

Full disclaimer i must admit that the environment i was testing all of my assets in was not very well lit but worked overall for my initial tests. I found a small vase i could use for the tests which may not be the final object that i use for the project but will work great as a proof of concept to see how I can model everything onto a curved surface.

I then printed a small QR code to simple use for these tests as i would like to have a more appealing way or image target on the final product for the end result. The QR code was then taped to the vase just to act as a simple trigger for the experience and once again using the flower model I had previously created I made a new image target. Luckily (Although lighting did not help) the image target was successful and the flower did appear on the side of the Vase which i was extremely happy about and was excited to move forward testing out this idea.

After this i started to move the flower models outside of the QR code and in different areas on all X,Y and Z axis to see how it would all interact. Each model was moved in different rotations and all had animations to make them spin within the area to see how movement would work. Luckily after testing this it also worked successfully overall and through trial and error I could the models to look like they are balanced on top or coming from the Vase. This worked even with my low light conditions during testing and so i was fairly confident this would work a lot better in a well lit environment or perhaps through a mobile device with a flash light.

I also wanted to do a small test to see what it could possibly look like with the flowers actually blooming from the Vase and so i tried a simple animation of the objects scale going from 0 to full size to show this effect and again this worked very well as a proof of concept as shown in the videos below of the process.

Next Steps

I am happy with how the initial tests worked for these models and i think that the next stages will be to develop better models and animations to actually be triggered on the Vase and another great way will be to add interactivity buttons which appear first on the ceramics after they are scanned by a mobile device. This could also be developed where i have multiple Vase’s with different animations or models appearing to show how this could be used in an exhibition with multiple ceramic pieces for users to experiment with during the experience.

Stutter – Developing Medium and Hard Difficulty Levels

Now that i have managed to get most of the first difficulty working and in the Unity project fully playable I felt happy to start setting up the other difficulties for the project. The easy difficulty acted as a great framework where i could replicate what I had already created and scripted in unity but in a great scale for the other sections

Medium Difficulty

For the medium difficulty based on my script the user will need to create a four word sentence and so there are now four placeholders on the table in front of the player. For some added difficulty four new words have also been added to float around the player and the speed has also moderately been increased to add a slight sense of speed to the situation to make it more difficulty for the player to spot and grab the words quickly. This of course is how i want the mind to represent some of the difficulty if they are put into situations they find difficult to speak with.

Hard Difficulty

The hard difficulty is quite a sudden jump compared to the easy and medium difficulty with the user now having to create a six word sentence. There are now twenty four different words floating around the user and spinning at a pretty fast rate that will be quite difficulty for the user to find and grab the words within the time that the audio is becoming far more intense.

Tutorial (Starting Area)

One thing that I also need is the tutorial area that the player can stand in for as long as they want before starting the full experience. I created this with only the word ‘Hello’ floating around the player so this was very simple to create based on what I had made previously for the easy difficulty.

How to End the experience

This concept for a virtual reality experience could technically go on for a long time if you really wanted it to to have a massive mind full of different words and large sentence and typographic structures to create in a given time. However this experience needs to be short and so i feel that this could be ended in a way that I have personally not been able to finish conversations due to the anxiety and social mobility of certain conversations and situations.

It was suggested to me that the last word that the user needs to place could just never snap in place and it is impossible for the user to finish the sentence. This is a good representation of how difficulty it can sometimes be for us to speak and how we put ourselves in a corner and cant finish our sentences. This can carry on for a short while and eventually fade out over time leaving the player in a dark space showing that the player has given up trying to speak. A short piece of text can then be shown at the end of the experience explaining the situation and perhaps a way of showing awareness and a way of understanding how some people struggle with speaking.

“Stuttering is a speech disorder in which the flow of speech is disrupted by involuntary repetitions and prolongations of sounds, syllables, words, or phrases as well as involuntary silent pauses or blocks in which the person who stutters is unable to produce sound. Almost 70 million people worldwide stutter , under 1% of the worlds population with some receiving therapy but there is no cure for the disorder at present”

Patience and understanding without judgement can show a small act of kindness in someones day”

The Ethics of this experience

I felt that this experience might have some ethical challenges due to the somewhat sensitive subject matter of what i am conveying to the user. While i personally have a stutter, i know that people deal with speech impediments in very different ways and some of course have different variations and intensities of speech disorders. So i want to add a disclaimer at the start of the experience to explain to the user what this is and how it is a abstract work of fiction:

Disclaimer – “The following is a work of fiction and based on the personal interpretation of one individual and should not be taken as a definitive representation of the human mind or a speech impediment such as a Stutter. Each person can experience a Stutter in many different ways and different levels of severity in different situations and stages of life.”

AR Ceramics – Initial unity development and animations

Creating The Flowers

For this idea I want the flowers to bloom out of the Vase like they are growing out of the imagery or the blank canvas on display. I have always wanted to try to create some 3D modelling in Blender or Maya so i decided that this would be a great way to follow some tutorials online to see what I can create.

I found multiple tutorials but one i felt was exactly what I needed where in the images below you can see the steps i followed to create a singular flower which had multiple petals that i thought would work well for this piece.

Blooming Animation

After following steps to create a flower model I also wanted to create some animations for the piece and using this tutorial it showed me how you could
achieve this. Using a cyclinder you can have the flower essentially squasged into a tall short petal to give the flower a blooming effect and by adding some of the physics options you can make this into quite a beautiful flowing effect which will
add to the vase greatly.

Modelling the flower was very interesting but I did struggle with some of the animation effects and exporting it as a useable file for unity. I do feel that i am confident in what I have learned this week in blender but i will need to develop this further for further unity tests and development to make this project successful.

Mind the Gap – Storyboard Developments (Tiltbrush)

After practicing with tilt-brush i wanted to delve deeper with creating my storyboard and actually filling the full train carriage with more of the puzzles, prompts and direction that the player will follow in the experience. Over the passed week I was able to draw the storyboard for the main three puzzles of the experience with the starting and ending area I hope to work on later.

Puzzle 02 – Test of Wits

At the start of the carriage i drew out the second user task in which the player will need to press four buttons around this area of the carriage in the correct sequence. I have indicated this by drawing the four buttons and using the highlight brush, i have circled each one to show its location and also drawing the number order in which they need to be pressed.

Puzzle 03 – Test of Strength

In the middle section of the carriage I have drawn out the task for the test of strength by drawing multiple blocks that are block the way for the passenger. I have then drawn multiple arrows in different directions to show that each object needs to be moved in order for the player to progress further. This is also indicated by the text prompt I have written above as the player will enter this section of the experience.

Puzzle 04 – Test of Speed

At the end of the carriage I have drawn out the task for the Test of Speed in which the player will need to pull down multiple hand rails in a given time in order to progress to the end of the carriage. I have indicated this by drawing out the hand rails and then highlighting each one with a downwards arrow indicating that each one will need to be pulled down by the player. of course this was further highlighted by some text at the start of the experience briefly explaining what the player will need to do next.

Storyboard Next Steps

I feel more confident now in using tilt-brush and feel i know what i need to do further to accomplish my full storyboard. I will of course need to complete first train carriage showing it in both states containing random NPC characters and one without NPC’s with the objects scattered around the environment. I will also need to create or find an asset for the drivers carriage showing the ‘Antagonist’ and also showing the elements of dialogue and the ending for the experience. Luckily I should be able to import some assets and objects from my initial unity build.

Stutter – Finalising the first difficulty in Unity

After more developments and finalising my ideas for how i want the users to experience this ‘Mind of a stutterer’ I wanted to finish everything for the first level before i continue to develop the other difficulties. So i spent this time trying to polish and refine this so that it works in a playable state then hopefully this can be replicated on a larger scale when i try to make the other difficulties within the experience.

This required a series of trial and error to see if all the words would snap in place correctly for just 3 word options. Of course this will need to be increased with each subsequent difficulty or ‘level’

AR Experience – Initial Concept and Ideas / Vuforia Tests

Initial Idea: Augmented Reality Vase

I have previously worked with a ceramics artist called Yuling when I visited her workshop in Tangshan, China back in 2019 and she is a fantastic sculptur and artist whos works I still have displayed. She was also very interested in ways that she could modernise her exhibitions using digital technology and I always thought it would be great to use AR as a way to not necessarily enhance her work but to accompany as an additional piece.

I felt that this would be a great idea to revisit and see if i could create an AR experience that could work with ceramic works. The idea for this piece would be using a vase or an image target on the vase which would then unlock an additional digital art piece that would be layered on top. So you could have a plain vase on display but when you scan it with your phone, you can see some animations of flowers or the imagery that is alread on display with the piece.

Adding Interactivity?

Of course AR experience can be triggered simple by observing an object with your movile device an the experience will begin but what if this could be enhanced further so that the user can actually ‘discover’ or ‘choose’ which animations on the vase would trigger first and when.

Downloading Vuforia

After coming up with some initial ideas for my Augmented Reality experience i needed to work out the basics of using Vuforia. So using the guide that was given to me i installed and imported the package to Unity which I had some problems at the start with different versions of the software not working but I eventually got this all working. As you can see from the images below I used some basic tests by using an image from my work notebook as a ‘image target’ and then used a simple model I have from turbosquid of a wallet as the augmented object.

From the images below after a few calibrations and some settings to change i was eventually able to get the AR experience working through my webcam so that when i held up my notepad to the camera the wallet model would appear on top. I am happy that this is now all working which means that i can progress with the next steps of trying to create my own AR animations/models to work for this experience.