CP&E – Second Research Iteration testing

Trying out Footstep sound effects

One idea that we had for the user test iteraction to see how people react to changes in the XR rig was to add footsteps sound effects to give the user an idea of speed and movement while playing the scene which i attempted using the code as shown below:

However through some further research and general testing i found that this method really does not work very well at all because it can confuse the user since its not their own feet making the sound effect and its also isnt necessary overall so we decided not to move forward with this.

We knew that we could need to develop some more testing and perhaps find some more ways of how we could make updates to the XR rig.

Questions for our Second Iteration:

  1. Did you notice any differences?
  2. If yes, please explain your answer
  3. I felt like I was a child
  4. I felt like i was an old person
  5. Do you have any further comments you would like to add?

CP&E – Creating and Finalising the 1990s Apartment Layout, Movement Tests for second research iteration

Finalising some Interior Design and Decoration

During this week i tried to go through my scene and make sure that i was filling up the space and really making it interesting for the user and adding any small details or additions that I could. It was also important to add elements that incorporate story elements such as the room with the painting that i wanted to make sure contained a lot of assets in this room specifically.

Creating the Painting Interaction

The painting interaction i initially thought would be quite trickt but overall using the set active true and false collider effect did work the charm with this interaction. I could create seperate faded objects that assemble together to make one whole object like a painting and have objects on the floor collide with these elements to reveal the fixed painting.

S

The script that i used to make this system work was quite simple so that when the particular painting piece the player picks up is collided with the faded versions that that piece will dissapear and reveal the correct version that will eventually lead to a full painting.

Creating the Photo Interaction

The photo frame interaction was created in a very similar manner to the painting interaction except this time they are linked to the photo frames and also use a very similar code to create this same effect. Sound effects were also added so that they would player on the particular collision with those objects and placing them.

One element with these particular assets that i found tricky was the box collider which had a strange centre of origin that needed to be reset for each part of the object. Luckily my lecturers helped me solve this issue and is something that i would definately need to remember for future assets in use.

Phone Answer Machine

The phone answering machine was quite easy to create in the end and mainly required having a audio source linked to the object that could be triggered when interacted with. I will also attempt to add spatial sounds to this particular element as well to try and create a more interesting and immersive effect for the user.

Photo Flash Scene Change Effect

For the scene transition i used a very similar approach from some of my previous projects which used a canvas and an animation that created a fading out effect. In this particular case i used a yellow flashing effect like a camera that also has a linked sound file of a camera flashing to essentially end the scene.

CP&E – Environment/Apartment Layout and Updates

Updated to the Apartment Layout

After looking at the apartment further i decided to try some further alternative layouts by looking at some different apartment floorplans online to see if i could create something a bit different and perhaps more interesting to move around and navigate for the player.

CP&E – Research Iteration 01 Results and Script Developments

Final Developments to the first user research iteration

To put together the user research iteration all of the rooms were divided the rooms amongst the group which included Lin creating the 1980s apartment

After all of the different environments were merged into one singular scene I then looked into how we can arrange this layout and the best way to switch between these scenes for the user testing. To avoid any issues with the user being able to see the other buildings outside the windows I decided to stack each one on top of one another so it wouldn’t be possible to see the other flats in the environment. This would hopefully not be too distracting in the environments.

As you can see from the image below the four XR rigs were then put in place so that the user would always start in the same position within each environment. The only difference between the XR rigs is the Y coordinate and each rig being labelled and specified for that certain apartment or time period and so we can turn on and off the different rigs as and when we please. For the research test we simply planned on showing the users the different environments and then getting their feedback on each one and also showing them the environments in a random order and not chronological so hopefully this wouldn’t influence there answers.

Finding test participants

After discussing with our teachers how we should go about this first user research test we decided that it would be best not to test too many participants and so we settled on finding up to six different users. Ideally for the test it would have been more beneficial to get users of different ages brackets but due to time constraints and also the demographic that we can get hold of. So for the actual test we were able to assemble six other people that were aged between 22-26 to look at the environments.

Overall it was great to see how different the users perceive environments but it is also a shame that these users will not have viewed a 1980s environment first hand compared to those who would have been older. However it is also interesting to see how users interpret the environments and see what there idea of an older or newer apartment would be.

Each participant was tested over the course of a week with some being tested and using the VR headset within some of the university classrooms as LCC while some others were tested at their homes using a Oculus Rift S or Oculus Quest 2 headset and each person was then also asked to fill out our demographic questionnaire and also our research iteration questionnaire.

Analysing Test Results

After we had completed all of the tests with six different individuals we were able to analyse there results and answers and overall it was quite interesting to see what each person thought each environment time period was and also there general opinions of the layout, furnishings and decorations of the apartments.

Forms response chart. Question title: Which general time period/decade did you feel you were in for each of the different environments?. Number of responses: .

Overall for the first question we had quite a positive result in which nearly everyone thought that Environment A was the 1980s, Environment B was the Future, Environment C was the 1990s and the only stand out was some mixed results in which Environment D was thought to mostly be the 2020s but had a couple answers which were not matching. However The results were mostly positive and what we wanted and so we would not need to make that many major changes to the environments in the terms of the time periods.

Forms response chart. Question title: Did you notice any particular changes in each of the environments? (Changes to the furniture, decocorations, technology and colour etc.). Number of responses: 6 responses.

Luckily all users had noticed differences in the environment which overall does show that they care about their surroundings and the differences that they see. Most of the users noticed more obvious changes such as technology, colours and general furnishings that had moved around or were different styles.

Forms response chart. Question title: Did anything feel out of place in any of the time periods/environments you visited based on the time period? (Certain pieces of furniture, decorations and placements etc.). Number of responses: 6 responses.

However there were some elements which users noticed felt or seemed out of place in the environments such as certain aspects they believed to be historically incorrect or didn’t match the surrounds which is feedback we can definitely incorporate into improving our environments.

CP&E – Script Ideas, Development and Content Cohesion

We have spent some time further developing the script and narrative of the whole experience by trying to write different elements and parts of the script including dialogues for certain characters, what NPCs we can use and also how to start and end certain sections of the narrative while trying to make it all cohesive.

We also continued to delop and refine other scenes based on all of our story ideas such as trying to make the young age scene include a birthday party and show how the mother could be present but not yet needing other characters in the scene by making it before the party begins on that day.

We also wrote down some additional notes and ideas after speaking with our teacher as to how we could refine and improve elements further as a whole narrative.

Collaborative Unit – Final Build Updates and Changes, Animation sequencing and Interactions/Triggers

Problems with FBX File Types

Unfortunately again after having a number of technical issues and needing to learn the different and new methods of exporting the FBX files with animations for Unity we came accross some further issues that we needed to find a way to resolve in the time that we had left. So due to some file names changes and some animations not linking to the original base models that we had in place I had to find a solution to have these working in the scene with the time we had left. So i layered the different changed versions of the models on top of one another so they could be set active and inactive at different times depending on which animation was ready to play.

Animation Importing and Sequencing

All of the models and animations were imported into the Unity project and placed into the appropriate areas of the scene.

The script to trigger the animations became very complex and I admit could have been done a lot better if i used scripted events properly but i chose to do it this way because i knew it would and could work in the time I had left to finish these scenes. In the images below I had to create a long list of public items in the script such as the different animations, animation triggers, the dialogue boxes and also the different versions of the seagull models.

After all of the public items were set up then it was simply a long and ordered list of which animations and objects would trigger and how long it would spend on those items inbetween. So the different versions of the seagull models were set active and inactive at certain points and the animations would play for so many seconds until the next animation was ready to start next.

Adding a Navigation Guide

Due to our lack of tutorial and general player guidance in this version of the project I wanted to add some form of guidance for the player in this prototype of the experience. So to do this i added a small faded orange marker which moves up and down kind of like a small icon that will hopefully give the player a sense of direction. There is one placed on the beach next to the seagulls and another that spawns after the first seagull dialogue on the pier to guide the player to that next section.

Creating the Pier Scene

The pier scene is created in a very similar way to the beach scene so that when the player enters next to the ice cream cart they will trigger a collider which will spawn the pier seagull models and the animation sequence and dialogue bubbles will appear working in the scene. This scene will also trigger 20 seconds after the ice cream eating mini game that was created by Yiran and Lin.

Adding the Crying Child

In the pier scene of the script it features a crying child after the player eats all of the ice cream and we had the sound arts student also create a sound file specifically for this. Due to time constraints the animation students couldnt model this character in time and so found and edited a model on the asset store which look and worked great with a simple crying animation that we could loop. However… we the materials and textures somehow went strange during exporting and so we accidentally made a small devil child as you can see from the images below which does work but also looks like a little sister from Bioshock… arguably terrifying but also we just had to keep it in for the moment because of how funny it was in the scene.

Adding the Speech Bubbles in Sequence

Now that all of the animations and triggers for the animations of the seagulls are in place I now needed to add in some synced speech bubbles that would also contain the dialogue to show and appear at certain points. The speech bubbles for each dialogue scene are set up using animators to trigger the bubbles at certain time stamps in the scenes that are mostly synced to the dialogue in the scene. These bubbles also appear on the pier scene as well. The only unfortunate thing is due to how these are set up I did not know the best way to add the interactivity because they despawn after each short dialogue scene so this is something i would need to rethink in the future for this version.

Updates to the credit scene

Yiran found a perfect new skybox for the credits scene which makes it looks so much nicer and also works out well because it kind of shows that its a sunset and an end of the day for the seagulls at the end of the experience which works very well overall. She sent my through this skybox and I then applied this to our final version of the experience.

Removing the City Scene

After we tried merging the scenes and the different assets/scripts that we had all worked on we found that the only problem was loading the city scene. We had some issues where others were using the OVR platform instead of the XR rig and when we tried to swap it out for the correct version as you can see in the image below it was just a grey screen and it just woulndt work in a VR headset. We were limited for time and couldnt resolve this issue and so unfotunately it ended up on the cutting room floor for the time being unless we revisit this project in the future as we hopefully plan to have a more completed version with more polish in the future.

Additions to the Environment

I thought that it would be nice to add a few more little elements around the environments and might as well utilise some of the test models that were given to us from the animation students and so I found it really funny to have a few of them sat in the Beach chairs around the area.

Creating the Final Build

After we had tried to achieve as much as we can before our final crit and deadline we were in a good place to try and achieve a fully playable build and while in the Unity file there is of course 5 scene that were created for this experience we knew that we would need to cut the city scene for now and so we sequenced it all so that after the sky scene the player would then go straight to the credits instead as you can see in the screenshot below. Overall the final build worked greay with minimal issues and we were extremely happy that it was fully playable and working from start to finish.

Collaborative Unit – Finalising before final crit presentation

The Introduction Scene

After creating the original basis for the introduction scene I then wanted to develop it further to include the original starting line that we were going to include from the script written by Lin which was the ‘Water goes in Water goes out’ and ‘Oh its my feet’. So to simulate this I made it so that the XR Rig was fixed in place and the player would not be able to walk or move around within this scene and I then created a white sphere and added the seagul feet so if the player looks down they might feel like they have the body of a seagull which i thought would be quite funny for the player.

As an introduction for the player which is similar to the credits, there will be a series of different texts appearing on screen. We have now created a couple of texts which explain to the player how to play the game including what the triggers and grips do with some small instructions on how to grab and eat objects in the game just to generally show the player what they can do during the experience. This was again created using a series of PNG files that are coloured in Unity and shown together in sequence using the Animator.

The Credits Scene

After mostly completing the general framework for the credits and introduction scene I felt that the island could look much better and more interesting so i went online again to the asset stores and found some new island models and assets which i felt made the scene feel much more welcoming and more interesting for the player to look around during these short sequences in the experience.

Adding Audio to the Sky Scene

Now that the sky scene is completely set up to incorporate the player standing on top of the seagull I then began to add the audio dialogue for the scene. The dialogue is told through three parts so three speech bubbles appear over time which the player can also grab using their controllers to throw and move around.

Sky Scene Boundary Borders and Wing Animations

When testing the sky scene I realised that there will definately need to be some boundary walls that stop the player from falling off the seagull while moving and so these were simply created using some box colliders with no mesh renderers so they would stop the player from falling and be invisible during gameplay.

To also add some movement and life to the scene I thought it would be great to have a slight animation which actually made the wings flap up and down in a loop to simulate the idea of the seagull actually flying through the sky. This was done simply using an animation which roated each wing up and down to the same position so that they would loop smoothly through the scene.

Importing the Animation Progress

Syncing Up the Animations

After receiving some of the intial animations for the beach scene I started to look into how these can be organised together in one animator that could be triggered during gameplay. So initially i would have the idle animation as the default in the animator and then have public strings labelled ‘BInt01’ etc. which to me means Beach Interation 1 and they will be all labelled in this way. This way in the script I can time the events so that I could play animation string 1 which will then play loop back to the idle animation after completing and then the next string can be accessed and so on until all have been completed.

Triggering Animations

Now that some animations were set up and working in the scene I had to find a way to trigger these animations when the player walks over to the seagulls. So i set up another box collider around the seagulls which will be used as a trigger so that when the player enters the collider and sees that they are tagged as ‘Trigger’ the animation that is on the seagulls will play. The plan is to then have the idle animation as the default that is always running on loop and then the animation scenes will play on the enter of the trigger. This method will be used for all of the animations throughout the experience to play in one after the other or on a certain event or trigger caused by the user or player.

Scene Management

After the general framework was set up for each scene to transiton to one another while some are timed at certain points and others are triggered. However when we initially tried this transition you could see a slight stutter or it felt strange to have the scene just change instantly so we knew that we would need a scene transition effect during the experience. One of the initial ideas was to create an Iris wipe which we werent quite able to figure out the best way to achieve this so i instead used a black canvas that would appear infront of the players main camera and would grow and shrink from the middle of the players field of view to create a wipe transition effect.

This effect is then triggered at certain points in the experience which then causes the wipe to appear for a few seconds, then the scene will change and the same canvas will start in reverse in the new scene which will then wipe to show the new scene the player has entered. Currently this transiton is up to 10 seconds which may be a long amount of time but it works in the grand scheme of things at this stage.

CP&E – Merging Environments and Preparing User Test Iteration

Everyone in our group had spent the past week trying to find certain assets that could be add to the different time period based scenes from furniture, applicances and other decorations.

Everyone in the group then added all of their assets to these different scenes and spent some time merging them into one singular scene with each apartment containing these different assets ready for the user testing stage of the experience. Some more time will need to be spent finalising our choices and also making sure that we have enough changes to make the scenes obvious or noticable for the user in our testing stage.

CP&E – Populating the 1990s Apartment for research iteration 01

I had chosen to work on the 1990s version of the apartment and so now that everyone in the group has an apartment prefab to work on I can solely focus and work on the 1990s apartment to find assets and research regarding this particular time period.

Museum of the Home Inspiration

A couple of weeks ago our group went to the museum of the home which was fully explained in one of my previous blog posts. However one of the great things about visiting the museum was that they had an apartment from 1990s Britain which was great and so I took as many pictures as I could to inspire my choice in assets for my environment interior design.

Other research of 1990s British Apartments

Finding Assets and Models

After my research and photos found from the 1990s apartments I started to look on the unity asset store as well as other websites such as turbosquid, Sketchfab and freemodel etc. to find the closest resemblance to 1990s furniture that I could to try and recreate this style.

I started by finding more obvious assets such as a sofa, chairs, lamps, rugs and other elements for the living room. I then started to look how i could find some electronic appliances for the kitchen like a fridge, microwave, toaster, kettle and other elements etc. Then one very stand out asset would be the television set and also finding a 1990s computer which of course have greatly changes and evolved over time and so these should stand out a lot to the user.

Populating the Environment

Now that I had assembled a folder of all the different assets I started to populate the environment using my collected assets which overall was quite interesting to do. However while all of these assets worked well for simply creating a research iteration test of different users noticing or viewing objects from different time periods I would need to find a lot more assets and decorations to fill out the environment to truly immerse the player within that time period if it works correctly. Lighting and textures will also be very important for creating the general tone and mood of the environment while we also need to figure out either creating a skybox, having outside assets or simply having blinds or curtains over the windows.

Collaborative Unit – Resolving Animation Issues, Adding Interactivity and Scene Management

Working Model and Animation Import

After we had the very long week of trying and failing to add the models with animation, earlier in the week we finally had some success after the animation students changed some elements of the rig and mesh while also learning new ways to export the model for Unity We eventually had the model mostly working in a unity scene as shown in the images below:

The only slight issue we still have seems to be with the wings but the rest of the body is working and looks great for now so if we have time we can try to resolve this but at the moment it will be fine for our short concept!

We then spent time trying to export test animations and luckily again after some trial and error we were able to get it all wokring and some images below show the animation files appearing in the scene and having an effect on the model as shown in the images below:

The image below shows one of the final models that is fully working and animated and the only issue that remains is the wings looking a bit jagged but in the grand scheme we can overlook this for now while we develop the rest of the project and try to add in the rest of the animations.

Finalising navigation and Teleportation

Initially last week I added in a full teleportation area that the player could move around but we wanted to have specific points that the player could navigate to around the beach so i started to add anchors around the different areas that the player could teleport to as shown in the image below:

However after some tests and moving around the anchor points I realised that this did not work overly well and felt very restricted for the player So we decided to go back to the full teleportation area. The player also has access to the continuous turn provider in the game as well to reposition themselves after they teleport.

However after further discussions and tests with the group, Yiran had the great idea that we could have continuous movement in the experience so that this is how the seagull walks and the teleportation is how the seagull flys which we felt was genius for this experience. So now the player has the option to teleport around the beach using the right trigger or using the left joystick to walk around the beach which after some tests worked extremely well for the experience. We just need to make sure to explain this in the tutorial instructions for the player/user.

In the story the player needs to transition from the beach to the pier so i decided that after including the idea of walking and flying, that the player should fly to the pier after being prompted to. So I have added a larger teleportation anchor on the pier where if the player points their teleportation point at it, they will appear on the pier fence.

This was then also replicated for the ice cream stand so if the player points their controller at the stand they will ‘fly’ over to it and then this is where they will start the mini game to eat ice cream and the child starts to cry. However some boundaries will need to be added to stop the player from falling off the pier or ice cream cart during the experience.

Creating the Credits

I also created the credits scene for the end of the experience which i created as a seperate scene and kept it fairly simple so tha the player would stand stationary on a small island where if they look around they will see both the seagulls sat behind them. The credits were created as simple PNGs from Illustrator that go through the full list of credits of us in VR, The 3D animation students and the BA sounds arts students and also my friend Alex who provided one of the voices for the game. The transition for the credits is a simple fade in and fade out animation for the different sprites in the scene.

Creating the Menu Scene

After creating the credits I then copied the same scene and I then plan to use this as a base for the menu. I still need to research into creating a VR menu that will have a button to start the experience but in this scene we wanted to have the ‘Water goes in water goes out’ dialogue and so ive placed the XR rig in a certain position with the seagull feet below the player so they can see that and think they are a seagull in this moment while also serving this point of the narrative.

Adding Eating Interactivity

One very simple interaction that the player can do in this game are areas where they can eat certain foods in the environment. So I found a ‘fries’ model on turbosquid and added this into the scene, ready for that particular animation and i seperated it so that one fry would be interactive and you could grab it out of the packet by pressing the inner grip button.

A collider was then added to the player camera so the player wil always have a collider on their head which can act as the players ‘mouth’ during the game. So that when they lift the fry towards their head it will dissapear and play an eating noise when they do so.

This was achieved simply by having an audio source attached to the head collider that had the eating sound effect and adding a tag to the ‘food’ assets so whenever they collide the food will be destroyed and the sound effect will play and this worked very well overall during the tests.

Adding Hitting Interactivity

One of the funny elements in our game was slapstick comedy and being able to hit the other seagulls with certain objects or even hitting them with the speech bubbles and so I added some code so that whenever the player grabbed certain objects and used those objects to hit the seagulls, a sound effect would play. This overall also works very well so whenever they collide you feel like your hitting them and this will hopefully be adapted further to trigger a hit animation whenever they collide as well to add to the comedy factor of this interaction.

Teleporation Sky Scene

After mostly finishing the teleportation for the beach and pier scene I wanted to then add this to the sky scene and this one does not have continous movement but three teleportaiton anchors that the player can move around if they choose to or they can simply choose to stay on the seagulls back during the short time they will spend in this environment.