AR Ceramics – Exporting/Animation Issues and Finalising the Augmented Experience

I continued to have the exporting/animation issues that I had over the christmas break and was unfortunately not able to find a solution to this problem in time so to meet the deadline i chose to use the original model that I had created in blender and animate this model in Unity.

Image Target

I spent some time trying to choose a more visually appealing image target and the images below show some of the options. I wanted to use a lotus flower style image but unfortunately all of these images didnt work too well as an image target during general tests.

So after testing these different images I went back to one of the original cartoon/drawn images of a pink lotus flower shown in the bottom right image above.

Model Updates

I went back to the original model that i used during some of my initial tests and changed the colour to more match the pink of the image target. Using this model i measured the size of the new image target and created a ‘Vase’ sphere to properly measure where exactly the flowers should bloom which took some time for trial and error to get the placement correct.

Updated Animation

After placing the models i started to experiment with how to create the flower blooming animation in Unity which results in the video capture below:

Arranging Around the Vase

After creating the animation I went back to trying to perfect the positions of all the flower models around the Vase and to also see how the animation works together which overall resulted very well.

Adding Interactivity

It was suggested that i could add a but more to this AR piece by experimenting with some interactivity even something as simple as pressing the screen and triggering an animation, effect or model. I started by adding a canvas button which when pressed could trigger an animation so initially I added the standard button into the scene as you can see in the image below.

I then went into the options for the button canvas and removed the text and made the button transparent so that it would fill the whole screen this way the user can press anywhere on their mobile device and trigger the animation.

Colour Change

After succesffuly triggering an animation with the button i decided to add a little more by having an altered blooming animation with a colour change. So when the image target is initially scanned it will appear in the same pink colour as all the other flowers but when you tap your finger or click the screen it will trigger a new animation and ‘grow a green flower’ instead as you can see from the images below.

Adding Sound

I decided after i tried to record some sound effects that the equipment i had to record wasnt working too well so i wanted to record a different sound effect. The main premise of this piece is flowers growing and blooming and so i felt that a pouring water or watering sound effect would work well for this and so I went out and recorded these exact sounds.

Exporting to Mobile

After finishing all of the updates for my AR Ceramics project i then tested exporting the project for mobile and tested it both through an emulator and also on my own android mobile device. The results are shown in the video capture below:

Image target size issues

The project successfully works through the emulator using my webcam on my pc and the project also works when using a large A4 version of the image target on my mobile device. However this may be an issue with the phone I have and the camera focusing but it could not focus to detect the smaller image target on the vase which is unfortunate for the testing phase. If i had more time i would test out this on different android devices or try and use a larger vase with a larger image target to see if a phone will be able to pick up the image better for future versions.

Stutter – Polishing, Environement Updates, Playable Build, Testing and Video Capture

Over the christmas break i spent some time trying to test and find any final bugs or issues but there were not too many that needed fixing and overall it plays how i wanted it to at this point of the project.

Environment Updates

Overall I am very happy with the environment and looks very typographic, sepiatone and dark in nature as i originall pictured it but i wanted to add a last few details to finish it off. I started to add some extra books and pages around the environment to add some final touches to the experience as seen in the images below.

Playable Build

After making the final changes to the project i was able to succesffuly export the project as a playable build which thankfully works without any known bugs or issues and plays easily through an Oculus headset (Meta now…). The gameplay capture videos below are from the playable build file.

Gameplay Capture

Gameplay Commentary

User Testing

Some of my friends were polite enought to try out and test the final build of the game and i asked them to give some of their thoughts which are written below:

Tester 01: “Firstly I really enjoyed playing the shutter experience. The darkness of the environment with the trees created a very good sense of claustrophobia. The gameplay is simple yet very effective and how it ramps up difficulty in each section is well done making the final level quite stressful compared to the first 2. An improvement I’d suggest is a recentre button as the table height was very low even when sitting. Also I noticed in the introduction level if you drop “Hello” it starts rotating around a different point and can be lost through the floor but I don’t believe this happens in other levels”

Tester 02: “Overall I really enjoyed the game but felt the position of the table was a little low and I did not like the font at the end. I get that it’s trying to be harder but don’t like the changing fonts”

Tester 03: “The VR experience you’ve made is really interesting – big fan of how the audio itself stutters as you’re trying to grab the words. I like the environment too.”

AR Ceramics – Blender 3D Modelling and Animation Issues

During this passed week I have been trying to export animated models based on tutorials that i followed online but unfortunately I have been continuing to have issues with either the software crashing, models not showing or animations not working in Unity.

Models not exporting correctly

Some of the methods that i previously found online had issues where the animation wouldnt bake or would only part bake and resulting in models in the middle of the animation and appearing like the image below:

Model/Plane property Issues

Using some other methods in blender to create flower models such as using certain faces of the UV sphere also resulted in the models being recognised as planes in unity meaning that one side is completely see through and the other side works correctly which resulted in the image below for some models:

Animations not working

After a while some models would export correctly as static models but any animations that i created in blender wouldnt translate correctly into unity or at least wouldnt work with my current knowledge of both software meaning that the animations would not work or only part worked for certain areas or aspects of each model.

Next Steps

Over the christmas break i will need to continue to find a solution to my exporting/animation problems with the models or find a different solution to continue developing this idea for my AR ceramics project.

Stutter – Polishing Some Bugs/Gameplay Issues

Boundary Area

I initially placed in the boundary area around the entire environment but I did realised that some of the walls were working correctly and it was possible still for the player to wall off the edge of the environment. After checking through again I found that it was a simple issue where the walls had a box collider instead of a mesh collider. Once this was updated the boundary walls worked correctly.

Hard Difficulty Fixes

After doing some playthrough tests of the experience i realised that the third word ‘We’ in the hard difficulty stage wasnt snapping like the other words but when you attempt a few times to put it in place, the counter would still trigger and eventually trigger the transition to the credits.

After going through all of the words and scripts for this bug i realised it was a simple ‘Tag’ word problem where one of the words was incorrectly labelled. So thankfully after changing this the rest of the experience worked correctly.

Instruction Updates

After trying the experience again i realised that the instructions at the start of the experience dont necessarily explain the controls to the user. I realised that to a casual player they may not realise straight away how to control the game so i have now added how you will need to use the triggers to grab and what each controller joystick does for the character movement around the environment.

Stutter – Final Crit Presentation and Feedback

This week we had our final crit presentatations where we presented our current progress with each project and a chance to receive feedback and guidance on where to take the project next before the final deadline.

Stutter Draft Gameplay Capture

The video belows the current iteration of stutter and its gameplay which i presented to the class during the presentation showing the general gameplay, environment, sound and and discussing any changes and updates id like to make for the rest of the projects duration.

During the presentation i also displayed my curent progress with the game design document, world layout, player journey and the environment in unity etc.

Feedback and next steps

After presenting to my class and teachers I recieved some very helpful and positive feedback regarding the VR section of this experience. The main feedback was to add a few more elements possibly to the environment and to work on sound mixing and possibly making some changes to the word rotation animations and the fonts that are being used for the experience. I will of course also need to work on having a fully playable build with minimal bugs and polish for some user testing and further feedback and suggestions.

AR Ceramics – Final Crit Presentation and Feedback

This week we had our final crit presentatations where we presented our current progress with each project and a chance to receive feedback and guidance on where to take the project next before the final deadline.

Blender Animation

The video below shows my current blender model and animation that i created for this AR piece. However i was only able to show it in blender as i was having some issues with exporting the model and animation for use in Unity which i will need to look into to continue this project further.

AR Ceramics Test footage

The video below shows my current experimentation with the AR Ceramics piece by using my original flower modela and animating them simply in unity to demonstrate how this would work in AR using a QR code as the target for the whole experience.

Feedback and next steps

After presenting the current progress of my AR project including the current models, animations, tests and game design document the feedback was very positive and agreed that it would be great to have the blender model fully working in unity to develop this idea and concept further to have a refined and polished experience. I will also need to start testing the experience on a mobile device using Vuforia.

Stutter – Game Design Document and Player Journey Updates

After focusing a majority of my time on this project developing the experience in Unity i spent this passed week going back and writing up the Game design document and making updates to the player journey for this experience.

Game Design Document

We were given a template/guide on how to create the game design document and using this guide i was able to develop and write a full GDD for ‘Stutter’. This document explains the story, gameplay, influences, mechanics, what sets this apart and the required assets. The player journey and world layout were also included in this document.

The Player Journey V02

Initially my player journey only focused on displaying the correct and incorrect words for the player but I have now tidied up the journey and also included a world layout/environment layout to fully show and explaing what you will be doing during the entirety of the experience.

AR Ceramics – Recording Sound, Vase Animations and Modelling

During this week i have tried further experimenting with my concept for the AR experience in Vuforia. Using the model that i had previously created in Unity and recieving some feedback and input from my tutors I started to experiment with a larger quantity of the model to achieve full coverage of the Vase.

Using an approximate sphere model based on the actual side of the Vase I started to move, rotate and plot all of the different flower models around the Vase. This took me a lot longer than expected due to have to carefully position and rotate all of the models to give a curved effect around the object but I was able to eventually achieve this.

The imags below show the effect live in my webcam which luckily did work successfully and if you rotate the vase around it does give quite a convincing effect that it is moving around the curve with the Vase which is the effect I wanted.

Blooming Animation

While the models work correctly through vuforia by being displayed around the image target I also tried to use some of the animation options in Unity. This meant i could have a concept of the flowers changing sizes, rotating and also offsetting from one another to create a type of pattern. After experimenting with these different animations, I feel confident that this concept will work overall for the final experience.

Exporting my Blender Animation (problems)

Alternative Flower Model

Although the initial flower tutorial i followed did create a nice flower model which i felt would suit this project well, I decided to try out a different tutorial more based on a lily pad to see if I have more success exporting the animation. The screenshots below show some of the steps i followed in creating the flower model.

One aspect of this tutorial I quite enjoyed was the in depth look into the shader section of Blender and creating multiple colour options including shadows and textures which turned out very nice overall.

Unfortunately I still had some troubles trting to export my flower model where the animation does work and when i export the model in different ways only the central petal seems to have any animation and the outer petals dont seem to be moving. I will need to spend some time this week trying to find ways to export my models to then further experiment in vuforia.

Recording Sound

This main theme of this concept is of course around flowers and plants so sound effects of this nature would make sense for the AR experience. I started to go around outside and luckily I am nearby an ecology park where hopefully I would be able to record some different sound effects. I also want to record some water dropplet sound effects now that i have the option of using the lilypad model animation which i think will be suitable for my AR experience.

Next Steps

The next steps for this Augmented reality experience is to achieve a successfully exporting blooming model from Blender and to also find and print a better more visually appealing image target to place on the Vase. As a stretch goal it would be interesting to have another option for an image target which has a different animation effect or plant/flower.

Stutter – Credits, Navigation, Hand Models and Difficulty Transitions

Hand Models

One of the main aspects for this experience was of course to add some hand models so the player can see where there hands are in a virtual space but this is something I left until later. I started to research online ways to add hand models to the XR interaction tool which didnt seem to difficult overall because you can attach and resize imported objects to each controller which will then appear on screen. At the moment the hands are currently static models that dont have any motion or animations but I am hoping that gesture or grab animations wont take too long to implement into my gameplay.

Adding Navigation

Navigation is another aspect where originally I wanted the experience to only work if the user can actually physically step around the environment but then of course I needed to think about the accessibilty of the experience and how some people might play sitting down. Researching more into the XR interaction tool it was also not too difficult to add quite a simple form of navigation to the experience.

I have now added a continuous move provider which is based on a locomotion system so that using the left hand controller joystick the player can move around the environment at a relatively fast pace which give the oppurtunity to further explore the environment they are playing in. I then included a Continous turn provider which allows the user to turn the camera left and right using the right hand controller. This is currently set to a continous motion which after a few attempts gave me slight motion sickness so i want to research into this further to have set turns that will isntantly turn the player in 60 degree increments for example.

Boundary Walls

After adding navigation it became apparent to me that the boundary walls that i was going to originally add to the experience might not have a use to stop the player falling out of the environment. This was quite simple to add walls which have a collider and rigid body with no mesh to create a visible boundary that the player cannot cross to try and leave the environment.

Introduction Screen

After having our lesson/class on ethics in virtual reality i knew that it was extremely important to include some information or a disclaimer screen message to explain and justify the experience to the user. I also fet that including a title screen also just helps to round off the whole game as a full experience with a clear start and a clear end. To create this I decided to use a series of animations to keep it simple so that tht title ‘STUTTER’ would appear followed by the first disclaimer and then the tutorial informaiton for how the player can start the experience.

Outro Screen

The Outro screen was made in a very similar way to the introduction screen which was made through a series of animations all timed to fade in and out after one another. The outro screen starts by displaying a message of information about speech impediments and Stuttering which is followed by a message that i think is important to show players to wrap up their understanding of the full experience.

Typography/Word Updates

After some initial viewings and suggestions to my VR experience, I was told by multiple individuals that the typography that i had chosen is quite difficult to read and if I had done this on purpose. I knew that after quite a few people had mentioned this that i would need to change the typography. So i decided to go back into my original Illustrator file to change the fonts to something much more legible as you can see from the images below.

Difficulty Transitions

For this project I wanted to avoid having different levels and transitions to keep the scope and time frame for this project achievable in the given deadline. This meant that transitions between my difficulty ‘layers’ would be one of the most important aspects and something that I knew i would need to leave until last after everything else in the experience is completed.

This week i focused on ways that i could have the transitions take place by fading in and out the different sections or ‘Parent’ objects that i have seperated each difficulty, disclaimer and conversation. To start this process I created a canvas that would be triggered using an animationt to create a fade to black effect by block the players view while different sections load in and out of play. This was then achieved by identifying each section as a game object and using a ‘StartCouroutine’ and ‘IEnumerator’ i could time the transition effect between each difficulty level.

Each of the ‘CorrectWord’ boxes were then assigned a counter so that every time a correct word was shown in the box, the counter would go up by 1 each time. This then meant that if all three words were correct in the easy difficulty the counter would go up to 3. The transition was then triggered by an if statement so that if the counter was at 3, then the animation fade would start and the next difficulty level would load in. Each difficulty was assigned a certain counter number and so on which luckily achieved this effect with quite great success. I still need to play around with the timings and length to see what works best but im confident over time I can find a good length of time between levels.

Next Steps

The next steps for this experience is to start some user testing to see if all of the gameplay mechanics and understanding of the experience works for the user or player. I will also need to go back over and check the sound and audio for the whole experience and make sure that the timings for transitions and difficulty lengths work together to make a satisfying game for a player.

AR Ceramics – Unity Development and Camera Tests

Now that i have looked at ways to develop a flower model and animation for my AR project i wanted to test how successfully these models will work in action and on a curved surface such as a Vase. I tried some initial tests with the image target of my notebook i was using last week and this again seemed to work fine on my webcam with the model showing up as a high quality which is reassuring for what i had created in Blender.

Full disclaimer i must admit that the environment i was testing all of my assets in was not very well lit but worked overall for my initial tests. I found a small vase i could use for the tests which may not be the final object that i use for the project but will work great as a proof of concept to see how I can model everything onto a curved surface.

I then printed a small QR code to simple use for these tests as i would like to have a more appealing way or image target on the final product for the end result. The QR code was then taped to the vase just to act as a simple trigger for the experience and once again using the flower model I had previously created I made a new image target. Luckily (Although lighting did not help) the image target was successful and the flower did appear on the side of the Vase which i was extremely happy about and was excited to move forward testing out this idea.

After this i started to move the flower models outside of the QR code and in different areas on all X,Y and Z axis to see how it would all interact. Each model was moved in different rotations and all had animations to make them spin within the area to see how movement would work. Luckily after testing this it also worked successfully overall and through trial and error I could the models to look like they are balanced on top or coming from the Vase. This worked even with my low light conditions during testing and so i was fairly confident this would work a lot better in a well lit environment or perhaps through a mobile device with a flash light.

I also wanted to do a small test to see what it could possibly look like with the flowers actually blooming from the Vase and so i tried a simple animation of the objects scale going from 0 to full size to show this effect and again this worked very well as a proof of concept as shown in the videos below of the process.

Next Steps

I am happy with how the initial tests worked for these models and i think that the next stages will be to develop better models and animations to actually be triggered on the Vase and another great way will be to add interactivity buttons which appear first on the ceramics after they are scanned by a mobile device. This could also be developed where i have multiple Vase’s with different animations or models appearing to show how this could be used in an exhibition with multiple ceramic pieces for users to experiment with during the experience.