Now that i have looked at ways to develop a flower model and animation for my AR project i wanted to test how successfully these models will work in action and on a curved surface such as a Vase. I tried some initial tests with the image target of my notebook i was using last week and this again seemed to work fine on my webcam with the model showing up as a high quality which is reassuring for what i had created in Blender.

Full disclaimer i must admit that the environment i was testing all of my assets in was not very well lit but worked overall for my initial tests. I found a small vase i could use for the tests which may not be the final object that i use for the project but will work great as a proof of concept to see how I can model everything onto a curved surface.

I then printed a small QR code to simple use for these tests as i would like to have a more appealing way or image target on the final product for the end result. The QR code was then taped to the vase just to act as a simple trigger for the experience and once again using the flower model I had previously created I made a new image target. Luckily (Although lighting did not help) the image target was successful and the flower did appear on the side of the Vase which i was extremely happy about and was excited to move forward testing out this idea.


After this i started to move the flower models outside of the QR code and in different areas on all X,Y and Z axis to see how it would all interact. Each model was moved in different rotations and all had animations to make them spin within the area to see how movement would work. Luckily after testing this it also worked successfully overall and through trial and error I could the models to look like they are balanced on top or coming from the Vase. This worked even with my low light conditions during testing and so i was fairly confident this would work a lot better in a well lit environment or perhaps through a mobile device with a flash light.


I also wanted to do a small test to see what it could possibly look like with the flowers actually blooming from the Vase and so i tried a simple animation of the objects scale going from 0 to full size to show this effect and again this worked very well as a proof of concept as shown in the videos below of the process.
Next Steps
I am happy with how the initial tests worked for these models and i think that the next stages will be to develop better models and animations to actually be triggered on the Vase and another great way will be to add interactivity buttons which appear first on the ceramics after they are scanned by a mobile device. This could also be developed where i have multiple Vase’s with different animations or models appearing to show how this could be used in an exhibition with multiple ceramic pieces for users to experiment with during the experience.