• Sarah

Phase 4- Augmented Reality

Updated: May 5, 2019


AR was implemented in game design for 'Enchanted Kin' as it brings to life the magic theme. The AR smoke is a result of gameplay, demonstrating the success of a spell. The purpose of the AR is essential as we do not want to be creating unnecessary additions that don't bring anything to the game. This has been questioned, however we believe it is something unique to the narrative and game combination of 'Enchanted Kin'. I found an article looking at AR bringing stories to life (https://econsultancy.com/how-publishers-are-using-augmented-reality-to-bring-stories-to-life/). AR creates additional visuals which influences the viewers imagination beyond the written narrative or illustrations . The article also examines how AR particularly engages 8-10 year olds, as they enjoy gamification. In our game the AR's purpose to visualise the spell result. AR can help stories come to life with them engaging readers with participation activities. This, in the context of 'Enchanted Kin', enables the family to be actively involved with a visual narrative element whilst still engaging in a communal/family activity.


I've since been working at creating a style for our smoke assets that will be used in AR for the past few days. We discussed how we thought it was important to have "life-like" graphics for our AR assets to "keep the magic alive". This is something that we have found by looking at the types of games our audience of 8-12 year olds are playing and sophisticated, convincing graphics are key features of those. With this knowledge I think that a smoke effect from a bubbling cauldron needs to a good quality render, with varying densities to show depth.


I have looked at a few varying references of smoke and its behaviour in cauldrons and created some observational drawings to understand these behaviours further. Art is definitely not my calling, so I advise you take these observations with a pinch of salt! (I may ask Rubi to help me next semester to improve my most basic art skills)



From these drawings I found that the smoke fills outwards, almost like an explosion and rolls around the top of the cauldron. There is also initial burst of height before falling downwards. At the tips of the smoke they become wispy flowing away in spiralling manner.


I then began to asses some options for creating this digitally as animation sequences. I've previously created some smoke assets in After Effects but because of the use in AR we think it would be more appropriate to create these in a 3D software. I have also used Cinema 4D before and have looked into how I create such looks using this. This led me to another software called Houdini that creates special effects. By applying a billowy smoke effect, in Houdini, to an object it creates a moving form which can be manipulated. This can be exported into Cinema 4D and external plugin Octane Render to be rendered with colour and a foggy, wispy look.


I followed a tutorial to understand how this could be achieved. I used the free version of Houdini to create the form where I had a problem with exporting the asset into a VDB file to be used in Octane Renderer. When exporting I was ending up with a single file opposed to a folder of individual frames in VDB format. I closely followed the tutorial, as the file suffix was important to the file creation as VDB. It took a while for me to realise but I eventually found that in the file naming I had used a lowercase "f" instead of the capital "F" needed. This fixed this problem.


I was mistaken in Cinema 4D- Octane Renderer, I believed the reason the asset was pixelated in render viewer was because it hadn't been rendered. However after 26 hours of rendering for a 5 second animation, this wasn't the case (*cries internally*). The clip was still pixelated. On further inspection, in the comments of a youtube tutorial, using Houdini Apprentice (the free version of the software for students and professionals) limits your resolution of exports, resulting in the pixelation of the smoke. Unfortunately theres not a payable student version of the software and the next step-up, Houdini Indie, is considerably expensive for a few assets at this stage. This leaves me with two options: talk to Adam about getting an educational license of the software or using a different special effects software for the initial billowy smoke creation and manipulation. I will explore both.


The 26 hour pixelated render


I went on to create a smoke simulation in Blender. Blender is open source and has its own renderer, cycles. I found it a lot harder to navigate than Houdini but I used a tutorial to become acquainted and additional tutorial creating a smoke simulation. This didn't give quite the same effect as Houdini but I'm happy with the outcome. I think that I've got a better rotation on the smoke through, with it billowing round and downwards after the initial height. I couldn't quite figure our how to get the smoke emitter shape to turn off for the render. I also think that I need to spend more time figuring out wispy smoke areas in Blender. I could create lighter density areas additionally on top of it to see if it can create the effect I am after. However after completing the tutorial and render I came to realise that it isn't possible to export smoke simulations from Blender to Unity. This is rather frustrating as it means I'll have to create another smoke simulation, this time I think I'm going to use Unity's particle system as there is no reason for it not to work from there.



My final exploration of software was with Unity itself. For this simulation I created particles that had an explosion movement. I watched this tutorial and learnt some parameters for working with particle systems in Unity, particularly in reference to the expanding smoke (https://www.youtube.com/watch?v=BHdbBtGgfb8). I found Unity to have the most difficult to control over the vertical movement of the particles, however it achieved the exploding look pretty organically. This is something that I couldn't suss out with the other softwares. I also had trouble applying colour, so I ended up with this crazy, in your face green that moves through a gradient I created. I ended up duplicating the particle system for use in the AR prototype. I duplicated it in an opposing red to indicate a fail. For the AR animations in the game I think it would be best to use Unity particle systems, as I think there is a lot more for me to explore with them, particularly in terms of the initial upward movement and colour. I did find overall that the quality of the particle system was not great. I would prefer to create a better render in another 3D software, but this serves the function of working with Vuforia. I should probably explore lighting in regards to this. I know that on Cinema 4D I would apply a global illumination to the render. I will explore this next semester.



Vuforia AR prototype and testing- Model Object Targets


To create the AR prototype I will be using Vuforia, Unity's inbuilt AR plug-in. I can input 3D images or any artefact Unity accepted asset to this according to the Vuforia website. The plugin serves as a script which contains the CAD model of my 3D printed cauldron. This is through the Vuforia target, model target. It uses a 3D print file (STL) to understand the object for tracking. It comes with a handy script that turns your 3D object into an outline, so you can move your model for easy tracking. It turns off upon successful track. I therefore used this test to see if I had correctly implemented the model target. Vuforia uses webcams to track the object within unity, this required a lot of moving about by me!



Custom AR experience- Applying the particle system to the model target


(This is the guide I used to help me implement the model target system: https://library.vuforia.com/content/vuforia-library/en/articles/Solution/introduction-model-targets-unity.html)


I then had trouble applying the code to turn on particle system. In the model target, I questioned whether I had to write a script to turn this on upon tracking. However Vuforia's guide to model targets simply stated that "To customise your AR experience, place custom content as: child objects of the ModelTarget Game Object. Align your content relative to the preview model.". I naively thought they would tell you if you needed to write a script to trigger play of the AR object. I went on to find out about how to play simulation with C#, I found out that I needed to use 'public void Play(bool withChildren);' to play. I acknowledged this as the particle effect is the child of the model object, however I thought that you applied this script to the particle system. Due to my complete lack of knowledge of the Unity UI I did not know where I had to apply this code. I asked my friend Michael who works with Unity and he showed me that you create a new script and then add it as a component to the parent object. This made me realise that you need to apply this to the Model Target. However understanding what to write around that small bit of code in order for it to function was another story. Michael then helped me to create the script with the line of code I had found.

This in turn became the most important line of code for my project:


gameObject.GetComponentInChildren<ParticleSystem>().Play() ;


With this line of code in the script applied to the Model Target, the detection of the object resulted in the particle system playing, demonstrating an in game success as well as a personal one for me. As I didn't write this line of code I made sure I understood it so I can learn from it and create it again in the future. I understand that "gameObject" is the Model Target (where the script is being applied to with the add component section). The "GetComponentInChildren<ParticleSystem>" means that the "ParticleSystem" is the component that the "gameObject" needs to select. I also now understand that these "< >" contain the object/system that you want to retrieve. The "GetComponentInChildren" tells the main object (model target in this case) that it needs to select one of its child objects. Finally the "()Play();" plays tells the "gameObject" that the thing inside these "< >" should be played.

By understanding each part of this code I am learning about C# syntax, this essential for me progressing in this project as I need to also be able to code the success system.


Testing this prototype


I was worried about testing this as I have continuously having problems using the webcam for tracking. It would be a lot more helpful if with Vuforia in Unity, you could use your phone camera for playing the AR scene. Moving a small object to a integrated webcam is far more difficult. However this will never change as Apple would definitely not allow for this, as this is a benefit of being part of their £79 developer programme with access to their ARKIT2 STK's.


-Whoops definitely got too involved and critical there-


Heres a video of me getting frustrated trying to catch the Model Target with the cauldron.

(Trust me to pick a plugin that requires me to be on camera all the time for testing! Sorry for never looking put together!)



Applying a UI for simulated playtest


For the playtest I have created an app that manually simulates the functionality of the success rates. The idea is that for play testing, after they cast a spell with the NFC cards and Arduino, I will select the outcome of the spell for the player (random), giving them the phone to reveal the spell outcome. I achieved this by creating 3 scenes, 2 of which were the success and failure particle system model targets. The 3rd was a UI. I simply created buttons that took you to the corresponding model target. I also through in the concept logo to take the app back to the home screen.


Building the app for iOS, via Xcode


I watched some tutorials online about how to put your development app onto your iOS device. I learnt that I needed to build the app and open it in Xcode. I would also need to become part of the free developer programme to get a developer key to put this on my device. As part of the programme, when you go to build your app you sign a certificate which awards it a provisioning profile. This is what finally lets you put this on your own device. I've found out through tutorials that you are limited to 10 builds a week, I now know I will have to be conservative with the amount of builds for testing next semester. I have now created a successful build on my phone that can be used for playtesting.



Playtest


I used the AR simulation app to discover more about the interactivity of AR in the game. It was used in a playtest to get responses about how it functioned as a gameplay element. For the purposes of assessing the interactivity and time constraints, I tested the prototype on a group of adults.


The feedback from this playtest were:

  • Think that'll run smoother when integrated together (i.e. Arduino sends data to phone)

  • The input of the cards is easy (into the cauldron)

  • Should use different angles for the AR as you can't predict where the players will be sitting. (Could this be possible by creating multiple model tracks?)

  • The mystic quality sum keeps it interesting, keeps people thinking on their toes, slightly strategic.

  • Its fun

  • Adults like the cards. They think their aesthetically pleasing due their layout and colour.

  • They like physically interacting with the cauldron

  • Adults think that the tactile input and AR makes you feel like your magic.

  • Like the 'puff' of the AR smoke - consider using sound effects to further this experience