• Sarah

Development 2

Updated: Jun 4, 2019


Game development


After reviewing Rubi's work on the narrative and exploring Alexa games (Interaction 2). We have found that the narrative model produced would not hold the attention of our audience. We want to keep the narrative to drive the story, but the descriptive nature of the game has to be pulled back. This has seen me take over Rubi's role of narrative, as we have established that the narrative is the driving force of the interaction game development with voice games.


Keeping core elements of Rubi's creation, I have developed a system that uses the environmental locations as levels from the original narrative. In addition to spell casting, the gameplay element of exploration has been added. This gives further purpose to the Alexa interaction with players, with them discovering more through VUI. This enables the players to find the ingredient cards, as opposed to originally having a full deck. The spells will continue to prompt the next level, however it is the players decision about when they would like to do this.


It's unfortunate to lose the script that Rubi has been developing as it is fully produced and she has worked hard to develop it. However, this development was vital to ensure the enjoyability of the game. It adds further interest to the game, beyond the tactility of the cauldron (and Arduino) and simple Alexa interaction. By giving the game this purpose it puts the players further into the role as characters, creating an immersive experience.



Plan

From semester 1 to 2, the game has developed considerably. We started off with a ton of components that didn't quite work together. Not only were they difficult to link up but the tactile interaction was bothersome and not user friendly. This led us to move into a new realm- creating an Alexa based Enchanted Kin.


Our aim is to link the two components, Alexa and Arduino-NFC.


The proposed components and tasks:


  • Alexa; Narration; using twine plugin to create a story.


  • Arduino & Unity; Get them communicating and co-operating, curating a model that compares two strings (NFC chips in playing cards) to audio output.


  • Cards; Curate strings and consider production.


  • Cauldron; casing the Arduino safely, whilst also being able to scan.


Event: Southampton Science and Engineering Day 16th March


For the purpose of the Science and Engineering Day I needed to create an output for the NFC display to the participants. James suggested how we could use the NFC tags to control the game. This in-turn created the idea for the simple witch maze game. The NFC tags on Rubi's cards were turned into directional cards. The players operate the cards on the NFC antenna on the Arduino. This moves the witch around the maze, with the aim of collecting her cat and cauldron,



In order for this to work we had to created pipping from the Arduino to unity to control the game. This saw James create a script that pipped out the Arduino string into python in unity. This utilised the namespace "System.I0.Ports" to control the port in which the Arduino was connected. Unfortunately through testing we discovered that recently this has become dysfunctional for MacOS, which means that to display the game or use the function, I must use a Windows laptop or PC. The original read script is left on the Arduino and this PortReader script, tells unity to read from that and thus move the character.



Rubi adapted some of the playing cards from Enchanted Kin into directional cards to be used during the event. I simply wrote to them with my NFC Arduino write script, with the stings "U", "D", "L" and "R". This was accompanied unity with a GameControl script, that instructed that if a string was present it would move in the direction of the card.





This build demonstrated the use of NFC as controller for the day. With the pipping being appropriate to be adapted for further use in Enchanted Kin.



Initial interaction playtest build


For a playtest before we broke up for the Easter break, I wanted to fully implement the interaction model I have been designing, the assess its ease of use and understanding for the player. I needed to mock something up, that both I and Rubi could use, in a short amount of time. I knew that I could created a UI system that took a narrator to the retrospective narrative passage, like a text adventure. However doing this through unity would take a really long time and tons of scripting. So I opted for transferring the interaction model to separate website pages, hyperlinking just like a UI system. This proved more difficult for producing prompt outcomes for unavailable commands such as, not looking in the right direction or not using the correct key terms. For the purpose of the build I added a random theme song to set the scene, however I do intend to develop the sound further.




This is a sample of me interacting with the build. With this build it can be hard to know what to say when when there is so much text on a page. I tried to create a hierarchy, however the sheer volume of text makes this difficult.


This model can be found at: https://sarahpartington.wixsite.com/playtest



Alexa skill research


I've found out that in order to build Alexa skills that you have to use their online ASK, without previous coding experience this would be how I would achieve the desired outcome of creating of an Alexa Game Skill. Alternatively with guidance, there is an SDK for Python which could be used. Interestingly looking at the functionality and a visual backend diagram, the Arduino is likely to work as a Smart Home Skill as it would need to receive commands from the Alexa, as well as it delivering outputs.



Smart home Skill functionality

Over the past few weeks I have continued to try and get to grips with building and coding for Alexa. I've found that the documentation for Alexa development is still somewhat growing and previous coding knowledge would be much advantageous. This is not something I have had experience in, so I was delighted to find a few twitch live streams from Amazon @ GDC that could potentially help me out. A main takeaway from this was, that by hosting my code via their Lambda service, I would be able to link the Alexa voice exploration to the input from the Arduino by applying the Alexa Smart Homes trigger to deliver interaction from the Arduino to the Alexa. I believe that this would require me to input the Python pipping of the NFC-Arduino into the scripting of the Alexa. I think that this will be quite challenging, given the timeframe of the project. I will continue to look for additional ways to achieve this linking.


(Of particular note: https://www.twitch.tv/videos/398447907?filter=archives&sort=time & https://www.twitch.tv/videos/398499572?filter=archives&sort=time)


This stream also helped me to discover that Enchanted Kin is defined as a "voice-paired game", which means that it requires an additional element and cannot be played with the Alexa alone. Along with this it made me further think about the interaction model of the game. It requires speech and tactile input from the player for audio output by the Alexa. This is what has proven difficult in trying to find how to apply the 'cauldron' (NFC-Arduino input) into the game.


Interaction between player, alexa and enchanted cauldron

Alexa skill building learning in San Francisco


The stream led me to make a rather last minute decision that I felt I needed to take advantage of. Less than 2 weeks later, I found myself on a plane travelling to San Francisco to take part in an Alexa Games Hackathon. This was one of the first games hackathon that Amazon had hosted and something that was not specifically available in the UK. I wanted to learn first hand to understand the framework and jargon to help me to develop Enchanted Kin.


The hackathon included a workshop for us to get to grips with designing for Alexa games and what new features they have implemented. I found particularly useful the interaction model worksheet, that enabled you to split down voice commands and outputs to their key sources by how they are defined and used in the Alexa Building Console. This was quite a surreal experience as I found myself amongst experienced Alexa developers as well as technology industry CEO's. It was great to talk to them about the project and hear about what they're doing with the Alexa.


The hackathon itself was quite a journey, paired with Alumini, Tom Endersby, we were tasked with creating an alexa game that fulfilled 3 criteria: Must ensure skill quality, identify engaging content and choose a monetisation strategy. We produced a few design ideas, and were keen to implement a 2 player battle game that used two branch decisions trees, like Enchanted Kin. However we soon discovered that our idea of a straight forward game on the Alexa, was much more difficult that we imagined. This would require us to understand how to utilise Alexa's memory and despite our enthusiasm we didn't think that with the time remaining that we would be able achieve this.


With the help of Alexa solutions architect Rob, we decided to develop a simple fact based skill, as games are very much an advanced skill for the Alexa. We took inspiration from home and hoped that our British charm would win us some bonus points. We managed to create the fact skill pretty promptly and started working on applying question place slots i.e. "Recommend me an Indian restaurant in Winchester" which was going well. Inline with the judging criteria we opted to propose use of the one-time purchases monetisation strategy. This really conflicted with our ethos that has developed over our time on the course, so we thought a more ethically inclined proposal; this was to use it as a ticketing purchasing service to local attractions such as Winchester cathedral.



We submitted our skill and prepared a presentation everyone. Tom was tasked with explaining the skill, whilst I was in charge of the testing panel to demonstrate it, as well as speaking about the monetisation strategy. But rarely do things go according to plan. As we tried to present, the skill would not invoke. We asked for a moment, but, briefly looking over our code, we could not figure out what was wrong. We continued on explaining the intention of our skill and finished our presentation.


This was a really significant moment for me. I started the academic year unable to talk in front of audiences, where I would panic and consistently rely on extensive note taking in fear of what happened to me at the hackathon. But I took this in my stride, stayed composed and took it with a pinch of salt. I feel like this, and talking at IBM, has pushed me and made me better at talking publicly about my work.


We ended up winning an honorary award for travelling so far and for failing so gracefully (Rob vouched for our build saying "it worked when they submitted it!". So despite the skill's failure, I feel like I learned a lot from the hackathon; experience with the developer console, practice skill building, public speaking and networking.




Bugs in code


I was really determined to try and get the skill working again when I got home. I managed to get the invocation for the skill working and give a fact but unfortunately the utterances do not work. I have looked over the code and copied directly from the GitHub repository, however I think that it has something to do with the return of the GetNewFactIntent. This is something that I believe is unique to the specific script for the fact skill, as I have seen that other models handle their intents separately to their launch intent.


In terms of Enchanted Kin, a different model is required to create the interaction needed, as the fact skill purely creates randomised pulls from a created repository. The model most aligned to the game is the decision tree interaction, which moves players through a choice based system. From this I hope I will be able to implement the decision tree interaction model for the choices in Enchanted Kin.



Moving away from the Alexa Skills Kit and Development service


Since returning home, I have struggled with the development of the Alexa Skill. Upon further research, the interaction model for Enchanted Kin falls under the adventure game skill type, which is an advanced skill for previous developers of Alexa skills. This is because of the sheer quantity of choices available to the player. There is additionally a Unity SDK that could link up a Unity build of interaction to the Amazon Alexa Services. However this is very experimental and through looking around the types of projects that are utilising this, they tend to be purely controller based on screen, with Alexa serving as a companion or as the controller for the game itself. My problem lies with having the Arduino NFC function. This could on its own function as a Smart Home Skill, with the problem comes from the need to communicate to and from the board itself. To ease the weight of the project we could consider condensing the interaction model, giving less choices. Ultimately though, our play tests have shown that it builds up well to the spell casting sections, that we would be losing a lot but removing the exploring and investigating interaction.


Although this is potentially possible, there is also problems with getting the spell casting Arduino all joined up to the Amazon Alexa services. Considering the problems we've had with getting the Arduino Python pipping to work on MacOS, I don't think it is feasible for me to develop in this framework at this time. I will continue to work on the interaction of the game and come up with other potential solutions.


Reflection


A lot has been going on this past month in terms of gameplay development. The design of the interaction model is developing well and I am pleased with how players interact with it during the latest playtest. However the development of NFC to the audio output has taken a bad turn. I will have to develop solutions to this over the next month. I will also continue developing more levels, applying the interaction model and hopefully creating an end interaction for the game.