Our project is working rapidly in a series of two week ‘sprints’ which i’ve written about previously in ‘Working in an agile manner’. I thought i’d bundle together a few of these sprints so folks can see what we have been up to and they all related to the technology phase of the project from my point of view.
User testing (informal)
After our initial testing with the museum take over day it was an exciting morning on the 5th Feb for me and Gail as we were unleashed on the first properly usable ‘beta’ prototype. Laura and Jake took notes. We headed out onto the first floor balcony gallery, as shown in the above photo and fired up the app.
- Would the app know where we were? YES it did! After choosing our characters we were instructed by the game to head for the Egypt Gallery on the ground floor
- Immediately upon seeing the map I tried to press the ‘ok i got it’ button which didn’t work. I needed to press closer to the map, valuable feedback noted by Jake who said “mmmm interesting we didn’t ever do that in our testing”
- When we got to the gallery the app kicked back into life and told us that we’d arrived in the right location – all thanks to the ibeacon technology. We got to play our first game of trying to spot the broken object. We had one minute to dash around the gallery and locate the broken object. I found it or so i thought. It turns out we have several broken nosed head objects but in my book we won that task. I really like that the app is almost a guide but disappears during the actual task so we could enjoy the gallery.
- Upon completion of the game we were ready for our next challenge. Off to the Birds and Mammals gallery on the first floor using the wayfinding feature of the app which seemed to work but then drop out (noted by the ever watchful eyes of Jake and Laura). When we arrived at the gallery it was mostly under wraps due to a gallery refurbishment. Luckily our ibeacon remained safely tucked away on a high level pillar PHEW. I took the liberty of jumping over the barriers to ensure the app at least knew we’d made to the gallery. At this point it crashed for reasons i’ll leave to aardman to figure out.
- After the app was restarted we got sent to the second floor to play two more of the challenges.
My first thoughts are that i’m very confident that the use of sensors, particular the location aware type, are going to be critical to the service in the years to come. The ibeacon technology clearly works. Laura and Jake have just written about the details of the ibeacons themselves and the hurdles that needed to overcome.
Using the app for the first time was genuinely exciting and despite some small issues aardman have pulled magic out of the bag for the games, the visual look and the user experience.
Although it is very tempting to test the app with the public I still feel we have 1-2 major bugs that we need to stomp before handing over to the general visitor. I think if great storytelling folks like aardman can master the opportunities of this type of sensor we’re in for some transformational ways of engagement. Onwards.
This blog post acts as ‘milestone 3’ evidence Doc 3.1, 3.2 & 3.3