A few months ago, when Apple announced ARKit, a new augmented reality framework, it was difficult not to be impressed by the accuracy of the tracking and the cohesiveness of the whole experience. Since then, some very interesting demo projects started cropping up.
In August, we had just finished a few months of back-to-back sprinting, so it was the perfect time to indulge in a creative break. We were really excited about the possibilities of AR and so spent a couple of weeks exploring potential uses for AR in Health-related apps.
As anyone familiar with Pokémon Go will understand, AR has the potential health benefit of encouraging people increase their physical activity. We’d love to see this taken further in AR applications for physiotherapy and rehabilitation. AR also has the very interesting potential to measure and record data about a person’s interactions with the world. We developed two prototypes to explore how this potential could be useful in a health-related context.
After lots of sketching, we eventually decided to take two ideas forward into prototypes: LifeTagger and ShapeSolver (full disclosure: we also considered LifeTggr and ShapeSlvr as names).
LifeTagger arose from thinking about how AR could be used to augment memory and add stories to your personal possessions. A bit like a distributed life story. To use the prototype, you first find an object that is important to you. You then tag it and add some information to it. You can then return to that tag later and view its associated information.
This approach could be useful in a few different situations such as reminiscence therapy where family photos and memorabilia might be tagged with an associated story. These tags could also help people with memory impairments live more independently by reminding them how to perform household tasks. Lastly we could imagine these tags being used in a more formal dementia care setting. Supporting people with dementia to communicate important parts of their identities or preferences to carers.
ShapeSolver evolved from similar themes that we are currently exploring as part of CognitionKit. We wanted to develop some simple AR games that might help us understand how people solve problems. ShapeSolver asks you to find an open space and then gives you a puzzle. None of the puzzles can be solved by standing still so you have to move around!
We speculated that different people would have different strategies for solving each problem. Information about how people move around the objects could be used for understanding a few different cognitive processes such as spatial reasoning, memory and executive function. Or it could even be used for cognitive training or rehabilitation.
Whatever the use case, the most important thing is that it’s fun to play!
Toying with the existing AR demos created a fair amount of buzz around the office. But witnessing people’s reactions to our own explorations was definitely the most fun part of the process.
Two observations in particular stood out:
Having seen the variety of sophisticated ARKit demos out there we initially expected to spend a lot of time getting to grips with the new APIs before we could try out even the most basic features. In reality, the bulk of our preparation was taken up by installing and setting up the beta versions of High Sierra, XCode 9 and iOS 11. Some brushing up on matrixes and 3D geometry is, of course, necessary. But starting with the XCode AR boilerplate project makes placing custom objects on the screen surprisingly easy. You quickly get a feel for the basic possibilities of AR!
Creating 3D models to occupy our AR scenes had a similarly low barrier to entry. In this case, though, having somebody on the team already familiar with 3D modelling applications will likely save a lot of time during prototyping. We found it faster to create our ShapeSolver objects in Blender and import DAE models to XCode rather than attempt a programmatic approach or use the (somewhat buggy) built-in SceneKit editor in XCode.
Note: When importing 3D models it’s helpful to think about size. We spent about 30 minutes wondering why our object wouldn’t show up in the scene, only to realise we had been sitting inside it all along (it was huge).
To be sure, providing real value beyond these initial prototypes is more technically challenging. The concept behind LifeTagger, for example, relies heavily on the premise that the tags you create will still appear in the same physical space the next time you launch the app. Persisting an ARKit scene, however, is currently not trivial, involving machine learning, image recognition, and storing GPS and device coordinates in order to then backwards-engineer where earlier tags should appear relatively to the device’s new origin point. At least until something like the AR Cloud exists.
The release of ARKit can be likened to the early days of the App Store, where the waters were tested by building wacky and gimmicky apps whose value was more entertainment than utility. Our own explorations fell largely into the former category but we look forward to experimenting further with AR and feel confident that we will see some genuinely useful applications in healthcare very soon.
(Thank you to Danielle for letting us film her while she solved puzzles on ShapeSolver.)