Wearable devices have been in the news a lot recently with the launch of Google Glass to the UK consumer market and an explosion of other form factors such as smart watches, thermostats and fitness bands that monitor user biometrics.
They are marketed with promises of providing truly personalised experiences, delivering contextual information partnered with personal metrics in a visually engaging way.
I recently attended Augmented World Expo in Silicon Valley, California, one of the largest augmented reality (AR) conferences in the world. It features applications of AR technology from the cutting edge of industry and research. Put simply, augmented reality is a technology that overlays computer generated visuals over the real world through a device camera – bringing your surroundings to life and interacting with sensors such as location and heart rate to provide additional information.
Connecting to the environment
One of the emergent trends focused on new modes of delivery that will be used to interact with the physical world in the future, providing the user with increased contextual content. Instead of current human-computer interactions where the touchscreen device is an intermediary tool to consume or interact with information, we are gradually moving towards human-world interaction where the user is much more connected to the environments surrounding them.
The rise of wearable devices could potentially make this a realistic goal by blurring the lines between computer and human environments that are traditionally divergent.
The growing convergence between the Internet of Things (controllable physical objects connected to the internet with sensors to adapt to the changing conditions around them), AR and wearable devices provide an opportunity to afford the user a more immersive and interactive experience.
Applications in medical teaching
One of the wearable devices showcased was the futuristically named SpaceGlasses, developed by Meta-view and backed by the person that most people consider to be the father of wearable computing, Steve Mann. In the demonstration, a professor from Stanford University medical school talked about the expense for universities in buying simulated patient mannequins, often costing tens of thousands of pounds each.
For a fraction of the price, using virtual 3D models, the glasses mimic the same physical user experience with haptic (touch) feedback, tracking gestures and interactions. This can include listening to a heartbeat with a stethoscope and applying different virtual patient scenarios such as cardiac arrest, then using gestures to deliver an adrenaline shot.
Haptic responses in scenarios like these can help prepare students for life or death situations without risk of harming a real patient and reinforce existing clinical skills.
In another aspect of healthcare teaching, Evena Medical is coupling wearable devices with patient data and imaging to identify precision placement when inserting an IV. This could have a massive impact in the training of medical students and continued development of existing medical professionals.
Google Glass in practice
In the last couple of months I have managed to work with a pair of Google Glass to do some initial research of potential use cases in education. Although there are currently some serious limitations on their use in an AR capacity (battery life, overheating etc) I wanted to see if I could demonstrate a simple application where they could add to the learner experience rather than replicate what is already available through traditional support.
The concept of the connected world is very popular at the moment, sensors in our devices/wearables assisting in linear practical tasks. Augmented assets could potentially complement the physical environments we work in simplifying complex technical processes. With this in mind, I acquired an old PC tower and set about building an AR experience focused on the removal of the riser card cage.
It was relatively quick to build the assets for the learning tool, most of the time was spent putting together the 3D CAD tracking model and the animation in the 3D software. A working version can be seen in the video below.
The major issue with Google Glass and third party apps at the moment is the lack of navigation afforded to the user, put simply there is no way of interacting with the AR environment dynamically. Other vendors have provided workarounds by allowing users to interact using a trackpad, although this is obviously not ideal.
Although 2014 is a long way from being the year of the wearable, it can perhaps provide a glimpse into how these new additions to the consumable market can benefit instructional learning and educational support in the future.