Inform feature
Pepper the robot
Creative Commons attribution information
©Jake Curtis via SoftBank Robotics
All rights reserved

The rise of the robot in education - Digifest speaker and Jisc futurist, Martin Hamilton

Martin Hamilton

Drones, robots and driverless cars – we’re living in an era of science fiction made real, says Jisc’s futurist, Martin Hamilton. But how comfortable are we with artificial intelligence in the classroom?

Let’s spend a moment thinking about science fiction: humanoid robots you could have a conversation with. The ship’s computer from Star Trek, or maybe Orac from Blake’s 7. Robot soldiers that can climb stairs. KITT, the self-driving car from Knight Rider.

For Generation X1 folk like me, these things were always some distance in the future. It was a little disappointing when the year 2000 came and went, and we were still waiting for our jetpacks and holidays on the moon. But have you noticed that all this has changed recently?

The future is here


Now you can buy the equivalent of Orac, the Amazon Echo Dot, for around £50. And there are now over 10,000 third-party “skills” that extend the capabilities of Alexa, Amazon’s virtual assistant.

Here are just a few of the things you can do with Alexa: turn lights on and off in your house; turn the kettle on; control your smart thermostat; listen to an internet radio station; check the weather forecast; check the travel along your commute route – and search for all sorts of information.


Now about those humanoid robots – if you’ve been to our Digifest or Connect More events, you might recall seeing the extremely cute Nao robot.

Nao has a bigger brother known as Pepper, who we are introducing to delegates at this year’s Digifest. Pepper is four feet high, and glides around on a wheeled base. And yes, you can have a conversation with “him”. What’s more, Pepper was designed to recognise emotion through analysing facial expressions, tone of voice and body movements.

Meet Pepper 

Around 10,000 Pepper robots have been sold to date, and you’re likely to encounter him working as a greeter in shops. Pepper’s parent company SoftBank recently launched the world’s first mobile phone store staffed entirely by robots.

Driverless transport

Self-driving cars are starting to find their way onto our roads too – some 90,000 Telsa electric cars have been shipped with Autopilot, which gives the car limited autonomy, and the company recently stated that all new models ordered with Autopilot would be capable of full autonomy. Teslas with Autopilot can already park themselves and come to fetch you, just like KITT.

We might not have our jetpacks yet, but there are a lot of companies working on human scale drones, such as China’s Ehang. Ehang have made a single passenger driverless drone, which will be going into service as an air taxi in Dubai this summer. Crucially the Ehang drone flies itself, which should mean that the human passenger does not require a flying license.

Trending now

Are there any common threads in these developments?

Firstly, there’s the massive computing power that’s now on tap through cloud computing like Microsoft Azure or Amazon Web Services. Secondly, there’s the artificial intelligence (AI) and machine learning that underlies Alexa and Autopilot. Thirdly, we are now working with data on an industrial scale thanks to the proliferation of smartphones, tablets and apps.

These three trends have come together to make all sorts of new things possible. One of my personal favourites is the use of AI in Google Photos to find categories of picture automatically – for instance, pictures with a particular person’s face in or pictures of cats.

Just like when you first use Alexa, this feels distinctly like magic. However, much of the underlying technology has been made open source, so you can get under the hood and tinker if you want to. Google’s TensorFlow is probably the key piece of software for that kind of exploration.

The next generation

And now we’re starting to see the next phase, which is when the AI goes off and creates something of its own. Google’s Deep Dream project takes this approach to create surreal and psychedelic art by enhancing particular aspects of an image file on request, eg “make it more cat-like”. We’ve also seen generative poetry, music and other creative arts.

At this year’s Digifest I’m taking a look at the rise of the robots and artificial intelligence, and what it could mean for teachers and learners. We’re already starting to see online learning apps that take an AI-based adaptive approach, acting as a coach and mentor by reinforcing key concepts with which the learner is struggling. I suspect we’ll quickly come to see AI-based assessors, careers advisers, and find AI in all kinds of other job roles.

For educators looking to exploit the potential of AI, this aspect throws up a lot of interesting questions. I picture my own kids going to visit the robot careers adviser, perhaps a descendent of Pepper, and questioning its recommendations.

Today’s AIs are driven to a large extent by what we could call pattern recognition – here are a million pictures, this half are pictures of cats. Now here’s a new picture – how likely is it to be of a cat? If we swap cats for careers, then the best that Pepper could probably say is “kids like you mostly went on to college”, or “you look like someone who likes to do hands-on work”.

How will we respond if Pepper says “give it up - you’re never going to amount to much”? Or to put it another way - what do we do when the computer says no?

But before we get too carried away, it’s important to note that today’s AIs are much better at recognising patterns than they are at coming up with stuff themselves. The state of the art is a technique called Generative Adversarial Networks (GANs). GANs work by pitting a neural network that generates stuff, such as cat pictures, against another neural network, known as the discriminator, which scores its results - how cat-like is this picture, compared with that million image dataset?

The black box nature of this process makes it hard to ask perhaps the most fundamental question of all – “why?” And if you cast your mind back to those old science fiction tropes, this was often the point at which the rogue computer exploded in a puff of logic!

Find out more at Digifest

Digifest 2017 logo

This is one of a series of features on topics that will be covered at this year's Digifest, which takes place on 14-15 March 2017.  

Martin will be giving his talk, loving the alien - robots and AI in education, during day one of Digifest in the morning. Full details for all this year's sessions can be found in the Digifest 2017 programme.

Digifest attendees will have a chance to meet Pepper the robot, as well as sample the latest technology in development, in the Digi Lab.

Join the conversation on Twitter using #digifest17.