It’s definitely not a bird. Nor is it a plane. The garish orange piece of plastic, small enough to hold in the palm of a hand, could pass for a missing limb of a toy tyrannosaurus. It may not look all that impressive, but it’s notable for two reasons. One is that the monster arm has emerged from a 3D printer. The other is that it is, in fact, the first ever object made from thought.
This milestone was reached with little fanfare last month at the Santiago MakerSpace, a technology and design studio in the Chilean capital. The toy limb’s shape was determined according to the wishes of its designer, as gleaned from a headset picking up his brainwaves. The man in question was George Laskowsky, Chief Technical Officer of Thinker Thing, the Chilean start-up developing the mind-controlled 3D printing system.
Engineers and designers have been using 3D printers for more than two decades. More recently, prices have tumbled and desk-top devices are increasingly being pitched at consumers. The touted possibilities appear to be endless – from bones to buildings to burritos – making some observers predict revolutionary consequences like the eventual demise of the factory. Because 3D printers build objects layer by layer from materials such as plastic or metal dust, a key advantage is the comparative freedom they give designers. Yet the design software is not easy to master, especially if you are four-years-old and haven’t yet learnt to hold a pencil properly.
“What is the point of these printers if my son cannot design his own toy?” says Bryan Salt, CEO of Thinker Thing. “I realised that while there were a lot of people talking about the hardware of the printer no-one really seemed to be talking about how to actually use it.” In theory 3D printers could help unleash our inner creativity, freeing us from the constraints of traditional production methods. However, in practice those unwilling or unable to plough through the software instruction manual could be left downloading ready- made models designed by others.
That’s where Emotional Evolutionary Design (EED), the software that allows Thinker Thing to interpret its users’ thoughts, comes in. Its current role is to power the Monster Dreamer Project, which will allow users to design their own fantastical creatures using the power of thought. Chilean children will get the first opportunity to try it out during tour of schools in the country at the end of this month.
When those children sit in front of a computer running Monster Dreamer, they will be presented with a series of different body shapes in bubbles. These will mutate randomly, with built-in rules preventing them becoming too abstract. The children’s reactions to the changes will be picked up by an Emotiv EPOC headset, a $300 electroencephalography (EEG) device designed to pick up the electrical signals from brain cell interactions using fourteen sensors on the scalp. As different brain states such as excitement or boredom generate specific patterns of brain activity, the computer can identify the shapes associated with positive emotional responses. The favoured shapes will grow bigger on the screen, while the others shrink. The biggest shapes are combined to generate a body part, and the process is repeated for different body parts until the monster is complete. The final result should be a unique 3D model that is ready for printing as a solid object.
Second nature
Design steered by emotional responses is based on the notion that most people are better at critiquing a design than they are at thinking of new ideas from scratch, especially if they have no training. “One of the biggest bottlenecks right now with 3D printing is content,” says Professor Hod Lipson, director of the Creative Machines Lab at Cornell University, in Ithaca, New York State. “We have iPods with no music. We have machines that can make almost anything but we do not have a lot of things to make with them.”
Lipson’s lab is also working on evolving 3D models with the mind. EndlessForms, created by two of Lipson’s students, is a website that mimics nature’s way of creating new designs in small steps. At the start of the process users are presented with 15 three-dimensional shapes. Clicking on any two will combine them and produce fifteen new shapes based on those choices. If you wanted to make a cat, you might click on one shape with the semblance of a muzzle and another with two pointed, ear-like triangles on top. The computer would then offer up a series of new shapes that more closely resemble the cat you have in mind, and so on until the model reaches the desired shape.
To reduce the time spent clicking, the researchers came to the same conclusion as the Thinker Thing team – feeding users’ thoughts back into the computer directly could make the process quicker. So, last year, the team used Emotiv EPOC headsets to read users’ brain signals and therefore determine their reactions. But then they ran into a problem. “At some point we were thinking it was only measuring the level of sweat because we were actually trying so hard to feel happy or sad about something,” says Lipson. No matter how many times they tried, the scientists could not find a reliable signal to use from the headset.
The problem with cheap consumer headsets is that the signals they pick up are already weak. The skull dampens the small electrical impulses from the brain’s neurons and electrical signals from nearby facial muscles can overpower them. Some sceptics argue that consumer EEG devices are not really measuring thoughts at all.
Others argue that for applications that only require basic feedback such as yes or no, the readings they generate can be accurate enough. “If they are simple positive or negative emotions, it can be 100%,” says Dr Olga Sourina, head of the Cognitive Human Computer Interaction Lab at the Nanyang Technological University, Singapore. “When you need to differ between more emotions like anger or fear, they can be less accurate.” Sourina has spent nine years working on improving the ability of computers to recognise emotions from EEG.
With some of the limitations of consumer EEG technology in mind, Lipson and colleagues decided to monitor thoughts via the eyeballs. Eye tracking can identify the shapes that get the most attention, and this could be used to shape design processes. The snag is that without an EEG headset it is not possible to tell whether someone is looking at a shape because they find it strange or beautiful. Even so, the team’s research so far suggests that at the end of the process participants still feel they have managed to reach the desired design. There’s some way to go, but in the future a combination of brain scanning and eye tracking may be preferred to the trusty old mouse when it comes to 3D object design.