Social Robot Design.

Cornell University. 2016 — 




Problem: Robot research platforms are either hugely expensive, overly anthropomorphic or not customisable enough.


Commonly used robots in current HRI research.


Solution: An open-source, expressive robot to be used for research in peoples homes and in our lab.

Core Challenge: Developing a form that could engage people socially without leveraging anthropomorphism.



Motion study of a form concept. Used to explore the expressive nature of a idea for a physical form. By having a bend in the neck, this form is able to move back and forward quicker and with less motor rotation. 


Key Skills: Rapid iteration using Cinema4d, Solidworks, Python.





One of the biggest design challenges in most Human Robot Interaction (HRI) research labs is finding an appropriate robot platform for the researcher to use in their studies. Most labs are made up of technically minded engineers and human focused social scientists. When looking to research the effects of new behaviours/algorithims/systems often the only choice is to use one of the expensive, off the shelf products available on the market (eg. Keepon, Nao, Pepper, etc). This means new interactions are being limited by what can be designed into these existing form factors using limited access to the hardware platforms themselves.

This has the effects of limiting the generalisability of scientific research and I believe is a choke point for the robotics innovation pipeline. When so many of the findings published today are either using heavily anthropomorphised products already available on the market (e.g Nao robot), products with limited access the the backend software (e.g Jibo), or low fidelity robots prototyped over a weekend in a research lab, it creates a gap in the explorable knowledge for the field of HRI. The goal for this project was to design a non-anthropomorphic, open source, low cost, expressive social robot platform that could be widely used. My goal was not only have it used by the 6+ researchers in my lab, but those in other departments, instituions and with the goal of doing field research with end users to explore potential use cases for social robots in the home.


Initial concept for expressive, collaborative robot.


The robot will be used in studies for group language learning, problem solving mediation, humour, resource distribution and other research in the Robots In Groups lab. Because this is such an emerging field, it also allows me to design new methods for how these devices should be ideated and iterated upon. These are methods I have been teaching in two classes at Cornell in the Spring and Fall of 2017.




Mood board for new robot design.



The new robot has to be close enough to what participants expect when they imagine a robot and similar enough to existing robots that research done with it can be generalised (supernormal). It also has to appear new and advanced so that when researchers embue their leading edge theories and algorithims into it there is no discrepancy in physical/system performance (technical). Participants need to be able to feel comfortable with the device and for it not be appear intimidating (friendly). I believe we can look to early technological examples of now common household goods for inspiration (nostalgic).




Initial sketches to explore different rough shapes for the physical form and how it might move.





First opertational prototype used for testing movement designs that were prototyped in animation software.


I’ve found physical prototyping to be invaluable and am a strong believer that when testing interactive physical prototypes there is no feedback as valuable as the feedback from a physical version of the design. For this reason this next phase focused on building out a prototype that the animations could be mapped onto (above). It also brings a lot of tough questions about wiring, mechanisms and weight distribution, key challenges that set the design of these kinds of devices apart from what an industrial designer traditionally works on. Working fluidly between CAD, code and workshop allowed a number of iterations on the mechanisms to reduce stress on the motors and allow more agility when implementing motions. It was also important to stay within the bounds of a (relatively) cheap motor, the Dynamixel AX-12, which retail for $49 each, in order to keep the overall cost of the project down.


Highlighting key components.


With components in place and robot operational, integration was the final major design hurdle. It’s one thing to design how a set of motors, a speaker or a screen can seem ‘happy’, but how does that change if all three are working in harmony? Does this vary slightly each time it occurs? And how does this change over time? These are the core UX questions that fuel my interest in this design space.

We know from 20+ years of research that robots activate a social response in humans more than what you see in a toaster, but less than a pet. How designers work with this medium is an open question and one that fascinates me.



Robot platform as of December 2017.


This work is still ongoing and this robot is currently being used in research in the lab. Usability tests are being run to verify whether or not the design intentions were realised in the final design. The robot is also being used in research into how people learn a new language, this is allowing practical testing of the usability of the form to convey expressive behaviours.

As this work is ongoing and always evolving, please feel free to reach out to see where this project is at currently, or check out the github repo below:


Github repo.




Character Actor: Design And Evaluation Of Expressive Robot Car Seat Motion
H. Tennent, D. Moore, W. Ju. Published at IMWUT December 2017.


Part of this process I have been developing is being published in the above paper “Character Actor: Design and Evaluation Of Expressive Robot Car Seat Motion” and becomes part of a scientific body of knowledge that is actively shaping how these devices are being design and built in industry, schools and research labs.

Click here to read the paper.



© Hamish Tennent 2017