January 4, 2015
In order to enable teletactics or allow a user to feel remote objects I built a number of prototypes that could provide feedback. Some prototypes are handheld and others that are strapped on to the user’s arm. In this post I will discuss the various prototypes and corresponding design process. The prototypes are built off of the Arduino platform.
I started with a toilet paper roll with a laser and switch strapped to it. Starting off the prototype this way allowed me to focus on implementing the Sphero ball movement dynamics when following the laser pointer drawn paths.
Next, we wanted to explore egocentric (“relative to self”) and exocentric (“relative to the world view”) motions of the Sphero balls. The laser pointer allows for exocentric motion, the path the ball follows is relative the larger space. However if we used a joystick to control a Sphero, it would move relative to its self. Therefore in the next prototype I added a joystick. I also added an LED with ping pong ball to soften the light. The LED color is meant to correspond with the Sphero ball the user is currently interacting with.
Following this, I started exploring feedback mechanisms. When a user pointed towards a Sphero ball I wanted to provide on demand feedback. In order to do this I built a cardboard expansion contraction mechanism using a pin slot joint attached to a servo motor. The controller would expand when it was pointed towards a Sphero and contract when pointed away. I also wanted to use this characteristic to select a Sphero for interaction. I placed a force sensor on one side of the mechanism so that when the mechanism was expanded in the presence of a Sphero the user could squeeze it to select it. The mechanism would compress based on the amount of force being applied by the user.
In order to have a more reliable prototype, we decided to 3D print it. My new design is seen below. All the functional aspects are the same except this design is more ergonomic to hold.The small cylinder at the front holds the laser, the sphere holds the LED, and the hole past the sphere is where joystick will be placed.
This video demonstrates the interactions. As you will hear, we also played with various sounds as a feedback mechanisms.
Here is a photo gallery of the final design.
We also wanted to investigate prototypes that could be worn by the user. The interaction paradigm would be similar to the previous prototype; expansion of the device would occur in the presence of the of the Sphero and selection through squeezing. With these prototypes we play with 3 levels of feedback:
We used a balloon as a shape replica of the Sphero. To allow for expansion and contraction of the balloon, I built a pnuematic network of compressors and an air release valve.
I tired a number of techniques to detect the user’s squeeze. The most memorable one was using a pressure sensor to detect squeeze and regulating balloon size. This wasn’t very affective as the user had to squeeze very quickly for any air pressure change to be detected. It was also difficult to use the pressure as the measure of the balloon size as pressure in the balloon didn’t really change as the balloon shell expands.
In another prototype, I used a force sensor to detect squeeze and switches to control the ballon’s size. The downside of this approach was that the user as wearing this large cage like device on their hands all the time. We wanted the user to have their hands free when they were not interacting.
Using a bend sensor between a double walled balloon was the most effective way to detect squeeze and regulate the balloon’s size while keeping the user’s hand free when not interacting. This was the most sleekest prototype and the final chosen technique.
Here is photo gallery of the final prototype design. All the designs were modelled in 3D and printed.
There are two sections of software that directly interact with the hardware handheld or worn devices. They include the loop taking in the image frames from the ceiling mounted tracking camera in the application running on my desktop and the code running on the Arduino for each prototype. I have state machines on the Arduino side mainly dealing with the low level aspects of the device such as controlling the compressors, air release valve, and LED when the appropriate transitions are triggered (ie. a squeeze is detected).
The state machine used in correspondence with the vision system running on my desktop is focused on high level aspects relating to where the user is pointing, type of interaction, and what motion commands should be sent to the Sphero balls.