On June 1st, 2012, a team from Lehigh University visited Spring Garden elementary school in Bethlehem, Pennsylvania, to meet with 23 kindergarten students from Miss Murnin's class. Our goal was to provide a positive and exciting experience for the students, showing them that science and engineering can be a fun and rewarding pursuit.
In preparation for our visit, we developed a new robotics platform based on Anrdoid smartphones. Modern phones offer a unique architecture for interaction with the physical world, to include voice, GPS, camera, and a host of low-level sensors. They also provide robust communication and computational ability. Our hypothesis was that phones could serve as the foundation for a powerful, low-cost robotics platform that can be interesting to adults and children alike.
This website provides links to the hardware platform and software that we used to build the robots, as well as pictures of our visit and a summary suitable for use in future outreach activities. We found the experience to be enriching and rewarding both for ourselves and the kindergarteners, and would be happy to provide support and advice to any Computer Science professionals considering K-12 outreach activities. We also are building a schedule for visits during the 2012-2013 academic year. If you are in the Lehigh Valley area and would like a hands-on robot presentation in your classroom, please contact Professor Spear.
The hour-long visit consisted of an initial "circle time", where we discussed the roles of scientists and engineers, four hands-on activity stations, and a final gathering. Each hands-on activity featured a different robot with which the students could interact.
In the first activity, students learned about how our robots were constructed. The robot in this station was built "upside down", so that the circuits and wires were visible. This allowed students to see how the parts get connected together, and provided an opportunity to talk about electricity, motor control, and the relationship between the phone/tablet and the rest of the robot. Students had the chance to connect wires, and then to use a tablet to drive the robot.
In the second activity, students talked about how a robot could help respond to natural disasters. This activity featured a robot that connected wirelessly (via Bluetooth) to a smartphone. The smartphone interface allowed the students to control the robot and have it take pictures, which were then transmitted back to the phone. In addition to playing with the robot, students discussed how robots could be used to explore dangerous environments without putting any people in harm's way.
In the third activity, students considered how a robot could be taught specific paths and patterns, so that it could do repetitive tasks to assist individuals with health-related limitations. Students drew shapes, patterns, and paths on the screen of the robot, and then the robot would re-create that path by moving around the room, following the directions that were drawn on the screen.
In the fourth activity, students interacted with a semi-autonomous robot. They would show the robot a balloon, and then the robot would move in order to follow the balloon. This activity provided a chance to think about how robots could be taught to recognize shapes and objects. It was also a fun way to play with the robots!
As we wrapped up the day, we had one more activity, based on speech. The robot presented an ear on the screen. When the ear was pressed, the robot would listen to the student, and then it would try to repeat the student's phrase in its own voice. As a hidden surprise, the robot was trained to recognize a specific keyword, and then to "dance" if it heard a student say that word.
The wiring schematic can be found here. Note that the schematic does not include powering the Arduino. We used a separate 9V battery for this purpose.
The parts used in building our robots are as follows. Note that the mention of particular vendors and products is in no way meant to be an endorsement. Rather, we provide this information only to simplify the task of recreating our work.
Finally, you will need an Android phone that is capable of using the Android Device Kit. We used the Transformer Prime tablet and HTC Droid Incredible 2, since we had these products on hand.
There are two principle software components for our current robot platform. The first is a small "sketch" that runs on the Arduino microcontroller. Its purpose is to receive commands from the serial port, and to use those commands to perform transitions along a simple state machine comprising 7 behaviors: stop, move forward, move backward, rotate clockwise, rotate counterclockwise, turn left, and turn right. Whenever a command arrives, the Arduino software transitions to the appropriate state. At all times, it also sends signals to two servo motors based on the current state, in order to maintain the appropriate movement by the robot. This software can be found in the Lehigh University Car-Bot repository, under the "arduino" folder.
The second component is the software that runs on the phone. This is more complex, consisting of 3000 lines of Java code (excluding comments and whitespace). This software manages bluetooth connections between phones, handles communication between the phone and Arduino, and also provides all interfaces and logic needed to perform the four activities described above. It can be found in the Lehigh University Car-Bot repository, under the "controller-project" folder. Note that this software currently relies on the JavaCV computer vision package. Depending on your hardware, you may need to re-build JavaCV, rather than use the compiled version in our repository.
We would like to note that there are several aspects in which the interface to our software could be improved. We are currently revising the software in a number of ways, and are open to suggestions on how to improve it. We also suggest that before using our software, you send us a message, so that we can provide advice about interface peculiarities that arise, particularly when trying to get a single device to communicate over both Bluetooth and USB simultaneously.
First and foremost, we would like to thank Miss Murnin for inviting us to her class, and to Mr. Smith, Principal of Spring Garden, for his support. We also thank Miss Murnin for taking the photos of the visit that appear on this website.
This project would not have been possible without equipment support from Google, and financial support received through the P.C. Rossin Assistant Professorship award. Tablet devices used in this project were purchased through National Science Foundation Grant CNS-1016828.
We would also like to thank Greg Besack and Dan Phillips, whose final project in CSE398 provided a starting point for some of our software.