We recently visited beautiful Laurel View Village in Davidsville near Johnstown. Present were their staff along with representatives of the Johnstown School District and the Johnstown Christian School. Everyone was excited about Romibo and the prospect of running a program with the residents there.
Just submitted the app for review. If accepted, it will be ready for distribution on the Apple App Store!
In addition to minor interface tweaks, I have spent the past few days working with the nub views (driving, head tilting, and emotion). The emotion nub replaces the buttons and allows for smooth transitions between colors. Additionally, the movement of the head and tilt nubs is now smoother because of animations back to the start position. Optimizing images, such as the silhouette image used to access the child view, was necessary for retina displays. Finally, I modified the phrases buttons for grammar and also to accommodate more than one line of text.
The fiber optic cables on Romibo’s antennae are not really effective because they are rather dim and face the back. In the new interface I created, the iPad app changes color based on Romibo’s emotion. It conforms to more of iOS Human Interface Guidelines and the iOS 7 “flatness.” It also uses more consistent elements across the app to create a smoother user experience. The mood buttons control the main view’s background color, and interface elements on top have transparency to adapt to new colors. See the screenshots below (the left is the old interface, and the right is the new one).
Continuing from a couple days ago, I have finished fine-tuning the accelerometer control as well as resolving an issue of constraining the driving nub’s location. Now, if the controllers need to take their eyes off the screen, they can still drive Romibo by tilting in the direction they want the bot to move. See pictures below!
The latest improvement I’m working on is an easier way to steer the robot using the iPad’s accelerometer. It is a challenging (and time-consuming) task! It took me some hours to compare different options and understand how to use the CoreMotion framework to read data from the internal accelerometer. A really tricky part was animating the DrivingNub based on the X and Y readings from the meter, but it is working and can drive the robot. Once it has been fine-tuned, I will post some pictures.
First of all, Suvir is WAY too modest. We have a media superstar as an intern! We’re confident he can do great things for the Romibo robot.
Romibo has great potential- especially for open-source development by folks like Suvir who are curious and willing to try new things with technology (it sure doesn’t hurt to have experience with iPad and Arduino development either!). He has already solved some of the problems we posed to him- chirping and wiggling when you “pet” Romibo and locking out the drive motors. Can’t wait to see what he comes up with next!
Set Romibo on a table? Save it from accidentally falling off with the new “lockout” functionality available from the iPad app. It will disable all driving control within the app. I established this setting using NSUserDefaults and a UISlider within the Configuration view. See the picture below!
I finished yesterday’s task today. It took a while to get the sound file to work, but Romibo now chirps occasionally when you pet him. (The chirp is really loud though – will work on a way to change volume). I added and fine-tuned a couple head movement functions in the RomiboRobot library. A new one is wiggle()- a slight movement of each head motor to give another response to petting.
Mr. McLaughlin, QoLT Lead Ambassador, lent me an old smartphone that I cleaned up today. It would be interesting to see if the camera could be used to do some image processing and track faces. I have experience with OpenCV in iOS development, so I did some research into using it with an Android phone, like the one he gave me. It does not have a front facing camera, so it would not be feasible to track facing the front. But the introduction of a smartphone opens up a lot of possibilities. I also started taking a look at the iPad application and cleaning up its interface.
Today, I “got my feet wet” with Romibo programming. It took a couple of hours to setup the IDE, skim through the libraries, and write a few small programs. My first task was to make Romibo “wiggle”, blink, and “chirp” when you pet it. Using the three touch sensors on its frame, I was able to sense when a user touched Romibo’s head. I altered a demo program to make it bob its head and sway. Now, I am working on adding a Chirp sound to the SD card contained in the Romibo and randomly playing it when the switches are triggered. From what I’ve seen so far, it seems like a great platform for me to learn and make improvements!