- My Account
- Best of the 'Box
- What's New
- Open Lidar
RobotBox is a community for robot builders to show off their projects. Add yours today.
NINA is a large-scale experiemental robotics platform. Her main function at the moment is to explore applications for robotic social machines--a social machine is a term I coined for robots or computer characters that interact with humans socially like a chatbot, but something that can show expression visually. I'm trying to explore the idea or push the idea of social machines to benefit people with communication difficulties, like those with Asperger Syndrome or "high-functioning" autism. It's understood that robots and computer characters like chatbots can goof up, however, so to optimize social machines I propose that products ought to be connected to servers with hundreds of people constantly collaborating and improving social machines, while optimizing them to work with member of the "high-functioning" autism spectrum.
NINA is a little taller than four-feet and weighs about 50 lbs. Her frame is constructed out of aluminum and is fastened together by nuts and bolts. Her treads are lynxmotion tracks with AndyMark Inc. aluminum hubs.
Motors are 12V 95 max RPM, 100+ ft-lbs of stall torque (small and lightweight, but still powerful enough to move her around without being too fast and dangerous). Her motor driver is a Sabertooth 2X25. For articulation, she is equipped with Hitec Large Scale, standard and microservos, as well as ServoCity Powerservos of different degrees. She had a total of 13 degrees of freem from her servos. (1 at the hips, 6 in the arm, 2 in the neck, 4 for eyebrows/eyelids). She uses a lynxmotion Serial Servo Controller (the SSC-32) for servocontrol.
For vision, she uses two Cyber Snipa Spotter webcams. She is capable of detecting faces only right now, but will someday be able to range-find via stereovision, among many other algorithms, utilizing OpenCV. She is being programed to track and follow faces as well (it works, but needs a little fine-tuning). I may equip her with a third, smaller camera for face detection to make it easier. She recognizes voice commands and can synthesize speech using Windows Speech recognition engine and Cepstral Robin's computer voice. She can hold a very limited conversation about various topics like music, animals, jokes, and whatnot, while moving about in various gestures to accomodate her speech.
Sensors, besides her optics, include Sharp IR range finders, PING Ultrasonic range finders, and Parallax PIR sensors. She uses a SC84 servo controller board for sensors (you can turn any port on this board into a sensor port and has accomodations for 36 ADCs). Most of the servos are not mounted yet, but more will be mounted with her next refit.
The NINA robot uses Nimh battery packs to power motors, servos and sensors. Someday, however, I may switch to LiFe to decrease the payload futher.
For logic and processing, NINA just runs on my laptop, but I plan to give her a Mini ITX based computer system for on-board computing (I use a BIG laptop).
NINA has performed and demonstrated her abilities twice for two halloweens the past couple years. She is mastering shaking hands, waving hello/good-bye, but needs more fine-tuning with expressions--I need to learn more about servo interpolated movements, as the current interpolation algorithm I wrote is a very crude one.
She is programed in Python. The central Python script connects to everything--the Lynxmotion SSC-32, OpenCV, and the Windows Speech Recognition and voice synthesis, Motor Driver, and Sensors.
NINA's come a long way, and she still has a ways to go. In any regard, I will be excited to find out how far she is going what she will be capable of.