- My Account
- Best of the 'Box
- What's New
- Open Lidar
RobotBox is a community for robot builders to show off their projects. Add yours today.
Hello. I am new to robotics. I have heard that there are many hackspaces and groups around Detroit, MI. Which ones are the best for a new guy?
Just wondering if you have or could post your pic source somewhere?
Awesome robot! Intriguing idea about the tendons.
How well does it walk? Any videos yet?
I chose to emulate the J5/Wall-E look for NINA because it's popular among many people, and hopefully it will allow people to feel comfortable around NINA because they know of Robots like Johnny-5 and Wall-E for being compassionate characters.
Johnny-5 was a really interesting person for me as a kid. I related to him because in many ways Johnny-5 showed traits of Asperger Syndrome! He didn't always understand many important social concepts, he often echoed words and phrases as he heard them, like, "Fredrick, wait here. I'll be right back after these words!" Many with AS have a particular obsession that they are interested in. For Johnny-5, it was "Input!" Also, he had a hard time making friends, and was frightened of rejection and loneliness--in some ways I felt the same way as a kid, even though J5 and I both had a handful of loyal friends. The robot Johnny-5 kind of taught me it was okay to be funny and to laugh at yourself if you goof up, and that it was important to surround yourself with loyal friends who do accept you.
I like that Ninja kinda looks like Johnny 5, always did like the triangle treads too.
Thanks, Bot-thoughts! Interesting thoughts on such a capcha system. Very intriguing.
NINA still has a long way to do, but her application as a social machine shows promise.
I'm actually going to try and get her on display with me as her operator at a local Autism conference in February where I'm going to push the idea of social machines and other forms of robotic autism therapy (like Keepon).
One reason why NINA would not make an "ideal" social machine is: 1) she's a little too big and heavy and might be intimidating. 2) Her servos make a lot of noise. I have asperger syndrome myself and servo gears don't bother me particularly, but another friend of mine who also has Asperger syndrome once told me to turn off one of my robots because she couldn't take it. (certain sensory intake can become ubearable to those of us on the autism spectrum). NINA, however, is the only social machine I have.
If I can push the idea of social machines far enough, however, I propose these robots should either be smaller and quieter than NINA, or for special applications, some social machines could also be animated computer characters. The best part about an actual robot social machine is its tangible and more interactive for some things.
I'm anxious to prepare NINA for her day at the Autism Conference. I don't know if anyone on the spectrum will be there, besides Temple Grandin herself, so just to be on the safe side I'll have a function in her python programming to home all joints to rest position for deactivation of her servos. She'll still be able to see and talk and listen--she'll just be imobilized for the sensory-safety and security of those on the spectrum who might be bothered by servo gears. I'll upload another video once the whole refit is complete.
That's fantastic! I have an LG Ally (cheapo Android) and was thinking about using it as a robot brain but chickened out for now.
Wow, that is a truly impressive project! Interesting idea about collaborative human optimization of robot interaction. One interesting idea is that of Google's game for identifying pictures... people play a game while simultaneously helping to categorize pictures for search. Sort of a distributed human computing concept. Similarly, one of the captcha systems out there also serves to convert scanned words to text with human help. Maybe one could create a captcha system that includes your robot's expressions and humans can categorize them while simultaneously providing robot feedback (e.g. does intent match expression).
Would love to hear much more about this robot... and eager to hear of your progress with it!
Sorry, I forgot to hit reply before responding to your comment. :p
Very grateful for your words.
That's precisely what I feel about robots too. They should help us where we need them. Already robots are saving lives in search and rescue missions, and they're helping us learn about the outer planets. Robots like ASIMO are being researched to see how they can help the elderly or disabled.
I don't give robots credit for intelligence either. Computers will get faster and more powerful, but as long as their architecture remains dictated by transistors and relays (like a metropolis of dominoes), robots have no real intelligence. Social Machines will not truly think and feel and be friends with their operators or users, but they can certainly act that way with the right programming. And as long as humans like to pretend robots like social mahines are alive, they can still benefit somewhat.
Here's an example. Dolphin therapy is great, but I read an artical once that there's is a facility where children are so sick that they can't safely leave or travel from the facility they reside. So the caring people who take care of these children highered the construction of a robot dolphin to look, feel, sound, move, and act like a real one. The robot was given to the facility itself. Children benefit from this robot the same way they would from "real" dolphin therapy. It was a beautiful article.
Great project! Start getting the robots ready to walk among us and help us. This is definitely the type of ongoing research project that is necessary to find out the best way we, humans and robots, can interact. (BTW, I am not giving robots credit for intelligence, but both groups will end up reacting in some form.)
I used a Mini-ITX board but now Springy's Stamp-based (plus RC). Her basic program has her following and grasping a 38KHz Pringles can. :-) The "machine intelligence" theory is from David Heiserman's "How to Build Your Own Self-Programming Robot" (TAB 1979). You can buy it on Amazon. Thanks!
I've seen many several videos of Keepon! How is this little robot progressing as far as its mission to assist those with autism spectrum disorder?
I understand that Keepon help those on the autism spectrum make eye-contact and help them be better able to focus--I believe.
Reading about Keepon last year inspired me to explore robotic autism therapy for those on the "high functioning" end of the spectrum--like Asperger syndrome. These people struggle with social skills, for instance, and have a hard time making friends. Hence, its easy for a lot of people with Asperger Syndrome to be "happiest" when they're by themselves. I did hear that interacting with a computer is easier and more comfortable for them than interacting with people. So I wondered, what if robots or animated computer characters could provide therapy or assistance in making the transition from interacting with a computer to interacting comfortably with people, while teaching social skill and important non-tangible concepts along the way? A robot or computer character that can be social with people? A robot that could heal and strengthen psychologically by being a "friend?" Even if its all an artificial thing, its the benefit of the people that matters. It helps--and its fun--to believe that an artificially intelligent robot is alive and talking to you and caring.
I'm exploring this idea with my experimental robotic platform NINA.
BTW-I have asperger syndrome myself, so I love to think about combining my interest in robotics to provide theraputic activity for my diagnosis.
Looks very cool!
What mini ITX board did you use? What's your experience with them? I'm thinking about giving my robot, NINA, a mini ITX.
I'm interested to learn more about this alpha level intelligence theory. What sort of robot behaviors encompass that? Perhaps I'll stop by the library and see if I can find anything.
Please remove the waynegramlich account from this web site. I keep getting spammed from this site and every attempt to shut down the spam fails. I would delete the account myself, but there does not seem to be any way to do so.
My Johnny 5 Toy Robot can be gotten at:
If you've ever wanted a toy version of Johnny 5 then here's your chance!
Please check it out!
Plus it can be used to convert with motors and electronics as your skill level permits!
Just a note, I've now wrapped the hacking work that the community has created into an ROS driver:
Either ToF or Triangulation are LIDAR. The concept is to shoot a specific beam of light from a laser (though technically you don't have use a laser, you could use something like a pinhole focused LED, but the range would be very short) then use the return from that specific beam to generate the results. Time of Flight is most commonly used until now because high speed CMOS sensors were not available in the 1990s. However precise clocks and a sensor that could tell it got hit fast enough was. After a bit, instead of using a clock you could actually read the charge level of the recharging laser pulse capacitor at the time the sensor said it was hit and give a distance reading since moving electrons and light aren't too far off in speed.
BUT now CMOS sensor of sufficient speed and with global shutter make triangulation possible. (Global shutter is a way to "gate" the picture by recording all pixels simultaneously. No having to read in the lines to get the results that could be changing as you read.) Triangulation is less sensitive to beam scatter than ToF, and triangulation can actually be more accurate for sub-mm measurement.
Okay, the simple answer, triangulation is a recognized medthod for LIght Detection And Ranging.
If you are referring to how the NEATO does its "magic" as far as LIDAR, read this paper:
Robotics and Automation, 2008. ICRA 2008. IEEE International Conference on
Issue Date: 19-23 May 2008
On page(s): 3002 - 3008
Location: Pasadena, CA
Print ISBN: 978-1-4244-1646-2
INSPEC Accession Number: 10014692
Digital Object Identifier: 10.1109/ROBOT.2008.4543666
Date of Current Version: 13 June 2008
It lays out the internals, the math, and how it was done. I am trying to get a source for the sensor. The sensor sited was a 752x16 Linear CMOS that had global shutter and could be read quickly. It was selected for this because it was inexpensive in 2008 and fast enough to gather 4000 datapoints/second.
Here is some information out of the IEEE 2008 paper that lays out the NEATO design, "...The relative weight of fs is determined by the image sensor, so we first decide on it. The image sensor should have a short exposure time to improve ambient light rejection (Section II.E), and a large number of pixels for resolution of x. We chose a global-shutter CMOS sensor with 752 pixels of resolution and a minimum shutter time of 35μs. Each pixel is 6μm, and we expect to be able to resolve the laser dot to within 0.1 pixel or better." The paper lays out the math and the major variables.
The original sensor selected was by "Photonics in Motion". Unfortunately they were purchased and I can't find the same time of sensor at the purchaser's site.
The paper makes the math, thus making flexibile software a possibility, and the paper shows the CMOS sensor with Global Shutter exists in 2008. Reading the paper is exciting. Lidar is available in several different configurations at costs far lower that a $400 vacuum. If people at at this group and others can crack two issues, finding a sensor and actually getting them delivered, and finding useable slip rings then 360/10 hz lidar is achieveable.
Great work Xevel! Go ahead and post on the RoboDynamics blog post about the SLAM prize. As far as I know, nobody has claimed it yet. You might wan to add some narration to the video to explain what's going on.
jbot: Thanks. That 1 pic you are talking about the robot is posed. With the friction of the RC servo gears it could hold the robot up, which is nice since standing then took very little power. The pcb inside of the RC servos were replaced by digital servo controllers of my own design which could detect the floor and thus maintain a level walk over uneven terrain.
borah: My design was not hard to do. The golden parts of the leg are simple brass strips from the hobby shop bent with pliers and drilled. I used Socket Head Cap Screws to put together. Some of the servos are attached to each other by opening the servo cases and drilling a bit, it's not hard and I explain this on my website. The base is made from pcb board and then painted gold and red. The cpu is on the underside of the robot. Simplicity is the key!
Lynxmotion has some good kits for making this style of hexapod so it's the easiest way to get up and running.
My digital servo design inspired openservo, a much more complete and better open-source digital servo controller. Check out http://www.openservo.org/
You can get more info on my website http://www.colinmackenzie.net/