So, we dusted some of our oldest and advanced robots and software for a shoot session, here is the summary work that soon you are going to be able to see in details in our site, which in a ways summarize the working areas of the
R.E.A.L Group at
UACh:
Part of out work cover Artificial Intelligence (A.I), specially with Neural Networks, we have a couple of papers about in it applying ANNs to Domotics and other stuff...
Some time ago, these kind of work was called just "Virtual Reality", but now when you mix data from real work mixed with your virtual world it is called "Augmented Reality"...so yeap, we do have some work about it, also again in the area of Domotics and Environmental Algorithms, mixing real data from devices such lamps, doors/windows, TVs and Audio/Video devices (DVD Players, TV setc, est), mixed with Virtual info in a simulated environment. Both worlds can work in a united or separated manner...
This one is our "BigFoot", all terrain vehicle, it have a 2D LASER range scanner that with the addition of a couple of Accelerometer is able to recreate a 3D Points Cloud Maps of the surroundings, this allow to the robot to do autonomous navigation and mapping of the areas covered.
This one is called "HAL", which is a recursive acronym for "Hal with Augmented Logics", but what means "Hal" and again the answer is "Hal with Augmented Logics", so the "H" is always "HAL"...what a mess :-P, it is a way in which we us Nerds seems that we are clever :-S ...The reason for that is that in every iteration we want to add more knowledge about itself to HAL. Currently is at the 0003 version, which mean that we have another 8997 versions to accomplish our final goal of a fully self conscious robot ;-)
With our 2D LASER Range Finder + a couple of accelerometer, we are able to recreate 3D Points Clouds Mapping, this one is the head of our friend Patricio Palma that sacrifice his head for science :-)
This three robots platforms build ups from simple Netbooks, brings robotics technologies closer to any engineer, specially students, with a simple and sheep Netbook can be use it as a robot vehicle.
We work with the idea of a common PC as a robots from 2003 when this was a pretty new idea, honestly back then we was the second or third project that develop this idea (at least in a public way), other companies takes the idea more seriously and they even sell PC based robotics platform today like the PcBOT 914 from WhiteRobotics, but we came long before they with the PcBot name and basic idea behind...This one is our first version (third iteration) and one of our oldest robot, that's way we called "Matusalen" too :-)
PcBot v1 came to life back in 2003 with his first iteration when an IBM PC or compatible computer was sheep enough to be use it a dedicated way as a robot, before that, the sole idea of use a PC as a robot was crazy due the cost and delicate components of those days computers. Back then for a robot you should develop special circuits to build your robot.
Following the same idea, when basics Notebooks become sheep enough in 2005-200, we create our PcBot v2 or "NoteBot".
This one is pretty new, we build it at the end of last year (basic chassis) and we finish it a couple of weeks ago. Rhexalion is a weird mix of the Rhex hexapod robot concept from Boston Dynamics and some of the main ideas of the Dandelion rover design. Rhexalion is a Bio-Inspired design for raw terrains in which it performs the Alternate Tripod walking algorithm but with just 4 Wheel/Legs instead of the common 6 ones that most hexapod robot use. So far tests are pretty good but there is still a lot of room for improvements...

For about 3 years now, we have a Bioloid Expert Robot Kit biped robot platform, first we focus in bring Linux compatible software to work with this robot (you know, we love Linux and the software from the Robotis company is all just for Windows) and now from last year, we begin to work in direct and inverse kinematics to improve the biped walk of the robot. It is a very hard task and we are moving forward slowly, but is a very interesting area in which we also want to apply A.I and other Technics.
This Domotic Scene is also a mix between our Robotics and Automation SDK called "
monoBOTICS" and the
Icarus Scene Engine from
PointScape. The idea is that with simple drag-and-drops technics, you can create your office or home environment and with simple setting you can setting your domotics or automation hardware. This later allow to for instance click on your virtual lamp and turn ON or OFF your real lamp, same for TV sets, DVD players, heaters and so on...
Finally also with the help of Icarus from PointScape, we can simple set up some VR Hardware like our custom DataGlobe fro research and develop, focusing in the research that yuo want to do (sing language, augmented reality or wherever) and not spending time creating the virtual represntation of the hand or how to access to the incoming data...
Cheers!
UPDATE!!!
We almost forget these two...At the left is a shoot from 3DSharpView software, the cool thing is that it use mono-C# plus openni with the Kinect Depth Camera to get the rgb and depth data, so is a cross-platform software to work at high level language for map reconstruction, object analysis and recognition among others.
This one is a cool adaptation for a Power Wheelchair, with a simple PC interface and a basic Netobook you can control the chair by keyboard, mouse pad, Wiimote and Face recognition.
For the first two is simple, but with a traditional Wiimote things gets interesting. As a wireless device is very confortable, lightweight, configurable gadget to control the chair. The user can adjust the movements that are comfortable for him to operate the chair, sensitivity on the Wiimote can also be adjusted. The user can attach the device on arms, legs or other part of the body that can be easily move it.
The Face Recognition is kind of tricky, since depth info is not easy to get with a simple 2 Mega pixel Netbot camera, the trick is more to set the proper amount of movements of the face (Up, Down, Left and Right) to produce the movement. Is a very initial implementation but it proves to be useful in certain scenarios and there is a lot of room for improvements and future development.
A pictures of the gang...