October 6th, 2013
First SkyNet came for the jellyfish…
[This] is a team of unmanned swimming robots designed to scour an area and grind up all the jellyfish they find. And they've got the chops (literally) to suck up jellyfish at a rate of 900 kilograms – nearly 2,000 pounds – an hour.
The invention comes from the Korea Advanced Institute of Science and Technology. Engineer Hyeon Myeong and colleagues developed it to help clear fishing waters of jellyfish blooms. […]
October 2nd, 2012
Cooperative Quadrocopter Ball Throwing and Catching:
OK, so now it's cute and amusing and quietly impressive. A decade from now, when the AI-driven quadrocopter is employing those same subroutines to hunt down the remnants of the human resistance … not so much.
August 20th, 2011
Never mind Wall-E, they've got 790! And a DRD! And a Smash Mash robot!
For the record, I have to lodge an objection to the inclusion of a Dalek and a Cyberman in a collection of robots. You can argue that a Cyberman qualifies because the brain is apparently just being used as a fancy CPU, but a Dalek is unquestionably a battle suits being driven by little mutated Kaled, so they definitely don't qualify as robots IMHO. Details like this matter, dammit!
August 9th, 2011
Matt Jones has posted a summary of a recent talk he gave pulling together his thoughts about The Robot-Readable World.
Robot-Readable World is a pot to put things in, something that I first started putting things in back in 2007 or so.
At Interesting back then, I drew a parallel between the Apple Newton's sophisticated, complicated hand-writing recognition and the Palm Pilot's approach of getting humans to learn a new way to write, i.e. Graffiti.
The connection I was trying to make was that there is a deliberate design approach that makes use of the plasticity and adaptability of humans to meet computers (more than) half way.
Connecting this to computer vision and robotics I said something like:
"What if, instead of designing computers and robots that relate to what we can see, we meet them half-way – covering our environment with markers, codes and RFIDs, making a robot-readable world"
The entire post is packed with fascinating ideas and links to other writings on the topic, including one to this terrific BLDGBLOG post on The New Robot Domesticity that I happened upon earlier the same day I read Matt Jones' piece.
We live in interesting times.
July 3rd, 2011
March 29th, 2011
Quadrocopter Ball Juggling. Apparently the quadrocopters themselves aren't doing the motion tracking – that's being handled by the infrastructure of the arena, with course corrections presumably being sent to the quadrocopters on the fly (so to speak.)
Even if the 'copters aren't fully autonomous, it's still an impressive sight.
One day, a descendant of one of those quadrocopters is going to be equipped with live munitions and hooked up to a powerful AI by some ambitious soldier, at which point we'd best hope that the damned thing decides to emulate a Culture Mind rather than, say, SkyNet.
We should be nice to the quadrocopters, in the hope that they'll tell their grandchildren to keep some of the apes around…
January 28th, 2011
Long Exposure Pictures Of Robots Cleaning. Fascinating to see how differently the three robots tackle the job. The Roomba may take longest – and have the action that least closely resembles the approach a human would take to vacuuming that space – but I find that pattern much the most appealing of the three.
I should probably never consider buying a robot cleaner, because it certainly wouldn't save me any time. For the first several months – at an absolute minimum – I'd be observing it in action instead of leaving it to get on with the cleaning; moving furniture around between cleaning sessions in an attempt to pose it a new problem, trying to work out the algorithm it was using, making predictions about which way it'd turn next.
[Via Russell Davies, via Interconnected. Both posts are well worth a read in their own right.]
January 22nd, 2011
James Auger and Jimmy Loizeau have been thinking about how to build a [better|bigger] mouse trap:
The Mouse Trap Coffee-table Robot:
A mechanised iris is built into the top of a coffee table. This is attached to an infra red motion sensor. Crumbs and food debris left on the table attract mice who gain access to the tabletop via a hole built into one over size leg. Their motion activates the iris and the mouse falls into the microbial fuel cell housed under the table. This generates the energy to power the iris motor, sensor and a LED graphic display on the front of the table–top.
October 9th, 2010
Reading this …
In its self-taught exploration of Internet English, NELL is 87 percent correct. And the more it learns, the more accurate it will become. According to a paper called "Toward an Architecture for Never-Ending Language Learning," NELL has two tasks: to read, and to learn from that reading — to "learn to read better each day than the day before…go[ing] back to yesterday's text sources and extract[ing] more information more accurately."
… and this …
The US National Nuclear Security Administration recently announced that it has started using autonomous robot vehicles to patrol the vast desert surrounding its Nevada National Security Site (NNSS). The 1360+ square miles of territory is home to millions of tons of low grade nuclear waste, as well as Cold War Era nuclear weapons, and cutting edge nuclear testing research. Guarding those precious nuclear materials is the Mobile Detection Assessment Response System (MDARS) robot, which is essentially a camera on a mini-Hummer. The MDARS can roam and scout the desert on its own, alerting a remote operator when it encounters something that shouldn't be there (two headed coyote?).
… I can only concur with MeFi commenter The Whelk:
Have these people never seen a sci-fi movie ever?
July 28th, 2010
A cry for funding if ever I heard one:
[…] Writing in IEEE Computer, Professor Noel Sharkey, from the University of Sheffield's Department of Computer Science, along with former Crimewatch presenter Nick Ross and Senior Interpol Advisor, Marc Goodman, warn of a coming robot crime wave in which military and police robots could be open to abuse from criminals.
Professor Sharkey urges fellow scientists and engineers working in robotics to be mindful of crime prevention and build in components in the software to assist with forensic analysis. He and his co-authors call for the police to consider building information databases that could track and trace robot crime, similar to our current fingerprint database system.
Professor Sharkey said: "Robots could assist a vast range of crime from drugs vending to assault and murder to voyeurism and burglary. Robots can't even be detected by the passive IR alarm systems in most of our houses. More pressing though, is the danger that criminals or terrorists will hack into armed military or police robots and pose a threat to life."
"The new crime wave might be 10 or 20 years away, but we should have no doubt it is coming. Robots will be used for crimes because they offer two elements that have always promoted crime: temptation and opportunity. We must act quickly and decisively to head off a pandemic of robot crime."
[Via Kevan Davis]
July 12th, 2009
Courtesy of a Making Light comment thread: When Calvins collide!
John Calvin and Susan Calvin, that is.
July 10th, 2009
April 11th, 2009
Tweenbots are human-dependent robots that navigate the city with the help of pedestrians they encounter. Rolling at a constant speed, in a straight line, Tweenbots have a destination displayed on a flag, and rely on people they meet to read this flag and to aim them in the right direction to reach their goal.
My favourite read of the week, by a mile.