Monday, November 28, 2011

Kinect 2 rumored for next Xbox will read lips, sense your emotion, judge you

When Kinect launched last year, its potential was already compromised. Instead of getting 640x480+depth, we got something more like 320x240+depth (at 30fps), half or less of what the hardware is even capable of. That is due to the USB 2.0 controller interface, which should be 35MB/s. I say "should" because it needs to share that bandwidth with other USB devices on board, which lowers the bandwidth to 5/16MB/s. Microsoft did release a patch that seemed to help the issue, but is very secretive about how much of that is due to data compression.

With that in mind, Kinect 2 is rumored over at Eurogamer to have a higher resolution image and IR sensor so it would naturally need a much higher bandwidth pipeline from the start. The accuracy of Kinect 2 is stated by Eurogamer's source to be "so powerful it will enable games to lip read, detect when players are angry, and determine in which direction they are facing.".

Also rumored lately, is 2 future Xbox SKUs. One of them being referred to as XboxTV, with stripped down capabilities and Kinect gaming portal (why not just reconfig a 360?) and the other being a hardcore games machine called Infinity or Loop, which would be backward compatible. Eurogamer also sites a past Wii-like controller concept also named Loop. I would imagine a reconfigured Xbox controller made to be used in single or dual-handed forms would make plenty of sense to me. This would really open up the hybrid gameplay with tandem Kinect and controller(s). The new consoles are rumored to be unveiled at E3 next summer, with some speculating a winter 2012 release and others suggesting a winter 2013 release.

Whether or not this Kinect 2 is the same as the upcoming Kinect for Windows remains to be seen, but one would assume that some sort of consolidation would keep cost down.

Friday, November 25, 2011

DisplAir combines projection, fog machine, IR sensor, magic

A Russian start-up company has created a sci-fi interface turned into a reality. Astrakhan-based DisplAir, has demonstrated that they can run a GUI image projected onto a curtain of fog, where the IR camera tracks the hand movements ala Kinect-style interface. The IR camera is capable of picking out hand movements as small as 1cm, making it more sensitive than the Xbox-bases device, even though there seems to be more lag. Hopefully the forthcoming Kinect for PC will be able to do as well or better, with its zoom feature.

I'm not sure why exactly this is any more efficient than a touch screen display, but it is indeed much cooler.

Source: Electronista

Wednesday, November 23, 2011

Kinect for Windows gets new hardware, ready for its close-up

Since announcing a few weeks ago that the Kinect for Windows commercial program will launch in early 2012, we’ve been asked whether there will also be new Kinect hardware especially for Windows. The answer is yes; building on the existing Kinect for Xbox 360 device, we have optimized certain hardware components and made firmware adjustments which better enable PC-centric scenarios. Coupled with the numerous upgrades and improvements our team is making to the Software Development Kit (SDK) and runtime, the new hardware delivers features and functionality that Windows developers and Microsoft customers have been asking for. 
Simple changes include shortening the USB cable to ensure reliability across a broad range of computers and the inclusion of a small dongle to improve coexistence with other USB peripherals.  Of particular interest to developers will be the new firmware which enables the depth camera to see objects as close as 50 centimeters in front of the device without losing accuracy or precision, with graceful degradation down to 40 centimeters.  “Near Mode” will enable a whole new class of “close up” applications, beyond the living room scenarios for Kinect for Xbox 360. This is one of the most requested features from the many developers and companies participating in our Kinect for Windows pilot program and folks commenting on our forums, and we’re pleased to deliver this, and more, at launch"
Click here for the rest of the story at: MSDN

Thursday, November 17, 2011

Wearable camera + HMD + trainer = remote human assist

It's augmented reality or AR for the workplace. I can't find another term so I made up the term remote human assist or RHA. I'll leave the rest up to Japan's institute of Advanced Industrial Science and Technology, who came up with this AR concept.

I'll try to make this as uncomplicated as possible. So you have 2 humans wearing a GoPro HD camera mounted on a helmet. Next you have both also wearing a HMD or Head Mounted Display showing the action through video. Basically, you have the trainer viewing the trainee's line of sight. The trainer then displays a hand movement of where the trainee is supposed to put his own hand, walking him through a task. The colored glove of the trainer is superimposed over the trainee's real-time image.

This could come in handy if say your airline pilot gets sick from eating the fish and you have to land the plane yourself... or you walk into a situation where you have only minutes to disarm a bomb and there happens to be one of those units laying there. Seriously though, I guess there are times when a trainer can't be following a trainee or multiple trainees all of the time. This appears to be designed for hostile environments where um, the trainee's life is more important? OK, I think you get the picture.

Video after the break:

Tuesday, November 15, 2011

Robots command humans to do their bidding. Welcome to the future

Researchers have completed tests using robots and blindfolded victims humans hooked up to electrodes in order to do tasks for them. Getting a ball in a basket is just the beginning. They are hoping to use this for rehabilitation purposes for people who can't get those muscles to work properly. That should hold them over at least until the robot apocalypse.

Video after the break:

Friday, November 11, 2011

Honda's ASIMO robot is not just smarter, but fitter, happier, more productive than ever before

Honda's AWESOME-O ASIMO robot has gone through some changes recently. New behavior control technology means that the robot can now make decisions on it's own, independent of human input. Honda says that ASIMO can choose its next move by combining long-term and short-term sensor data, including recognizing human gestures and speech, thanks to its onboard  set of visual and auditory sensors.

The robot, having recently shed 13.23 lbs (6 kg), can now run at a maximum of 5.9 mph (9 km/hr) and jump up and down indefinitely. ASIMO also now has independent finger movement control, making him able to handle much more delicate tasks.

Videos after the jump

Don't call it K-9. NSK developes robotic guide dog. Now with Kinect-O-Vision!

Japanese manufacturer NSK, maker of robot-related parts including ball bearings, has announced that they are developing a quadrupedal robot that could serve as a guide dog for the blind. Now working closely with the UEC, they are hard at work on a prototype, which uses a Microsoft Kinect sensor to detect stairs and other obstacles. New features include voice recognition, so that the robot can be easily commanded to start, stop, and move up and down the stairs.

Sunday, November 6, 2011

Microsoft promotes "Kinect Effect"

What is almost frightening is that if this video were posted a year and a half ago, most people would consider this a piece of concept future from Microsoft, and one that only has a toe dipped in the waters of reality. Well, its quite obvious now that this video only scratches the surface of what can be done right at this moment with regards to entertainment as well as science and practical applications. It seems that the future needs to start picking up the pace a bit because we seem to be catching up with it at a very fast pace.