Tuesday, December 20, 2011

IBM touts mind control as next big thing in interface


IBM seems to think that talking to your iPhone or Xbox through Siri or Kinect is so this year. They predict that in five years, the hardware will be small enough and cheap enough to become mainstream.

The art of typing seems to have a chance at being supplanted  by speech recognition pretty rapidly. Still, I'm not so sure I like the idea of bypassing any natural human displays of gesture in order to type faster or to change the channel by simply thinking about it. My, how far we've come since The Clapper.

Link: IBM Research News

Friday, December 2, 2011

Kinect interface to be built into smart TVs?

This article over at The Daily that I seemed to have missed last week, stated that some sort of Kinect TV is in the works.They also mention Sony as being one of the manufacturers. This seems to be about right in line with the next logical step for Kinect. As for Kinect in a Sony TV, I'll believe it when I see it.

Monday, November 28, 2011

Kinect 2 rumored for next Xbox will read lips, sense your emotion, judge you


When Kinect launched last year, its potential was already compromised. Instead of getting 640x480+depth, we got something more like 320x240+depth (at 30fps), half or less of what the hardware is even capable of. That is due to the USB 2.0 controller interface, which should be 35MB/s. I say "should" because it needs to share that bandwidth with other USB devices on board, which lowers the bandwidth to 5/16MB/s. Microsoft did release a patch that seemed to help the issue, but is very secretive about how much of that is due to data compression.

With that in mind, Kinect 2 is rumored over at Eurogamer to have a higher resolution image and IR sensor so it would naturally need a much higher bandwidth pipeline from the start. The accuracy of Kinect 2 is stated by Eurogamer's source to be "so powerful it will enable games to lip read, detect when players are angry, and determine in which direction they are facing.".

Also rumored lately, is 2 future Xbox SKUs. One of them being referred to as XboxTV, with stripped down capabilities and Kinect gaming portal (why not just reconfig a 360?) and the other being a hardcore games machine called Infinity or Loop, which would be backward compatible. Eurogamer also sites a past Wii-like controller concept also named Loop. I would imagine a reconfigured Xbox controller made to be used in single or dual-handed forms would make plenty of sense to me. This would really open up the hybrid gameplay with tandem Kinect and controller(s). The new consoles are rumored to be unveiled at E3 next summer, with some speculating a winter 2012 release and others suggesting a winter 2013 release.

Whether or not this Kinect 2 is the same as the upcoming Kinect for Windows remains to be seen, but one would assume that some sort of consolidation would keep cost down.

Friday, November 25, 2011

DisplAir combines projection, fog machine, IR sensor, magic

A Russian start-up company has created a sci-fi interface turned into a reality. Astrakhan-based DisplAir, has demonstrated that they can run a GUI image projected onto a curtain of fog, where the IR camera tracks the hand movements ala Kinect-style interface. The IR camera is capable of picking out hand movements as small as 1cm, making it more sensitive than the Xbox-bases device, even though there seems to be more lag. Hopefully the forthcoming Kinect for PC will be able to do as well or better, with its zoom feature.

I'm not sure why exactly this is any more efficient than a touch screen display, but it is indeed much cooler.




Source: Electronista

Wednesday, November 23, 2011

Kinect for Windows gets new hardware, ready for its close-up


Since announcing a few weeks ago that the Kinect for Windows commercial program will launch in early 2012, we’ve been asked whether there will also be new Kinect hardware especially for Windows. The answer is yes; building on the existing Kinect for Xbox 360 device, we have optimized certain hardware components and made firmware adjustments which better enable PC-centric scenarios. Coupled with the numerous upgrades and improvements our team is making to the Software Development Kit (SDK) and runtime, the new hardware delivers features and functionality that Windows developers and Microsoft customers have been asking for. 
Simple changes include shortening the USB cable to ensure reliability across a broad range of computers and the inclusion of a small dongle to improve coexistence with other USB peripherals.  Of particular interest to developers will be the new firmware which enables the depth camera to see objects as close as 50 centimeters in front of the device without losing accuracy or precision, with graceful degradation down to 40 centimeters.  “Near Mode” will enable a whole new class of “close up” applications, beyond the living room scenarios for Kinect for Xbox 360. This is one of the most requested features from the many developers and companies participating in our Kinect for Windows pilot program and folks commenting on our forums, and we’re pleased to deliver this, and more, at launch"
Click here for the rest of the story at: MSDN

Thursday, November 17, 2011

Wearable camera + HMD + trainer = remote human assist


It's augmented reality or AR for the workplace. I can't find another term so I made up the term remote human assist or RHA. I'll leave the rest up to Japan's institute of Advanced Industrial Science and Technology, who came up with this AR concept.

I'll try to make this as uncomplicated as possible. So you have 2 humans wearing a GoPro HD camera mounted on a helmet. Next you have both also wearing a HMD or Head Mounted Display showing the action through video. Basically, you have the trainer viewing the trainee's line of sight. The trainer then displays a hand movement of where the trainee is supposed to put his own hand, walking him through a task. The colored glove of the trainer is superimposed over the trainee's real-time image.

This could come in handy if say your airline pilot gets sick from eating the fish and you have to land the plane yourself... or you walk into a situation where you have only minutes to disarm a bomb and there happens to be one of those units laying there. Seriously though, I guess there are times when a trainer can't be following a trainee or multiple trainees all of the time. This appears to be designed for hostile environments where um, the trainee's life is more important? OK, I think you get the picture.

Video after the break:

Tuesday, November 15, 2011

Robots command humans to do their bidding. Welcome to the future


Researchers have completed tests using robots and blindfolded victims humans hooked up to electrodes in order to do tasks for them. Getting a ball in a basket is just the beginning. They are hoping to use this for rehabilitation purposes for people who can't get those muscles to work properly. That should hold them over at least until the robot apocalypse.

Video after the break:

Friday, November 11, 2011

Honda's ASIMO robot is not just smarter, but fitter, happier, more productive than ever before


Honda's AWESOME-O ASIMO robot has gone through some changes recently. New behavior control technology means that the robot can now make decisions on it's own, independent of human input. Honda says that ASIMO can choose its next move by combining long-term and short-term sensor data, including recognizing human gestures and speech, thanks to its onboard  set of visual and auditory sensors.

The robot, having recently shed 13.23 lbs (6 kg), can now run at a maximum of 5.9 mph (9 km/hr) and jump up and down indefinitely. ASIMO also now has independent finger movement control, making him able to handle much more delicate tasks.

Videos after the jump

Don't call it K-9. NSK developes robotic guide dog. Now with Kinect-O-Vision!



Japanese manufacturer NSK, maker of robot-related parts including ball bearings, has announced that they are developing a quadrupedal robot that could serve as a guide dog for the blind. Now working closely with the UEC, they are hard at work on a prototype, which uses a Microsoft Kinect sensor to detect stairs and other obstacles. New features include voice recognition, so that the robot can be easily commanded to start, stop, and move up and down the stairs.

Sunday, November 6, 2011

Microsoft promotes "Kinect Effect"






What is almost frightening is that if this video were posted a year and a half ago, most people would consider this a piece of concept future from Microsoft, and one that only has a toe dipped in the waters of reality. Well, its quite obvious now that this video only scratches the surface of what can be done right at this moment with regards to entertainment as well as science and practical applications. It seems that the future needs to start picking up the pace a bit because we seem to be catching up with it at a very fast pace.

Friday, October 21, 2011

HoloDesk shows more brilliant R&D from Microsoft Research

Microsoft Research keeps one-upping themselves. Again, none of the technologies are particularly new. Its like Microsoft uses the arm for brainstorming tech part combinations until they get the right combination to make something practical and natural. Once they get the right set of pieces and it can be proven to be a viable product, they whittle it down into a finished product. (like Kinect) There are many practical applications that can be used for this format and it could also be quite a training tool.



What makes up HoloDesk? So lets see: We have a partially reflective mirror barrier that you can see through, but also reflects images from a display on top. The Kinect-like camera is used to capture hand recognition to track your movement and and face recognition to position the holographic effect where it needs to be, depending on your relative line of sight.

I really like this new Microsoft. They seem to have a heart and soul and believe in bringing cool stuff out of science fiction to real life. Gone, seems to the drab, boring Microsoft of old. Its been changing since probably around the time that they designed the Xbox 360, but didn't nearly entrench the company as a whole until much more recently. Good timing too. With the rise of smart phones and now tablets, they are becoming the ones having to fight for market share. Still, they are on the right course to having a much more appealing brand identity. They have always been a software company, but something sparked in them since the original Xbox: The fact that you need new technology to grow the software. A lot of their newest innovations seem to have a harmony between the two.

Wednesday, October 19, 2011

Microsoft Research turns your arm, notepad into an interactive display

Combining Kinect and a Pico projector on your shoulder, Microsoft does exactly what the title suggests withe the OmniTouch.




I've seen the automatic display realignment calibration done before. In fact it was back in 2007 from Johnny Lee, who also worked for Microsoft for a stint. His experimental design though, was before Pico projectors existed and before Kinect (which he had a hand in development) was also available. So now you get portability (Pico projector) in conjunction with a touch interface (Kinect-like short-range camera/sensor). I would expect this technology to need to be much smaller yet before it gets taken seriously, and that is very likely to happen, since this version is using mostly off-the-shelf parts.


Saturday, October 15, 2011

Piloting a plane from inside an egg never looked so good. Barco offers complete half-sphere sim display.


This is unprecedented realism. Utilizing 13 Barco LCoS 10 Mega Pixel projectors onto the outside of a 3.4 meter diameter acrylic sphere, the realism happens inside the bowl, where the pilot sits, free of any other visible obstructions. The result is a view that is a complete 360-degree view of the world. The physical stimulation-simulation machinery would be separate, and I would assume that any lateral rotation would need to center its axis with the center of the dish.

More pics and video after the jump

Monday, June 20, 2011

Ubisoft's Rocksmith Teaches You Guitar Through A Video Game Format

"Introducing the Rocksmith bundle! The bundle will include the game along with an Epiphone Les Paul Junior guitar and the Rocksmith Real Tone Cable™. Available at launch and priced at $199.99!"

So with this game, you can use any real guitar and just buy the game w/ special cable for only $80. Don't have a guitar? Just purchase the one that comes with a Les Paul Junior (its full size, just not full strength) for only $120 more! It seems that the game will be available for Xbox 360, PS3, and PC. I LOVE THIS!!! This is exactly where Rock Band 3 wanted to go, but maybe bit off a bit more than it could chew, with the Pro Modes having to be compatible with the existing game format. This is where Guitar Hero players can go and put their skills to real-world use! I've been asking for this ever since the original GH from Harmonix. I just hope that the game is actually good.

I really hope that the future of video game developement becomes more open to the idea of learning real skills through fun interactive software. Its like that idea imploded in the '90s, save for maybe Leapfrog-type of stuff for the little ones.

Friday, June 17, 2011

Microsoft Delivers Kinect SDK For PC

"We've been waiting for confirmation on yesterday's rumor, about Microsoft's motion-sensing Xbox 360 peripheral coming to PCs, and now we have it. MS has just now released a software development kit (SDK) for Windows that will allow .Net developers to write Kinecting apps in C++, C#, or VB. We spoke with some developer representatives from the company to get the full details, including just what you can and can't do with this big bundle of libraries. Follow us after the break for all the info.
At this point the SDK is effectively a straight-port of the same libs that are currently available to Xbox 360 developers. Built on XNA, the Kinect library is standalone, so you won't necessarily need to rely on DirectX being present. The SDK gives full access to everything the peripheral has to offer, including both cameras (VGA and depth-sensing) and the full microphone array. The former can identify up to six individuals or track the full skeletons for two, while the latter can handle advanced echo-cancellation and even sound triangulation."

Thursday, May 26, 2011

Cinema Turns Toward High Speed 3D For Greater Immersion


It was announced last month that Peter Jackson's upcoming 2-part telling of The Hobbit would be not only be shot in 3D and with Red's new Epic digital cameras, but would also include a feature that no other major studio production has offered in nearly a century, an increased frame rate. In this case, doubling it! 

Fellow director James Cameron has been championing higher frame rates since well before Avatar was released, and I could not help imaging "what if?", while watching Avatar for myself. Of course his comment about "fighting battles one at a time" is well understood. The movie was truly something no one had ever seen before. Still, the first thing I noticed during this epic was how even though it often gave off that "you are there" feeling, when the action got heavy or I was forced to change depths, I had to blink my eyes repeatedly in order to get them to refocus. This took me out of that feeling multiple times, and eventually I had to learn to just let it go. I do believe that this is the #1 cause of eyestrain in 3D cinema. While the jury's out on making 3D movies that every single person can enjoy the same, I do have a feeling that this will be a breakthrough that turns more than a few 3D doubters into converts. Cameron shot an action scene at 24 fps, 48 fps, and 60 fps and showed it off at CinemaCon in March in order to show the media and get people in the industry on board for the higher frame rates. Those who were shown this demonstration seem to be generally impressed.

more after the jump

Tuesday, May 17, 2011

Move Over Microsoft Avatars, Time For Surrogates?

Conceivably Tech published an article recently that exposes a patent from Microsoft filed on January 28, 2011. It seems as though the new software patent would bring object recognition and a real-time replica of the user via body scan, rather than the family-friendly Avatar to the table for gaming. What's not known is whether the patent is intended for the current version of Kinect or a future version. The abstract reads:
“A depth image of a scene may be received, observed, or captured by a device. The depth image may then be analyzed to determine whether the depth image includes a human target. For example, the depth image may include one or more targets including a human target and non-human targets. Each of the targets may be flood filled and compared to a pattern to determine whether the target may be a human target. If one or more of the targets in the depth image includes a human target, the human target may be scanned. A skeletal model of the human target may then be generated based on the scan.”
Article from Conceivably Tech and more petent images after the break

Microsoft Makes Exploring The Known Universe Easy With Kinect

This story from last month didn't seem to get much recognition so I thought I would mention it because I just can't stop thinking about its coolness. Engadget broke it last month HERE. "During their day two keynote at MIX11, Microsoft showed off its Worldwide Telescope project powered by Kinect. In collaboration with NASA, the Worldwide Telescope project allows you to explore high-resolution photos and 3D renders of space and beyond."

Sometimes Microsoft can really surprise me how far reaching they have become into technology and research well beyond Windows and even the Xbox brand. Video and quote from Engadget: