Kinect

Changing the way we design: Kinected Conference

Kinected Conference is a project by MIT students that aims to make the videoconferencing screen a more useful tool by integrating it with Kinect motion detection. The students demo their system in the video below; video quality is rough, and some of the features are clearly still in development, but it is still a fascinating technology.

Kinected Conference from Lining (Lizzie) Yao on Vimeo.

The four applications they are developing, "Talking to Focus", "Freezing Former Frames", "Privacy Zone" and "Spacial Augmenting Reality", each could be beneficial to our industry.

"Talking to Focus" - This feature makes video conferencing better. Especially when involving multiple locations, each with multiple participants. Only the people currently speaking are in focus; above their image are word balloons that could contain a variety of useful information, links to important documents, etc. ("Click the link above my head to download the specs I just mentioned...") One of the major limitations of video conferencing is attention tends to focus on individuals based on their screen position, rather than activity, as opposed to face-to-face where the eye gravitates towards the most active individual; this feature directly addresses that problem.

"Freezing Former Frames" - In addition to augmenting the impact of "Talking to Focus", this feature allows participants to "pause" just themselves while they pull up important documents, step out of the room momentarily, or even just get over a sneezing attack without disrupting the conversation. Imagine how useful that would be in a sales call!

"Privacy Zone" - In the video this feature was the least successful of the ones demonstrated, but the idea is there and it will only improve. Essentially this puts up a curtain behind the speakers. Making a call from home, or haven't had time to clean the office? No problem. On the trade show floor and don't want the traffic to be a distraction? Solved. But beyond that, this introduces greenscreen capabilities. Instead of a blank white wall, throw up the building plans; literally walk the team through the designs, and use Kinect's gesture recognition to navigate, zoom, and highlight key features.

"Spacial Augmenting Reality" - This has the most direct implications for construction. First of all, notice that the 3-D spacial recognition is good enough the system can measure length. This now becomes an inspection tool; point the camera at the wall panel, and everyone sees how the actual dimensions compare to the specified dimensions. Heck, this could probably create an as-built overlay for BIM, allowing for direct visual comparison. Second, imagine each of those blocks they were moving around the table was linked to a BIM element. The software will probably quickly evolve to the point the blocks are not even needed; move your hand to the on-screen location of an object, make the appropriate gesture, and manipulate as needed.

What really blows me away about this, though, is that it's based on consumer-level, widely available video game technology. I could walk down the street, drop $200 dollars, and set this up in my office. We are on the verge of complete design tool transparency, allowing us to interact directly with our designs.

I'm excited to see what happens next!

The Future of Design Tools

One of the most memorable parts of the recent Iron Man movies was the interactive holographic tool Tony uses to design his armor (if you have not seen the movie, watch this). This level of motion-based computing has become a sci-fi staple, and it is easy to understand why. How great would it be to design buildings the same way painters and sculptors create art?

Now watch this:



The researchers are using a motion-detecting video game console, which many still dismiss as "toys", to control a game so complex that it normally requires specialized keyboards and mouses. The complexity of World of Warcraft controls is equivalent to the complexity of BIM design tools: a set library of functions that can be performed on selectable objects in an interactive, changeable environment. Granted, the controls are rough and hardly ready for "serious players", but that is just a matter of refinement.

Consumer Product Analyst James McQuivey has this to say about the Kinect:
...Kinect for Xbox 360 will usher us into a new era Forrester has entitled the Era of Experience. This is an era in which we will revolutionize the digital home and everything that goes along with it: TV, internet, interactivity, apps, communication. It will affect just about everything you do in your home. Yes, that, too.

I've just completed a very in-depth report for Forrester that explains in detail why Kinect represents the shape of things to come. I show that Kinect is to multitouch user interfaces what the mouse was to DOS. It is a transformative change in the user experience, the interposition of a new and dramatically natural way to interact -- not just with TV, not just with computers -- but with every machine that we will conceive of in the future. This permits us entry to the Era of Experience, the next phase of human economic development.

Meanwhile, IBM is working on Star Wars-style holographic phone calls. They predict that within 5 years the technology will be compact enough to fit in a cell phone. Desktop versions of the technology should easily handle BIM; the models are already designed for simple computer rendering, and are generally more static than a talking human face.

In other words, it might not be long before we can design buildings with a few waves of the hand. McQuivey points out that manufacturers will need to join the Era of Experience too. Designers will want to handle virtual models of your product, fit them together to build fantastical structures, and see what happens when they fall down. The companies that succeed in this era will be those that can provide that experience.

As a closing thought, the "Era of Experience" idea extends beyond sales tools and computers. The idea of the "Experience Economy" has been around since the mid-90's; I first encountered it as an explanation of why coffee costs so much more at Starbucks than at Denny's. Even deeper than that, though, is the understanding that the experience your clients have with you is part of their experience with your product. Providing a positive experience, which is something above and beyond just providing good customer service, is more important than even the coolest high-tech design tools.

UPDATE: Check out this page for a gallery of amazing Kinect hacks: http://openkinect.org/wiki/Gallery

[H/T ReadWriteWeb]

Kinect-type hardware for all PCs!

In a recent post I discussed the potential impact of Kinect's motion-detecting controller on the design process; today, I found this update:
PrimeSense, the leader in sensing and recognition technologies, and ASUS, a leading enterprise in the new digital era, announced today that PrimeSense Immersive Natural Interaction™ solutions will be embedded in WAVI Xtion, a next generation user interface device developed by ASUS to extend PC usage to the living room. WAVI Xtion is scheduled to be commercially available during Q2 2011 and released worldwide in phases. 
There is also a software development kit for designers wanting to create 3D-sensing applications to be distributed through an online App Store.

This is the next big breakthrough. Not just for the construction industry, but for the way we use computers. And, since computer tech is getting so small and light, potentially the way we interact with all our tools and devices.

One of the products we considered for our annual Top Ten list was the Norton Trinity "Intelligent Door Closer". It did not make the final cut, but only because we realized we were more excited about the implications of this type of technology than the actual product. Trinity is a door closer with a self-powered on-board computer that monitors room temperature and adjusts door closure rate to compensate. In and of itself a very cool advancement, but the bigger story is the computerization of such a small, background piece of equipment. This is not a computer hooked up to a door so it can open, close, and lock remotely; this a computer in the actual door.

Now combine that with Kinect-style motion controls, and an ever more sophisticated library of gesture recognition.

Imagine a sink that can see you pull your hands away in shock from scalding water, and adjust the temperature to compensate.

Imagine lamps that increase lighting when you pick up a book, then turn it down when you lie down to sleep.

Imagine phase-change windows that become opaque or translucent based on your gesture.

Imagine...

[h/t ReadWriteWeb]