Sunday, May 27, 2012

I want the computer from Minority Report

Remember, remember the Fifth of November?”


Oops. Wrong movie. I meant this one:

Fun movie. And a really cool computer UI:


So how close is this to reality? Maybe we’re almost there (but even if we don't NEED the ravin’ glow-finger gloves, we might still WANT them).

Nintendo started the popularization of motion control with the Wii-mote. Sony has jumped on the bandwagon as well with its karaoke-mic-styled PlayStation Move controller. And Microsoft of all people took the next step with its Kinect system of full-body optical recognition. For almost a year, this system has been available for Windows 7 computers (see http://bit.ly/tIg1U0 for the M$ vision for Kinect).

 A couple of other recent developments raise some interesting possibilities. One company (http://on.mktw.net/Jh6918) is claiming to have developed a system that will track all 10 fingers with pinpoint accuracy. Soon, all the techniques that the iPhone has taught us could be available in thin-air.

An odder approach to this is to put a sensor onto a person’s shoes so that all of your hand gestures can be captured and used to control a smartphone in your pocket – as they call it, “eyes-free interaction.” (http://cnet.co/Je1JJs)

“How ‘bout the power to kill a yak, from 200 yards away … with mind bullets! That’s telekinesis, Kyle!”
Of course, all of this optical recognition technology is nice, but it will be obsolete once we can simply control our devices with our minds. Some, like Peter Bentley, worry that we will go beyond just using mind control to help people with injuries like quadriplegia and start putting it into daily use (http://huff.to/LjYww4). While he may be right about the current state of technology (I don’t think I’d like a buggy computer chip surgically implanted in my brain), as things advance even farther, popular demand may trump all of these concerns. And then we might truly see Homo cyborgis.

Monday, May 14, 2012

Read the code, find the way, do the work - Simple!

In the past few weeks, through our posts, we have introduced you to some data about how people may work with computers in the future. Today we are here to discuss how these interface changes are helping humans at large and are making their lives easier.
Have you ever seen a huge warehouse where the workers go in and bring new orders to the delivery dock?   It takes time and effort for them to get to the required items and bring them out. An order may consist of more than one item for which and many workers may be required to complete the order. Alternately, if only one person is working on your order, the time to complete the order might be unacceptably long. As technology evolved, small pickup vans were used in warehouses to reduce the workload of the workers. Though this significantly reduced the physical effort required, the time aspect was still an issue for long and complex orders. Many stores strived to fix this delivery issue but there was no good way to speed up order completion. Increasing the number of workers did not help, as it increased both costs and management complexity.
To fix this problem, the most practical solution has been to bring more machines into the process.
In doing this, the big questions are:
1.    Do we have to direct the machine every time to get the required material?
2.    Will the machine just help find the item while the worker still picks it up?
3.    Will the machine be as big as a human?
The answer to all these questions is now a simple “No”.
We started this blog with “learn to talk to machines.” The next big step ahead is making the machines talk to each other as well.
What languages do the machines speak?  
One important method is called Client-Server Communication. This is a kind of the boss-subordinate relation where the. Server computer is the boss and the clients are the machines programmed to do what the boss says.
Another important aspect is the barcode. With this form of coded communication, the robots navigate their path to destination and back.
Kiva robots work in warehouse with a central computer that keeps track of all the robots and the robots read bar codes on the floor to navigate. Now, a worker can work on multiple orders at once. The robots have increased the accuracy of the order delivery, reduced the completion time, and have made possible a breakthrough to parallel order processing.  Watch the robots in action!
Machines can work tirelessly; they don’t need lunch or coffee breaks; just enough time to charge.

Thursday, May 3, 2012

Transforming throughout technology: innovation in medicine.


Many people may say that human contact is essential, especially when treating sick people. Are you willing to be cured by a robot? Would you take a pill that can run a lot of complicated exams? Daniel Kraft in “Medicine’s future? There’s an app for that”  explains the future of medicine, claiming robots and other digital interfaces will soon replace doctors and traditional ways of curing.

The question is, as technology keeps developing, are we losing the essence of humanity by attenuating our contact with each other? The future of medical students includes studying anatomy and surgery with machines. Jack Choi just created a virtual dissection table for example. What about the importance of enhancing students’ preparation with real organisms? To what extent is person shown on the virtual dissection table similar enough to a real human body?