Jeff Han's touch screen interface works by measuring the refraction of light caused by fingers touching the screen. This allows for several neat computer-human interactions. Namely, the screen is multi-touch, meaning that several fingers can touch the screen at the same time. Secondly, the screen allows for different reactions based on the amount of pressure applied to the screen. During Jeff Han's presentation to TED he also indicated that the screen could potentially register proximity of the hands to the screen.
During this presentation some of the more obvious uses of the multi-touch screen are demonstrated. I think that the really important impact of this technology is how it will enable people to interact with data and give people more options for thinking about how to interact with data.
One terrible aside, that I can see is that Jeff is currently producing these screens for military vendors. Hopefully the influx of cash from these people will enable Jeff to branch out into more civilian uses.