Apple has been granted a patent for a multi-touch display that can sense when and where a finger is near the screen. The patent was one of 13 granted to Apple, and revealed on the eve of Wednesday's expected announcement of a multi-touch Apple tablet.
The patents were awarded by the US Patent & Trademark Office and reported by Patently Apple, a website that tracks the company's patent applications and awards as indications of the company's innovations and technology investments.
Patent number 7,653,883 originally filed in Q3 2005, apparently can make use of several different techniques, singly or in combination, to sense a nearby object such as finger or stylus. The techniques include optical shadow, capacitive, inductive, electric field and others, and one or more sensors.
You can find the full filing by typing in the patent number to the U.S. Patent and Trademark Office's search engine. The Patently Apple story suggests that the core of the patent may actually concern the control of the proximity sensing technology and relating that capability to the visual user interface. The new patent is one of a battery of Apple applications around display technology, including one that makes LCD pixels "touch sensitive," eliminating several layers in current screen technology resulting in brighter, thinner and lighter displays. A recent patent filing by Apple shows the company is developing a way to create thinner, brighter touchscreens for its laptop and mobile devices, such as iPhone smartphones.
Today, touch-sensing components sit atop the layers that form a liquid crystal display (LCD) screen. In effect, Apple's invention aims to make the LCD pixels "touch sensitive" by eliminating the additional layers. By doing so, the screen becomes thinner, somewhat lighter, and brighter
One of the patent's diagrams contains several drawings and a series of steps that show a finger entering the "proximity sensing field" over a specific feature (like a button) on the UI, displayed on a a tablet-like device's screen. The system then displays and enables a "particular GUI element," such as a virtual keyboard. After typing, the fingers move away from the screen, and the system disables the object and removes it from the screen. It's hard to tell from the simple drawings but they hint at the possibility that the proximity sensing may give Apple the opportunity to create a much more responsive and fluid mobile user experience, with GUI elements seeming to float over each other. The GUI evolves by becoming less obvious, less of an intermediate presence as the current UI system of windows and controls, manipulated by mouse or keyboard. The multi-touch UI of the iPhone and iPod Touch was the first commercially successful step toward a new way of interacting with a mobile computer.
The approach is similar to that advocated by former New York University scientist Jeff Han, who demonstrated a highly sophisticated multi-touch screen at the 2006 annual TED conference, captured in this YouTube video. During the demonstration, Han repeatedly talks about the conventional computer interface, including keyboards both physical and virtual, "going away" to be replaced by a direct, almost tactile manipulation of data and images. Han is founder of privately-held Perceptive Pixel, whose Multi-Touch Wall product is perhaps best known for its use in 2008 by cable news network CNN.
The other Apple patent grants include:
— Automatically figuring out the bandwidth available in a wireless channel to optimise throughput, especially for multimedia traffic.
— A color management system to accurately display original colors across different screens, and optimize them for a given display.
— A multi-media/videoconferencing system.