Mardi 05 Novembre 2024
taille du texte
   
Mercredi, 15 Septembre 2010 21:50

How Context-Aware Computing Will Make Gadgets Smarter

Rate this item
(0 Votes)

Small always-on handheld devices equipped with low-power sensors could signal a new class of “context-aware” gadgets that are more like personal companions.

Such devices would anticipate your moods, be aware of your feelings and make suggestions based on them, says Intel.

“Context-aware computing is poised to fundamentally

change how we interact with our devices,” Justin Rattner, CTO of Intel told attendees at the company’s developer conference.

“Future devices will learn about you, your day, where you are and where you are going to know what you want,” he added. “They will know your likes and dislikes.”

Context-aware computing is different from the simple sensor-based applications seen on smartphones today. For instance, consumers today go to an app like Yelp and search for restaurants nearby or by cuisine and price. A context-aware device would have a similar feature that would know what restaurants you have picked in the past, how you liked the food and then make suggestions for restaurants nearby based on those preferences. Additionally, it would be integrated into maps and other programs on the device.

Researchers have been working for more than two decades on making computers be more in tune with their users.  That means computers would sense and react to the environment around them. Done right, such devices would be so in sync with their owners that the former will feel like a natural extension of the latter.

“The most profound technology are those that disappear,” Mark Weiser, chief scientist at Xerox PARC and father of the term “ubiquitous computing” told in 1991 about context awareness in machines. “They are those that weave themselves into the fabric of everyday life.”

Making this possible on PCs has proved to be challenging, says Rattner. But the rise of smartphones and GPS-powered personal devices could change that.

“We now have the infrastructure needed to make context-aware computing possible,” says Rattner.

The next step is smarter sensors, say Intel researchers. Today, while smartphones come equipped with accelerometers and digital compasses, the data gathered from these sensors is used only for extremely basic applications.

“Accelerometers now are used to flip UI,” says Lama Nachman, a researcher at Intel. “But you can go beyond that to start sending human gait and user behavior.”

For instance, sensors attached to a TV remote control can collect data on how the remote is held by different users and build profiles based on that. Such a remote, of which Intel showed a prototype at the conference, could identify who’s holding the remote and offer recommendations for TV shows based on that.

Overall, context-aware devices will have to use a combination of “hard-sensing,” or raw physical data about a user (such as where you are), and “soft-sensing” information about the user, such as preferences and social networks, to anticipate needs and make recommendations. This creates the cognitive framework for managing context.

On the hardware side, context-aware computing will call for extremely energy-efficient sensors and devices. Devices will also have to change their behavior, says Rattner.

“We can’t let devices go to sleep and wake them up when we need them,” he says. “We will need to keep the sensory aspects on them up and running at all times and do it at minimum power.”

So far, context-aware computing hasn’t found commercial success, says Intel. But as phones get smarter and tablets become popular, the company hopes users will have a device where apps disappear and become part of the gadget’s intelligence.

See Also:

Photo: Intel CTO Justin Rattner holds up a prototype sensor that could help enable context aware computing in devices/ (Priya Ganapati/Wired.com)

Authors: Priya Ganapati

to know more click here

French (Fr)English (United Kingdom)

Parmi nos clients

mobileporn