For the media day at Intel's Developer Forum (IDF) in San Francisco, Intel researchers and scientists put away the transistor tech talk briefly and formed a panel with science fiction authors to discuss in creative detail some of the things that the company has been researching. Primarily, the focus was on the relationship between people and their computers in the future, which Intel says will be shaped by ubiquitous computing and new methods of interaction as chips get small enough to be installed in anything.
The sci-fi authors were originally paired up with Intel researchers to get an idea of the future technologies that they are working on, and to incorporate those technologies into their own stories for an IDF book titled The Tomorrow Project Anthology (the third in the series). Among the authors on the panel was Karl Schroeder, who said that Intel's lab experiments were "used as tools" to write the stories. The ideas behind the stories in the book range from the dark side of data collecting to the way all things in our environment, including animals, could one day interact with computer interfaces so that we can learn more about them.
Some real world examples from Intel's labs session followed, and showcased what the future might hold for the way we use processing power and computer interfaces. All of the concepts rely on heavy data collection and analysis in order to try and enrich the way we work and play. We also saw a working demo of Intel's Display without Boundaries project, which essentially allows any object to become an interactive screen.
The example we saw projected images onto a curved bowl, and you could spin those images as if they were on a roulette wheel. Feedback from your hand gestures and position was given to the computer through an Xbox 360 Kinect sensor. Microsoft has long been involved in surface touch and object detection technology (through PixelSense), and Intel itself first highlighted its Oasis object detection technology back in 2010, but Display without Boundaries isn't confined to one partocilar surface or shape — images can be wrapped around corners, too — and it highlights the fact that Intel is thinking well beyond the square PCs of today as far as how we will physically use computers in ten or 20 years from now.
Not only that though, the future will also have a heavy focus on sensors and advanced algorithms to crunch all the data collected by those always-on sensors in real-time. Some working examples of future concepts were shown off by their respective scientists and researchers during Intel's Labs session. These are technology concepts catering to both personal usage and business environments, building on some of the features that are already available to us in smartphones and laptops.
We saw a system that is designed to be a recommendation engine for a smartphone, based on various data streams from a person and their group of friends. The concept is to use location sensors, services like Foursquare, and other data, to help aid decision making in various circumstances. In one example we were given, this system would be able to analyse the data from a group of friends and make an informed decision on where the group should eat, not necessarily based on common interests or averages. Intel wants to make the technology think more naturally. In the real world, you might want to go to a place because you know a particular friend likes it (but not necessarily all of them), or you might want to go somewhere you've never been before based on how many times you've been to other places.
For business users, one very specific example we saw was a solution based on sensors that can detect who is close by in order to automate things such as meeting and collaboration software. Computers can be detected via Blutooth, and audio sensors can analyse vocal patterns, all in order to ascertain if certain users are in range, and allowing those users to start collaborating immediately — if they want to.
To tackle the problem of deciding which communication method to use when you want to contact someone, a group of researchers is working on a system that analyses location and accelerometer sensors, as well as other things such as computer activity and calendar information, to find out if a person is busy working, if they are in a meeting, or if they are currently on the move. It will suggest the best way to contact them based on the data collected by the sensors so as not to disturb them.
Finally, interactive shopping shelves were on display, sporting a collection of sensors (motion, temperature), NFC, camera, online connectivity and touchscreens to aid both the shopper and the staff. The touchscreens could be used to display pertinent information about products, such as allergy warnings, for example, as well as live pricing (which could allow store workers to more efficiently change product pricing). Furthermore, customers could leave feedback, such as a rating, for products, and even view Twitter streams about products right there on the shelf. The shelves would also be able to detect when stocks need replenishing.
All of these concepts are in the works by researchers and we were shown very raw examples of what can be accomplished. With the amount of data collection that's possible, and with very smart people creating algorithms to organise it all, the future of computing will be equal parts scary and exciting and perhaps a lot more hands on (and voice on) than the tablets and smartphones of today allow.
The author of this article attended IDF as a guest of Intel.