After a less than stellar year in which it announced 12,000 layoffs, a major deviation from its CPU roadmap, and a complete withdrawal from the smartphone SoC race, Intel’s vision for a post- A connected, compute-intensive PC is finally starting to take shape. At the company’s annual developer conference in San Francisco this week, the focus shifted from conventional devices like PCs and laptops to other perceived growth areas.
Intel will spend the next year trying to build a strong foundation for emerging categories such as virtual reality, drones, autonomous vehicles, and home robots as well as custom “smart” devices for increasingly specific niches. which he hopes will become part of the fabric of our daily lives. Simultaneously, it wants to solidify its position in the data center where it expects to see massive growth as sensor-laden devices generate huge amounts of data, to interpret and use. This involves not only processing all that data, but also moving it from devices to the cloud and returning it to the user fast enough to be useful.
One of the biggest takeaways for Intel this year was its RealSense 3D camera initiative. First promoted as a “perceptual computing” enhancement for PCs, tablets, and laptops, RealSense seemed like a gimmick without many useful real-world applications. It’s been compared to Microsoft’s Kinect – fun in specific scenarios, but largely useless. Intel suggested it could replace traditional input devices, making computers more human-like, but there weren’t many device makers who tried to make that happen. Over the past year, this initiative has moved away from the paradigm of manipulating a user interface and has really come into its own, fueling security enhancements such as Windows Hello, allowing drones to understand their surroundings and avoiding collisions, and powering all kinds of interactive robots. .
Of course, there was the big news about Project Alloy, an ecosystem designed to enable “mixed reality” experiences, which Intel defines as similar to virtual reality but incorporating real-world elements rather than trying to replace. Motion tracking is handled by RealSense – the headset is opaque and all real-world objects are captured and fed into the virtual environment, unlike augmented reality which overlays what you see with your own eyes. Then there was Joule, an all-new development platform designed specifically for RealSense apps; Euclid, a kind of plug-and-play PC with RealSense built into its candy bar-sized body.
In the near future, a new, slimmer RealSense module with increased range and sensitivity will be available. It will be easier to integrate into consumer products and has a real chance of becoming widespread. Demonstrations at the show included a robotic companion for children with diabetes, safety glasses that can detect if engineers are using the wrong parts when working on sensitive equipment, a projector that can turn any table into an interactive game , a head-up display that riders can wear with their helmets, and an educational modular robot construction kit.
While some of these projects are frivolous and some may never make it out of the proof-of-concept stage, the common thread was that no matter how small the niche, if there is a need, Intel wants to fill it. A few of these projects might end up resonating with people, or at least planting the seed of an idea in the minds of other participants.
Either way, we’ll have more devices in our lives, or at least devices that do more things. This is in addition to the devices and environmental controls around us, our clothes, our cars, the tools we use at work, the smart city infrastructure we pass by, and more. All of them will generate data, and much of that data will be analyzed using some kind of artificial intelligence. Intel also wants to be in this space, not only addressing all of this with a new generation of Xeon Phis, but launching it across the world using incredibly fast, low-latency silicon photonics between servers and 5G up to at endpoints where needed.
We finally seem to be at a point where IoT stops being a buzzword and becomes tangible and relatable. It all comes together – RealSense can be used for object and pattern recognition, but it only works when there are huge datasets to learn from. The data is generated by the cameras, sent to a huge data center, processed by artificial intelligence and returned in a useful form. Some applications don’t need this to happen quickly, but when it comes to autonomous vehicles, medical robots, or even utilities, even milliseconds matter. Intel wants us to know that it is working across the chain.
While last year’s IDF was flashy but somewhat lacking in direction, this year’s show attendees were able to walk away with a sense of where the company is headed – and it’s not about PCs. While not quite in the rearview mirror, we can expect computing hardware to lose its overall importance as a much larger picture of connected devices and services emerges around us.
Disclosure: The IDF correspondent’s flights and hotel were sponsored by Intel.