Intel AI boss: It’s time to move from brute-force to more-efficient computing

The common narrative of artificial intelligence is that it finally took off in recent years when there was enough data — from mega repositories like Google — and enough computing power through racks of servers equipped with fast processors and GPUs. That’s not incorrect, but it’s too simplistic to describe the future of machine learning and other forms of AI. That was the message from Amir Khosrowshahi, the CTO of Intel AI products, at VentureBeat’s Transform 2018 conference outside San Francisco today.

The challenge now is optimizing the whole process: Better algorithms require less computing and can draw accurate inferences from less data, said Khosrowshahi, co-founder of AI company Nervana Systems, which Intel acquired in August 2016.

And no matter how good the hardware — which Intel is happy to sell — optimizing how it works together is critical. “What’s most important is actually the surface on the top,” said Khosrowshahi, “the systems-level integration of software, [hard drives], interconnects, figuring out latency [delays] between racks of servers.”

As an example, he named Taboola, which provides an AI-driven customized content recommendation on media sites. Those recommendations need to be generated in under 100 milliseconds to meet user expectations. “You have to recommend accurately and under very stringent latencies,” said Khosrowshahi. “So it’s a systems-level problem.”

Intel worked with Taboola to streamline its whole system, not just improving algorithms but optimizing nuts-and-bolts issues like fetching data from memory. “That was actually the biggest bottleneck,” said Khosrowshahi. “It’s not machine learning. It’s getting the data in.”

The recent growth spurt in AI has been one of brute force computing — a paradigm from companies that have brute force: Amazon, Facebook, Google. The thinking was that you needed massive amounts of data to draw conclusions, which is an outdated notion, according to Khosrowshahi. All this power is rotting data scientists’ brains, he said, making them lazy in writing efficient algorithms. “The current models are not good at exploiting data,” he said. “Do you really need to see 100 million cats to learn how to identify cats?”

“Our models are improving dramatically, and performance is improving quite a bit,” he continued. “The algorithms are getting better at inferring data efficiently.”

Another Intel AI customer is Chinese ecommerce giant JD.com. It uses image recognition and comparison to help recommend products. “They know the customers’ proclivities from images they clicked on in the past,” said Khosrowshahi. If someone has clicked on a particular purse, for instance, JD.com can recommend other purses with similar styling in the future. “So we worked on this problem with [JD.com] of solving this algorithm more efficiently and at scale,” he said.

Better algorithms aren’t just for giant companies. In fact, they are an equalizer between the giants and smaller players with modest means. “We’re going to see more and more examples of better models, with better performance, that use less data, less computation,” said Khosrowshahi. “And this is a really good, healthy sign because … organizations that don’t have all this data can actually produce things that are really valuable.”

Must Read

error: Content is protected !!