Intel Snaps Up Nervana to Jump-Start AI


Intel last week announced the acquisition of startup Nervana, in a bid to enhance the company's capabilities in artificial intelligence and deep learning.

Nervana jumped out of the gate at its 2014 launch with a robust platform for deep learning, a framework called "Nervana Neon," and the Nervana Engine -- a groundbreaking ASIC chip scheduled for introduction by the first quarter of 2017.

Nervana's IP and expertise in accelerating deep learning algorithms will expand Intel's capabilities in the field of artificial intelligence, according to Diane Bryant, general manager of Intel's Data Center Group.

"We will apply Nervana's software expertise to further optimize the Intel Math Kernel Library and its integration into industry standard frameworks," she said.

The Nervana Engine and its silicon expertise will advance Intel's AI portfolio, added Bryant, enhancing the deep learning performance and lowering the total cost of ownserhip of its Intel Xeon and Intel Xeon Phi processors.

"Nervana has developed a unique capability to process data more quickly across various environments, ranging from open source to highly customized computing systems," observed Jeff Kaplan, managing director of ThinkStrategies.

"Like in many M&A situations, Intel's acquisition is aimed at quickly obtaining technological innovations, along with the talented team that developed the new solution," he told the E-Commerce Times.

That talent pool includes Nervana CEO Naveen Rao, a former Qualcomm researcher with a PhD from Brown University, and CTO Amir Khosrowshahi, among others. The firm has raised more than US$28 million in funding from investors.

Nervana earlier this year introduced its Nervana Cloud platform, which the company has described as 10 times faster than competing AI cloud platforms. It allows organizations of various sizes to build and deploy deep learning solutions without having to make large investments in machine learning or data teams.

Nervana Cloud customers include Blue River Technologies, which uses agricultural robots along with the Nervana Cloud to boost crop yields, and Paradigm, an oil and gas software developer that uses the cloud platform to help identify subsurface faults embedded in 3D seismic images.

The new Nervana Engine chip will be able to handle a high level of data at speeds that its competitors cannot match, according to the company, through incorporation of a new technology called "high bandwidth memory," which combines 32 GB of on-chip storage with memory access speeds of 8 terabits per second.

Intel doesn't have much of a machine learning business at the moment, so it needed to acquire Nervana in order to catch up with the competition in the segment, said Paul Teich, principal analyst at Tirias Research.

"They have invested in software, including some open source work," he told the E-Commerce Times, "but anyone doing AI and machine learning R&D or deploying deep learning at scale will use hardware accelerators."

Nvidia's GPUs are currently the most favored accelerators in the business, "which has to rankle Intel," Teich said.

"Xeon Phi simply didn't move in the right direction. This is reminiscent of Intel putting a lot of investment into a new Itanium 64-bit architecture instead of adding 64-bit instructions to their X86 cores, which AMD eventually did for them," he pointed out.

Deep learning takes a "fundamentally different approach than big or little x86 cores," explained Teich. "Intel's software folks saw deep learning coming, but their silicon architecture and design teams did not."

David Jones is a freelance writer based in Essex County, New Jersey. He has written for Reuters, Bloomberg, Crain's New York Business and The New York Times.

0 Comment

Leave a Reply

Captcha image


  • 5300c769af79e

    Microsoft Strengthens Partner Bonds

    In addition to being advantageous to partners, it could be a good move for Microsoft.These latest moves could bring Microsoft closer to its ultimate goal, which is "to become the cloud company of choice in the market," said Gold.
  • 5300c769af79e

    Opinion: Potential Galaxy S8 Fingerprint Reader Location is Dumb

    However, it appears Samsung isn’t content with doing what every other company is doing, and instead, is placing the fingerprint reader in the most ridiculous location they can to possibly help differentiate the Galaxy S8 from its competition.On my Pixel XL, for example, I basically just feel around the device’s backside until I find the indentation for the fingerprint reader.
  • 5300c769af79e

    Leveraging Splunk for Complete And Seamless Network Visibility

    Download Splunk is the leading software platform for machine data, enabling customers to gain real-time operational intelligence by collecting data from security, network and server assets throughout the corporate infrastructure.The Napatech Pandion is a cost-effective, highly reliable network recorder that guarantees 100% data capture and retrieval on demand in high-speed networks.
  • 5300c769af79e

    The Best Cheap Cell Phone Plans You've Never Heard Of

    While most Americans are signed up with one of the major carrier brands—AT&T, Sprint, T-Mobile, US Cellular, or Verizon Wireless—there are many more choices available to US cell phone customers looking for a bargain.US MobileUS Mobile is the least expensive of several "roll your own" plan carriers, like Ting.