Webinar: AI Gets Edgy: How New Chips and Code Are Pushing Artificial Intelligence From the Centralized Cloud
July 20, 2016
VP of Research
Artificial intelligence (AI) has relied on energy-hogging fast processors and large datasets for training neural networks – both of which presupposed centralized computing architectures. But today, more powerful chips are letting AI escape from centralized, cloud-based systems and move out to devices at the edge of the network. Among incumbents, Intel spent a whopping $16.7 billion on AI chipmaker Altera; Google is developing an AI chip called Tensor and working with Movidius to put AI on a USB stick; and Nvidia has dropped $2 billion so far on its Tesla graphics chip for machine vision and other AI tasks. At the same time, startups like krtkl, KnuEdge, Nervana, MIT’s Eyeriss, and China’s Horizon Robotics are all developing new chips to bring AI to robots, self-driving cars, and other things in the Internet of Things that have to operate independently with an intermittent network connection – an IoT without the Internet.
Register to attend
Contact us to attend
| July 20, 2016 11:00 AM - 12:00 PM EDT