AI Devices that Walk, Roll and Fly -- and Tacos -- Draw Developers to NVIDIA HQ
Staff Writer By: Lynette Farinas, NVIDIA
At an event that spanned everything from robots to the software and semiconductors that power them, our team in charge of autonomous machines hosted Thursday the first public meetup at NVIDIA's new headquarters building.
The evening event drew hundreds of developers to the heart of Silicon Valley to nosh on tacos, network and listen to talks on NVIDIA's Jetson embedded computing platform, our new TensoRT 3 inferencing software and the Redtail AI framework for autonomous mobile robotics.
The gathering comes amidst an AI boom -- unleashed by the parallel processing power of NVIDIA GPUs. Welcoming developers to the event, NVIDIA's Jesse Clayton explained how this revolution has given vast numbers of people access to image recognition, voice recognition and real-time translation services that can go beyond what humans are capable of.
The next phase: bringing the power of NVIDIA's parallel computing platform to devices with a limited power budget. That's where Jetson comes in. "Deep learning is very computationally intensive; it's a parallel computing problem, that's why researchers turn to GPUs," Clayton said.
Spreading parallel computing power to what technologists call the "edge" -- or the world outside of data centers and desktops -- is key to building devices with the intelligence to tackle last-mile delivery challenges, manage crops to feed a hungry planet, inspect infrastructure such as roads and bridges, and even perform surgery, Clayton explained.
The attendees at Thursday's gathering were eager to get started. Alvise Memo, a software engineer at Palo Alto-based startup Aquifi, explained he already relies on NVIDIA GPUs for machine learning -- and is eager to put that knowledge to work on Jetson.
A few highlights from the meet-up:
The post AI Devices that Walk, Roll and Fly -- and Tacos -- Draw Developers to NVIDIA HQ appeared first on The Official NVIDIA Blog.