Why Low-Power Neural Network Accelerators Matter

Alasdair Allan
3 min readMar 12, 2018

--

Depending on the state of current technology, the computer industry oscillates between building thin and thick client architectures. Either the bulk of our compute power and storage is hidden away in racks of sometimes distant servers, or alternatively, it’s in a mass of distributed systems closer to home.

While the cloud is currently a good architectural solution, at least to most of our problems, I’m not going to be particularly surprised to see those preaching it as the last solution we’ll ever need proved wrong, yet again, as things begin to tip back towards much more distributed systems.

However interestingly, and perhaps a first, this time around it might not just be the form factor of our computing that’s changing; it might well be our entire model of how computers are programmed, because deep learning is eating software.

“…it’s a radical change in how we build software. Instead of writing and maintaining intricate, layered tangles of logic, the developer has to become a teacher, a curator of training data and an analyst of results. This is very, very different than the programming I was taught in school, but what gets me most excited is that it should be far more accessible than traditional coding, once the tooling catches up.” — Pete Warden, Google

The arrival of low-cost, and relatively low-powered, hardware is driving this change, and over the last year or so we’ve seen an increasing movement of machine learning away from the cloud, and towards the edge.

This sort of hardware, deployed along side open licensing training data that allows realx comparative benchmarking, is showing us that we’ve quickly reached the point where hardware at the edge is almost now cheap enough to throw away. Which is why I see the increasing diversity in deep-learning hardware targeting low-powered embedded devices as so important.

We’re entering an age of capable computing. The computing that’s available at the edge is now “good enough” to do most needed things.

These ideas are almost the antithesis to traditional wisdom, and if you talk to folks at Intel or most of the other big manufacturers, there is a lot of push back to the notion that cheap, low-powered, computing is the way forward. After many years where Moore’s Law defined their business model changing that model is going to be challenging. But I’m not alone here, and even front-end development is being disrupted by the emergence of deep learning.

GAP8 chip. (📷: GreenWaves Technologies)

Like Warden I’ve preordered the new GreenWaves Developer Kit. Their development board is built around the GAP8 chip, based on the RISC-V open source processor architecture, and will sit along side the Movidus hardware that’s already on my desk. But the board has a rather different form factor to the Movidius hardware from Intel.

The GreenWaves GAPDUINO GAP8 development board has Arduino-compatible headers—which should let you use Arduino shields—along with connector for an external camera module. It breaks out all the necessary I/O, including a USB to serial and JTAG bridge, to enable you to program the on-board flash and debug GAP8 applications.

“Green Waves, have just announced a new device and shared some numbers using the [speech commands dataset] as a benchmark… They’re showing power usage numbers of just a few milliwatts for an always-on keyword spotter, which is starting to approach the coin-battery-for-a-year target I think will open up a whole new world of uses.” — Pete Warden, Google

I think it’s likely that next generation of computing will be distributed, more specialised, and designed around machine learning. I also think boards like this—along with other developer focused products like Google’s AIY Projects Vision Kit which has been built around Intel’s Movidius chip—will prove important in developing prototypes of the hardware for that next generation of computing. It’s going to be interesting to see where things go over the next year or so, but I’d confidently predict we’ll see more edge-based computing.

--

--

Alasdair Allan
Alasdair Allan

Written by Alasdair Allan

Scientist, Author, Hacker, Maker, and Journalist.

No responses yet