The Internet of Things might have less Internet than we thought?

“Privacy was never dead, it just went away for a while…”

Alasdair Allan

--

The transcript of the keynote talk I gave at QCon London, held in March 2020, during which I talked about machine learning, edge computing, and privacy.

Talking on the keynote stage on the first day of QCon London 2020. (📷: Danilo Teodoro)

Machine learning is traditionally associated with heavy duty, power-hungry, processors. It’s something done on big servers. Even if the sensors, cameras, and microphones, taking the data are themselves local, the compute that controls them is far away. The processes that make decisions are all in the cloud.

But this is now changing, and that change is happening remarkably quickly and for a whole bunch of different reasons.

The nature of what we do as developers means that we often obsess about now and next, rather than taking the time to put things into proper historical context, and I think that’s a mistake.

So every once in a while it’s worth it to take a step back, and look at history, then decide whether we want to repeat our mistakes, and our triumphs, just one more time, or whether we should be doing something different this time around?

Because we’ve been here before, back in 2011. It was, by any measure, the dawn of the Big Data era. There were new tools appearing, new levers to move the world, we all got a rather excited.

However at the endless succession of big data conferences that followed I mostly didn’t talk about big data. Instead I talked about machine learning, and small data. Distributed data.

Nobody was all that excited about Machine Learning back then, it’s funny how things turn out. Because almost a decade later it turns out that the machine learning and small data systems might well be about to take over.

Now for anyone that’s been around a while this isn’t going to be a surprise, as throughout the history of the industry, depending on the state of our technology, we seem to oscillate…

--

--