top of page

Close to the Edge: Why edge computing is going to be way more important than AI


If you’re coming up for air whilst writing your “10 ways You Can Optimise Users with Threads” guides for Linkedin. Stop. Even if it’s just for a moment, and look around you. What’s powering these cage fights and bragging rights? Data, of course. But data is only as good as its owners or its nourishment. In this fast-food era of data, the technologies supporting the communication of zeroes, ones, strings (and more) are also only as good as we are.



More Moore’s Law


A couple of years ago I sat in my home office watching Zoom where a guy—no more than 20 minutes away from me at ETH University in Zurich—talked to the audience about Edge and Quantum Computing. Yeah, I do enjoy a good lecture in between bouts of k-drama and late 80s techno rave binges. During that awful pandemic, the era of ever-increasing connectivity and data-driven applications, edge computing (and quantum computing) had emerged as a revolutionary technology that promised to transform the way we processed and analyse data. In a forum that ran alongside the lecture, there were the usual naysayers calling out edge computing as too early and quantum computing as Moore’s Law bunkum—but I was hooked mostly because as a kid I would read my dad’s Quantum Physics books like they were the Weekly World News. So when artificial intelligence came to us in a nice ChatGPT-shaped box: we were enthralled. It could do everything we ever wanted, and yet, as of today it’s flawed and you know it: it’s an unreliable witness sometimes and slightly old-fashioned in its (our) opinions. AI suddenly started doing everything we couldn’t, largely because we couldn’t be bothered. And today? Even less so. But AI has nestled nicely into our lives, it’s the perfect gift to our fast-food data desires even if it’s limited and it’s the perfect friend to IoT, isn’t it? Well, not quite.


With the proliferation of Internet of Things (IoT) devices and the exponential growth of data generated at the network edge, traditional cloud computing architectures faced challenges related to latency, bandwidth constraints, and data privacy. That’s what the guy at the lecture started by telling us. But what I heard was something totally different. What I heard was the possibility that edge computing would bring to digital transformation shaping the future and compared to AI is a fundamental component in AI. Complementary, yes, today maybe, but necessary in the future? Absolutely. So I want to demystify edge computing and show you why this is all you’ll be thinking about soon. Get your Linkedin guides at the ready!



Instant Karma


Really simply: edge computing is about decentralising computing resources (CPU, memory, network, storage and possibly graphics) by bringing data processing and storage closer to the source of data generation. What usually happens in, say, cloud computing (which transmits data to a remote centre to process and analyse it first), is that edge computing does this before data jumps out into the cloud and does it by performing the task on the edge: usually the edge of the network. See what they did there? It can be done on IoT devices or on cute little edge servers.


Why would anyone bother their arse to have all the data analysed and processed so close to everything? In the past we’ve used cloud computing for volume as it’s cost-effective; but that doesn’t mitigate risks—in some cases cloud computing has increased risks in managing demand and load because cloud computers belong to someone else.


One of the main advantages of edge computing is its ability to minimise latency. Latency, or as we gamers call it: lag. That time delay between data transmission and receiving the data, is a critical factor in many real-time applications—of course in games we know this on both sides of the server. By processing data locally, edge computing significantly reduces the round-trip time between the data source and the computing resource. This low-lag environment enables time-sensitive applications like autonomous vehicles, industrial automation, and AR—yes absolutely AR, to function seamlessly, ensuring rapid decision-making and enhanced user experiences.


Edge computing has the ability to alleviate the strain or should that be—drain—on network bandwidth. I hope OpenAI is reading this! With the explosive growth of IoT (and AI) devices, sending all that generated data to centralised cloud servers can overwhelm network infrastructure. By processing data at the network edge, the edge computer can filter and aggregate the data, reducing the volume of information sent up to the cloud—cool eh? At a time of a cost of living crisis, even your working capital is gonna take a hit if you’re a business owner. Large scale projects would actually benefit in the long term by using edge computing right now.



Your Stuff is Yours


My fave subject—aka why-don’t-you-follow-Heidi-Saas-Debbie-Reynolds-and-Jackie-McGuire—data privacy and security! Wooo hoo! Listen it’s a simple doozy, use your edge network to keep sensitive information closer. That’s the tweet. Ok, I’ll extend it more and say that I live in a country that has banking secrecy, so cloud computing is never used for any information pertaining to individuals bank balances. But the same goes for health data, perhaps? For me, definitely. And look, let's just get all on the same page about Threads—I don’t care about another Meta product; the data brokerage is too much to bear. I want to be in control of who contacts me, how I’m seen and who knows what I’m doing or where I’m buying. You can opt out of most of this stuff by simply using providers like BitsAboutMe. I’m just sayin’—adding yet another Meta data mining source to your life doesn’t help anyone but Meta.


Back to the subject! Edge computing addresses data privacy and security concerns by keeping sensitive information closer to its source. Sensitive data over any network needs to be compliant. By processing data locally, edge computing allows organisations to retain greater control over their data, reducing the risk of data breaches and ensuring compliance with regulations. Data is a currency, it’s something we should care more about. Rant over.



Let’s Work Together


Edge computing can actually complement cloud computing. There I said it. The best cloud computing tool at your fingertips that I’ve used has to be AWS by a country mile. By offloading all those data processing tasks to the network edge, cloud computing provides a virtually limitless storage and computational power. And though some applications require real-time processing and immediate decision-making, which may not be feasible in a centralised cloud environment, edge computing is a bit more flexible.


Implementing edge computing solutions comes with its own set of challenges. The distributed nature of edge computing requires some solid networking infrastructure to connect edge devices and servers effectively. Expensive. Additionally, managing and securing a large number of edge devices dispersed across geographically diverse locations can be complex. And expensive. Organisations must also carefully consider the balance between local processing and cloud offloading, ensuring an optimal allocation of resources to achieve desired performance and cost efficiency. So a hybrid working environment of cloud and edge might suit and have broader appeal for businesses wanting to do it all. Web3 gaming for example, might hugely benefit from systems where real time data either needs to be parsed for blockchain or is being coupled or decoupled in micro feature updates etc.



Go Further Without Moving


Distributed architecture is everyone’s friend. Bringing processing capabilities closer to the data source using edge computing is a big mood. This decentralised approach allows for faster and more efficient processing by leveraging local resources. It also enables fail rates to get identified faster allowing us to fail quickly and iterate because technical failures in individual edge devices or servers do not necessarily disrupt the entire network—fab use cases! AI, on the other hand, typically relies on centralised cloud infrastructures, which can introduce single points of failure and limit scalability in certain scenarios. I bet you didn't think about that when you were dumpster diving through your chatGPT service. Yeah, that’s why you pay for it.


The era of digital transformation is now and it’s a really golden age which has spanned at least the last 5 years without edge computing coming into vogue. But like that guy who did that lecture on the wet evening during the pandemic, I can see the light. Edge computing will play a pivotal role in enabling innovative applications and technologies, it has to. Readying ourselves for what happens next in our world, from pandemics to climate change is something that the last few years taught us. And technology is advancing towards convenience at such a revolutionary pace that it’s hard not to bring edge computing into the conversations as we plan our world. From smart cities and autonomous vehicles to intelligent manufacturing and immersive virtual reality experiences, edge computing's ability to process data locally and deliver that low-lag currency will continue to unlock new possibilities for us. Additionally, the convergence of edge computing with artificial intelligence and machine learning will facilitate real-time decision-making, predictive analytics, and personalised experiences. A lot of us have been waiting for the switch to flip for years now, and I’m ready. I’m here for it. Let’s go!








bottom of page