How the cloud is turning into fog

Igor Bergman
5 min readOct 20, 2021

As the cloud continues to grow, so do the options for managing it, and for developing software and solutions for use within it.

Gartner recently released impressive new figures for global cloud growth: this year, public cloud services spending will hit US $396 billion and is expected to grow to US $482 billion next year.

As Gartner also mentions, trends within the overall ‘cloud’ are changing. For example, it talks about regional cloud ecosystems, cloud infrastructure and platform services, and Edge computing in accelerating the uptake of the cloud in the coming years.

What this all means is that modern cloud service models are changing and evolving. As they do, the role and options around ‘fog’ increase. Fog computing happens via endpoints to a gateway that transmits to the in-house server. This is different than on-prem computing which happens on an in-house server.

I think the value proposition for the ‘fog’ is so straightforward, a very simple and understandable story, which is why I believe it’s becoming so successful.

But what exactly is the ‘fog’, and where does it play?

Planning for the fog

The word ‘fog’ is a good one to describe this addition to the evolving ‘cloud stack’. Commonly, cloud computing architecture is designed in such a way that it’s very centralized, with a huge data center that can be anywhere in the world, perhaps thousands of miles away from the client servers and users that are being managed, and the services that are being provided.

In comparison, fog is a distributed architecture, with perhaps hundreds of thousands of nodes connecting to local devices, across multiple services from different clouds, all working together to deliver efficient and fast user experiences.

It acts as an intermediate layer between the clouds’ servers themselves and the enterprise.

As a use case example, a modern factory would have a large number of machines or devices connected to local servers (edge processors), which would in turn be connected to the cloud.

If you were to have hundreds of thousands of devices connected directly to the cloud (as you would if you were a global manufacturing company), you would rapidly run into problems around data processing, computing, latency, connectivity, and all of the associated expense.

Once you have the ‘fog layer’, you process the very high volumes of data locally. This creates a new level of redundancy if the connection to the cloud is broken.

This is exciting for developers because ‘fog’ allows them to focus on creating new value built on their specialist expertise, and their special knowledge of their markets and their users’ needs, without having to replicate the cloud — something the cloud services providers already do very well. Developers can stop trying to replicate Google, AWS, Azure or Alibaba, and focus instead on building distributed services in new ways that take advantage of the fog’s inherent advantages — a distributed architecture that works with the cloud but does not rely on the cloud.

For users, they get the security and reliability that the fog can provide, along with the freedom and flexibility to work with different mixes of cloud providers, and different cloud configurations.

This is especially important when they want to move from one cloud service provider to another, securely and seamlessly, or add new providers alongside well-established partners (for example, when breaking into new geographic markets).

An evolving cloud

I’m sometimes asked if, having moved successfully to the cloud over the past 20 years, enterprises are now being asked to bring their processing back in-house.

I don’t think that’s the case.

In the early days of the cloud, the number of devices being connected directly to it was in the tens of thousands — a manageable number that didn’t cause any special problems or pose any particular risks. It was also the case that, back then, our reliance on the cloud wasn’t as large as it is today.

Now, the number of devices ultimately connected to the cloud is in the millions, with over 90% of all enterprises based in the cloud. The fog is therefore a sensible evolution of a very successful centralized approach.

Cloud and fog strategies

For something that, at the operational level, is complex, the benefits of the fog are surprisingly obvious. It doesn’t actually require a technically detailed white paper for a CEO or a board to recognize that the costs of sending packets of data between devices inside an enterprise will be lower, more efficient and provide robust data analysis faster. CEOs who support IT teams to build this type of infrastructure will set their companies up to have a competitive advantage

The cost benefits get better when you include the reality that a large proportion of the processing required, in many cases, can be done locally, in the fog. Companies that do this well will have real-time data processing and analysis because it’s processed closer to the source, not in an external data center or cloud, which reduces lag time.

The more globally distributed your customers, the more efficient fog becomes.

Being more secure in the fog

There are also huge security advantages with the fog. The cloud service providers are very good at security, but they may still be a single point of weakness. If you rely on the cloud without having fog implementation between your devices and the cloud itself, there may be different risks if the cloud suddenly drops, or if the connection is compromised in another way.

With the fog, the business can continue, even if the data isn’t flowing to and from the cloud. If one site is disconnected from the cloud, predictive and proactive analytics can start working on whether or not that site can be brought back online within the service level agreements (SLAs) in place. If not, the cloud can start to re-route data and connections, or simply wait until the connection is restored. Meanwhile, operations continue within that site or enterprise, because of the Edge. Business losses are minimized, compared to what they might have been.

As to managing the fog, software architectures that are open, containerized, able to scale, globalized and available 24x7 such as Lenovo’s platform make that straightforward, in managing the devices connected to a fog implementation, and managing the network of Edge servers.

As companies focus on customers, they will need innovative and efficient ways of processing data, in cloud ecosystems that remain flexible and scalable, but which remain secure, and which don’t constrain transactions.

The ‘fog’ therefore helps enterprises react faster, have better performance, reduce costs, improve security and survive any breaks in the connection to the Internet.

The fog is an exciting option for software developers considering the architecture of their solutions, how they deploy those solutions, and how they build in the services that support them.

I believe that, as part of the broader cloud ecosystem, across public cloud and private clouds, as well as multiple cloud service providers, the fog will become an important contributor to the continued growth of cloud services.

And this will not just apply to enterprises running their businesses in the cloud: it will also apply to software development teams creating new solutions for the cloud.

--

--