Illustration representing edge computing vs cloud computing

Edge Computing vs. Cloud Computing:
What it Means and Why It Matters

Edge computing and cloud computing are related but distinct technologies. Understanding both is crucial for making the best use of either.

Back when computers were room-size, multi-ton behemoths, office workers used terminals to do their jobs. Terminals were little more than a screen and keyboard; a way to connect with the computing power of the enormous central computer.

As components got smaller and processors got faster and cheaper, personal computers entered the workplace. Each individual device had all the processing power an end user needed.

Now, we’ve nearly come full circle: Most work applications are cloud-based, and our computers are terminals that connect to the cloud (which is really an array of powerful computers). The only difference is individual devices and local servers have considerable computing power of their own. We have “edge” computing (the local devices) and “cloud” computing (the remote server room we access via the internet).

Each of these methods of getting work done has its place in a modern office. With the right edge and cloud strategy, you can increase efficiency, free up bandwidth, and reduce lag. Here’s what you need to know.

Cloud computing

What is cloud computing?

Cloud computing is a technique for hosting data and computing applications via the Internet in virtual “Cloud” servers or databases, allowing users to access them from anywhere.

How does cloud computing work?

Cloud computing uses a network of remote, interconnected servers on the internet to provide services. These servers pool their computational resources together to accommodate many users at once.

When a user requires services from the cloud, they connect via the internet. The cloud provider’s system automatically allocates and delivers the necessary resources for the user’s requirements upon request.

What are the advantages of cloud computing?

What are the disadvantages of cloud computing?

When should I use cloud computing?

Cloud computing is most useful for users who need to access their data remotely, who have fluctuating workload requirements, or need to rapidly deploy digital applications and services cost-effectively.

For example, cloud computing is extremely helpful for facilitating remote collaboration, as it provides a database for all members of a team to share data with each other, no matter where they are.

According to Learning and Development Expert David Rivers in his course Introduction to Cloud Computing for IT Pros, there are three primary use-cases for cloud computing:

Learn more about cloud computing:

Edge Computing

What is edge computing?

Edge computing is a technique for distributing stored processing and computing data closer to the original “source” of that data, or in their language, near the “edge” of a data network.

How does edge computing work?

Edge devices process and prepare data locally, reducing the amount of latency and bandwidth required to send that data to the central server for store and analysis.

Edge servers can also filter and prioritize the data they send first, providing access to especially time-sensitive information in near real-time.

Professional headshot of David Linthicum

"When you think of edge computing, it's not just defining something that exists on one system. It truly defines an architecture that exists between many systems that are configured in such a way that edge computing is the way in which they're approaching the processing system."

- David Linthicum in Understanding Edge Computing in a Cloud Computing World

What are the advantages of edge computing?

What are the disadvantages of edge computing?

When should I use edge computing?

Edge computing is most useful in scenarios where low latency and real-time data processing are critical.

For example, a healthcare device connected to an edge server could provide real-time updates on a patient’s health status, allowing doctors to take immediate action whenever necessary.

Learn more about edge computing:

When to use cloud computing vs. edge computing

Using both cloud computing and edge computing:
the edge-cloud continuum

Many data applications may be best served by applying both edge and cloud computing technologies in tandem.

The “edge-cloud continuum” or “edge-cloud spectrum” refers to any configuration or combination of both cloud and edge computing technologies to provide these kinds of data services and applications.

For example, an organization providing software applications on IoT devices could use an end-to-end cloud and edge stack system to optimize bandwidth and response times over a very large area of service.

Professional headshot of David Linthicum

"You really can't have an edge based system if you don't have some sort of centralized processing. Typically, that will reside on a public cloud. Many systems start within the cloud and move to the edge. We may separate the application into a part that exists in the cloud and a part that exists in the device. So in essence, it's a symbiotic relationship."

- David Linthicum in Understanding Edge Computing in a Cloud Computing World

To equip your entire organization with everything they need to make the most of both cloud and edge processing, try a demo of LinkedIn Learning today.