Compared to the previous generation, today’s generation of startups are increasingly cloud-centric. The previous generation of dotcoms had to suffer the economics and complexities of deploying, managing, and scaling their own servers, networks, and data centers.
In contrast, today’s generation grew up in the just-in-time, pay-for-what-you-need, and scale-up-on-demand world that is cloud native.
Also see: Why the Future of Computing is at the Edge
Edge Benefits Beat Out the Cloud
But over the last two years, businesses have largely opted for edge-enabled, serverless infrastructures. This means there are no servers to manage; no locations to spin up; and most importantly, no cloud computing contracts to analyze.
With edge-enabled, serverless infrastructures, businesses can benefit from faster and more stable API performance and a decreased need in infrastructure support and annual spend.
As a software practitioner for more than two decades, I have been through more “paradigm shifts” in computing than I can count. But I can confidently say this: The future of computing for an entire generation of companies will be “edge-native,” and the traditional cloud is the platform that will lose.
Also see: Top Edge Computing Companies
What is Edge Computing?
One of the main problems with paradigm shifts is that there are so many new technologies that emerge in the early stages. The same has been true for edge computing, with numerous companies offering “edge-compute” solutions that run on new infrastructures, telecommunications providers, and even cloud-computing companies.
When we talk about edge computing companies, we’re describing the ability to run code at the network edge—specifically the content delivery network (CDN) providers.
CDNs have been around since the beginning of the Internet. The major players (Akamai, Limelight, Cloudflare, Verizon Edgecast, and Fastly) have been helping customers ensure content is delivered quickly to customers by ensuring a large distributed global cache of servers.
In the old model, these providers simply stored data for companies, ensuring that as customers visited websites or downloaded software, the response times were fast because the server itself had the content as close to the customer as possible.
Also see: Will Edge Computing Devour the Cloud?
Two Features that Differentiate Edge from Cloud
Programmable Server Resources and CDN
One change is that these server resources and the content delivery network itself are now programmable. This allows companies to move core API services off of centralized cloud servers and onto the existing globally distributed networks that the CDNs operate.
With edge solutions, companies that could only run servers in limited locations now have the ability to run APIs at a much larger scale, increasing user response speeds and the company’s global footprint.
Code Is Automatically Run on the Nearest Server
The second major change is how the code itself is deployed. With cloud computing, you’re renting a server and running your code on it. With edge computing, you simply deploy your code to the platform, and the code is automatically run on the nearest server across the regions.
This idea, called serverless compute, is also offered by the cloud providers (AWS Lambda, Google Cloud functions, and Azure Functions). But with the edge platforms, these functions now run across a global fleet of servers with zero management overhead.
Changes in the technology landscape are dictated by the economics they offer, not just the innovation behind the product.
When cloud computing came to market, the economic advantage was instantly obvious. Cloud computing gives users the ability to swap high upfront investments in their own server and network infrastructures for zero upfront. They use on-demand leases of compute power that can be paid with a credit card. The adoption was driven by economics, not the technology.
With edge computing, we are seeing similar economic staying power. Cloud-computing companies buy server and data center infrastructure in bulk to support resales as capacity. Edge platforms, like CDNs, are using edge computing to drive additional value on existing infrastructure, which in turn lowers the cost required to provide compute services to customers.
CDNs are fundamentally simple servers: They hold copies of data (storage/memory), they look up requests for said data (CPU), and then return the data to the user (network). There are also a number of free CPU cycles available throughout the day, as most retrieve and transmit actions require less CPU power than running a full database engine.
With serverless models, CDN providers are able to further monetize their existing capacity, allowing for meaningful economic impacts downstream.
As companies expand globally, cloud-based bandwidth costs will only increase, whereas such costs are not even a factor in edge-native pricing. In addition, edge-native solutions dramatically lower management costs (no servers to monitor), scale rapidly at a global level (code runs near users automatically), and simplify billing.
For startups looking to offer low-cost solutions on a global scale, the simplicity and economics of the edge-native model is compelling. As the next generation of startups comes of age, we expect to see many adopt a cloud-free model.
We’ve seen the future of the cloud—and it lives on the edge. For businesses, this is a faster, more scalable, and dramatically cheaper solution to modern computing needs.
Also see: Why Cloud Means Cloud Native
About the Author:
Jake Loveless, CEO, Edgemesh
This UrIoTNews article is syndicated fromGoogle News