Edge computing is ushering in a new era of technology. Operating seamlessly with the cloud, the edge allows organizations to meet the challenge of this ever-more-connected world, where we are all reliant on applications that must function anywhere and everywhere. With the edge, the benefits of the cloud—lower costs, increased agility, ability to scale up and down as needed, faster innovation—can be extended from the cloud to the precise locations where they are needed most.
This ability to deliver computing power where it is required has never been more important. It also has significant implications for accelerating the deployment of IoT as well as our ability to truly reap the full benefits of artificial intelligence (AI), machine learning (ML), virtual reality (VR), and augmented reality (AR). Any app or industry that can benefit from lower latency, higher availability, and greater privacy—from self-driving cars, robot-assisted surgery, and smart factories to predictive maintenance and multiplayer online gaming—can benefit. These all require vast amounts of data to be available, on demand, regardless of geography.
The combination of the cloud, hybrid, and edge is already meeting the needs of today’s organizations, such as the demand for low latency, caching, local data processing and even data residency requirements with a localized on-premise/on-site component. The endless applications include smart highways, SaaS web applications, medical imaging, and the need for services like Hulu and Netflix to serve customers immediately, on demand, with no latency issues.
This is where life at the edge enables the best of both worlds. Edge computing provides businesses with the same infrastructure, services, APIs, and tools they use in the cloud where they need it the most. Behind the scenes, this weaving together of cloud, hybrid, and edge pulls together a great many moving parts and may be intimidatingly complex to developers and IT administrators within organizations if they try to make it happen on their own. This is why AWS creates a consistent, secure experience across all environments—and it allows their customers the ability to continue to innovate.
Couchbase, one of the most dynamic companies pioneering these processes is a next-generation database company and an AWS customer. Couchbase uses the AWS services they need in AWS Regions and AWS Local Zones to bring data and analytics needs of digital-forward companies closer to their end users. Over the past decade, Couchbase has been on the forefront of the edge computing revolution, rethinking the way data is distributed, and in the process creating one of the world’s most ambitious and robust systems for managing information.
Companies in almost every industry have built mission-critical apps and services using Couchbase’s technology to provide the highest guarantees of speed, availability, and data governance for customers.
Here, Wayne Carter, Couchbase’s vice president of engineering, discusses the ways the edge expands opportunities for innovation, how it enables systems that follow users as they skip between as many as 40 different networks a day, and why “latency is the enemy” but victory is within sight.
How has edge computing changed during the past decade?
Wayne Carter: When I started at Couchbase in 2013, the main edge devices were mobile phones connecting to self-managed data centers at the edge. Now the major cloud computing platforms extend all the way out to the edge, enabling enterprises to benefit from edge computing without needing to manage the highly distributed infrastructure. This means that instead of having to run all your own systems, there are now infrastructure companies like AWS and database companies like Couchbase to which you can outsource just about everything. Running workloads and accessing data on edge locations is a lot easier and less expensive today, opening it up to an entirely new range of businesses.
You’ve said that the edge creates new ways to think about data. How so?
The edge allows us to decentralize data and put it close to every user so it’s available incredibly quickly, whether that’s on a mobile phone, a point-of-sale system, a package scanner, or the kind of isolated digital system you might find on an airplane. By combining the power of the edge and the cloud you guarantee machines not only have access to the data they need, but that the data they collect ultimately makes it back out to the rest of the system. Taken together, we see significant improvements for digital products and experiences.
Couchbase and AWS both offer services that help companies manage their edge computing needs. What do companies get by turning to these providers instead of doing this themselves?
Most companies want to free their developers from operational work to focus on more high-value projects. When you reduce the operational responsibility of your application teams, they can iterate faster and focus on building applications on top of these new, faster, more available systems. You remove the costs of running your own hardware, and you get rid of the complexity of maintaining hardware and keeping the underlying software stack and tool base healthy and up to date. That’s a major boon for application teams. It all allows them to focus on their core product, iterate faster, and deploy to the cloud and edge in minutes, all on systems that are maintained and scale automatically.
For consumers, latency can show up as things like a spinning circle that pops up when downloading a movie at home. But apart from inconvenience, what is the true impact of latency?
Latency is a measure of the response time of a network, the time it takes from when a user makes a request for data to when the data is delivered, measured in milliseconds. A quarter-second difference in response time sounds minor, but if you’re doing five million data transactions in a day, 250 milliseconds on each one adds up to around 14 days of lost time—in reality, time you’re gaining—which might as well be considered lost revenue. So yeah, latency is the enemy. It can be a major drag on your bottom line. Getting down to single-digit-millisecond latency is the goal.
What are the best ways to keep latency low?
Generally speaking, the closer the compute and database are to the user, the faster the response. AWS has a comprehensive offering of edge services for different scenarios. Putting an AWS Outposts rack or server on-site can reduce latency. AWS Local Zones provide very fast, low-latency servers in major metropolitan areas where businesses might not have space to run their own datacenter. AWS Wavelength is built into 5G cellular systems which also puts the compute close to the user.
An important element of edge computing is creating a consistent experience from the edge to the cloud. What’s an example of how that plays out?
When it comes to network complexity, databases, and edge, one industry that comes immediately to mind is air travel. Airlines run a whole raft of parallel processes—very different things—sometimes next to one another, sometimes across the country, sometimes on a plane that’s 30,000 feet over the middle of the Atlantic. They’re managing ticketing, customer service, maintenance systems, in-flight entertainment, AI chatbots, and point-of-sale systems both onboard and in the terminal. Plus, they’re feeding data to almost all their employees in real time, while their employees move through as many as 40 networks a day. It all has to work seamlessly while remaining in compliance with access, control, and regulatory rules. And you need to know you’re moving your data securely and responsibly. It’s a lot! Offloading the enormous cost and operational complexity of running those systems has proven extremely attractive to airlines.
Do you have a favorite piece of the edge infrastructure, in terms of functionality or what it enables?
It’s usually not one single edge offering, but how a combination of edge services works together to deliver a truly distributed edge environment. For instance, when you have both Wavelength and Local Zones in an area, you have two separate options for single-digit–millisecond latency, which gives you the seamless connection between the AWS cloud and your edge systems, so the same code can run in either or both places. I also love the idea of combining AWS Private 5G and the Snow family devices. You could literally put a smart factory in the middle of the wilderness without reliable connectivity to the internet and have it all running on your own 5G network with massive local compute running on AWS. That might sound like an extreme example, but it’s actually not so different from the everyday situation on airplanes and cruise ships. And even if you’re just a DIY guy on a farm, getting this kind of full stack from Amazon and Couchbase is not cost-prohibitive. It’s a world-class system, it’s available off the shelf, and it’s pay as you go.
Last question. Can you talk about the importance of your cloud provider solution?
The right cloud provider was critical to our success. We chose AWS because they had optionality that wasn’t available from anyone else. They provide a continuum of services from the cloud all the way to the edge, which can include a number of different locations and environments. With AWS, our customers can use the same database to manage their edge applications and cloud applications, which gives them an amazing consistency for their applications wherever they’re running. So, for example, when those airline employees do their 40-network hopscotch—moving on and off networks, joining the European cloud, then joining the North American cloud, moving on and off private networks—all that connectivity is resolved automatically. The applications just ride on top of the network, and the database is absorbing all the interruptions in the network so you’re not affected by them. Combining AWS’s hybrid edge solutions with Couchbase’s database products enables a new class of applications that are faster and more available than ever before. And that’s the ball game. The number one thing you want for your company is for it to be up and for it to be fast.