What Developers Need to Know About Edge Computing – DevOps.com

More and more companies are living “on the edge” as they move from “I’ll think about edge computing” to “I don’t think our business can optimally grow and compete without edge computing.” Developers will play a key role in helping organizations use the edge, but their effectiveness will depend on having edge-aligned processes, products and people in place.

Edge computing takes place near the physical location of either the user or a data source. It’s one way that organizations can use and distribute a common pool of resources across multiple locations. The edge model provides users with faster, more reliable services while expanding the potential of hybrid cloud computing for organizations. Customers and businesses alike will benefit from the way in which edge computing optimizes the use and application of artificial intelligence and machine learning by running data-intensive applications closer to the data source.

While it is in many ways a question of architecture, applications are key to effectively exploiting the edge. Here are some of the things developers—and their organizations—need to know.

Edge computing and cloud computing are not mutually exclusive. The best solutions will leverage both platforms. For example, centralized computing might be used for storage and compute-intensive workloads, while edge computing would be used to exploit data at the source, in near-real-time. Infrastructure and company culture need to be in place to support this, but developers must be thinking in terms of building applications that can cross and bridge computing platforms.

Consistent tooling is essential. Developing for legacy IoT was very specific and required specialized tools and skills. The modern edge, in contrast, is part of a technology continuum, and developers need tools that they can use across that continuum, including:

  • Provider/enterprise core: The “non-edge” tiers, owned and operated by public cloud providers, telco service providers or large enterprises.
  • Service provider edge: Tiers are located between the core or regional data centers and the last mile access.
  • End user premises edge: Tiers on the end-user side of the last mile access can include the enterprise or the consumer edge.
  • Device edge: Standalone (non-clustered) systems that directly connect sensors/actuators via non-internet protocols. This represents the far edge of the network.

Containerization enables flexibility, scale: Linux container technology wasn’t built for the edge, but the model fits perfectly with edge requirements for modularity, segregation and immutability. 

Keep an “open” mind. Open APIs open up new opportunities. Developers using open APIs can access real-time data programmatically, making it possible to build applications that can leverage data as it’s “born.” Likewise, a focus on open source development tools and platforms will help to ensure that organizations can continue to leverage the edge as the model evolves. 

What’s different is the devices. Developing for the edge is just like developing for any other platform–except that developers must write consistent and secure code that can be seamlessly copied to all device types. 

Edge computing may not start with developers, but developers will play a huge role in organizations’ ability to leverage the edge. A focus on flexibility, openness and the future will help ensure that developers build applications that fully exploit and expand the edge advantage. 

This UrIoTNews article is syndicated fromGoogle News

About Post Author