By Deepak Syal
Edge computing is a critical component of today’s IT infrastructure network design. However, there are a few fundamental obstacles to overcome before we can move data processing to the network’s edge. Thankfully, firms all across the world are working to solve these issues and make edge computing more efficient, reliable, and user-friendly. If we look at the kind of patents that firms are submitting, we can readily figure out how they are solving these problems in novel ways. Patents provide us with a window into the inner workings of a corporation, allowing us to observe what R&D is going on. As a result, we identified five main obstacles and one inventive solution for each.
1) Security And Encryption
The internet of things (IoT) is well-known for its security flaws. Because of the enormous volume of data exchanged between the data centre and the edge, each device in an edge server is another potentially susceptible endpoint. However, the Zhejiang Geely Group, a Chinese multinational corporation, is attempting to make the network more secure with their invention.
A blockchain-based edge computing security encryption technique is the subject of their patent. A vehicle (edge data centre) authenticates a request utilising a multi-level verification system based on blockchain and creates action in the steering edge node using this technology.
2) Distributed Computing
Most servers have a scattered collection of modules that are positioned far apart from one another. On the other hand, Edge computing tends to move all systems closer to the processing regions. This causes a problem since the business server must take the edge server into account during calculation.
So, how is the industry dealing with this problem?
Beijing Zhixin Microelectronics Technology, a subsidiary of the State Grid Information and Communication Industry Group, has submitted a patent that proposes breaking down the whole edge server into several routes with route arranging devices at the edge centre. When a connection request is received, a suitable route is sought (by matching source and destination node criteria), and if none is identified, a new route is constructed based on the service and bandwidth needs.
The delay induced by data transmission is known as latency. This is most apparent in situations where we cannot afford to wait even one minute. Self-driving automobiles, for example. The response time of a self-driving car is crucial to its success.
Latency can be minimised in an edge server if the computation occurs closer to the data or if the calculation occurs exclusively in the data centre. But, in most cases, latency difficulties arise as a result of distributive computing and both-way processing.
Samsung has devised a solution to the problem. Their patent proposes that the predicted latency associated with both node and core may be estimated by evaluating network architecture consisting of an edge data centre and edge nodes. After that, the difference in latency between the two may be calculated, and the edge transfer procedure can be improved. As a result, self-driving automobiles’ real-time behaviours will be optimised.
4) Operational Constraints
Because several edge receivers are situated at different distances from the data centre, troubleshooting and repairing any fault that arises in the framework requires a lot of logistical and manual input, which raises the maintenance cost.
On the other hand, Intel found a technique to avoid these costs. Their invention provides an effective answer for dealing with operating limits and aids in the avoidance of expensive maintenance, logistics, and repair expenses.
5) Data Accumulation
Data is a valuable corporate asset, but gathering large amounts of data at the edge is a liability in and of itself. NTT Communications, a Japanese telecommunications corporation, has filed a patent to solve the data buildup problem.
A data retrieval server with an industrial information table for storing edge IDs is suggested in the patent. By linking edge IDs with an industry ID for identifying the industry and an ID of the firm/organisation belonging to the industry, these IDs correspond to edge servers controlled and managed by a company. An edge network can quickly split, store, and access a large quantity of data in this manner.
Other edge computing difficulties include network bandwidth, moving system limits, etc. Furthermore, a new architecture is necessary to use edge computing capabilities in a better way, based on accommodating distributive computing with practical bandwidth needs and logistics. As a result, major corporations such as IBM, Intel, Amazon, Google, Huawei, and others have already created “edge” portfolios. However, as more internet-connected gadgets enter the market, inventive companies are likely only scratching the surface of what edge computing can do.
(The author is Director and Co-founder GreyB. Views expressed are personal and do not reflect the official position or policy of the FinancialExpress.com.)
This UrIoTNews article is syndicated fromGoogle News