The IT industry has always had some interesting standoffs, with the pendulum swinging from one side to the other — the latest being cloud versus edge or distributed versus centralized architecture. Cloud is well-defined and is a centralized approach meaning everything is in the cloud, with a few locations where everything runs and is stored. These centralized architectures are the easiest to maintain since they have only one point of failure and are “simpler.” However, they come with some drawbacks that the industry is trying to solve to meet needs. 

This is where the edge comes into play. 

But the simplicity of the centralized architecture, in turn, is its main disadvantage: the centralized ones are very unstable, since any problem that affects the central location can generate chaos throughout the system. However, distributed architectures are more stable, by storing the totality of the system information in many nodes that maintain a singular view with each other.

This same feature is what gives distributed architectures a higher level of security, since to carry out malicious attacks, many nodes would have to be attacked at the same time. As the information is distributed among the nodes across the edge (wherever it is): in this case if a legitimate change is made it will be reflected in the rest of the nodes of the system that will accept and verify the new information; but if some illegitimate change is made, the rest of the nodes will be able to detect it and will not validate this information. This consensus between nodes protects the system from deliberate attacks or accidental changes of information. The edge brings new challenges, but it also brings benefits from application performance and security points of view.

Edge is wherever the application needs it to be. It can be on a customer premise, an access network, metro area network, provider edge, even at the cloud edge hosted by modern multi-container applications. The benefit of such an approach is that each container runs where the optimum compute, latency, and memory requirements are met. It can be at the edge, one or more edge locations where latency meets jitter requirements and in the cloud where the compute meets the memory requirements. As the business process application will orchestrate the microservices into a desired function, it will be distributed over many locations and the system will be more complex to maintain. Vendors must build tools that will simplify these complexities, as it will increase the development velocity of new industrial applications, bringing economic benefits to the overall society.

This versatility means edge computing can be used wherever enterprises are looking to improve productivity and bring automation to systems. Examples include smart grids that better manage energy consumption in manufacturing, allowing enterprises to analyze and detect changes in production lines before failure occurs. Another example includes healthcare facilities where technology is needed to run faster and more efficiently to provide optimized patient monitoring, streamlined service and increased patient data protection. 

Rather than comparing edge versus cloud, we should view how to combine edge and cloud into a highly performing and secure system, exploiting the benefits of each and minimizing disadvantages of each.