How To Build The Optimal Edge Computing Infrastructure For Your Business

How-To-Build-The-Optimal-Edge-Computing-Infrastructure-For-Your-Business

New research defines standard models for deploying edge infrastructure

The physical infrastructure is key in any edge computing strategy. The power, cooling, and enclosure equipment, as well as the compute it supports, provides the foundation on which applications can run and enables countless edge use cases.

Making the right physical infrastructure choice is important at the edge given that many deployments are in locations where additional support and protection is required. Navigating edge infrastructure is also made more complicated with the broad and varied definitions of edge.

These factors make it challenging for the 49 per cent of enterprises exploring edge computing deployments. They must make decisions on how best to use existing infrastructure and where to make investments.

Vertiv, a global provider of critical digital infrastructure and continuity solutions, recently released the results of an in-depth research project to identify edge infrastructure models to help organisations move toward a more standardised approach to edge computing deployments, with the intent to improve costs and deployment times.

The report, Edge Archetypes 2.0: Deployment-Ready Edge Infrastructure Models, builds on the edge archetypes research and taxonomy it introduced to the industry in 2018. The new research further categorises edge sites based on factors including: location and external environment, number of racks, power requirements and availability, site tenancy, passive infrastructure, edge infrastructure provider, and number of sites to be deployed.

Device Edge: The compute is at the end-device itself, either built into the device or in a standalone form that is directly attached to the device, such as AR/VR devices or smart traffic lights.

Micro Edge: A small, standalone solution that can range in size from one or two servers up to four racks. It can be deployed at the enterprise’s own site, or could be deployed at a telco site, with common cases including real-time inventory management and network closets in educational facilities.

Distributed Edge Data Centre: This could be within an on-premise data centre (either a pre-existing enterprise data centre or network room or a new standalone facility). It also could be a small, distributed data centre or colocation facility located on the telco network or at a regional site. Distributed Edge Data Centres are currently common in manufacturing, telecommunications, healthcare and smart city applications.

Regional Edge Data Centre: A data centre facility located outside core data centre hubs. As this is typically a facility that is purpose-built to host compute infrastructure, it shares many features of hyperscale data centres that are conditioned and controlled, has high security and high reliability. This model is common for retail applications and serves as an intermediary data processing site.

The introduction of edge archetypes three years ago advanced the understanding of the edge. It was the first formal attempt – using information gathered across the industry – to group edge applications in a way that would help organisations avoid reinventing the wheel with every edge deployment.

Since then, other organisations and industry bodies have been working in parallel – and often with Vertiv as a collaborator – to create standard processes and technologies to advance the understanding and effectiveness of the edge. These latest edge infrastructure models represent the logical next step.

“As the edge matures and edge sites proliferate and become more sophisticated, creating edge infrastructure models is a necessary step toward standardised equipment and design that can increase efficiency and reduce costs and deployment timelines,” said Martin Olsen, global vice president, edge strategy and transformation for Vertiv.

“Edge sites will continue to require some customisation to meet users’ specific needs, but these models streamline many fundamental choices and introduce some much-needed repeatability into edge environments. This research is especially useful for specifiers, such as channel partners and IT management professionals.” Olsen added.

The research clarifies that edge sites will require refinements based on factors that may include environment, use case, legacy equipment, security and maintenance, enterprise data centre operations, and communications capabilities. These adjustments are possible within the framework of the edge infrastructure models, however, and do not diminish the benefits of standardisation the models provide.

“By adopting the four infrastructure models, edge players across the ecosystem can derive an array of benefits, including accelerating go-to-market and expediting the deployment of sites,” said Dalia Adib, director, consulting and edge computing practice lead STL Partners. “The edge market is experiencing growth, and this can only be bolstered by introducing some level of standardisation to the language we use for describing the edge.”

If you liked reading this, you might like our other stories

Living On The Edge
Did You Know About Google’s Multi-Cloud Vision?