Hyperconverged infrastructure increasingly lives life on the edge
02 November 2021 | Curated Article shared by Nilesh Roy
:: Introduction ::
'Edge' IT infrastructure continues to build mindshare and deployment consideration as organizations seek to expand knowledge and business differentiation beyond the walls of core datacenters and major public cloud regions. IT leaders face a wealth of edge options - each with its own set of advantages and challenges - as they determine how best to outfit remote locations with IT resources. Increasingly, one of these options is hyperconverged infrastructure (HCI), which brings inherent benefits to the edge ecosystem such as deployment ease and a common management platform across clouds. However, research shows that HCI's consistent, predictable experience is driving edge deployments for additional reasons, including security.
:: My take ::
// HCI platforms have enjoyed steady growth in remote and edge locations in recent years, with the COVID-19 pandemic delivering a sizeable boost to those deployments over the last 18 months. Although this tracks closely with an overall shift of IT resources beyond core enterprise datacenters and centralized public clouds and toward the edge, HCI's role in this migration is notable because it signifies a commitment to use a core, on-premises datacenter technology for a highly strategic deployment location - rather than leaning solely on public cloud. However, our data shows this is not merely about extending existing HCI infrastructure to the edge or dropping HCI 'silos' into edge locations. On the contrary, HCI's consistent innovation in hybrid IT and consistent, predictable operations has ultimately ushered the technology into a prime spot within the edge landscape. Core datacenters will continue to be HCI's primary deployment location for the foreseeable future, but the technology's expansion at the edge with optimization for analytics and other edge-centric workloads will solidify its role as a go-to edge deployment option. //
:: Taking the edge off remote deployments ::
Despite its natural benefits for edge environments, HCI has thus far thrived primarily in core datacenters, where IT teams turn to the platforms to accelerate infrastructure modernization. However, a gradual shift of deployments is underway to locations outside of core datacenters, including regional and colocation datacenters, ROBO sites and edge environments. According to a recent research report, 67% of HCI customers currently deploy HCI in core datacenters, down from 73% in 2020. Meanwhile, 33% of HCI customers deploy in ROBO sites (up from just 19% in 2020) and 15% deploy at the edge (up from 9% in 2020).
Although we have witnessed a consistent increase in HCI deployments outside of core datacenters in recent years, the COVID-19 pandemic has accelerated this trend. According to a recent study, 55% of HCI customers said they were spending more on HCI as a result of COVID-19, significantly more than the percentage of customers spending more on servers (35%) and stand-alone storage (37%). For many organizations, COVID accelerated existing modernization investment strategies and a portion of this increased HCI spending was dedicated to replacement of legacy infrastructure in core datacenters. But for other organizations faced with providing IT resources to a newly remote workforce, HCI provided (and continues to provide) a viable option to outfit more remote and edge locations with quickly deployable, easily managed infrastructure that is closer to the employees.
Recommended by LinkedIn
An indication that expansion beyond core datacenters will continue is shown in another study, in which 75% of organizations expect a significantly increased reliance on remote work moving ahead, compared with prior to 2020. In theory, HCI deployed in core datacenters should provide resources to remote workers as effectively as HCI deployed in remote datacenters or ROBO environments. However, HCI customers indicate that networking and ease of scaling (along with rich data services) require the most improvement in their current HCI platforms - and both are significantly more prevalent in larger organizations. On the other hand, smaller greenfield HCI deployments at remote locations can be easier to configure and manage because they do not sit directly within the core datacenter's networking ecosystem. Workloads deployed on remote HCI installations can be limited to those required for those locations or for a select group of remote employees that need to access those resources (rather than joining a larger, mixed-workload environment in the core datacenter). Further, steady development of hybrid capabilities in recent years has eased the process of deploying HCI nodes and clusters on public cloud platforms, which provides a dynamic extension to remote or edge HCI resources that might be limited due to space and/or staffing constraints.
Security also plays a major role in decisions to deploy HCI in ROBO and edge environments. Edge environments are increasingly attractive attack vectors for bad actors that seek to exploit weaknesses inherent in locations that lack enterprise-grade infrastructure security. Air-gapped infrastructure can prevent network-based intrusions at the edge, but those deployments create major challenges and can minimize the potential benefits of deploying at the edge in the first place. For example, IoT installations in smart cities, retail establishments, oil & gas sites and other locations succeed only if the data generated in those locations can be regularly accessed and analyzed - often in real time. Some analysis may occur on-site, but deeper analysis typically occurs in the core datacenter or on public cloud infrastructure (either of which require network connectivity, unless the organization physically transports the data). Further, network access is a necessity if infrastructure is deployed in remote locations to reduce application latency for employees in those regions.
Organizations that extend their core IT infrastructure to edge locations face the same challenges that are characteristic for three-tier, legacy infrastructure, including complexity and management difficulty. Meanwhile, HCI is comparatively easier to deploy and manage, while offering automated resource provisioning that decreases or even eliminates the need for on-premises IT staff. While these are critical to ensuring a secure application deployment platform, perhaps most important is HCI's consistency - its typically stable, predictable experience helps to limit infrastructure-related surprises that can introduce security issues. Further, HCI provides a common management platform across core, edge and public cloud environments, which also helps to reduce complexity and ease the overall security process. This common experience also eases disaster recovery, which is especially crucial in edge environments that can be more prone to outages than core datacenters.
:: Navigating the edge infrastructure journey ::
While HCI's myriad benefits align well with the unique requirements of edge and other remote locations, there are challenges. For example, while HCI might spark organizations to deploy the technology at the edge due to the perceived relative ease of securing the environment the execution is non-trivial. According to our 2021 study, 58% of organizations that deploy HCI at the edge (compared with just 35% that deploy in core datacenters) have encountered security integration challenges when migrating or deploying applications on the platforms. Similarly, edge HCI deployers are also far more likely than core datacenter deployers to encounter issues with migration of non-virtualized workloads onto HCI platforms.
However, as edge infrastructure demand escalates, I expect that edge-specific HCI iterations will proliferate. To date, vendors have released edge-specific enhancements to their existing core platforms, including multi-cluster management, smaller profiles, decreased minimum cluster size requirements, ruggedized designs and cloud integration. Moving ahead, and to coincide with the rising interest I have observed among existing HCI customers for workload-specific HCI platforms, I expect to see edge-specific releases designed to optimize edge activities, such as first-pass data analytics; for example, this might include HCI edge clusters optimized (or certified, depending on the software) for Hadoop, Spark or SAP HANA. Further, HCI's strong support for Kubernetes and cloud-native initiatives (including application development, which is currently one of the top deployed workloads on HCI) will allow organizations to use distributed application architectures.
As this data collection and pre-processing ramps up in edge locations, organizations will be able to reduce the data load in their core datacenters and generate just-in-time analytics. The rise of edge analytics could ultimately encumber the storage capacity of HCI deployments, but continued innovation in disaggregated technologies will enable organizations to more easily offload data to stand-alone storage arrays in core datacenters for more intensive processing (and longer-term storage), which complements the ability to deploy clusters on public cloud services. The potential value behind such implementations is immense, particularly for organizations struggling to extend monolithic datacenter infrastructure to edge locations.
#NileshRoy #02November2021 #HyperConvergedInfrastructure #HCI #Storage #innovation #computing #compute #consistency #edge #edgecomputing #datacenter #monolithic #Hadoop #Spark #SAPHANA #DisaggregatedTechnologies