Americas

  • United States

HPE, Pure Storage unveil enterprise storage products

News
Mar 14, 20244 mins
Data CenterEnterprise StorageGenerative AI

New servers and storage services are targeted at high performance workloads, which means AI.

Handsome Smiling IT Specialist Using Tablet Computer in Data Center. Succesful Businessman and e-Business Entrepreneur Overlooking Server Farm Cloud Computing Facility.
Credit: Gorodenkoff / Shutterstock

Hewlett Packard Enterprise and Pure Storage each introduced new storage-related products and services targeted at high-performance workloads – which means AI.

HPE expanded its HPE GreenLake for File Storage offering with the addition of a new high-density, all-flash storage array, which the company said is aimed at large-scale enterprise AI and data lake workloads.

The new HPE product is an upgrade to the HPE Alletra Storage MP array introduced last year, with four times the capacity and twice the system performance per rack unit of the previous generation, according to HPE. The vendor also says the upgrade offers up to 2.3 times the performance density and up to 2.3 times the capacity of competitive offerings.

True to its name, the updated HPE GreenLake for File Storage line is aimed strictly at file-based data. It has a separate product line, HPE GreenLake for Block Storage, for scale-out block storage, which it released last month.

HPE GreenLake for File Storage is built to allow enterprises to scale performance and capacity independently. “Performance at scale limitations can be an issue for AI storage, particularly legacy scale-out NAS solutions,” said David Yu, HPE storage product marketing, in a blog post about the upgraded platform. “These solutions can scale capacity, but they can’t scale performance linearly to match that capacity.”

It’s designed to deliver sustained performance “that spans all the stages of AI – from data aggregation, data preparation, and training and tuning to inferencing,” Yu wrote. It achieves this by using a disaggregated, shared-everything, modular architecture that enables independent scaling of performance and capacity.

Pure Storage boosts self-service management options

Pure Storage is unveiling new self-service capabilities across its Pure1 storage management platform and Evergreen portfolio, as well as changes to its partner program.

For starters, Pure has updated its ActiveCluster upgrading technology for its Purity operating system; self-service autonomous upgrades to the operating environment will keep the different servers in sync with the latest OS upgrades.

Pure has also updated and streamlined its ransomware anomaly-detection capabilities. In the event of an anomaly, Pure1 now recommends snapshots from which customers can restore their affected data, either locally or remotely, instead of having to review the snapshot catalog manually.

Pure recently launched a disaster recovery as-a-service (DRaaS) offering and is already releasing an update, DRaaS 1.1. This enables customers to deploy disaster recovery in their data center in just 15 minutes or less. Organizations running virtual machines in a VMware environment can now leverage Pure1 to deploy a self-service environment for disaster recovery in the AWS cloud.

Also, Pure Storage launched a new Auto-Enroll feature, where organizations can auto-enroll VMs in their data center for added protection.

In addition to the new software and services, Pure has updated its partner program in a number of ways. The updates include: a new pricing model framework with programmatic discounting based on partner type, tier, and deal registrations; automation tools that allow partners to create independent configurations and quotes; simplified auditing and reporting capabilities; and a new partner intelligence dashboard that provides visibility into current and pipeline engagements, as well as guided proactive recommendations.

Storage is a vital, if overlooked, element in AI, especially generative AI. Large language models are built on data, and unstructured data at that. Companies are drawing in terabytes of data from data lakes, all of which is to be processed before the actual training and inferencing can begin. So, while GPUs get all the attention in the AI arms race, storage is just as important.

Andy Patrizio is a freelance journalist based in southern California who has covered the computer industry for 20 years and has built every x86 PC he’s ever owned, laptops not included.

The opinions expressed in this blog are those of the author and do not necessarily represent those of ITworld, Network World, its parent, subsidiary or affiliated companies.