Storage Architectures for the Hyperscale Data Era

Logo
Presented by

Dan Duperron, Senior Technical Architect, Quantum

About this talk

High performance computing generates huge volumes of data – mostly unstructured files and objects that must be processed quickly, then preserved and protected for many years and even decades. Applications like AI model training, machine learning, deep learning, and other forms of data analytics are accelerating data growth; many organizations are dealing with billions of files and exabytes of capacity. This challenge is compounded because data must be moved and managed across its lifecycle, often using different storage platforms and cloud platforms for different stages in the data pipeline. Quantum is the #1 provider of cold storage infrastructure to the world’s largest hyperscalers, and we are helping HPC organizations to build and manage private clouds using a unique architecture designed to reduce costs, improve cost predictability, strengthen cybersecurity and reduce emissions. Watch this short presentation to learn more about how hyperscale and cloud-native storage technologies can help you accelerate your data pipeline, simplify data management, and cost-effectively access inactive data for decades.
Related topics:

More from this channel

Upcoming talks (3)
On-demand talks (76)
Subscribers (17542)
Quantum delivers end-to-end data management solutions designed for the AI era. From high-performance ingest that powers AI applications and demanding data-intensive workloads, to massive, durable data lakes to fuel AI models, Quantum delivers the most comprehensive and cost-efficient solutions. Leading organizations in life sciences, government, media and entertainment, research, and industrial technology trust Quantum with their most valuable asset – their data. Quantum is listed on Nasdaq (QMCO). For more information visit www.quantum.com.