Storage refreshes are something that every data center goes through. However, the most painful are NAS refreshes. It is not just the painful migration of data from the old NAS to the new NAS but also the tedious process of making sure security and access controls are set correctly. Additionally, each refresh gets more difficult, because unstructured data keeps growing. It is time to get off the NAS treadmill and consider the cloud for NAS services.
Cloud for NAS services is now a reality thanks to new hybrid cloud models. Users can benefit from improved performance thanks to local flash performance and organizations benefit from cost effective cloud storage and breaks the endless storage refresh cycle.
RecordedMar 29 201861 mins
Your place is confirmed, we'll send you email reminders
Organizations, application owner and users all have much higher expectations of IT than ever before. They expect IT to recover real-time data instantaneously and recall aged data very quickly. These expectations mean that backup architectures are getting stretched at both ends; rapid primary storage recovery and fast access to data residing on long-term storage repositories.
Join Storage Switzerland, Veeam and NetApp for our webinar as we discuss the new service level objectives IT needs to prepare for and how the backup software and hardware architecture needs to change to meet these new demands.
Join our panel of data protection experts as we discuss:
- The new service level objectives
- Why current data protection infrastructures fall short
- The three steps you need to take to modernize your data protection architecture
IT professionals are hearing a lot about composable infrastructures and many vendors are claiming to offer a solution, but only a few have truly composable solutions ready for deployment today. The goal of composability is to bring the cloud like infrastructure experience to the on-premises data center so that mission-critical workloads can benefit.Join Storage Switzerland and Kaminario for our next 15 minute webinar. During this webinar listen in as our panel of experts discuss what composable infrastructures are, how they have evolved and most importantly what to look for in a composable solution.
Register to receive an exclusive copy of Storage Switzerland’s latest eBook “Developing a Cloud-Like Experience for Mission Critical Data.”
Composing Infrastructure for Elastic, Hadoop, Kafka and Cassandra to Drive Down Cloud Data Center Costs
Hyperscale applications like Elastic, Hadoop, Kafka and Cassandra typically use a shared nothing design where each node in the compute cluster operates on its data. Hyperscale architectures, to maximize storage IO performance, keep data local to the compute node processing the job. The problem is that the organization loses the efficiency of shared storage infrastructure. As the hyperscale architecture scales, overprovisioned and underutilized compute, GPU and storage resources cost the organization money!
Join Storage Switzerland and DriveScale for our 15 minute webinar to learn how composable infrastructure can provide both high performance and high efficiency.
In 15 minutes learn:
- The challenges facing hyperscale architectures
- The true cost of underutilized compute and storage
- Why fast Ethernet networks are good for your data
- How a composable architecture brings scale, performance and cost efficiency to your data center
Most All-Flash Arrays were bought in the last few years and have not come anywhere close to “end of life,” yet most vendors are now shipping NVMe All-Flash Arrays which offer better performance. As enticing as these new systems might be, IT planners need to proceed with caution to make sure that the organization has workloads that can justify the move to NVMe as well as ensuring the rest of the storage and network infrastructure is up to the task.
In this 15 Minute webinar Storage Switzerland and Virtual Instruments take an objective look at the NVMe and help you map out a strategy for the transition.
Many organizations need to continue to run virtualized applications on-premises but are also looking to replace and consolidate aging server infrastructure. At the same time, these organizations want to leverage cloud services for data protection, high availability and disaster recovery as well as providing a centralized hub to track and manage what is happening across multiple data centers. The challenge is that most infrastructure solutions don’t offer both an efficient approach to demanding scale-up and scale-out applications while effectively integrating cloud services – for on-premises and cloud efficiency. Instead of creating a simple to manage hybrid cloud, the organization is forced to make trade-offs.
During the webinar, “Simplifying the Large-Scale Hybrid Cloud”, Storage Switzerland and Axellio discuss how Microsoft Azure Stack HCI and Axellio’s FabricXpress Servers can deliver new levels of consolidation in the enterprise. Learn how to intelligently leverage Azure to simplify operations like data protection, business continuity, and data center operations – while deploying less infrastructure and less software for your demanding on-premises workloads.
- How to Design Hybrid Cloud for Consolidation of Enterprise Workloads & Infrastructure
- What Cloud Services Does On-premises HCI Need?
- What is Microsoft Azure Stack HCI and how does Axellio leverage this for greater simplicity?
For most organizations, unstructured data is their biggest problem. The need to store all this data is forcing IT to expand primary storage capacities at an alarming rate. Unstructured data not only threatens to break IT budgets, but also force organizations to build new data centers. Organizations are looking to hybrid cloud storage for relief but the cloud has several potential roadblocks to prevent many organizations from leveraging it.
Join Storage Switzerland and infiniteIO for a 15 minute Friday Webinar on May 10 at 1:00 pm ET / 10:00 am PT. In 15 minutes learn:
•. The Goals of a Hybrid Cloud Storage Project
– What a Hybrid Cloud Storage Architecture Looks Like
– The Roadblocks to Hybrid Cloud Storage Success
– How to Overcome those Roadblocks
All-Flash arrays are no longer a one size fits all. There is a clear line of demarcation between standard all-flash and extreme flash, but most suppliers try to squeeze all of an organization’s problems into a single system architecture because typically they only sell one type of architecture. Traditional all-flash vendors are adding NVMe drives internally to their systems, but they are not updating their system internals to keep up with the technology. Other vendors are trying to create an end-to-end NVMe experience without the full enterprise services or trying to layer on 3rd party storage software that reduces performance and which also forces IT to re-think their network architecture due to requiring fork-life upgrades or replacements.
The truth is that most organizations need both, high-performance and extreme performance. Workloads like Oracle, MS-SQL, and analytics all benefit from extreme performance. Tier 2 applications and other types of unstructured data may have their performance demands met by a high-performance flash array. In both cases, the hardware design and the quality of the software are critical determining factors on how well the solutions will meet the organization’s demands.
Join Storage Switzerland and Violin Systems for our live webinar “Deciding Between High Performance and EXTREME Performance” as we discuss differences between the two performance levels and what to look for to make sure the system selected lives up to expectations.
HubStor’s CEO, Geoff Bourgeois, will be joining Storage Switzerland Senior Analyst Krista Macomber live at 1 p.m. EST on Monday May 6 to talk about how his company is approaching this problem, positioning continuous data protection and integrated backup and archive as the foundation of a modern data management strategy. Don’t miss this chance to hear Mr. Bourgeois’ take on the state of data protection and management, where the market is heading, and how HubStor uniquely fits in.
Splunk is one of the most broadly adopted analytics applications on the market today. The primary reason for its popularity is it analyzes data that organizations already have, like log and metadata generated from IT operations, enterprise security and various lines of business. Splunk’s popularity though has led organizations to desire to keep more and more of this data so that organizations can improve their decision-making capabilities even further. The desire to keep more data is creating complexity and cost in the underlying infrastructure as IT tries to scale the Splunk environment to meet the ever-growing demand.
Join Storage Switzerland and SwiftStack, a Splunk technology partner, for our webinar where our panel of experts will discuss the value of having Splunk analyze larger datasets while providing insight into overcoming infrastructure cost and complexity challenges through Splunk enhancements like SmartStore.
Key Webinar Takeaways:
- The Value of Increasing Splunk Dataset Size
- The Challenges Created by Large Splunk Datasets
- How Splunk SmartStore Enables Splunk Scalability
- The Requirements of a SmartStore-enabled Storage Infrastructure
- How Splunk and SwiftStack Work Together to Make Large Datasets more Manageable
As organizations and their employees become more distributed, a case can be made that endpoints (laptops, tablets and smartphones) are some of the most critical pieces of technology in the organization. The problem is most organizations haven’t revisited their endpoint data strategy in years. In fact, some still don’t have a formal strategy in place.
In this webinar Storage Switzerland and Carbonite discuss the increased importance of endpoints and why organizations need to upgrade their strategy to make sure these devices are protected, secure and in compliance. In light of increasing legal requirements and potential penalties, organizations can no longer afford to ignore this critical issue.
In this live webinar you learn why:
•. Endpoint backup needs to upgrade to keep pace with new threats
•. Laptop backup by itself is not enough
•. Security and loss prevention are as important as protection
•. The Endpoint Strategy must help with compliance
Modern applications process large segments of data, often in parallel. These applications need to match patterns across a multitude of object and vectors or regularly update multiple indices as they change over time. Storage IO and network latency quickly become a hindrance in allowing these applications to reach their full potential.
Most IT Planners assume that the only way to address storage performance problems is to buy faster storage systems and eliminate network latency. These approaches only partially address the issues while being least efficient and more expensive. Join Storage Switzerland and NGD Systems for a 15 minute live webinar to discuss an alternative approach leveraging computational storage.
Watch to learn:
• Modern Application IO Profiles
• The problems with typical IO workarounds
• How computational storage addresses these issue
• The use cases for computational storage
It is 2019, why are we still backing up production storage? The simple answer is that most production storage systems can provide some level of data protection but they can’t fully replace backup. It's time to consider what backup independence means. The technology is there for production storage to replace backup but most organizations haven’t fully leveraged it. Join Storage Switzerland and ClearSky Data for our live webinar to learn how to design a self-protecting production storage infrastructure that not only protects data but replaces backup.
Join us to learn:
•. The difference between data protection and backup
•. Why most production systems can’t replace backup
• How to architect a production system that is fully protected and achieves backup inde-pendence
Artificial Intelligence, Machine Learning and High Velocity Analytic workloads are going mainstream. Enterprises of all types and sizes want to seize the opportunity their data presents. As these workloads move from development to production, organizations face a significant challenge with the supporting storage architecture. At the heart of the problem is the file system the organization will use to store the information. It needs to be fast, scalable, durable and cloud-ready.
In this webinar Storage Switzerland and WekaIO will review the various file system options available to organizations looking to create storage architectures for AI, ML and analytics.
In this webinar you’ll learn which file system is best for your modern workload:
•. Traditional NAS (including scale-out NAS)
•. Legacy Parallel File Systems
•. Modern Parallel File Systems
Most storage consolidation strategies fail because they attempt to consolidate to a single piece of storage hardware. To successfully consolidate storage, IT professionals need to look at consolidation strategies that worked. Server consolidation was VMware’s first use case. It was successful because instead of consolidating hardware, VMware consolidated the environment under a single hypervisor (ESXi) and console (vCenter) but still provided organizations with hardware flexibility. A successful storage consolidation strategy needs to follow a similar formula by providing a single software solution that controls a variety of storage hardware, but that software also has to extract maximum performance and value from each hardware platform on which it sits.
Join Storage Switzerland and StorOne in which we discuss how to design a storage consolidation strategy for today, the future and the cloud.
In this webinar learn:
- The problems with a fragmented approach to storage
- Why storage fragmentation promises to get worse because of AI, ML, and the Cloud
- Why consolidating to a single storage system won’t work
- Why hyperconverged architectures fall short
- Why Software Defined Storage falls short
- Why the organization needs a Storage Hypervisor
For over a decade Network Attached Storage (NAS) was the go to file storage device for organizations needing to store large amounts of unstructured data. But unstructured data is changing. While large file use cases are still prevalent, small file use cases are becoming more dominant. Workloads like artificial intelligence, analytics and IoT are typically driven by millions, if not billions of small files.
Object Storage is often hailed as the heir apparent but is it? Can file systems be redesigned to continue to support traditional NAS workloads while also supporting modern, small file and high velocity workloads? Join Storage Switzerland and Qumulo for our live webinar, “NAS vs. Object — Can NAS Make a Comeback,” to learn the state of unstructured data storage and if NAS file systems can provide a superior alternative to object storage.
Join us on our live event to learn
•. Why traditional NAS solutions fall short
•. Why object storage systems haven't replaced NAS
• How to bridge the gap by modernizing NAS file systems
•. Live Q&A with file system and NAS experts
Organizations are moving to the cloud but according to a recent Osterman Research study, only 14% of companies have completed that transformation. The study clearly identifies data storage as an area where IT can easily accelerate their cloud transformation journey. Potentially more so than any other component, intelligently moving data to the cloud has the opportunity to significantly lower on-premises storage costs without the threat of impacting day to day operations.
Join Storage Switzerland, HubStor and Osterman Research for our live webinar where we’ll discuss the results of the Osterman Research study, what it means for IT, and how IT can take advantage of that research to leverage the cloud to alleviate data management and data protection concerns.
All pre-registrants will receive our exclusive eBook, “Understanding the Difference Between Data Protection and Data Management.” Sign up now and get your copy today.
The cloud seems like a logical destination for backup data. It is by definition off-site, and the organization no longer needs to worry about allocating valuable floor space to secondary data storage. The problem is that most cloud backup solutions fall short of delivering enterprise class data protection.
Most cloud backup solutions are too complicated to set up and upgrade, don't provide complete platform support, don't provide flexible recovery options, can't meet the enterprises RTO/RPO requirements and don't provide a class of support that enables organizations to lower their operational expense.
In this live webinar Storage Switzerland and Carbonite discuss the five critical capabilities that enterprises, looking to move to cloud backup, need to make sure their solution has.
Join us for this live event to learn:
1. Why switch to the cloud for enterprise backup?
2. The five critical capabilities enterprises MUST have in cloud backup solutions. What they are and why enterprises need them.
3. Why and where most solutions miss the mark
4. How Carbonite Server delivers the five critical capabilities
Designing architectures to backup primary storage as well as provide rapid recoveries is a challenging task that most IT professionals need to face. It is even more challenging in the face of a rapidly growing data set, increasing demand for shorter recovery times and new threats like ransomware. The cost for IT to design, implement, maintain and upgrade these infrastructures can consume a big part of the IT budget. Additionally, the time required for each of these steps is something that most IT teams simply don’t have.
As a result, many organizations are looking for ways to move to an infrastructure-less architecture where most of the physical hardware is in the cloud as well as the software intelligence. The goal is to move data protection from a very unpredictable CAPEX cost to a normalized OPEX cost. There are an increasing number of cloud solutions that claim an infrastructure-less solution but many of these solutions force the organization to give up many of the capabilities they’ve come to count on or force them into long-term cloud relationships.
Attendees to the webinar will learn the challenges of maintaining an on-premises infrastructure, why current cloud solutions fall short and what IT really needs from an infrastructure-less solution.
Backup software is continuously improving. Solutions like Veeam Backup and Replication deliver instant recoveries, enabling virtual machine volumes to instantiate directly on the backup device, without having to wait for data to transfer back to primary storage. These solutions can also move older backups to higher capacity, lower cost object storage or cloud storage systems. To deliver meaningful performance during instant recovery without exceeding the backup storage budget requires IT to re-think its backup storage architecture.
Modern backup processes need high performance, low capacity systems to deliver high-performance instant recovery, as well as high-capacity, modest performance systems to store backup data long term and software to manage data placement for the most appropriate recovery performance while not breaking the budget.
Edge Computing, often referred to as Distributed Cloud, is becoming a requirement for a set of new application use cases revolving around IoT, AI/ML and other new technologies requiring ultra low latency, data thinning to reduce bandwidth costs, autonomy, and privacy. These drivers mean that data center operators need to rethink their networking infrastructure as they plan for and deploy edge compute which will result in an explosion of mini- and micro-data centers. New approaches leveraging open networking, next generation SDN fabrics and network automation are required to simplify, manage and troubleshoot their increasingly distributed network infrastructures.
Join Storage Switzerland's Lead Analyst, George Crump and Pluribus Network's Mike Capuano for our on demand Lightboard webinar, "Using Next Generation Network Fabric for Your Distributed Cloud."
Storage Switzerland - experts on storage, server virtualization, cloud
Tune into Storage Switzerland's channel to learn from this analyst firm focused on storage, virtualization and the cloud. Storage Switzerland’s goal is to provide unbiased evaluations and interview content on sponsoring and non-sponsoring companies through articles, public events and product reviews.