Tackling the Storage Challenges of Rapid Data Growth
As data continues to grow at an alarming rate, IT will need to be smarter on how they store their data as much as how much storage they purchase. In this panel, experts from HGST and Code42 will discuss how data growth is affecting the storage industry in regards to cold storage, HDDs, back up and archiving and best practices to developing a comprehensive storage strategy.
This week on White Space, we look back at the news from DCD Converged conference in London. We’ve also brought back a special guest - Cole Crawford, CEO of Vapor IO and purveyor of unusual rack arrangements.
We discuss various ways to reuse server heat and discover that Coca Cola is apparently using Internet of Things to develop new flavors of the sugary drink.
Peter looks at the reasons behind the Telecity outage in the UK - but this outage has nothing on the recent data center fire in Azerbaijan, that left almost the entire country without access to the Internet.
Also mentioned are the news about CA Technologies getting out of the DCIM business, the reinvention of liquid cooling company Iceotope and the fact that the US government has just discovered another 2000 data centers it didn’t know it had.
Amazon Web Services has provided companies of all sizes with on demand elastic IT infrastructure services in the cloud totaling more than 50 services. This series gives you an opportunity to ask questions from an AWS expert live on any topic and also hear an overview of the subset of services/tools that AWS offers in big data and analytics.
Either submit questions during the session or ahead of time to firstname.lastname@example.org. In the email please include your NAME, COMPANY and JOB TITLE.
На этот раз сотрудники департамента ПС расскажут о сервисе ЕМС, позволяющем разработать стратегию развития ИТ-инфраструктуры хранения данных в соответствии с приоритетами и планами развития бизнеса компании на несколько лет вперед. Сервис предполагает оценку существующих систем хранения и выработку оптимальной стратегии без привязки к вендору.
IT organizations face rising challenges to protect more data and applications in the face of growing data security threats as they deploy encryption on vastly larger scales and across cloud and hybrid environments. By moving past silo-constrained encryption and deploying encryption as an IT service centrally, uniformly, and at scale across the enterprise, your organization can benefit from unmatched coverage— whether you are securing databases, applications, file servers, and storage in the traditional data center, virtualized environments, and the cloud, and as the data moves between these different environments. When complemented by centralized key management, your organization can apply data protection where it needs it, when it needs it, and how it needs it—according to the unique needs of your business. Join us on November 25th to learn how to unshare your data, while sharing the IT services that keep your data secure, efficiently and effectively in the cloud and across your entire infrastructure.
This tutorial covers technologies introduced by popular papers about Google File System and BigTable, Amazon Dynamo or Apache Hadoop. In addition, Parallel, Scale-out, Distributed and P2P approaches with Lustre, PVFS and pNFS with several proprietary ones are presented as well.
This tutorial adds also some key features essential at large scale to help understand and differentiate industry vendor's offerings.
Analysts have been advising for many years that all businesses are becoming software businesses and one of the key software platforms that organisations invest in to support this is SAP's Enterprise Solutions. With this transformation, the availability of these mission-critical enterprise services increases every day.
During this webinar, you can learn how:
• SAP and SUSE have collaborated for 16 years to make Linux the preferred operating system upon which SAP applications run.
• SUSE can help you unleash the power of Open Source whilst providing the most performant and resilient platform available for your SAP applications
• Highly-Available SAP HANA, the core of SAP's next generation business suite S4/HANA, is ONLY available on SUSE Linux
One of SUSE’s Platinum Partners – Securelinx – who specialise in the deployment, configuration and support of Linux, are sponsoring this webinar.
На этом вебинаре вы услышите ответы архитектора облачных решений AWS Дениса Баталова на предварительно присланные вопросы по тематике DevOps – сервисам и лучшим практикам AWS по разработке и обслуживанию программного обеспечения с помощью облака AWS. В частности, приветствуются вопросы по таким AWS сервисам как CloudFormation, Elastic Beanstalk, OpsWorks, а также CodeCommit, CodeDeploy, CodePipeline и EC2 Container Service.
The virtualization wave is beginning to stall as companies confront application performance problems that can no longer be addressed effectively, even in the short term, by the expensive deployment of silicon storage, brute force caching, or complex log structuring schemes. Simply put, hypervisor-based computing has hit the performance wall established decades ago when the industry shifted from multi-processor parallel computing to unicore/serial bus server computing.
Join industry analyst Jon Toigo and DataCore in this presentation where you will learn how your business can benefit from our Adaptive Parallel I/O software by:
- Harnessing the untapped power of today's multi-core processing systems and efficient CPU memory to create a new class of storage servers and hyper-converged systems
- Enabling order of magnitude improvements in I/O throughput
- Reducing the cost per I/O significantly
- Increasing the number of virtual machines that an individual server can host without application performance slowdowns
Gli strumenti Command Line Interface (CLI) di AWS forniscono un’interfaccia a riga di commando semplice da utilizzare che permette di creare potenti script di automazione. In questo Webinar, scopirari techiche avanzate che aprono nuovi scenari nell’utilizzo della CLI AWS. Ti mostreremo come filtrare e trasformare le risposte dei servizi via JMESPath, come concatenare script e comandi per ottenere automazioni complesse, ed esplorare le nuove caratteristiche e funzionalità.
Dave Minturn, Storage Architect, Intel; J Metz, SNIA Board Member, R&D Engineer, Office of the CTO, Cisco
Non-Volatile Memory Express (NVMe) has piqued the interest of many people in the storage world. Using a robust, efficient, and highly flexible transportation protocol for SSDs, Flash, and future Non-Volatile Memory storage devices, the NVM Express group is working on extending these advantages over a networked Fabric.
This live Webcast will explain not only what NVMe over Fabrics is, but also specifically pay attention to how it works. We’ll be exploring:
•Key terms and concepts
•Differences between NVMe-based fabrics and SCSI-based fabrics
•Practical examples of NVMe over Fabrics solutions
•Important future considerations
Come join us as we discuss the next iteration of NVMe.
Russ Fellows, Senior Partner & Analyst, Evaluator Group
Deploying Solid-State for Virtualized Environments: Use Cases for All Flash, Hybrid and Alternative Storage Implementations
This session is dives into common use cases for all-flash and hybrid storage systems for virtualized environments in mid and large enterprises. Russ will focus on actual deployments, allowing listeners an opportunity to learn how to build a solid business case for solid-state, based upon findings from enterprise firms, as well as hands on performance testing with multiple systems in head to head comparisons, providing practical information.
Review the options and architectures that are best suited for server and desktop virtualization
Understanding of when and where to deploy solid-state storage or hybrid to maximize your IT budget and ROI.
The bottleneck in flash storage is often the interface. SAS/SATA interfaces were designed specifically for hard disk drives not for flash media. For example, flash storage can support many more simultaneous I/O operations. The resolution to the problem is to use a different interface, one that is higher throughput and is more directly accessible from the CPU. Leveraging one of these interfaces and extracting optimal performance from the flash media means leaving the confines of the SCSI protocol with customized proprietary drivers. The result is complexity and slow innovation.
Join Storage Switzerland and OCZ, a Toshiba Group Company, for a live webinar “How NVMe Will Change Flash Storage”. In this webinar we will answer these questions:
- What is NVMe?
- Why is NVMe Flash superior to SAS/SATA SSDs?
- Is NVMe Flash superior to proprietary PCIe flash drives?
- How to get started with NVMe?
How do YOU choose the ideal data storage solution for your virtual server environment? Do you need Flash/SSD?
This webinar focuses on common questions that data storage buyers ask when they are looking to purchase storage to handle their virtualized server environment and solutions that can increase their productivity while saving on costs.
Hubert Yoshida, CTO, Hitachi Data Systems, Greg Knieriemen, technology evangelist, Hitachi Data Systems, Adrian De Luca, CTO
Innovative technology companies that quickly capitalize on business opportunities and satisfy the demands of today’s empowered consumer have caused a wave of disruption. In 2016, businesses will turn to IT for solutions that will keep them competitive. Chief information officers will invest in faster delivery of applications and analytics, and transform IT by leveraging the third platform, social, mobile, analytics and cloud to reduce infrastructure expenses. Learn how to avoid distractions and remain focused on the IT trends that matter in 2016, as well as gain the knowledge to help accelerate your IT transformation and success.
J Metz, R&D Engineer for the Office of the CTO, Cisco; Alex McDonald, SNIA-Ethernet Storage Forum Vice Chair, NetApp
When we talk about “Storage” in the context of data centers, it can mean different things to different people. Someone who is developing applications will have a very different perspective than, say, someone who is responsible for managing that data on some form of media. Moreover, someone who is responsible for transporting data from one place to another has their own view that is related to, and yet different from, the previous two.
Add in virtualization and layers of abstraction, from file systems to storage protocols, and things can get very confusing very quickly. Pretty soon people don’t even know the right questions to ask!
How do applications and workloads get the information? What happens when you need more of it? Or faster access to it? Or move it far away? This webinar will take a step back and look at “storage” with a “big picture” approach, looking at the whole piece and attempt to fill in some of the blanks for you. We’ll be talking about:
- Applications and RAM
- Servers and Disks
- Networks and Storage Types
- Storage and Distances
- Tools of the Trade/Offs
The goal of the webinar is not to make specific recommendations, but equip the viewer with information that helps them ask the relevant questions, as well as get a keener insight to the consequences of storage choices.
Backup is not the most fashionable of functions within the datacentre but it is one that every organisation faces. Nevertheless many firms, particularly at the mid and larger end of the scale, find it challenging to sustain high performance, service quality and control costs.
The emergence of cloud infrastructure offerings such as AWS, Microsoft Azure and Private cloud has only increased the appeal of abandoning the traditional labour-intensive, multi-vendor approach in favour of a service-led utility model.
This raises the question – are you better off continuing with an in-house solution under your control or deploying a cloud service?
The webinar will explore 5 challenges that need to be addressed by Enterprise customers looking to take advantage of the opportunity Cloud presents.
You will learn about:
* The impact on restore
* Handling bandwidth constraints
* Delivering to Service levels in the cloud
* Assessing security and compliance commitments
* The true costs and the impact of change
Hyperscale clouds get all of the press, but is it really a case of “go big or go home”? The giant cloud providers offer great prices on bulk storage but the cloud can act as much more than just a backup destination. There’s a lot of value residing within the individual cloud offerings of the data protection vendors, with an entire spectrum of services ranging from standard backup storage to Managed Archiving to Disaster Recovery as a Service. And more.
Join us on this panel discussion to learn about what’s available and how to take full advantage of the range of cloud services offered so that you can precisely match vendors’ offerings to your business needs.
Arun Taneja, Founder & Consulting Analyst; Taneja Group (Moderator)
Christophe Bertrand, Vice President of Product Marketing; Arcserve
Austin McChord, CEO & Founder; Datto
Abhijit Dinkar, CTO, Dell Data Protection; Dell
Mark Campbell, CTO; Unitrends
Fred Pinkett, Senior Director of Product Marketing
File storage is an ever growing costly, complex problem. Files sprawl is out of control forcing organizations to build storage environments with file servers or NAS, backup servers and media, disaster recovery sites, WAN optimization, replication, remote storage, and mobile solutions. Adding insult to injury, these solutions run out of capacity and need expensive upgrades and painful migrations.
Cloud NAS does away with all of this by providing local NAS controllers which cache active files from a cloud based file system. This maintains the same user experience and performance, adds collaboration and mobile access, while securely eliminating separate backup, DR and remote data access solutions. Learn how cloud enables file storage with unlimited scale that saves company’s 40-60% of their storage costs and eliminates refreshes and migrations forever.
Web-based companies figured out a way to make high-capacity storage infrastructures using low-cost, commodity hardware. While this concept makes sense for large ‘hyper-scale’ companies, it’s not easily implemented by traditional IT organizations. The Open Storage Platform (OSP) is a model that the industry is using to define the hyper-scalers’ approach and provide a roadmap for building highly scalable, economical storage environments.
Join the Evaluator Group for this informative webinar to learn about OSP and what it can do for corporate IT. On the webinar we will:
• Discuss the challenges of implementing hyper-scale infrastructures
• Describe the Open Storage Platform and how it addresses these challenges
• Explain how companies can use OSP to deploy their own private clouds
Join us for our Storage Update. Each update covers the latest news, provides technical education, gives our opinion on the latest trends in storage, updates you on the latest briefings we've attended, and, of course, take questions and comments from our audience. We'll cover everything you need to know in 30 minutes but we will leave the line open for an hour to take in questions. No registration is required, simply tune in to get your 30 minute storage update or hang around for the full hour for the interactive Q&A.
John Peluso, Senior Vice President of Product Strategy, AvePoint
With the increase in cloud computing and easy-to-use file sharing systems in the workplace, it's become increasingly difficult for IT departments to keep track and maintain a secure environment. When staff needs to access or share data quickly, they no longer need to rely on IT to provide the tools to do so. Why would they go through the red tape of IT procurement, provisioning, testing, and security when they can find a solution themselves in a matter of seconds?
Join John Peluso, Senior Vice President of Product Strategy at AvePoint, as he presents how organisations have decided to call a truce and provide self-service provisioning and management. In this webcast, he will discuss:
- The dangers of having silos of information that IT and the business are unaware of – disconnected from the centralized servers and storage of the data center or even approved cloud services
- How all of this information may be absent from aggregated capacity, secured content, usage, and other reporting at higher levels, which can complicate business decisions.
- What an organisation can do to proactively manage any rogue IT by providing self-service to end users
Nearly 90 percent of today’s enterprises will be pursuing a hybrid cloud solution in the next 12 months. While these hybrid cloud architectures bring numerous benefits like flexibility, efficiency, and cost savings, they also introduce a multitude of challenges.
Service uptime and recoverability can become unpredictable in these fragmented, multi-vendor, hybrid cloud environments.
Attend this webcast and learn, how to:
· Make hybrid cloud services safe, predictable and available.
George Crump, Founder and Lead Analyst, Storage Switzerland
As Disaster Recovery pressures mount businesses of all sizes are looking to the cloud to solve their recovery challenges, but can the cloud be counted on when the business needs it most? Join George Crump, Lead Analyst with Storage Switzerland in this interactive webinar as he covers the different types of cloud backup solutions available, the types of disaster recovery solutions and provides an economic model to help businesses determine if the cloud is worth it. As always, we will leave plenty of time at the end of the webinar for Questions and Answers.
All attendees will receive an exclusive copy of George’s latest white paper “Cloud DR Economics - Can You Cost Justify it?”. This paper is not available anywhere else; you must register to get your copy.
Scott Jeschonek, Director of Product Management, Cloud at Avere Systems
With datasets continuing to grow larger and cloud computing getting easier to set up and use, one of the biggest challenges facing IT departments responsible for managing Apache Hadoop environments is how to efficiently support and enable data storage. The main choice for moving data to cloud-based Hadoop deployments is to use HDFS, but efficiencies and processes have not gotten to a level that is both sustainable and repeatable while being affordable for IT organizations.
In this 30-minute webinar, speaker Scott Jeschonek will look at the current best practices of using HDFS and Hadoop completely in the cloud. Next, he will highlight the limitations of this approach by contrasting classic on-premises Hadoop with Hadoop in the cloud, suggesting that the two are not the same. Finally, a new approach will be proposed that enables the use of Hadoop compute clusters while data is kept in its on-premises source instead of loading into the cloud. Participants will be able to consider a "Hybrid Hadoop" workflow that provides the economics needed for replication with more flexibility in the data source location, and the efficiency of eliminating the load process before cloud compute jobs.
Your world will be hybrid for the foreseeable future. You will combine a mix of on-premises private cloud and public cloud infrastructure to meet ever-changing business demands. But how do you build a storage architecture designed to support this hybrid world?
This session will provide insight on how software-defined storage facilitates hybrid cloud architectures. We’ll present four scenarios and case studies where companies are thinking differently about “cloud storage.”
Specifically, you’ll learn how you can use software-defined storage to:
1. Build an OpenStack “landing pad” to onboard applications born in public clouds like AWS.
2. Create hybrid AWS-based applications with on-premises infrastructure that migrates older, colder data to the public cloud.
3. Treat public clouds as “DR sites” where applications are automatically protected across data centers and clouds.
4. Store data on private, hosted infrastructure that bursts to the public cloud for compute.
Karna Bojjireddy, Product Manager, Security Softlayer, IBM Cloud. Shawn Mullen, Cloud Architect, IBM Cloud
Security is the number 1 inhibitor for most organizations looking to move to the cloud. Privileged user access, data location, data residency, boundary control, regulatory compliance, data segregation, IT support are some of the key security concerns. SoftLayer is the first cloud company to offer Intel® Trusted Execution Technology (Intel®TXT) as an additional method to secure customer’s infrastructure, down to the chip level. Organizations can simply take advantage of all that cloud has to offer for a growing set of workloads and data sets and feel even more safe that their data is secure and meeting their security and compliance needs.
Jason Leiva, Solutions Architect & Eric Bassier, Sr Director, Datacenter Products
Boost VM performance and reduce backup costs with Veeam 9 and Quantum tiered backup storage. From deduplication, to hybrid storage, to tape, to cloud, learn how Quantum’s portfolio of tiered backup storage works with Veeam 9 to boost VM performance while reducing backup costs.
The hottest topics for storage and infrastructure professionals
The Enterprise Storage channel has the most up-to-date, relevant content for storage and infrastructure professionals. As data centers evolve with big data, cloud computing and virtualization, organizations are going to need to know how to make their storage more efficient. Join this channel to find out how you can use the most current technology to satisfy your business and storage needs.