Case Study: Open, Scalable, Shared Storage for Geospatial Data
Join Esther Spanjer, Director of Business Development EMEIA at Western Digital and Janusz Bak, CTO of Open-E for this webinar.
They will discuss the challenges that Aviation Accounting Center LLC, an engineering company in Geospatial data processing was facing when planning for expansion of their IT infrastructure. Their existing standalone servers did not provide the capacity, availability and performance needed for storing and accessing its geospatial data. Esther and Janusz will walk you through the proposed solution and how it met the customer’s needs on scalability, capacity, throughput, connectivity and high-availability requirements.
RecordedJun 12 201834 mins
Your place is confirmed, we'll send you email reminders
Chris Marsh – Western Digital Marketing; Nir Paikowsky – VP OEM Enablement, ScaleMP, Inc.
SAP HANA data sets are growing to multible terabytes. Growing SAP HANA data sets don’t have to mean significant computing infrastructure costs to scale HANA for cloud or on-premises deployments. From this webinar, you’ll learn:
- Scale TBs of system memory for SAP HANA data sets at a better TCO
- Works for HANA On-Premises deployments and HANA Cloud Infrastructure/Services
- How to expand HANA server memory in a cost-effective manner vs. DRAM configs
- Overview on HANA system memory configs for: single instance, multi-tenancy, non-production environments, etc.
- How a better HANA compute infrastructure TCO can be achieved using Ultrastar® DC ME200 Memory Extension Drive to scale HANA memory pools with fewer nodes and at a reduced DRAM costs
Intended audience: Cloud Architects, SAP HANA managers, SAP BASIS Admins, “HANA-as-a-Service” providers
Matthew Morris, Sr. Solutions Manager, Data Center Systems, Western Digital
There is no slowing down the amount of data that companies are collecting, managing, and trying to decipher to understand how to shape their business, both today and for the future. With all of this data, there are growing constraints on how it can be stored and leveraged for actionable insights, and there are a number of technology considerations and questions, including, for example:
>Is it better to store in the cloud or deploy a hybrid cloud initiative?
>What’s the difference between scaling out and scaling up? Which should we consider?
>How do you manage your data growth Oracle or Microsoft SQL Server?
In this webinar, we will answer those questions while also sharing how IntelliFlash NVMe™ simplifies data management challenges. We will dive into the traditional infrastructure elements, such as capacity, performance, and consumption of data, and how Oracle and SQL Server databases have developed scale-out and scale-up approaches. We will provide guidance on how to work with these topologies. Through various use-cases, we will share the impact that these architectures can have on infrastructures, specifically how NVMe and/or all-flash arrays enable a flexible, scalable infrastructure that accelerates time to insights.
Massive data growth, increasing application density, and compliance requirements are driving organizations to deploy more efficient data protection solutions. As a result, backup and storage admins are leveraging snapshots, in-line deduplication, and new NVMe Flash technology to drive 24x7 operations, continuous data protection, and improve RTOs and increase the RPO SLAs with an on-premises cloud.
We invite you to join us for this webinar to learn how:
>Modernizing your backup and recovery operations for virtual and physical infrastructure can transform your business
>NVMe unlocks the true potential of solid-state flash technology
>Flash and Cloud Object Storage Systems from Western Digital integrate with Veeam and Rubrik stacks to accelerate your storage and data protection services
Matt Hebb, Product Marketing, Data Center Systems, Western Digital
As you likely know, the reason companies deployed virtualization in the beginning was to get more out of their hardware – and make their lives a little easier – it is the very same reason they are deploying it today. As a result, virtualized servers are the defacto standard in today’s data centers. While, it’s true that virtualization lowers IT infrastructure costs and allows enterprises to respond to new business needs quickly, it also creates an additional layer of infrastructure complexity. Complexity that the IT team has to manage across both the ESX and storage array layer.
We get it – it’s a double-edged sword. Virtualization is meant to lower costs and allow quicker response time, but when the infrastructure becomes complex, the resources required to manage it could increase and offset the savings you were hoping to achieve. Join us for this webinar where we’ll address how IT can more easily manage their virtual environment.
In this webinar, we’ll introduce you to the IntelliFlash vCenter Plug-in and share how the product offers visibility into storage datastore, offers the ability to self-provision VMs, and provides a consistent snapshot of VM-level performance metrics. In addition to better understanding the IntelliFlash vCenter Plug-in, you will learn:
>How to simplify your VMware administrative tasks
>How to take full advantage of All-Flash Arrays to create and deploy denser, faster VMs
Erik Ottem, Sr. Director, Data Center Systems at Western Digital Corporation
Virtualization has been around for quite a while. Now might be a good time to review what you’re doing from a data perspective. Classic hardware virtualization has been the key to consolidation efficiencies and savings. Now data virtualization can provide similar benefits in a different way. A modern data strategy should consider the benefits and costs of data virtualization. Some people are surprised to find out that hardware virtualization is not the end of the story, that there is data virtualization as well. In this webinar we’ll take a look at how having the right data strategy can improve the way you provide value to business operations by making data more available by reducing siloes and reducing the amount of management required.
Join Erik Ottem, Senior Director, Data Center Systems at Western Digital Corporation, as he shares tips on how to get the most from a virtualized environment from a data perspective.
In this webinar you’ll learn:
- Data Strategy: Why organizations must start with a data strategy; the business value of building a
data strategy and aligning infrastructure planning with that strategy
- Shifting from Hardware Virtualization to Data Virtualization: Why viewing your virtualization strategy
from a different lens is imperative to driving a competitive business advantage
- Cost Savings: How implementing a data strategy can help you spend less on both building and
managing data infrastructure
- Virtualization-Worthy Data: How to identify what data is better housed in a virtual environment;
including identifying the single source of truth (Golden Copy) and managing data quality
- Case Study: we’ll take a look at how data virtualization can work in the real world and the benefits that
have been realized.
Mark Miquelon; Director, Product Management, DCS; Jeff Nicholson, Senior Technologist, Engineering, DCS
The scale and diversity of data that is produced today is forcing an evolution of data center application development and deployment, and as a result, we are beginning to see more applications deployed in containers than virtual machines. While valuable, there are numerous IT challenges that cannot be addressed via simply deploying a container. Organizations are constantly facing new and changing performance and agility requirements while needing to scale up and down on-demand. Today’s data center, even with the advent of new technologies, like containers and virtual machines, is too rigid to efficiently meet the demands of the velocity, scale, and variety of today’s data. As a result, many companies are considering how they can “compose” – build, tear down and re-deploy – IT resources on-the-fly. An open, non-proprietary API that enables this composability is a critical element in meeting this challenge.
Join us to learn:
>Why organizations are moving from conventional server and storage implementations to composable
>What differentiates composable from open composable infrastructures and the benefits of an open composable infrastructure
>How Western Digital’s E3000 and F3000 Fabric Attached Devices enable an open composable infrastructure
>Benefits and how-to create and deploy virtual systems, then tear them down for re-use by using an open composable API
Eric Ottem, Sr Director, Western Digital, Thomas Davenport-Harvard Business Review, Mark Peters-Sr Analyst, ESG, Global
Data is integral to your company’s success. Having the right strategy for your data is as important as the right strategy for your company. Creating an overall data strategy, as well as the supporting workload specific strategies can save money and improve effectiveness.
In this webinar you’ll learn:
How can a data strategy impact your business from HBR author Thomas Davenport
How a data strategy can support business acceleration from ESG Global senior analyst Mark Peters
How ActiveScale can support a golden copy, single source of truth strategy
How IntelliFlash can support data acceleration that can accelerate your business
When to buy a storage product or a service to implement your business strategy
In the 90’s if you had current storage technology you were well served. But, now, over two decades later, there are a multitude of enterprise storage alternatives, and sifting through the options to identify what is necessary and what will drive actionable insights can be daunting. How do you decide if you should use on-premises or the public cloud? Should you use both? If you’re struggling with how to better manage the explosion of data and proliferation of storage options, this webinar is for you.
Join Erik Ottem, senior director, Data Center Systems at Western Digital Corporation, as he shares tips on deciding where to put your organization’s data and why.
In this webinar you’ll learn: how to evaluate considerations for cloud or on-prem storage when evaluating your data strategy, including:
- Data Security: What are the tradeoffs between your shop and the public
- Cost: When considering cost, what workloads are better on-premises
compared to public cloud
- Golden Copy: Why a single source of truth/multiple versions of truth is a key
consideration to support decision making
- Hybrid Cloud: How to determine if combination of on-prem and public cloud
might be the right option
- Data Strategy: Why your choices – regardless of workload – must align to a
greater data strategy
As organizations do their best to gain value from their unique data assets and unlock critical information, the way data interacts is changing. Nowhere is this revolution having a greater impact than in the data center.
Join IDG and Western Digital for this exclusive webinar to learn why Open, Scalable, Disaggregated and Extensible are the keywords for the next-gen data center.
Phil Bullinger, Western Digital’s Senior VP and General Manager for Data Center Systems and Freeform Dynamics’ Tony Lock
Enterprises of all sizes have unprecedented opportunities to drive value from data, delivering better customer experiences and increasing business value, both in terms of service delivery and embracing technology trends such as machine learning and AI. That’s the top line: in practical terms this means knowing how to architect technology infrastructure - network, storage and processing — to deliver performance and scale.
From a storage perspective, the option of going all-flash is now familiar; meanwhile, new connectivity mechanisms such as NVMeoF™ are looking to change the game once again. Evolving storage architectures are driving a move away from monolithic infrastructure and towards a composable, open architecture model. The benefits are profound, in terms of delivering a dynamic response to the needs of changing workloads, and hitting the target in terms of massively scalable service delivery.
So, how can organizations prepare? In this webinar, with Phil Bullinger, Western Digital’s Senior Vice President and General Manager for Data Center Systems and Freeform Dynamics’ Tony Lock, we look at:
- The current state of play - the growing need for performance, both in terms of data
management and service delivery
- The technology opportunity - how all-flash and NVMeoF™ are changing how we think
about data infrastructure
- Open, composable infrastructure - what this means, benefits and scenarios that will
- Starting the journey - how to move from current, monolithic models and towards
If you’re an architect looking to design performance around machine learning and AI, or are looking for a more general view on how you can build an infrastructure fit for the future, tune in.
Erik Ottem, Sr. Director, Data Center Systems, Western Digital and JuneAn Lanigan, Global Head of Enterprise Data Mgmt.
The world is awash in data, and now there are technologies that can capture and store unprecedented volumes of data. Saving all this data may not be affordable, so how do you decide what to keep or discard? Data Strategies are the latest way to make sure your business needs and data infrastructure are aligned for maximum value.
Join Erik Ottem, senior director, Data Center Systems at Western Digital as he interviews JuneAn Lanigan, Global Head of Enterprise Data Management at Western Digital as they discuss the evolution of data and how developing a data strategy is critical in both managing diverse datasets and aligning data to drive actionable business insights.
In this unique Q&A-style webinar, you’ll learn from someone who is in the data trenches every day their perspective on:
- How data has changed over the years – types of data and its impact on business
- What data is critical to ongoing operations and how must it be protected
- Western Digital’s experience building and deploying a Big Data Platform to create a
- How to manage data quality issues that may impact analytics workloads
- Futureproofing your data infrastructure for maximum ROI
Register today to hear from someone, like you, who has watched the world of data change and who has effectively built and deployed a data strategy to drive business impact.
Dave Montgomery, Director of Platforms, Western Digital
As discussed during our May webinar – NVMe-oF 101, to address today’s dynamic data requirements, we must consider that end users need a next-generation infrastructure that aligns storage resources with workloads without jeopardizing performance, capacity, or incurring unwarranted expense, and this is where Non-Volatile Memory Express over Fabrics (NVMe-oF) comes into play.
To continue our NVMe-oF discussion, during this follow-up webinar, we’ll dive Western Digital’s OpenFlex™ line of NVMe-over-Fabrics (NVMe-oF) products. We’ll share how OpenFlex can address these issues today – with dynamic control of the blend of storage resources that each application needs – enabling both flexibility and composability of flash and disk-based resources.
In this webinar, you will learn:
>How composability and NVMe-oF address the dynamic nature of today’s IT application environment (recap)
>The challenges in using server-based NVMe devices
>The architecture of the OpenFlex product line
>How Western Digital’s OpenFlex product line accelerates workloads and improves IT asset ROI
With more and more connected devices – data , specifically file data or unstructured data – is everywhere. Structured or unstructured data, the bottom line is that everyone simply wants to understand the business value of their data. To unlock its full potential, it is important to create a data strategy for files. Regardless of data type, files go through a lifecycle where the data might be used heavily at first, then less often, then perhaps archived. An effective data strategy for files can yield important benefits, like improved storage performance, better file access, and lowered costs, but more importantly, it can help unlock true business intelligence.
In this webinar you’ll learn about:
>Data trends: Structured vs. Unstructured
>The wide variety of file types
>Useful strategies to manage file data, including how to repurpose files for additional value, maintain performance for active files, and how to lower cost for less active files
>How to extract business intelligence from files
For those of us who spend our working lives around data centers, we know a good deal about the incredible technology inside storage devices, whether they are spinning hard drives or SSDs. But we tend not to think so much about the storage enclosures. After all, they just hold the drives and provide connectivity, power, and cooling, right? But, actually the enclosures have some very significant attributes that can influence performance, reliability, and cost, and this is especially true at scale where we must maximize storage density.
In this webinar, you will learn:
- About today’s density challenges at scale
- How storage enclosures influence performance, reliability and cost
- Why holistic engineering – from Silicon to Systems – is important and how it drives
innovation in the data center
It’s hard work to uncover real business results through analytics. It's common that you spend more time curating data than improving your algorithms. When diving into analytics for business value, you have to roll up your sleeves and prepare to put in the work. A data strategy for analytics may include finding real data sets from many sources that can be used in training your machine as quickly and effectively as possible, but one thing is imperative, the scalability and manageability of data is critical to the success of your analytics project.
In this webinar you’ll learn:
>Why an analytics strategy is paramount in driving business results
>Questions you should ask before building a strategy for analytics
>How to identify the different data needs for typical analytic workloads
Rajeev Sharma, Senior Product Manager, Data Center Systems
Businesses across the globe are grappling with how to not only capture and preserve their data, but more importantly, how to transfer the data into actionable insights. After all, data is the currency of the modern economy, and the speed by which an organization can unlock the possibilities of its data is paramount to its success. As a result, many organizations are turning to NVMe – a protocol that is disrupting all-flash array architectures – to provide higher performance and improve their data’s time-to-value. But, transitioning an infrastructure to NVMe may bring up questions, specifically how this will impact existing FC SAN and iSCSI environments.
In this webinar, you will learn:
- Why organizations are considering NVMe, and benefits of NVMe-based storage
- How to address the challenges of transitioning to NVMe-based storage
- How to leverage NVMe-based solutions in existing FC and iSCSI storage
- How to effectively consolidate storage silos without any trade-offs in performance
- How Western Digital’s IntelliFlash N-Series enables high performance at low
latency for the most demanding workloads, your mission critical applications
Dave Montgomery, Director, Storage Platforms Marketing, Western Digital
Every IT shop is trying to balance two different objectives: 1) increase capacity; 2) deliver the requisite performance against a variety of workloads. Indiscriminately spending money that doesn’t align capacity and storage performance with the workloads’ needs, is wasteful. However, connecting storage capacity and performance across the network with workloads is an easy way to allocate storage resources cost effectively, as needed.
To address today’s dynamic data requirements, we must consider that end users need a next-generation infrastructure that aligns storage resources with workloads without jeopardizing performance, capacity, or incurring unwarranted expense. Non-Volatile Memory Express over Fabrics (NVMe-oF) is a storage protocol that aims to simplify the interconnection of computer memory, storage, and networking, while improving performance of the storage infrastructure and related applications, to meet IT departments’ objectives and drive business value.
In this webinar, you will learn:
• What is NVMe-oF
• The business value of NVMe-oF
• The role that NVMe-oF will play in transforming data centers of the future
Manfred Berger, Sr. Mgr. Business Development Platforms EMEIA Western Digital;Michel Portelli, Sr Dir EMEA Marketing DataCore
Software Defined Storage is a growing trend in the datacenter, as IT managers are moving away from traditional Tier-1 OEM solutions in an effort to increase flexibility and reduce cost. But putting together a SDS solution, based on various hardware and software components can be challenging. Enter the pre-tested reference architectures from Western Digital and DataCore, that take on storage solutions of Tier-1 OEM vendors by providing equally high performance and world class reliability. But at a much more cost-attractive price point and with much greater flexibility in its tiering capability.
Join Manfred Berger from Western Digital’s Business Development team and Michel Portelli, Senior Director EMEA Marketing at DataCore Software for this webinar, in which you will learn more about:
1. DataCore SDS Software Solution
2. Western Digital hardware JBOD and server platforms
3. Western Digital and Datacore’s joint storage solutions to take on the SDS market
Erik Weaver, Global Director, M&E Market Development, Western Digital; Tridib Chakravarty, CEO, StorageDNA
In part-two of our DNAFabric series, we will dig deeper into how abstraction, combined with Tableau, enables a new generation of data insights. To empower decision-makers to derive actionable insights from large data sets, we will highlight the power of laying a foundation that supports next-generation artificial intelligence and machine learning applications to both see and understand data.
David Ridgeway, Senior Manager of Product Marketing. Data Center Systems @ Western Digital
Accelerating time to insights is the new currency and lifeline for your business. To tap into these insights, today's businesses need an advanced storage option that delivers performance, simplicity, and economics. Success requires you to take advantage of the latest high-performance storage constructs – flash memory, NVMe standards, and fast data storage – to eliminate bottlenecks, management complexity, and the high cost of traditional storage solutions.
In this webinar you will learn how NVMe enables higher performance all-flash arrays (AFAs) with the requisite enterprise-class management and high efficiency for cost-effective on-premises storage. Even if you’re considering your first AFA, this webinar will provide the information you need when deploying next-generation NVMe AFA solutions:
-Learn about the benefits that NVMe AFAs bring to enterprise, such as fast data acceleration and accelerating multi-site disaster recovery
-Five key features to look for in next-generation AFAs
-Find out how CIOs are using NVMe All-Flash storage to support mission-critical applications
-Discover, with support from the Forrester Total Economic Impact Study, how flash storage helps consolidate multiple tiers of storage to save on power, cooling, and rack space at a lower total cost
Western Digital Corporation is an industry-leading provider of storage technologies and solutions that enable people to create, leverage, experience and preserve data. The company addresses ever-changing market needs by providing a full portfolio of compelling, high-quality storage solutions with customer-focused innovation, high efficiency, flexibility and speed. Our products are marketed under the HGST, SanDisk and WD brands to OEMs, distributors, resellers, cloud infrastructure providers and consumers.
Case Study: Open, Scalable, Shared Storage for Geospatial DataEsther Spanjer, Director, Enterprise Business Development EMEAI; Iro Maragkou, Sales Operations Manager, Open-E[[ webcastStartDate * 1000 | amDateFormat: 'MMM D YYYY h:mm a' ]]34 mins