Hi [[ session.user.profile.firstName ]]

Storage

  • How to Save on Oracle Licensing Fees by Replatforming with Dell EMC: Bart Sjerps How to Save on Oracle Licensing Fees by Replatforming with Dell EMC: Bart Sjerps Bart Sjerps - Principal Systems Engineer Jan 24 2017 11:00 am UTC 60 mins
    Database license fees drive over 80% of total system cost. Many organizations virtualize their applications, but Oracle is often an exception for a variety of reasons.
    You will learn why re-platforming Oracle databases on better hardware can drive down TCO by a significant amount. Bart will also cover technical challenges and benefits, as well as myths and facts about licensing Oracle on VMware, and how to deal with Oracle license audits and still stay compliant.
  • Datacenters and Storage Models of the Future Datacenters and Storage Models of the Future Randy Kerns Jan 24 2017 4:00 pm UTC 60 mins
    Every new or recycled idea that comes along is going to be the ‘only way things are done in the future' according to some. This narrow view has been offered in the area of storing and managing information since modern computing began. The reality is much more messy. Different approaches for handling information are used and in reality, the processing and management of information use mostly unrelated solutions. This BrightTalk session will look at evolving datacenters and how information is handled including the storage technologies employed. There will be different approaches that will be discussed as part of the presentation.
  • 2017 Trends in Cloud 2017 Trends in Cloud William Fellows, Founder and Research Vice President - Cloud and Melanie Posey, Research Vice President for VotE- Cloud Jan 24 2017 4:00 pm UTC 60 mins
    Digital transformation is a business imperative, not an option. Cloud is powering this change: It provides the new style of IT that supports the information, process and platform transformations required to link technology and information assets with marketing and customer experience to deliver new or enhanced digital processes.

    Join William Fellows, Founder and Research Vice President for Cloud, as he discusses what’s coming this year in cloud, and how the industry’s major battleground in 2017 will be how, from where and at what price cloud transformation services are delivered to customers.
  • Introduction to Swordfish: Scalable Storage Management Introduction to Swordfish: Scalable Storage Management Richelle Ahlvers, Chair, SNIA Scalable Storage Mgt Technical Work Group, Principal, Storage Management Architect, Broadcom Jan 24 2017 6:00 pm UTC 75 mins
    The SNIA Swordfish™ specification helps to provide a unified approach for the management of storage and servers in hyperscale and cloud infrastructure environments, making it easier for IT administrators to integrate scalable solutions into their data centers. Created by SNIA’s Scalable Storage Management Technical Work Group (SSM TWG), SNIA Swordfish builds on the Distributed Management Task Force’s (DMTF’s) Redfish specification using the same easy-to-use RESTful methods and lightweight JavaScript Object Notation (JSON) formatting. This session will present an overview of SNIA Swordfish including the new functionality added in version 1.1 released in January 2017.
  • Future-Proof Your Private Cloud with Software-Defined Storage Future-Proof Your Private Cloud with Software-Defined Storage Rob Whiteley, VP of Marketing, Hedvig Inc. Jan 24 2017 7:00 pm UTC 45 mins
    Many large organizations either can’t or won’t go all-in on the public cloud. Regulatory compliance, legacy infrastructure, and costs drive these organizations to build private clouds. Do you invest in VMs or containers? Do you use OpenStack, Kubernetes, or Mesos to orchestrate your cloud? How will you bridge your private cloud with AWS, Microsoft Azure, or Google Compute Cloud? You need a flexible architecture that not only embraces all the above, but facilitate frictionless movement among them.

    Specifically, you’ll learn how you can use software-defined storage to:
    1. Support any hypervisor, container, or cloud platform – simultaneously.
    2. Plug into any cloud orchestration tool to automate your private cloud.
    3. Integrate public cloud – and not just bolt it on – when you’re ready.
    4. Create self-service storage and sandboxes for DevOps environments.
  • Solving Enterprise Business Challenges Through Scale-Out Storage & Big Compute Solving Enterprise Business Challenges Through Scale-Out Storage & Big Compute Michael Basilyan, Google Cloud Platform; Scott Jeschonek, Avere Systems; Jason Stowe, Cycle Computing Jan 24 2017 8:00 pm UTC 60 mins
    Google Cloud Platform, Avere Systems, and Cycle Computing experts will share best practices for advancing solutions to big challenges faced by enterprises with growing compute and storage needs. In this “best practices” webinar, you’ll hear how these companies are working to improve results that drive businesses forward through scalability, performance, and ease of management.

    In this webinar, you will learn:
    - How enterprises are using Google Cloud Platform to gain compute and storage capacity on-demand
    - Best practices for efficient use of cloud compute and storage resources
    - Overcoming the need for file systems within a hybrid cloud environment
    - Understand how to eliminate latency between cloud and data center architectures
    - Learn how to best manage simulation, analytics, and big data workloads in dynamic environments
    - Look at market dynamics drawing companies to new storage models over the next several years

    In just 60-minutes, you’ll be presented with a foundation to build infrastructure to support ongoing demand growth and have ample opportunity to ask direct questions to presenters.
  • “The Best Way To Predict Your Future Is To Create It.” “The Best Way To Predict Your Future Is To Create It.” Peter McCallum Jan 24 2017 9:00 pm UTC 45 mins
    More so than ever before, businesses are requiring IT to become less about infrastructure and more about availability and mobility of data. Yesterday’s IT shops were defined by the technology they chose to use rather than capabilities they could deliver to the business and its workforce. End users have become more savvy and less willing to accept limitations of traditional, static services. Technology has to adapt to timing, location and value of data on a global basis without restrictions. Based on these trends, FreeStor grew from roots in static infrastructure to now support the widest interoperability across storage platforms, protocols, hypervisors, clouds, and applications in the industry. See how FreeStor is helping to drive the goal of “infrastructure anywhere” for seamless delivery of data anywhere users need to be.
  • Leveraging Memory to Increase Disk Performance Leveraging Memory to Increase Disk Performance Petros Koutoupis, Lead Linux Systems Developer, IBM Cloud Object Storage (Cleversafe) Jan 24 2017 10:00 pm UTC 60 mins
    The future is now. Memory prices are dropping drastically and companies are investing heavily in them. That doesn't mean spinning magnetic disks are to disappear anytime in the near future. Their densities continue to rise and prices are significantly cheaper than that of memory. Operating at slower speeds, this presentation dives into the methods one can employ to increase the performance, and in turn the value of this slower and aging data storage technology.
  • Enterprise Applications and Persistent Storage with Docker Containers Enterprise Applications and Persistent Storage with Docker Containers Andrew Sullivan, Technical Marketing Engineer, NetApp Jan 25 2017 5:00 pm UTC 45 mins
    Docker® containers have emerged to provide the agility that development teams need while delivering the stability and reliability required by IT operations. Developers can bring existing applications into the Docker ecosystem while building new applications using microservice design principles. IT-operations teams can benefit from reduced complexity and faster deployment of containerized applications.

    Typically, Docker containers do not retain data after being restarted or retired. A container’s ephemeral nature presents challenges for containerizing many common Enterprise applications, from databases to home-grown tools, where data must be retained or when a container must share its data across multiple Docker hosts. Docker and NetApp solve these issues and more with the NetApp® Docker Volume Plugin (nDVP).

    The nDVP adds persistent storage capabilities to Docker containers, which gives developers the ability to enable data persistence in applications that use Docker container technology along with enterprise storage features such as transparent scalability, high availability, encryption, data deduplication, and seamless replication.
  • How To Save 50% Or More On Storage Costs How To Save 50% Or More On Storage Costs Mark Pastor, Quantum Corporation, Krishna Subramanian, Komprise Jan 25 2017 6:00 pm UTC 60 mins
    For most organizations today, over half of their data is not active yet it consumes expensive storage and data protection resources. Quantum and Komprise have teamed together to show you how to know more about your data and to identify how you can store it more intelligently: to keep it protected and fully accessible but at much lower costs than the traditional approach.

    In 15 minutes you will be able to see how much money you can save on your next storage purchase, and we will even provide you a return on investment (ROI) analysis for your internal discussions.

    Join Quantum and Komprise for this joint webinar and learn:

    • How you can get analytics across your storage to know how data is being used
    • How to cut costs and see the ROI of implementing a simple active archive
    •How this strategy can help you leverage public cloud storage if desired

    We will give you a live demo so you can see how easy it is.
    Join us and qualify for a free analytics assessment in your environment!
  • How to Manage, Orchestrate and Analyze Copy Data How to Manage, Orchestrate and Analyze Copy Data Alex Infanzon, Application Solutions Manager, Pure Storage; Prashant Jagannathan, Technical Manager, Catalogic Jan 25 2017 7:00 pm UTC 60 mins
    Oracle database sprawl comes with significant cost and complexity. The complexity is the result of proliferation of database copies created for a range of uses including disaster recovery, provisioning of development and testing infrastructure, quality assurance (QA), DevOps in private or hybrid clouds to name a few.

    Join this webinar to learn how Catalogic Software’s® ECX™ Copy Data Management platform deployed in conjunction with the Pure Storage® FlashArray™, allows organizations to manage, orchestrate and analyze Copy Data. The solution provides full lifecycle management of your Copy Data through automated workflows that allow you to streamline the creation, management and use of Oracle database copies.
  • Discover Space-Based Global Cloud Storage Network & Telecom Backbone Discover Space-Based Global Cloud Storage Network & Telecom Backbone Scott Sobhani, CEO, Cloud Constellation Corporation, Spacebelt Jan 25 2017 10:00 pm UTC 60 mins
    Join Spacebelt and the Cloud Constellation Corporation for a technical deep-dive into connecting your cloud storage and telecommunications networks within your data center.

    The Space-Based data center infrastructure is comprised of interconnected satellites, forming an exclusive cloud storage network ring to serve enterprises and governments everywhere. This is what the data center of the future looks like, and it includes an autonomous global telecom backbone, fast delivery of data stored in-orbit to anywhere on the planet, ultra secure point-to-point connectivity, and more.

    With the every increasing influx of data and big data, cloud storage models and data centers need to continue to shift and adapt to these demands. Learn about the latest and greatest data center connectivity solutions and technologies on the market today.
  • Spark Versus Hadoop. Quelle technologie choisir pour mon projet Big Data ? Spark Versus Hadoop. Quelle technologie choisir pour mon projet Big Data ? Marc Royer - Spécialiste SE & Big Data - Dell EMC Jan 26 2017 11:00 am UTC 30 mins
    Malgré quelques similarités, Hadoop et Spark sont souvent considérées comme la même technologie. Durant ce webcast, Marc Royer, Spécialiste SE et Big Data chez Dell EMC vous guidera à travers les différences entre ces outils, afin que vous puissiez choisir le bon pour votre projet Big Data.

    Pendant ces 30 minutes vous découvrirez notamment :

    - Pourquoi et comment les organisations se tournent vers le Big Data pour innover
    - Spark et Hadoop, et les spécificités des deux outils
    - Quelques cas d’usage de ces technologies
    - Une méthodologie pour réussir votre projet Big Data et les bonnes questions à se poser pour choisir la technologie appropriée
  • Hyperconverged Systems: What's Awaiting Data Centres in 2017 Hyperconverged Systems: What's Awaiting Data Centres in 2017 Mike Beevor, Sr. Technical Marketing Engineer at Pivot3 Jan 26 2017 3:00 pm UTC 60 mins
    It's no longer a secret that hyperconverged data centres are gaining ground in 2017, hence it is imperative for data centre architects to understand this technology to stay ahead of the curve. Join this presentation to learn key architectural concepts in HCI and why they matter.

    Among other topics discussed, you will discover:

    ·Why inline services offer better outcomes
    ·Why policy-based management is a must have for effective data centre consolidation
    ·Why modularity and scalability is critical in your next generation data centre
  • 2017 Trends in Software 2017 Trends in Software Nick Patience, Founder and Research Vice President - Software Jan 26 2017 4:00 pm UTC 60 mins
    Digital transformation is real, and it's happening, although there is still a very long way to go. We believe it is an inescapable truth that every business is becoming a digital business, controlled by software, which is the manifestation of these digital transformations. Businesses must react, driven by the imperatives of improving intelligence, agility and their customer-centricity, with the ultimate goal of survival in a digital world.

    Join Nick Patience, Founder and Research Vice President for Software, as he dives into the three business imperatives behind digital transformation and what specifically will be transformed in 2017.
  • Build Self-Service Dev/Test Environments with Google Cloud Platform & CloudBolt Build Self-Service Dev/Test Environments with Google Cloud Platform & CloudBolt Bernard Sanders, CTO and Cofounder, CloudBolt; Chris Sells, Product Management Lead, Google Cloud Developer Tools Jan 26 2017 5:00 pm UTC 30 mins
    Enterprise IT organizations looking to get a foothold into public cloud need solutions that deliver the efficiency and innovation promised by cloud platforms—without throwing away the investments they've made in on-prem technology, policies, and skills.

    Creating self-service Development & Test (or Dev/Test) environments is one of the first and most popular cloud workloads enterprise IT organizations explore in their journey to cloud. Google Cloud and CloudBolt have partnered to create a hybrid cloud solution for this important workload that's easy to set up, and can be fully tailored to your environment. CloudBolt is a cloud management platform that integrates your on-prem virtualization and private cloud resources with the public cloud, serving as a bridge between your existing infrastructure and Google Cloud Platform.

    With these hybrid solutions, users can rapidly provision the resources they need through an intuitive self-service portal. IT organizations maintain full control over system configuration, usage quotas, and cost transparency. By delivering a self-service, fully auditable alternative to shadow IT, CloudBolt and Google Cloud let you tap the latest cloud innovations from Google directly from the technology you're already familiar with on-prem.
  • Keeping Your Firm's Critical Applications Running while Improving Performance Keeping Your Firm's Critical Applications Running while Improving Performance Alan Porter, DataCore Solution Architect and Alan Kerr, Stable Path Solution Jan 26 2017 6:00 pm UTC 60 mins
    Application performance, availability and security are key issues IT departments in law firms struggle with today. It is imperative that you keep your firm's critical applications running fast and always available for your attorneys. Is there a cost effective and manageable solution?

    Join our webinar on January 26th to hear how law firms like yours are they are taking advantage of existing storage infrastructure to:

    • Improve application performance
    •Ensure document management and email systems operate 24/7
    • Reduce both CapEx and OpEx expenditure
    •Enable a hassle free experience with zero downtime


    Hear real case studies on how law firms like yours are deploying Lenovo Converged Server SAN appliances powered by DataCore to reduce cost and increase productivity. As the world leader in price/performance, Lenovo and DataCore together allow you to run more workloads, with better applications performance and availability.
  • All-Flash Storage Performance & Consolidation for Your Mission-Critical Apps All-Flash Storage Performance & Consolidation for Your Mission-Critical Apps Alex I., App. Solutions Manager, Pure Storage; Gaetan C. & Sunil M., Head of Product Marketing & Dir. of Product at Cohesity Jan 26 2017 7:00 pm UTC 60 mins
    When it comes to repurposing and accessing data for test/development and file sharing services, how do you ensure data protection and instant, reliable recovery while maintaining speed and efficiency? The best of both worlds is all-flash storage performance for mission-critical applications, as well as flash-driven consolidation of your secondary storage needs. With everything from seamless snapshot integration to safe and secure storage, your data and your application teams will thank you.
  • Why Hyperconverged Systems will be the Foundation of Your Next-Gen Data Center Why Hyperconverged Systems will be the Foundation of Your Next-Gen Data Center George Wagner, Sr Product Marketing Manager at Pivot3 Jan 26 2017 7:00 pm UTC 60 mins
    Hyperconverged Infrastructures (HCI) offer the potential for simplicity, agility and improved economics, but making sure that you implement the correct architectural approaches is vital. In this session, we will explore the architectural options of HCI, and provide insights into how you can apply this technology to develop the foundation for your next generation data center today.

    Join us for this presentation to learn:

    ·Key architectural concepts in HCI and why they matter
    ·Why inline services offer better outcomes
    ·Why policy-based management is a must have for effective data center consolidation
    ·Criticality of modularity and scalability in your next generation data center
  • How to deploy Big Data technology faster How to deploy Big Data technology faster Ashim Bose, Director, Analytics & Data Management Portfolio and Jan Jonak, Offering Manager, Analytics Platform Services, HPE Jan 26 2017 7:00 pm UTC 60 mins
    While the proliferation of analytic technologies has created exciting new ways to harness Big Data like never before, for many it has also created a quagmire of complexity that is slow to deploy, runs inefficiently, isn’t integrated with other systems, and cannot be scaled. There is a solution; it is possible to create next-generation applications with ease and efficiency.

    Join us for this presentation to learn:

    • How to jump start Big Data technologies deployments by using a standard analytic platform
    • How to implement an analytics platform that can scale based on your needs and simplifies your user experience
    • Where analytics deployments have failed to deliver ROI
    • Lessons learned from other companies on the journey to enterprise-grade analytics
    • What consumption models are available and which are most advantageous
  • Four Reasons Why Your Backup Hardware Will Break by 2020 Four Reasons Why Your Backup Hardware Will Break by 2020 George Crump, Storage Switzerland Doug Soltesz, Cloudian Jan 26 2017 8:00 pm UTC 45 mins
    While backup software vendors continue to innovate, hardware vendors have been resting on their deduplication laurels. In the meantime, the amount of data that organizations store continues to grow at an alarming pace and the backup and disaster recovery expectations of users are higher than ever. Most backup solutions today simply will not be able to keep pace with these realities. If organizations don't act now to address the weaknesses in their backup hardware, they will not be able to meet organizational demands by 2020. In this webinar, Cloudian and Storage Switzerland will discuss three areas where IT professionals need to expect more from their backup hardware and where they should demand less.

    Four Reasons Why Backup Hardware Will Break by 2020:

    1. Not Cost Effective Enough
    2. Not Scalable Enough
    3. Only Good for Backups - Not Enough Use Cases
    4. Too Much Deduplication
  • The Cloud and Object Storage Platform of the Future The Cloud and Object Storage Platform of the Future Tony Barbagallo, VP of Product, Caringo Jan 26 2017 9:00 pm UTC 45 mins
    The data center of 2020 will look nothing like the data center of today. The software-defined data center will continue to gain momentum with object storage playing a major role. Learn how object storage is now enabling far more than just cost-effective data storage and is being used for hybrid cloud, data management, filer and database optimization.