Hi [[ session.user.profile.firstName ]]
Sort by:
    • A DNA-based Archival Storage System
      A DNA-based Archival Storage System Luis Henrique Ceze, University of Washington Recorded: Jun 13 2016 10:45 pm UTC 37 mins
    • Demand for data storage is growing exponentially, but the capacity of existing storage media is not keeping up. Using DNA to archive data is an attractive possibility because it is extremely dense, with a raw limit of 1 exabyte/mm3 (10^9 GB/mm3), and long-lasting, with observed half-life of over 500 years.

      This work presents an architecture for a DNA-based archival storage system. It is structured as a key-value store, and leverages common biochemical techniques to provide random access. We also propose a new encoding scheme that offers controllable redundancy, trading off reliability for density. We demonstrate feasibility, random access, and robustness of the proposed encoding with wet lab experiments. Finally, we highlight trends in biotechnology that indicate the impending practicality of DNA storage.

      Read more >
    • Cohesity and HPE Join Forces: Simple Data Protection for the Cloud Era
      Cohesity and HPE Join Forces: Simple Data Protection for the Cloud Era Marty Lans, HPE & Gaetan Castelein, Cohesity Recorded: Jun 1 2017 6:00 pm UTC 28 mins
    • Struggling with data protection? You're not alone. Many storage admins are faced with the challenge of protecting more apps, while supporting tighter business requirements. Unfortunately, legacy data protection solutions were designed more than 10 years ago and haven't kept up with today's requirements.

      Cohesity and HPE provide a joint solution to simplify data protection, by combining the efficiency of the Cohesity software-defined platform with the power of HP DL360 servers.

      Learn how our joint solution allows you to:
      -Simplify data protection by converging all your backup infrastructure on one web-scale platform, including target storage, backup software, replication, DR and cloud tiering.
      -Simplify management with a single UI and policy-based automation.
      -Accelerate your recovery points and recovery times while cutting data protection costs by 50%.
      -Integrate with all the leading public clouds for archival, tiering and replication.

      Read more >
    • The Bifurcation of the Flash Market
      The Bifurcation of the Flash Market George Crump, Storage Switzerland Recorded: Nov 30 2016 6:00 pm UTC 62 mins
    • High Performance or Capacity - Making the Right Choice

      The flash market started out monolithically. Flash was a single media type (high performance, high endurance SLC flash). Flash systems also had a single purpose of accelerating the response time of high-end databases. But now there are several flash options. Users can choose between high performance flash or highly dense, medium performance flash systems. At the same time, high capacity hard disk drives are making a case to be the archival storage medium of choice. How does an IT professional choose?

      Read more >
    • Storage Management: Windows Server and Microsoft Azure
      Storage Management: Windows Server and Microsoft Azure Anil Desai, Independent IT Consultant / Writer Recorded: Nov 15 2016 6:00 pm UTC 63 mins
    • Organizations have more options than ever when it comes to deciding how and where to store their data. In an ideal world, low-cost high-speed storage would be nearly infinite. Practicality, however, demands that IT groups determine how best to leverage their own storage (including local, NAS and SAN options), and how cloud storage can fit into the overall architecture.

      This presentation will start with recommendations for classifying storage requirements based on various needs, ranging from lower-cost, long-term data archival to highly-available, fault-tolerant, geo-replicated architectures, along with the vast sea of data that's located in between these requirements. The focus will be on the many different ways organizations can leverage existing and new features in the Windows Server platform and the many available storage-related services in the Microsoft Azure cloud.

      Also covered will be information about building a private cloud architecture in your own datacenter, using the Microsoft Azure Stack, System Center, and related OS and cloud options.

      Join us and save your seat today!

      Read more >
    • Object Storage: A Powerful Platform for 21st Century Data Management
      Object Storage: A Powerful Platform for 21st Century Data Management Steven Hill Recorded: May 18 2016 5:00 pm UTC 56 mins
    • After years of being dismissed as an archival storage platform, Object-Based Storage is re-emerging as the technology of choice for massive cloud storage platforms and media delivery systems; as well as for enterprises that are finally becoming aware of the power and flexibility that a metadata-rich storage environment can provide. Object Storage is proving to be the multi-tool of the modern storage industry, offering the ability to deliver high performance file, block and object-based front end capabilities while providing exceptional data protection, analytics and policy management in the background. In this program we’re going to discuss the increasing use of object storage technology in the data center and how this versatile technology can modernize your storage environment from end to end.

      Read more >
    • Maximize your Investment in Hyperconverged Infrastructure
      Maximize your Investment in Hyperconverged Infrastructure Paul Pindell, F5 - Bryan Pelham, Illumio - Chris Wahl, Rubrik - Brad Peterson, Workspot - Mike Penrod, Nutanix Recorded: Jul 26 2016 4:30 pm UTC 50 mins
    • Featuring speakers from F5, Illumio, Nutanix, Rubrik, and Workspot. Compare and evaluate 4 leading hyperconverged platform-optimized solutions that expand the capabilities of the Nutanix enterprise cloud platform: F5 application delivery, Illumio adaptive security, Rubrik data protection, and Workspot VDI.

      Learn how:

      • Workspot's cloud-native, infinitely and instantly scalable orchestration architecture (aka VDI 2.0) enables enterprise-class VDI deployment in hours, in which you can use all your existing infrastructure (apps, desktops and data).

      • Rubrik eliminates backup pain with automation, instant recovery, unlimited replication, and data archival at infinite scale -- with zero complexity. 

      • Visualization 2.0 from Illumio shows you a live, interactive map of all of your application traffic across your data centers and clouds, and identifies applications for secure migration to the Nutanix platform.

      • F5 delivers your mission critical applications on an enterprise cloud that uniquely delivers the agility, pay-as-you-grow consumption, and operational simplicity of the public cloud without sacrificing the predictability, security, and control of on-premises infrastructure.

      Read more >
    • Why Going Paperless is Easier Said Than Done
      Why Going Paperless is Easier Said Than Done Produced by: Custom Strategies | Underwritten by: Veritas Recorded: Oct 22 2015 6:00 pm UTC 57 mins
    • Government Business Council Report: 63% of Feds Lack Confidence in Agency's Digital Records

      Digital Viewcast to Explore How Agencies Can Use Information Governance to Help Agencies Streamline Records Management and Comply With White House Directives

      In 2011 President Obama issued the Presidential Records Management Directive (PRMD) requiring federal agencies to shift from “print and file” paper-based records management to digital archives that are easily retrievable and usable by December 2016. This digital event will highlight the recent study conducted by Government Business Council regarding various government agencies progress in modernizing their information management systems.

      Watch this digital event to learn:

      The costs of an ad hoc approach to information governance

      Federal employees’ confidence in the reliability of their agency’s information management tools

      Whether federal employees believe their agency is taking a strategic approach to information governance

      Agencies’ progress in implementing archival and eDiscovery tools ahead of the December 2016 deadline

      Read more >
    • Building native ETL/ELT on Hadoop without manual coding
      Building native ETL/ELT on Hadoop without manual coding John Mullis, Oracle & Ankur Gupta, Hortonworks Recorded: Oct 6 2015 10:00 am UTC 58 mins
    • Data professionals tend to see Hadoop as an extension of the data warehouse architecture and not a replacement; however it can reduce the overhead on expensive data warehouses by moving some of the data and processing to Hadoop. The Big Data framework has been extended beyond the warehouse to incorporate operational use cases such as customer insight 360, real-time offers, monetisation, and data archival. Generating value from big data requires the right tools to move and prepare data to effectively discover new insights. In order to operationalize those insights, new data must integrate securely with existing data, infrastructure, applications, and processes.

      In this webinar you will see how Oracle and Hortonworks has made it possible for you to accelerate your Big Data Integration without having to learn MapReduce, Spark, Pig or Oozie code. In fact, Oracle is the only vendor that can automatically generate Spark, HiveQL and Pig transformations from a single mapping which allows our customers to focus on business value and the overall architecture rather than multiple programming languages

      Read more >
    • Manage Data Deluge with Turnkey Storage & Infinite Scalability
      Manage Data Deluge with Turnkey Storage & Infinite Scalability Vince Curella, Intel | Don Frame, Lenovo | Paul Turner, Cloudian Recorded: Oct 15 2015 4:00 pm UTC 44 mins
    • Join Vince Curella, Intel North American Service Providers Technical Sales Manger, Don Frame, Lenovo North America's Director of Enterprise Systems Group Brand Management, and Paul Turner, Cloudian CMO and Technology Evangelist, on October 15th as they introduce new “turnkey” pre-certified storage appliances that take the guess work out of configuring the right server/storage combination. The combined Intel, Lenovo, and Cloudian turnkey system solves data deluge management challenges by speeding up deployment times and mitigating the risk of application downtime or performance problems that can occur from misconfigured systems.

      Intel, Lenovo, and Cloudian are revolutionizing leading-edge petabyte scale computing for enterprise and STaaS customers—now they do it together—with modern solutions that offer scale out architecture, hybrid cloud tiering, S3 compatibility, and multi-data center multi-tenancy features.

      Webinar discussion topics will include:
      •Simplifying backup and archival for big data
      •Enabling in-place smart data analytics
      •Centralize and control remote office backup
      •Scale out architecture to handle large cloud deployments
      •S3 compatibility to enable a private and public cloud environments
      •Enterprise file synchronization and sharing for the Internet of Everything

      ***Webinar starts at 9am PDT | 11am CDT | Noon EDT***

      Read more >
    • Drop Backup Pains and Get Protection Gains with Virtual Tape Libraries
      Drop Backup Pains and Get Protection Gains with Virtual Tape Libraries Mel Beckman and Dell’s Marc Mombourquette Recorded: Apr 8 2015 4:50 pm UTC 61 mins
    • Between rapidly growing data volume and stricter protection requirements, you might be finding that your tape archives are simply too slow to keep up. Fortunately, a solution has emerged: the Virtual Tape Library (VTL).

      Before you choose a VTL, however, it’s vital that you know all the major considerations. See what they are in a live webcast led by acclaimed IT technologist Mel Beckman and Dell’s Marc Mombourquette. Don’t miss out on your chance to hear them demystify VTL technology and explain its inner workings. Sign up today and discover:

      • How VTL works and why it's so good
      • Secrets of reduplication
      • The importance of encryption at rest and key management
      • VTL interconnect technologies
      • Shopping for feeds, speeds, and features
      • Backend archival strategies
      • Real-world solutions

      Read more >
    • DB2 Tech Talk: Data Archiving Best Practices: Improve Performance, Reduce Costs
      DB2 Tech Talk: Data Archiving Best Practices: Improve Performance, Reduce Costs Serge Rielau (host) and Eric Naiburg (presenter) Recorded: Mar 30 2012 5:55 pm UTC 50 mins
    • When it comes to data, more isn’t always better. Businesses demand that applications perform at their best, yet system complexity and data volumes continue to grow. In this technical session, we will discuss ways to improve performance by:
      · Reducing the volume of data in your production database so that SLAs for mission-critical applications are easier to meet
      · Controlling transactional database and warehouse data growth
      · Archiving and storing historical transactions securely and cost-effectively while still maintaining universal access
      · Ensuring access to historical data for compliance, queries and reporting
      · Ensuring compliance readiness

      This is an interesting, eye-opening look to the implications of not being able to access certain historical data and even having too much data.

      Read more >
    • How the Software-Defined Data Center Helps Ease the Epidemic Crisis
      How the Software-Defined Data Center Helps Ease the Epidemic Crisis Skip Snow, Senior Healthcare Analyst, Forrester, Gil Vidals, CEO, VM Racks & Eric Rife, Sales Engineer Director, Nexenta Recorded: Dec 16 2014 4:00 pm UTC 56 mins
    • Global health crises are making headlines daily and the medical industry’s ability to respond effectively depends on rapid access to data storage for archival and analysis. Data management has always been a healthcare challenge; today’s data stores are growing exponentially, and the requirement for responsiveness is accelerating.

      Join this presentation to hear from industry expert Skip Snow of Forrester Research on the big trends in healthcare data management and Eric Rife, subject matter expert from Nexenta on the compelling Software Defined Storage solutions to meet these requirements. VM Racks CEO Gil Vidals will continue the conversation by showcasing how SDS helps meet HIPAA compliance and healthcare’s unique requirements.

      Attend to learn more about:
      - The unique challenges of data management in healthcare, the importance of communication across the continuum of care, and why infrastructure is key
      - Why storage is increasingly burdensome to healthcare organizations, how to drive down the complexity and cost of solutions, and the positive impact on response time
      - How Software Defined Storage solutions help healthcare organizations get to solutions faster, with hardware that is easier to procure
      - Why HIPAA compliance hosting provider VM Racks chose SDS to support delivery of rapid, reliable, cloud-based healthcare solutions to its public sector customers

      Read more >
    • What's new in Red Hat's Inktank Ceph Enterprise 1.2?
      What's new in Red Hat's Inktank Ceph Enterprise 1.2? Neil Levine, Director of Product Management, Red Hat Recorded: Jul 30 2014 5:00 pm UTC 56 mins
    • Red Hat’s Inktank Ceph Enterprise 1.2 is the solution. Join our free webinar Wednesday, July 30, and learn how this new release couples a slate of powerful features with the tools needed to confidently run production Ceph clusters at scale to deliver new levels of flexibility and cost advantages for enterprises like yours, ones seeking to store and manage the spectrum of data - from “hot” mission-critical data to “cold” archival data.

      During this event, you’ll learn about features that include:

      **Erasure coding: Meet your cold storage and archiving needs at a reduced cost per gig.

      **Cache tiering: Move “hot” data onto high-performance media when that data becomes active and “cold” data to lower cost media when that data becomes inactive.

      **Calamari: Monitor the performance of your cluster as well as manage storage pools and change configuration settings with this newly open-sourced Ceph management platform.

      Read more >
    • Using Auto-Classification to Improve Your Information Governance Practices
      Using Auto-Classification to Improve Your Information Governance Practices Mark Diamond (President of Contoural), David Gould (HP) Recorded: Apr 2 2013 5:00 pm UTC 63 mins
    • Information Governance is an essential element to your compliance planning and execution. With evolving regulatory demands and increased litigation, the imperative to gain control over business content has never been more critical. Experts know that managing the retention and disposition of business information reduces litigation risk and legal discovery costs. But with the best of plans, there are challenges to face and decisions to make. Add in the maturation of technology and security issues, and the challenges seem to grow exponentially.

      Governance is still lacking in many organizations as around 85% of users still manually identify records, but are not clear which content is valuable and not valuable, and as a result, there is considerable fear towards the regulatory impact of deleting information. New auto-classification technologies can take the burden off the end user by eliminating the need for them to manually identify records, by providing automatic identification, classification, retrieval, archival, and disposal capabilities for electronic business records according to governance policies. During this webinar we will discuss how to improve your governance practices with auto-classification technologies. Join us for tips and insights on:

      - Understanding and Identifying the risks and costs of discoverable information
      - Quantifying the business benefits of Information Governance practices and Auto-Classification
      - How Auto-Classification works and can seamlessly fit into your organization

      Read more >
    • 3 Steps To Use The Cloud To Eliminate Storage Administration Headaches
      3 Steps To Use The Cloud To Eliminate Storage Administration Headaches Storage Switzerland, IBM PureSystems, TwinStrata Recorded: Sep 27 2012 5:00 pm UTC 59 mins
    • From Storage Management To Storage Value

      Today, the data center is moving from virtualization to cloud computing, with an eye toward reducing IT complexity and costs while increasing flexibility and business responsiveness. In reality 60-70% of IT budgets are devoted to maintaining existing infrastructures not on creating business value. IT personnel are stretched too thin to truly respond to the needs of the business and spend most of their time just keeping up. This is particularly true when it comes to the storage infrastructure. Countless hours and millions of dollars are spent on the “must-haves” of provisioning, performance tuning, expansion, backup, archival, disaster recovery, as a necessary cost of doing business.

      In this webinar, experts from Storage Switzerland, IBM and TwinStrata will share their insight on steps that IT can take to reduce the amount of time spent on the storage must haves so that they can instead focus on initiatives devoted to business value.

      Read more >
    • DB2 Tech Talk: Compression Comparison DB2 vs. Oracle Database and The Rest
      DB2 Tech Talk: Compression Comparison DB2 vs. Oracle Database and The Rest Serge Rielau (host), Chris Eaton (presenter) Recorded: Sep 27 2012 4:30 pm UTC 85 mins
    • Running the data storage environment can be a costly proposition. In addition to escalating costs for storage resources, there are additional costs associated with managing, powering and cooling the systems.

      In response to customers concerns about these storage-related costs, IBM Information Management has been delivering advanced capabilities within the DB2 for Linux, UNIX and Windows product family to reduce storage requirements and improve database performance by focusing on various forms of in-database and in-memory compression.

      With an eye toward current industry capabilities, this Tech Talk delves into the benefits and advantages of the DB2 solution in the following areas:
      • database compression
      • backup and archival compression
      • how and when to use file system and storage deduplication technology

      Along with these features and capabilities native to DB2, we will also compare these to Oracle and the rest of the database industry to show you why DB2 has a clear advantage when it comes to storage optimization.

      Read more >