Demand for data storage is growing exponentially, but the capacity of existing storage media is not keeping up. Using DNA to archive data is an attractive possibility because it is extremely dense, with a raw limit of 1 exabyte/mm3 (10^9 GB/mm3), and long-lasting, with observed half-life of over 500 years.
This work presents an architecture for a DNA-based archival storage system. It is structured as a key-value store, and leverages common biochemical techniques to provide random access. We also propose a new encoding scheme that offers controllable redundancy, trading off reliability for density. We demonstrate feasibility, random access, and robustness of the proposed encoding with wet lab experiments. Finally, we highlight trends in biotechnology that indicate the impending practicality of DNA storage.
In this video Storage Switzerland and NTP Software discuss the surprising role that data archival can play in minimizing the exposure from ransomware attacks like WannaCry and data thiefs like WikiLeaks.Read more >
Struggling with data protection? You're not alone. Many storage admins are faced with the challenge of protecting more apps, while supporting tighter business requirements. Unfortunately, legacy data protection solutions were designed more than 10 years ago and haven't kept up with today's requirements.
Cohesity and HPE provide a joint solution to simplify data protection, by combining the efficiency of the Cohesity software-defined platform with the power of HP DL360 servers.
Learn how our joint solution allows you to:
-Simplify data protection by converging all your backup infrastructure on one web-scale platform, including target storage, backup software, replication, DR and cloud tiering.
-Simplify management with a single UI and policy-based automation.
-Accelerate your recovery points and recovery times while cutting data protection costs by 50%.
-Integrate with all the leading public clouds for archival, tiering and replication.
High Performance or Capacity - Making the Right Choice
The flash market started out monolithically. Flash was a single media type (high performance, high endurance SLC flash). Flash systems also had a single purpose of accelerating the response time of high-end databases. But now there are several flash options. Users can choose between high performance flash or highly dense, medium performance flash systems. At the same time, high capacity hard disk drives are making a case to be the archival storage medium of choice. How does an IT professional choose?
Organizations have more options than ever when it comes to deciding how and where to store their data. In an ideal world, low-cost high-speed storage would be nearly infinite. Practicality, however, demands that IT groups determine how best to leverage their own storage (including local, NAS and SAN options), and how cloud storage can fit into the overall architecture.
This presentation will start with recommendations for classifying storage requirements based on various needs, ranging from lower-cost, long-term data archival to highly-available, fault-tolerant, geo-replicated architectures, along with the vast sea of data that's located in between these requirements. The focus will be on the many different ways organizations can leverage existing and new features in the Windows Server platform and the many available storage-related services in the Microsoft Azure cloud.
Also covered will be information about building a private cloud architecture in your own datacenter, using the Microsoft Azure Stack, System Center, and related OS and cloud options.
Join us and save your seat today!
After years of being dismissed as an archival storage platform, Object-Based Storage is re-emerging as the technology of choice for massive cloud storage platforms and media delivery systems; as well as for enterprises that are finally becoming aware of the power and flexibility that a metadata-rich storage environment can provide. Object Storage is proving to be the multi-tool of the modern storage industry, offering the ability to deliver high performance file, block and object-based front end capabilities while providing exceptional data protection, analytics and policy management in the background. In this program we’re going to discuss the increasing use of object storage technology in the data center and how this versatile technology can modernize your storage environment from end to end.Read more >
Featuring speakers from F5, Illumio, Nutanix, Rubrik, and Workspot. Compare and evaluate 4 leading hyperconverged platform-optimized solutions that expand the capabilities of the Nutanix enterprise cloud platform: F5 application delivery, Illumio adaptive security, Rubrik data protection, and Workspot VDI.
• Workspot's cloud-native, infinitely and instantly scalable orchestration architecture (aka VDI 2.0) enables enterprise-class VDI deployment in hours, in which you can use all your existing infrastructure (apps, desktops and data).
• Rubrik eliminates backup pain with automation, instant recovery, unlimited replication, and data archival at infinite scale -- with zero complexity.
• Visualization 2.0 from Illumio shows you a live, interactive map of all of your application traffic across your data centers and clouds, and identifies applications for secure migration to the Nutanix platform.
• F5 delivers your mission critical applications on an enterprise cloud that uniquely delivers the agility, pay-as-you-grow consumption, and operational simplicity of the public cloud without sacrificing the predictability, security, and control of on-premises infrastructure.
Government Business Council Report: 63% of Feds Lack Confidence in Agency's Digital Records
Digital Viewcast to Explore How Agencies Can Use Information Governance to Help Agencies Streamline Records Management and Comply With White House Directives
In 2011 President Obama issued the Presidential Records Management Directive (PRMD) requiring federal agencies to shift from “print and file” paper-based records management to digital archives that are easily retrievable and usable by December 2016. This digital event will highlight the recent study conducted by Government Business Council regarding various government agencies progress in modernizing their information management systems.
Watch this digital event to learn:
The costs of an ad hoc approach to information governance
Federal employees’ confidence in the reliability of their agency’s information management tools
Whether federal employees believe their agency is taking a strategic approach to information governance
Agencies’ progress in implementing archival and eDiscovery tools ahead of the December 2016 deadline
Data professionals tend to see Hadoop as an extension of the data warehouse architecture and not a replacement; however it can reduce the overhead on expensive data warehouses by moving some of the data and processing to Hadoop. The Big Data framework has been extended beyond the warehouse to incorporate operational use cases such as customer insight 360, real-time offers, monetisation, and data archival. Generating value from big data requires the right tools to move and prepare data to effectively discover new insights. In order to operationalize those insights, new data must integrate securely with existing data, infrastructure, applications, and processes.
In this webinar you will see how Oracle and Hortonworks has made it possible for you to accelerate your Big Data Integration without having to learn MapReduce, Spark, Pig or Oozie code. In fact, Oracle is the only vendor that can automatically generate Spark, HiveQL and Pig transformations from a single mapping which allows our customers to focus on business value and the overall architecture rather than multiple programming languages
Join Vince Curella, Intel North American Service Providers Technical Sales Manger, Don Frame, Lenovo North America's Director of Enterprise Systems Group Brand Management, and Paul Turner, Cloudian CMO and Technology Evangelist, on October 15th as they introduce new “turnkey” pre-certified storage appliances that take the guess work out of configuring the right server/storage combination. The combined Intel, Lenovo, and Cloudian turnkey system solves data deluge management challenges by speeding up deployment times and mitigating the risk of application downtime or performance problems that can occur from misconfigured systems.
Intel, Lenovo, and Cloudian are revolutionizing leading-edge petabyte scale computing for enterprise and STaaS customers—now they do it together—with modern solutions that offer scale out architecture, hybrid cloud tiering, S3 compatibility, and multi-data center multi-tenancy features.
Webinar discussion topics will include:
•Simplifying backup and archival for big data
•Enabling in-place smart data analytics
•Centralize and control remote office backup
•Scale out architecture to handle large cloud deployments
•S3 compatibility to enable a private and public cloud environments
•Enterprise file synchronization and sharing for the Internet of Everything
***Webinar starts at 9am PDT | 11am CDT | Noon EDT***
Between rapidly growing data volume and stricter protection requirements, you might be finding that your tape archives are simply too slow to keep up. Fortunately, a solution has emerged: the Virtual Tape Library (VTL).
Before you choose a VTL, however, it’s vital that you know all the major considerations. See what they are in a live webcast led by acclaimed IT technologist Mel Beckman and Dell’s Marc Mombourquette. Don’t miss out on your chance to hear them demystify VTL technology and explain its inner workings. Sign up today and discover:
• How VTL works and why it's so good
• Secrets of reduplication
• The importance of encryption at rest and key management
• VTL interconnect technologies
• Shopping for feeds, speeds, and features
• Backend archival strategies
• Real-world solutions
When it comes to data, more isn’t always better. Businesses demand that applications perform at their best, yet system complexity and data volumes continue to grow. In this technical session, we will discuss ways to improve performance by:
· Reducing the volume of data in your production database so that SLAs for mission-critical applications are easier to meet
· Controlling transactional database and warehouse data growth
· Archiving and storing historical transactions securely and cost-effectively while still maintaining universal access
· Ensuring access to historical data for compliance, queries and reporting
· Ensuring compliance readiness
This is an interesting, eye-opening look to the implications of not being able to access certain historical data and even having too much data.
Global health crises are making headlines daily and the medical industry’s ability to respond effectively depends on rapid access to data storage for archival and analysis. Data management has always been a healthcare challenge; today’s data stores are growing exponentially, and the requirement for responsiveness is accelerating.
Join this presentation to hear from industry expert Skip Snow of Forrester Research on the big trends in healthcare data management and Eric Rife, subject matter expert from Nexenta on the compelling Software Defined Storage solutions to meet these requirements. VM Racks CEO Gil Vidals will continue the conversation by showcasing how SDS helps meet HIPAA compliance and healthcare’s unique requirements.
Attend to learn more about:
- The unique challenges of data management in healthcare, the importance of communication across the continuum of care, and why infrastructure is key
- Why storage is increasingly burdensome to healthcare organizations, how to drive down the complexity and cost of solutions, and the positive impact on response time
- How Software Defined Storage solutions help healthcare organizations get to solutions faster, with hardware that is easier to procure
- Why HIPAA compliance hosting provider VM Racks chose SDS to support delivery of rapid, reliable, cloud-based healthcare solutions to its public sector customers
Red Hat’s Inktank Ceph Enterprise 1.2 is the solution. Join our free webinar Wednesday, July 30, and learn how this new release couples a slate of powerful features with the tools needed to confidently run production Ceph clusters at scale to deliver new levels of flexibility and cost advantages for enterprises like yours, ones seeking to store and manage the spectrum of data - from “hot” mission-critical data to “cold” archival data.
During this event, you’ll learn about features that include:
**Erasure coding: Meet your cold storage and archiving needs at a reduced cost per gig.
**Cache tiering: Move “hot” data onto high-performance media when that data becomes active and “cold” data to lower cost media when that data becomes inactive.
**Calamari: Monitor the performance of your cluster as well as manage storage pools and change configuration settings with this newly open-sourced Ceph management platform.
As more applications are created using Apache Hadoop that derive value from new types of data, the enterprise "Data Lake" forms with Hadoop as a shared service. While these Data Lakes are important, a broader life-cycle needs to be considered that spans development, test, production, and archival. If you have already deployed Hadoop on-premise, this session will also provide an overview of the key scenarios and benefits of joining your on-premise Hadoop implementation with the cloud, by doing backup/archive, dev/test, or bursting. Learn how you can get the benefits of an on-premise Hadoop that can seamlessly scale with the power of the cloud.Read more >
Information Governance is an essential element to your compliance planning and execution. With evolving regulatory demands and increased litigation, the imperative to gain control over business content has never been more critical. Experts know that managing the retention and disposition of business information reduces litigation risk and legal discovery costs. But with the best of plans, there are challenges to face and decisions to make. Add in the maturation of technology and security issues, and the challenges seem to grow exponentially.
Governance is still lacking in many organizations as around 85% of users still manually identify records, but are not clear which content is valuable and not valuable, and as a result, there is considerable fear towards the regulatory impact of deleting information. New auto-classification technologies can take the burden off the end user by eliminating the need for them to manually identify records, by providing automatic identification, classification, retrieval, archival, and disposal capabilities for electronic business records according to governance policies. During this webinar we will discuss how to improve your governance practices with auto-classification technologies. Join us for tips and insights on:
- Understanding and Identifying the risks and costs of discoverable information
- Quantifying the business benefits of Information Governance practices and Auto-Classification
- How Auto-Classification works and can seamlessly fit into your organization
From Storage Management To Storage Value
Today, the data center is moving from virtualization to cloud computing, with an eye toward reducing IT complexity and costs while increasing flexibility and business responsiveness. In reality 60-70% of IT budgets are devoted to maintaining existing infrastructures not on creating business value. IT personnel are stretched too thin to truly respond to the needs of the business and spend most of their time just keeping up. This is particularly true when it comes to the storage infrastructure. Countless hours and millions of dollars are spent on the “must-haves” of provisioning, performance tuning, expansion, backup, archival, disaster recovery, as a necessary cost of doing business.
In this webinar, experts from Storage Switzerland, IBM and TwinStrata will share their insight on steps that IT can take to reduce the amount of time spent on the storage must haves so that they can instead focus on initiatives devoted to business value.
Running the data storage environment can be a costly proposition. In addition to escalating costs for storage resources, there are additional costs associated with managing, powering and cooling the systems.
In response to customers concerns about these storage-related costs, IBM Information Management has been delivering advanced capabilities within the DB2 for Linux, UNIX and Windows product family to reduce storage requirements and improve database performance by focusing on various forms of in-database and in-memory compression.
With an eye toward current industry capabilities, this Tech Talk delves into the benefits and advantages of the DB2 solution in the following areas:
• database compression
• backup and archival compression
• how and when to use file system and storage deduplication technology
Along with these features and capabilities native to DB2, we will also compare these to Oracle and the rest of the database industry to show you why DB2 has a clear advantage when it comes to storage optimization.