The storage community on BrightTALK is made up of thousands of storage and IT professionals. Find relevant webinars and videos on storage architecture, cloud storage, storage virtualization and more presented by recognized thought leaders. Join the conversation by participating in live webinars and round table discussions.
Learn from leading storage technology analyst Taneja Group about how to resolve these issues, with a rich set of actionable insights delivered by Tintri Analytics, the SaaS based Analytics platform. These insights are based on workloads’ key characteristics and the end-user experience of performing several common tasks.
Hyperconverged infrastructure is now being broadly adopted, yet many organizations are blindly deploying these new shiny toys without clear line of sight to an optimal storage configuration.
In this webinar, Luke Pruen, Technical Services Director at StorMagic presents real-world data from a large variety customers removing the fog of how best to configure for the perfect balance of cost, efficiency and performance.
Learn more about StorMagic SvSAN on our website: http://stormagic.com/svsan/
CloudPhysics takes the complexities of your virtual infrastructure inventory, performance, and configuration analysis and compels the evaluation nearly effortless. Learn how to use the On Premise IT Cost Calculator to derive the value of your current workloads and use the Cost Calculator for Microsoft Azure to make an informed comparison of the two scenarios.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage is more capable than where we read and write data.
In this session we’ll discuss the role and value extensible storage infrastructure has in accelerating software development activities, improve code quality, reveal multiple deployment options through automated testing, and support continuous integration efforts. All this will be described using tools common in DevOps organizations.
High speed and autonomous transportation, remote surgical procedures, and mixed reality experiences are just a few of the upcoming Internet of Things (IoT) technological game changers that require high speed and low latency communications infrastructure. 5G Radio Access Network (RAN) technologies continue to evolve to meet the complex and continually evolving network requirements of IoT applications.
This webcast will explore the latest advances in RAN, including:
• Which technologies will see the earliest deployments?
• How important will the 3.5 GHz band networks (also known as the Citizen’s Broadband Radio Service (CBRS)) be in future 5G applications?
• Will mmWave networks change the game in the last mile?
• How far will the industry advance in real deployments in the next year or two?
• Where are companies participating in this ecosystem investing their money?
• Are there concerns for Return on Investment (ROI) that companies are trying to resolve?
Brian Daly, Director - Core & Government/Regulatory Standards, AT&T Access Architecture & Devices
Adam Drobot, Chairman, OpenTechWorks, Inc.
Steven Glapa, VP of Marketing, Tarana Wireless
Enterprises and service provider customers have more storage, networking, and cloud technology choices at their disposal than ever before, and aggressive technology adopters will gain a performance and agility edge over their rivals in 2017. All Flash Arrays, Object Storage and Disaster Recovery to Cloud/Disaster Recovery as a Service are key technologies transforming the storage landscape. These new innovations account for a relatively small amount of storage revenue today, but will gradually displace traditional disk based storage systems, ultimately opening up opportunities for new innovators. The pace of change in networking technology provides the speed, capacity and agility demanded by cloud, IoT, social media, content delivery and the fabric that makes digital transformation possible.
Join Al Sadowski, Research Vice President for Infrastructure, as he dives into what’s accelerating the virtualization of network applications and how traditional vendors will need to transform their “innovation supply chain” to produce product equivalents to highly scalable cloud infrastructure.
The tables have been stacked against storage buyers for decades, but a change is coming. Forget the endless refresh treadmill of monolithic storage systems, and licensing models that are not just inflexible, but also force you to pay again and again for the same technology. Join this webinar to learn how to liberate your storage budget, and pay for storage the way that suits your business rather than the vendors, with 451 Research senior analyst Tim Stammers, and NetApp SolidFire’s Vice President & General Manager, Dave Wright.
Server virtualization was supposed to consolidate and simplify IT infrastructure in data centers. But, that only “sort of happened”. Companies do have fewer servers but they never hit the consolidation ratios they expected. Why? In one word, performance.
Surveys show that 61% of companies have experienced slow applications after server virtualization with 77% pointing to I/O problems as the culprit.
Now, with hyperconverged, companies have another opportunity to fulfill their vision of consolidating and reduce the complexity of their infrastructure. But, this will only happen if their applications get the I/O performance they need.
Join us for this webinar where we will show you how to get industry leading I/O response times and the best price/performance so you can reduce and simplify your infrastructure.
The next wave of cloud computing is coming, and it is going to take the shape of serving distinct enterprise needs in an on-demand fashion. It is normal for enterprise IT departments to provision or procure IaaS, PaaS or SaaS services now. That in turn means that a larger galaxy of business needs and operational complexities will be tackled as the next way to monetize cloud infrastructure. The trickle of specialized providers doing boutique services such as disaster recovery as a service, compliance as a service and database as a service could well become a flood.
Join Kelly Morgan, Research Vice President for Services as she uncovers what service providers will need to change in order to gain a competitive advantage in 2017, and what the sources of demand for multi-tenant datacenters will be in the coming year.
Cloud, Mobile, Big Data, Social Media - the pillars to our new digital society and a springboard for innovation by businesses of all shapes and sizes. But so what? It’s all just hype, right? Should you be starting to do things differently? Join us for this webinar if you want a reality check...
As technological advances continue to create new demands for Media & Entertainment organizations, the ability to efficiently store, manage, and curate data will be of paramount importance.
The newest generation of intelligent storage systems are key to gaining real-time visibility into the data footprint, usage patterns and much more. These real-time analytics drive innovation and ensure that business processes are carried out smoothly and reliably.
Watch our slightly fictitious story about "Really Big Studios" as told by Mike Bott - Mike has deep experience creating, growing and supporting data storage systems for many of the industry’s studios, post production and special effects shops.
Over the last decade we have experienced IT stack shift from HW centric to SW centric viewpoint, beginning with server to networking to now Storage. In this session we’ll explain the key characteristics of the SDS technologies in the market today and we will explore the where and how ScaleIO fits in. We will discuss some key ScaleIO technological concepts and explore the benefits ScaleIO for storage lifecycle simplification and overall TCO benefits. We will also go through a live demo to show some of these features in action.
Quantum's StorNext 5.4 brings new levels of performance, efficiency, flexibility and compatibility to media storage and data management. Enhancements include more tiering options, support for integrated applications, and even faster network connectivity.
Business demands on IT infrastructure are increasing while the pace of change accelerates. In order to support business requirements, companies must understand the value that new storage technologies may provide. Storage technologies are emerging that provide benefits beyond those already delivered by Flash and SSDs. In order to evaluate and plan for these new technologies, it is important for companies to know how to evaluate their business needs in light of these emerging technologies. This session will cover the economic and performance benefits of non-volatile, memory-based storage technologies, along with new access methods known as NVMe.
What you will learn:
- Performance and economic considerations for storage
- Emerging storage technologies, including NVMe
- Real-world test results of NVMe by Evaluator Group and others
- Evaluator Group recommendations for NVMe planning and deployments
With no room for data loss, how do you scale business without scaling costs?
Please join us for an interactive panel discussion with our customers and partners. Together we will explore ways organizations, like cloud service provider OvationData, are solving Big Data challenges with an innovative mix of technologies and scalable infrastructures that deliver needed results.
In this webinar we will discuss:
- How to break the economic barriers of storage
- How companies should evaluate new storage technologies
- Advice on how to fight PB growth challenges
- Plus more!
You will hear real-word examples and ideas from flash to object storage, with scalable performance, high data durability and boundless capacity, all at breakthrough costs.
Sportvision, the media company behind the iconic 1st & Ten® yellow line in football broadcasts, offers a broad range of rich sports entertainment and analysis products that rely on vast amounts of video and metadata.
When struggling with storage capacity and data management at scale, the company turned to Qumulo to gain the necessary visibility into data usage and performance.
Watch this video case study and learn how Qumulo helps Sportsvision:
- Gain unmatched real-time visibility into data and usage at the file-level
- Receive insight into performance consumption to ensure adequate IOPS
- Scale incrementally, allowing the company to simply grow storage capacity and performance by adding more nodes
Scalability is a must-have for any IT environment. This is especially the case where storage is concerned. Between all the files, photos, and videos, the average firm has more unstructured data than it knows what to do with. Clearly the importance of scaling up to meet increasing storage demands can’t be debated. What can and has been argued to great length is how to go about it. Should we scale up, or scale out?
Should you stick to the tried and true method of just adding disks to your arrays ("scale up") or look at software-based systems that can cluster multiple storage servers together over the network ("scale out")? Who's actually using these solutions and when does it make sense to go with one over the other?
Watch this video and get clear on the pros and cons of these two distributed storage solutions.
W trakcie prezentacji opowiemy o najnowszej rodzinie macierzy klasy midrange. Zobaczymy jak radzą sobie na rynku po kilku miesiącach od premiery. Odniesiemy się do sytuacji na rynku dysków SSD oraz powiemy jakich nowych funkcjonalności można spodziewać się po premierze mikrokodu Falcon.
Join independent analyst and blogger, Gabe Knuth, as he discusses the current state of storage, how we got there and the benefits of flash storage with the VP and Chief Architect at Pure Storage, Vaughn Stewart. Tune in now and learn what sets one storage vendor apart from its competitors.
The Scientific Computing and Imaging Institute (SCI) at the University of Utah is on a mission to help us better see the world. Their visualizations range in scope from as large as our solar system to as small as a brain cell. Running such intensive research projects requires a powerful storage solution.
Watch this short customer video to see how SCI is using Qumulo to power their visualizations, image analysis, and scientific computing.
Splunk does an excellent job of managing data, moving data between tiers which it calls buckets. But assigning Splunk to data management tasks takes compute power away from the primary objective — rapid data analysis. In this live webinar Storage Switzerland and Tegile will discuss how the default Splunk architecture works today, what the challenges with that architecture are and how to design a storage architecture for Splunk that is both high performance and cost effective.
Gone are the days when proprietary hardware was our only option in the data center. As companies seek technologies that would make use of their existing infrastructure, Open Source quickly became the staple solution for innovation and flexibility within the data center. If you’re looking ways to explore an open source strategy for the first time, or to take your existing strategy to the next level, this webinar is for you!
Join Nithya Ruff, Director of Open Source Office and board member at the Linux Foundation, as she shares why you need an open source strategy, and how you can grow your own. We look forward to collaborating with you!
Have you considered a move to converged infrastructure, but need to be convinced with solid return on investment (ROI) data? Respected industry analyst IDC interviewed global enterprises that use and rely on Hitachi Unified Compute Platform (UCP) to calculate its true business value. IDC found that each organization using this converged infrastructure earned an average return of 360% or about $32.7 million over a five-year period.
The comprehensive independent analysis includes a breakdown of business benefits in four key areas and shows how a unified compute platform:
•Increases business productivity with improved application performance, reduced provision time and faster time to market.
•Improves risk mitigation and user productivity with a high level of resiliency, performance and efficiency.
•Boosts IT staff productivity with decreased IT time to set up new infrastructure and launch new applications.
•Reduces IT infrastructure costs for the provision and deployment of compute, storage and networking resources.
Security is changing. New innovations are making defenses such as anti-malware more of a reality than ever before, while yesterday’s incumbents are being remade through new acquisitions as well as selloffs of their legacy security businesses. So what’s next for security? For one thing, the scale of IoT security risk has finally been revealed – while the threat of ransom may provide attackers an alternative if malware becomes less viable. Will security’s innovations be ready enough, soon enough to protect enterprises from today’s emerging threats?
Join this session with Scott Crawford, Research Director for Information Security, to find out why 451 Research has rated all 6 of 2017’s key security trends as “high impact.”
As data footprints in research organizations increase into the petabyte scale, so does the need to understand and visualize how that data is being used.
While Qumulo Core data-aware scale-out NAS gives you powerful insights into your data, you can take this one step further by building custom visualizations using our programmable REST API.
Watch this videor to learn how to:
- Understand the usage patterns of your data to best manage your storage environment
- Create interesting visualizations of your organization’s data using Qumulo Core’s programmable REST API
- Use the Qumulo QPI with common data scientist tools such as Python’s jupyter Notebook
Database license fees drive over 80% of total system cost. Many organizations virtualize their applications, but Oracle is often an exception for a variety of reasons.
You will learn why re-platforming Oracle databases on better hardware can drive down TCO by a significant amount. Bart will also cover technical challenges and benefits, as well as myths and facts about licensing Oracle on VMware, and how to deal with Oracle license audits and still stay compliant.
Every new or recycled idea that comes along is going to be the ‘only way things are done in the future' according to some. This narrow view has been offered in the area of storing and managing information since modern computing began. The reality is much more messy. Different approaches for handling information are used and in reality, the processing and management of information use mostly unrelated solutions. This BrightTalk session will look at evolving datacenters and how information is handled including the storage technologies employed. There will be different approaches that will be discussed as part of the presentation.
Digital transformation is a business imperative, not an option. Cloud is powering this change: It provides the new style of IT that supports the information, process and platform transformations required to link technology and information assets with marketing and customer experience to deliver new or enhanced digital processes.
Join William Fellows, Founder and Research Vice President for Cloud, as he discusses what’s coming this year in cloud, and how the industry’s major battleground in 2017 will be how, from where and at what price cloud transformation services are delivered to customers.
Many large organizations either can’t or won’t go all-in on the public cloud. Regulatory compliance, legacy infrastructure, and costs drive these organizations to build private clouds. Do you invest in VMs or containers? Do you use OpenStack, Kubernetes, or Mesos to orchestrate your cloud? How will you bridge your private cloud with AWS, Microsoft Azure, or Google Compute Cloud? You need a flexible architecture that not only embraces all the above, but facilitate frictionless movement among them.
Specifically, you’ll learn how you can use software-defined storage to:
1. Support any hypervisor, container, or cloud platform – simultaneously.
2. Plug into any cloud orchestration tool to automate your private cloud.
3. Integrate public cloud – and not just bolt it on – when you’re ready.
4. Create self-service storage and sandboxes for DevOps environments.
Google Cloud Platform, Avere Systems, and Cycle Computing experts will share best practices for advancing solutions to big challenges faced by enterprises with growing compute and storage needs. In this “best practices” webinar, you’ll hear how these companies are working to improve results that drive businesses forward through scalability, performance, and ease of management.
In this webinar, you will learn:
- How enterprises are using Google Cloud Platform to gain compute and storage capacity on-demand
- Best practices for efficient use of cloud compute and storage resources
- Overcoming the need for file systems within a hybrid cloud environment
- Understand how to eliminate latency between cloud and data center architectures
- Learn how to best manage simulation, analytics, and big data workloads in dynamic environments
- Look at market dynamics drawing companies to new storage models over the next several years
In just 60-minutes, you’ll be presented with a foundation to build infrastructure to support ongoing demand growth and have ample opportunity to ask direct questions to presenters.
More so than ever before, businesses are requiring IT to become less about infrastructure and more about availability and mobility of data. Yesterday’s IT shops were defined by the technology they chose to use rather than capabilities they could deliver to the business and its workforce. End users have become more savvy and less willing to accept limitations of traditional, static services. Technology has to adapt to timing, location and value of data on a global basis without restrictions. Based on these trends, FreeStor grew from roots in static infrastructure to now support the widest interoperability across storage platforms, protocols, hypervisors, clouds, and applications in the industry. See how FreeStor is helping to drive the goal of “infrastructure anywhere” for seamless delivery of data anywhere users need to be.
The future is now. Memory prices are dropping drastically and companies are investing heavily in them. That doesn't mean spinning magnetic disks are to disappear anytime in the near future. Their densities continue to rise and prices are significantly cheaper than that of memory. Operating at slower speeds, this presentation dives into the methods one can employ to increase the performance, and in turn the value of this slower and aging data storage technology.
Docker® containers have emerged to provide the agility that development teams need while delivering the stability and reliability required by IT operations. Developers can bring existing applications into the Docker ecosystem while building new applications using microservice design principles. IT-operations teams can benefit from reduced complexity and faster deployment of containerized applications.
Typically, Docker containers do not retain data after being restarted or retired. A container’s ephemeral nature presents challenges for containerizing many common Enterprise applications, from databases to home-grown tools, where data must be retained or when a container must share its data across multiple Docker hosts. Docker and NetApp solve these issues and more with the NetApp® Docker Volume Plugin (nDVP).
The nDVP adds persistent storage capabilities to Docker containers, which gives developers the ability to enable data persistence in applications that use Docker container technology along with enterprise storage features such as transparent scalability, high availability, encryption, data deduplication, and seamless replication.
For most organizations today, over half of their data is not active yet it consumes expensive storage and data protection resources. Quantum and Komprise have teamed together to show you how to know more about your data and to identify how you can store it more intelligently: to keep it protected and fully accessible but at much lower costs than the traditional approach.
In 15 minutes you will be able to see how much money you can save on your next storage purchase, and we will even provide you a return on investment (ROI) analysis for your internal discussions.
Join Quantum and Komprise for this joint webinar and learn:
• How you can get analytics across your storage to know how data is being used
• How to cut costs and see the ROI of implementing a simple active archive
•How this strategy can help you leverage public cloud storage if desired
We will give you a live demo so you can see how easy it is.
Join us and qualify for a free analytics assessment in your environment!
Oracle database sprawl comes with significant cost and complexity. The complexity is the result of proliferation of database copies created for a range of uses including disaster recovery, provisioning of development and testing infrastructure, quality assurance (QA), DevOps in private or hybrid clouds to name a few.
Join this webinar to learn how Catalogic Software’s® ECX™ Copy Data Management platform deployed in conjunction with the Pure Storage® FlashArray™, allows organizations to manage, orchestrate and analyze Copy Data. The solution provides full lifecycle management of your Copy Data through automated workflows that allow you to streamline the creation, management and use of Oracle database copies.
Join Spacebelt and the Cloud Constellation Corporation for a technical deep-dive into connecting your cloud storage and telecommunications networks within your data center.
The Space-Based data center infrastructure is comprised of interconnected satellites, forming an exclusive cloud storage network ring to serve enterprises and governments everywhere. This is what the data center of the future looks like, and it includes an autonomous global telecom backbone, fast delivery of data stored in-orbit to anywhere on the planet, ultra secure point-to-point connectivity, and more.
With the every increasing influx of data and big data, cloud storage models and data centers need to continue to shift and adapt to these demands. Learn about the latest and greatest data center connectivity solutions and technologies on the market today.
Malgré quelques similarités, Hadoop et Spark sont souvent considérées comme la même technologie. Durant ce webcast, Marc Royer, Spécialiste SE et Big Data chez Dell EMC vous guidera à travers les différences entre ces outils, afin que vous puissiez choisir le bon pour votre projet Big Data.
Pendant ces 30 minutes vous découvrirez notamment :
- Pourquoi et comment les organisations se tournent vers le Big Data pour innover
- Spark et Hadoop, et les spécificités des deux outils
- Quelques cas d’usage de ces technologies
- Une méthodologie pour réussir votre projet Big Data et les bonnes questions à se poser pour choisir la technologie appropriée
It's no longer a secret that hyperconverged data centres are gaining ground in 2017, hence it is imperative for data centre architects to understand this technology to stay ahead of the curve. Join this presentation to learn key architectural concepts in HCI and why they matter.
Among other topics discussed, you will discover:
·Why inline services offer better outcomes
·Why policy-based management is a must have for effective data centre consolidation
·Why modularity and scalability is critical in your next generation data centre
Digital transformation is real, and it's happening, although there is still a very long way to go. We believe it is an inescapable truth that every business is becoming a digital business, controlled by software, which is the manifestation of these digital transformations. Businesses must react, driven by the imperatives of improving intelligence, agility and their customer-centricity, with the ultimate goal of survival in a digital world.
Join Nick Patience, Founder and Research Vice President for Software, as he dives into the three business imperatives behind digital transformation and what specifically will be transformed in 2017.
Enterprise IT organizations looking to get a foothold into public cloud need solutions that deliver the efficiency and innovation promised by cloud platforms—without throwing away the investments they've made in on-prem technology, policies, and skills.
Creating self-service Development & Test (or Dev/Test) environments is one of the first and most popular cloud workloads enterprise IT organizations explore in their journey to cloud. Google Cloud and CloudBolt have partnered to create a hybrid cloud solution for this important workload that's easy to set up, and can be fully tailored to your environment. CloudBolt is a cloud management platform that integrates your on-prem virtualization and private cloud resources with the public cloud, serving as a bridge between your existing infrastructure and Google Cloud Platform.
With these hybrid solutions, users can rapidly provision the resources they need through an intuitive self-service portal. IT organizations maintain full control over system configuration, usage quotas, and cost transparency. By delivering a self-service, fully auditable alternative to shadow IT, CloudBolt and Google Cloud let you tap the latest cloud innovations from Google directly from the technology you're already familiar with on-prem.
Application performance, availability and security are key issues IT departments in law firms struggle with today. It is imperative that you keep your firm's critical applications running fast and always available for your attorneys. Is there a cost effective and manageable solution?
Join our webinar on January 26th to hear how law firms like yours are they are taking advantage of existing storage infrastructure to:
• Improve application performance
•Ensure document management and email systems operate 24/7
• Reduce both CapEx and OpEx expenditure
•Enable a hassle free experience with zero downtime
Hear real case studies on how law firms like yours are deploying Lenovo Converged Server SAN appliances powered by DataCore to reduce cost and increase productivity. As the world leader in price/performance, Lenovo and DataCore together allow you to run more workloads, with better applications performance and availability.
When it comes to repurposing and accessing data for test/development and file sharing services, how do you ensure data protection and instant, reliable recovery while maintaining speed and efficiency? The best of both worlds is all-flash storage performance for mission-critical applications, as well as flash-driven consolidation of your secondary storage needs. With everything from seamless snapshot integration to safe and secure storage, your data and your application teams will thank you.
Hyperconverged Infrastructures (HCI) offer the potential for simplicity, agility and improved economics, but making sure that you implement the correct architectural approaches is vital. In this session, we will explore the architectural options of HCI, and provide insights into how you can apply this technology to develop the foundation for your next generation data center today.
Join us for this presentation to learn:
·Key architectural concepts in HCI and why they matter
·Why inline services offer better outcomes
·Why policy-based management is a must have for effective data center consolidation
·Criticality of modularity and scalability in your next generation data center
While the proliferation of analytic technologies has created exciting new ways to harness Big Data like never before, for many it has also created a quagmire of complexity that is slow to deploy, runs inefficiently, isn’t integrated with other systems, and cannot be scaled. There is a solution; it is possible to create next-generation applications with ease and efficiency.
Join us for this presentation to learn:
• How to jump start Big Data technologies deployments by using a standard analytic platform
• How to implement an analytics platform that can scale based on your needs and simplifies your user experience
• Where analytics deployments have failed to deliver ROI
• Lessons learned from other companies on the journey to enterprise-grade analytics
• What consumption models are available and which are most advantageous
While backup software vendors continue to innovate, hardware vendors have been resting on their deduplication laurels. In the meantime, the amount of data that organizations store continues to grow at an alarming pace and the backup and disaster recovery expectations of users are higher than ever. Most backup solutions today simply will not be able to keep pace with these realities. If organizations don't act now to address the weaknesses in their backup hardware, they will not be able to meet organizational demands by 2020. In this webinar, Cloudian and Storage Switzerland will discuss three areas where IT professionals need to expect more from their backup hardware and where they should demand less.
Four Reasons Why Backup Hardware Will Break by 2020:
1. Not Cost Effective Enough
2. Not Scalable Enough
3. Only Good for Backups - Not Enough Use Cases
4. Too Much Deduplication
The data center of 2020 will look nothing like the data center of today. The software-defined data center will continue to gain momentum with object storage playing a major role. Learn how object storage is now enabling far more than just cost-effective data storage and is being used for hybrid cloud, data management, filer and database optimization.