The data center management community focuses on the holistic management and optimization of the data center. From technologies such as virtualization and cloud computing to data center design, colocation, energy efficiency and monitoring, the BrightTALK data center management community provides the most up-to-date and engaging content from industry experts to better your infrastructure and operations. Engage with a community of your peers and industry experts by asking questions, rating presentations and participating in polls during webinars, all while you gain insight that will help you transform your infrastructure into a next generation data center.
Would you like to cut complexity across all phases of app development and deployment?
Join us for this straightforward discussion on how CA Application Lifecycle Conductor reduces risk through a single source-of-truth. CA Application Lifecycle Conductor automates and manages the software development lifecycles that span mobile-to-mainframe environments — from the initial service desk ticket to the deployment of the application in production.
Join Rose Sakach, Sr. Principal Product Manager, and Vaughn Marshall, Director, Product Management as they outline CA Application Lifecycle Conductor’s many benefits. Discover how you can:
• Create one view and traceability for the application development lifecycle
• Identify the potential time savings for project managers, release managers and compliance managers
• Determine which customer segments would benefit the most from adopting CA ALC
Are you ready to simplify application lifecycle management—from mobile to mainframe?
With the average company experiencing unplanned downtime 13 times a year, the costs associated with continuing to invest in a legacy backup solution can be extensive. For this reason, more customers are switching to Veeam® and Quantum than ever before. Update to a modern data center and achieve Availability for the Always-On Enterprise™ with Veeam coupled with Quantum’s tiered storage that increases performance, reduces bandwidth requirements and executes a best practices for data protection.
After a record-setting year in 2015, where will the tech M&A market go in 2016? What trends that pushed M&A spending to its highest level since the Internet Bubble burst will continue to drive deals and which ones will wind down? What other sectors are likely to see the most activity this year? And most importantly, what valuations will be handed out in deals over the coming year? Drawing on data and views from across 451 Research, the Tech M&A Outlook webinar maps many of the major developments in the IT landscape (IoT, Big Data, cloud computing) to how those influence corporate acquisition strategies. Join us for a look ahead to what we expect for tech M&A in 2016.
Overhead power distribution in your data center offers many attractive possibilities, but is not without its challenges.
Join Starline's Director of Marketing, Mark Swift; CPI’s Senior Data Center Consultant, Steve Bornfield; and University of Florida's Joe Keena for an exploration of the options and some of the pitfalls, supported by real-life examples from the field.
The Chef DevOps Journey Assessment or Dojo is a tool that is used to assess both the current and desired state of your organization's DevOps progress. The goal of the exercise is to expose opportunities for continuous improvement by defining a baseline of where you are on your journey and the desired state you’re trying to get to. After we compile the results, patterns will emerge and be used to visualize goals and generate custom guidelines towards success.
In this webinar, Justin Redd, Customer Engineering Lead, and Thomas Enochs, VP Customer Success, will explain:
- Why we created the Dojo and how it can help your organization accelerate DevOps adoption
- What the DevOps Journey Assessment is and how it’s conducted
- Who in your organization should attend the Dojo exercise
- How to get it
Hear ad-tech leader Sharethrough talk about delivering self-service access to powerful analytics using Snowflake and Looker. Sharethrough Director of Analytics, Joseph Bates will join Snowflake VP of Product and Marketing, Jon Bock and Looker Alliances Analyst, Erin Franz, to chat about the benefits of combining Snowflake’s cloud data warehouse with Looker’s data analytics platform.
You’ll hear how Sharethrough, a software company that powers in-feed, native ads for premium publishers & brand marketers, has been able to crunch data 300x faster even while giving more users access to analytics on their data.
In this webinar you’ll learn about the power of combining Snowflake’s elastic architecture and ability to bring together diverse data with Looker’s capabilities that allow users to join and model that data to see the full picture of their business.
Among the benefits you’ll hear about:
- In-database scale and performance: how Looker’s direct database interface takes advantage of Snowflake’s power and flexibility
- Real-time insights: why bringing together diverse data without data movement enables results that are always up to date
- Integrated access to diverse data: how Looker can take direct advantage of Snowflake’s native support for both structured and semi-structured data (like JSON), even making joins across diverse data possible
- Self-service access: how Snowflake and Looker make all of your data available to data users in a consistent, user-friendly way without the burden of infrastructure, tuning, and manual optimization
The goal of Virtual Desktop Infrastructure (VDI) is to lower the operational costs of supporting an organization’s desktops and laptops. A VDI project should make it easier for IT to support users, keep the organization safe from viruses and better protect its data assets. The key to a successful VDI project is user adoption. But while many organizations consider using an All-Flash Array (AFA) to deliver the performance that will lead to user adoption, the problem is AFAs drive up the cost per desktop, increase the vulnerability of the architecture and encourage additional storage silos.
Join Storage Switzerland and Cloudistics for an informative webinar that will provide an alternative approach that meets user’s performance expectations while leveraging existing – and often already paid for – storage hardware and does not introduce new silos of storage.
In this webinar, we will discuss:
• the joint effort between AT&T, Broadcom and ONF to support ON.Lab’s Central Office Re-architected as Datacenter (CORD) Project
• the specific use case of a SDN based leaf-spine fabric built with bare-metal OCP hardware (using Broadcom switching silicon) and open source switch software, built on top of Broadcom provided SDN reference software, Open Flow Data Plane Abstraction (OF-DPA).
The goal of the joint effort is to provide a pure SDN-based fabric implementing the following:
• standard L2 switching within racks,
• L3 forwarding across racks using MPLS labels between leaf and spine switches, with support for QinQ and IP multicast.
• The entire fabric will be controlled by an ONOS controller cluster. Broadcom’s OF-DPA 2.0 software will be used to program the OpenFlow rules in all the white box switches
Healthcare privacy and data protection regulations are among the most stringent of any industry. Join this Webcast to learn how healthcare organizations can securely share health data across different cloud services. Hear experts explore how Encrypted Objects and Delegated Access Control Extensions to the Cloud Data Management Interface (CDMI) standard permits objects to freely and securely move between clouds and clients with enhanced security and auditability.
•Protecting health data from alteration or disclosure
•How Cloud Encrypted Objects work
•How Delegated Access Control works
•CDMI for Electronic Medical Records (EMR) applications
•Healthcare use cases for implementing securely sharing data in the cloud
Learn about Quantum’s exciting new high-performance workflow storage system - Xcellis!
Xcellis Workflow Storage is the latest innovation in the continuing development of Quantum StorNext appliances. It is a high-performance storage family engineered to optimize workflows, streamline operations and empower organizations. Quantum Xcellis Storage Systems solves the world’s hardest data management problems in media & entertainment, video surveillance, life sciences, geospatial and government.
Benefits of Xcellis…
• Enhance workflows, boost efficiency, and improve productivity.
• Scale capacity and performance in parallel or independently.
• Combine SAN performance and NAS connectivity and promote collaboration and support workflows.
Xcellis is for companies that depend on rapid, reliable access to content to deliver the products, services, and intelligence that drives their business.
For more information visit: www.quantum.com/Xcellis
Corporations increasingly rely on their enterprise services bus (ESB) as the communication center to link multiple IT systems, applications and data. Unfortunately, when something goes wrong in the ESB it can have a cascading affect, impacting critical applications. Determining the root cause of the problem is a challenge for most IT organizations, since ESBs serve as a ‘black box’, offering little insight into the issue.
Now, you can assess and resolve performance issues for applications that communicate across your ESB, before they affect your users. Join Richard Nikula, Vice President, Product Development and Support at Nastel, to learn how Nastel AutoPilot for CA Cross-Enterprise APM lets you analyze application behavior in real-time in production, test and development environments.
Join Simon Robinson, Research Director, Voice of the Enterprise: Storage for an interactive webinar highlighting his analysis of customer behaviors and perceptions as derived from our latest survey of over 700 enterprise IT buyers in the Q4 2015 Voice of the Enterprise: Storage study.
During the webinar, you will learn:
• Where are storage budgets headed in 2016?
• What are the key customer pain points and objectives for storage in 2016?
• How will storage spending be impacted by the transition to cloud?
• Which storage technologies are customers looking to adopt in 2016?
• Which storage vendors do users regard as most strategic today and in the future?
Learn how to protect your business against localized disaster, eliminate capital expense, and answer the cloud mandate with Quantum’s newest cloud offering – Q-Cloud Protect for AWS. This webinar will explain what this new offering is, and how it enables you to leverage the public cloud for disaster recovery with a solution that plugs into your existing backup environment.
As we enter 2016, hyper-converged infrastructure appliances are gaining steam as a means to achieve agile IT resource allocation in software-defined data centers. However, many planners are discovering that fast-deploy server/storage appliances often fall short in terms of hosted VM performance. Perceived application performance is the make-or-break of modern IT service strategy and faster is definitely better. The challenges to achieving stellar application performance in virtualized hyper-converged infrastructure are many, but I/O latency is ultimately the first and often the most important hurdle to overcome. Most solutions offered by storage and hypervisor software vendors fall short of tackling the latency challenge, since doing so really requires more than just memory caching, log structuring or I/O pipelining. What is needed is parallel I/O processing.
In this webinar, you will learn how to:
- Obtain up to 300x reduction in latency and up to 15x improvement in data throughput – even with commodity server and storage gear
- Harness the untapped power of multi-core processors to create a parallel engine for processing I/O
- Reduce your infrastructure costs while learning more about a hyper-converged solution that achieved $0.08 per SPC-1 IOPS™
The results have been documented in a just-published report from the Storage Performance Council that should be required reading for any IT planner.
Join us as industry analyst, Jon Toigo walks through the benchmark results to describe the best performing and lowest cost solution for achieving application performance in the market today.
DevOps addresses inefficiencies that result from keeping operations and development in separate silos. By connecting development and operations, enterprise IT departments can begin to break down the walls.
DevOps defines a set of roles and responsibilities focused on reducing risk in IT deployments and projects. The result is maximized automation, elimination of human error, increased consistency, and reduced time spent on the outages, as well as error detection and prevention brought about by unstable environments.
In this webinar, ODCA president, Gunnar Menzel, will share perspectives on the DevOps concept, focusing on key challenges it can help resolve and the benefits it can provide.
It’s 2016. If you were born after 1980, you probably don’t remember a time without the Internet and the Web. You reach for your smartphone as your go-to device to answer questions, order pizza, find the score of a football game, and check the address of your child’s next soccer game.
It should come as no surprise, that you reach for your smart phone or use your browser at work to search for an answer when you have a technical issue. The bottom line is that more and more business users are bypassing the service desk and the self-service portal because they believe they can get answers quicker on the web.
Could this be the beginning of the end for IT Self Service as we know it? Join George Spalding, EVP, Pink Elephant and Jim Blayney, Sr. Product Marketing Mgr. CA Technologies, as they tackle the thorny and divisive issue of whether it’s actually more productive for users to get their own solutions from the web.
Random numbers and the entropy they contain is the foundation of secure cryptography. Yet, providing sufficient random numbers and entropy with assurances that it cannot be known, monitored, controlled or manipulated by others has proven to be remarkably difficult. How do you know if your random numbers are truly random, and how can you insure that your security system is secure?
Join EXFO’s Data Center team for an overview of the challenges inherent to 40G/100G and the ultimate 400G fiber-optic network migration, and to gain insight into industry best practices for physical-layer and service-layer testing and monitoring.
This webinar covers the following topics:
•How to anticipate and address the challenges inherent to multifiber push-on (MPO)-based fiber-optic network infrastructures
•How to assess the performance of a future-proof permanent link for a future high-speed line rate
•The importance of connector insertion loss (IL) and return loss (RL) inside the data center
•How to plan a trouble-free data center interconnect (DCI)
What is a WAF (web application firewall) and how can it help defend your AWS workloads? In this webinar, you’ll learn how to get started with the new AWS WAF service and where it fits in your security strategy. You’ll see how AWS WAF works with Trend Micro’s Deep Security to provide a strong, layered defense for your web applications.
NVMe (Non-Volatile Memory Express) over Fabrics is of tremendous interest among storage vendors, flash manufacturers, and cloud and Web 2.0 customers, Because it offers efficient remote and shared access to a new generation of flash and other non-volatile memory storage, it requires fast, low latency networks, and the first version of the specification is expected to take advantage of RDMA (Remote Direct Memory Access) support in the transport protocol.
Many customers and vendors are now familiar with the advantages and concepts of NVMe over Fabrics but are not familiar with the specific protocols that support it. This live webcast will explore and compare the Ethernet RDMA protocols and transports that support NVMe over Fabrics and the infrastructure needed to use them. You’ll hear:
•Why NVMe Over Fabrics requires a low-latency network
•How the NVMe protocol is mapped to the network transport
•How RDMA-capable protocols work
•Comparing available Ethernet RDMA transports: iWARP and RoCE
•Infrastructure required to support RDMA over Ethernet
•Congestion management methods
A movement is underway. Businesses are awakening to a new era of the digital enterprise, requiring companies to find new ways of delivering their services built for the digital era. Success in this new era requires a digital industrialization strategy in which datacenters become the core asset of the business enabling a transformation from an infrastructure that is tightly coupled with the business to a modern infrastructure that enables any business.
Digital industrialization is a continuous cycle that organizations can use to turn IT infrastructure from a cost into an asset by standardizing on one set of technologies and economics across facilities, hardware, software, and operations; consolidating datacenters; abstracting functionality; automating operations and governing it all to ensure security, integrity, and compliance.
This presentation will go through why digital industrialization is needed, what the benefits are and how the Ericsson Cloud portfolio facilitates it.
Financial advisors espouse that proper asset allocation during times of market volatility can help us sleep better. There is a parallel in the IT world. With volatility driven by technology advancement, virtual & cloud environments, and consumer demand for the newest applications and hardware, a good night’s sleep for an Asset Manager requires properly managed and optimally allocated hardware and software assets in a constantly changing environment.
This session explores this intimidating world, common pitfalls, prescriptive actions and what the latest technology can do to make sure your assets, licenses and infrastructure are optimally aligned to drive wealth in IT…and let the Asset Manager sleep well without the fear of negative audit findings and exorbitant fines.
Join us for a live Big Data Analytics customer case study webcast featuring Dana Gardner, a leading IT industry analyst at Interarbor Solutions, as he interviews Procera Networks executive, Cam Cullen.
Learn how Procera Networks dealt with massive data volume challenges to provide network performance benefits to its global users, powered by HPE Vertica. HPE Vertica is the industry’s first comprehensive, scalable, open, and secure platform for Big Data Analytics.
For the Data Scientist, Data Science is complex; for the average business user it is a mystical art form that promises a lot, but often under delivers against expectation. For many established companies the result of this has been a lack of investment in an area that is, for others, quickly becoming an area of competitive advantage.
Helping the business understand the value of Big Data and Analytics, whilst also helping translate their business requirements and expectations, is a critical foundational step of the Data Analytics Lifecycle that can lead to greater investment from the business and greater profit for the organization. By way of customer examples, this presentation discusses the importance of engaging the business early and the importance of being able to tell an engaging story about the ‘Art of the Possible’.
Join us for a live webinar featuring Dr. Jim Metzler, Distinguished Research Fellow at Ashton Metzler & Associates, as he discusses the future of the WAN and why now is the time to rethink the architecture in order to evolve/accelerate your business.
Why take the traditional approach to branch office expansion by relying on providers that can’t meet the urgency of your business needs? A new era has come where complexity, cost, flexibility and time-to-market are no longer a hurdle. Welcome to the world of the Software-Defined WAN.
Join this webinar featuring Jim Metzler, industry specialist to discover:
•Why it’s time to re-architect your WAN
•How today’s WAN infrastructure is failing IT
•New market dynamics that the traditional WAN cannot support
•A step-by-step approach to evolving your WAN with minimal disruption
In the traditional world of EDW, ETL pipelines are a troublesome bottleneck when preparing data for use in the data warehouse. ETL pipelines are notoriously expensive and brittle, so as companies move to Hadoop they look forward to getting rid of the ETL infrastructure.
But is it that simple? Some companies are finding that in order to move data between clusters for backup or aggregation purposes, they are building systems that look an awful lot like ETL.
The presentation will cover RAN evolution, associated trends, and benefits, moving out of the traditional, purpose built, distributed base stations, into a new era of fully virtualized RAN (vRAN), based on ASOCS® virtual Base Station (vBS) solution and its collaboration with Intel®"
Growing concern over climate change, increasing utility prices and the availability of sustainable energy like solar are driving increased demand for conservation, pricing transparency and cost savings. Smart energy solutions provide real time visibility into consumption and billing data helping consumers to conserve resources, while energy and utility companies are better able to balance production to meet actual demand reducing brown outs and other potential issues. At a time when energy utilities play an increasingly important part of our everyday lives, smart grid technologies introduce new security challenges that must be addressed.
In this webinar we will discuss:
•The threat landscape
•Addressing security challenges in critical infrastructure with cryptography and strong authentication
•Compliance with NERC CIP Version 5
•The road ahead
Unstructured data growth, the Internet of things, application modernization and big data analytics are forcing organizations to provide more capabilities at a lower cost. In this climate, understanding the market trends of how unstructured data is exploding and why object storage is becoming the solution of choice is essential.
Join Dennis Jannot as he explores web-scale private-cloud is the ideal object storage option - allowing you to build a global content repository, take Hadoop to your data across the globe, store ‘IOT’ data or archive inactive content with cloud economics.
As the switch to the cloud blurs the boundaries of the traditional enterprise security perimeter, organizations are challenged with maintaining visibility into access events and enforcing consistent access controls across their data estate, which spans on-premises, virtualized and cloud-based applications. Applying uniform authentication policies on different types of applications may often require diverse integration methods, such as the RADIUS and SAML 2.0 industry standards, as well as custom agents or APIs. In this webinar, you will learn how Office 365, AWS Workspaces and SFDC—as well as other SaaS applications—can be easily secured using SafeNet Authentication Service.
Transforming your application portfolio can help your business reduce costs & invest in innovation. Modernize. Migrate. Retire. Join us for this informative discussion to learn how which options are the best match to produce quick wins for your business
Are you struggling to consume, derive insight from, and act on the increasing volume of data generated from millions of connected devices and emerging applications? Join us for this Webinar to learn how HPE Vertica “Excavator,” the new release of the industry’s leading Big Data analytics platform, integrates with Kafka to quickly ingest and analyze high-speed streaming data, from various sources, including Internet of Things applications, and provides enhanced SQL analytics and performance to Hadoop.
Join our upcoming webinar to learn how HPE Vertica “Excavator” offers the most advanced analytics with enhanced security and enterprise-class performance reliability for the most demanding data-driven organizations.
Please join Teradici and Europe’s leading IT management and services provider Bechtle, to learn how their customer’s successful Virtual Desktop Infrastructure (VDI) deployment improved enterprise mobility, endpoint manageability, and engineering collaboration.
Specifically, this webinar will share how their customer CSD Management’s PCoIP Hardware Accelerator deployment with VMware Horizon View has:
•Effectively delivered complex engineering, video, and visualization applications with stellar performance and responsiveness from any location;
•Realized business benefits including a significant reduction in CPU cycles and bandwidth savings through PCoIP hardware acceleration by dynamically offloading end-points; and
•Enabled their engineers to work remotely with efficient access to more than 300 high-performance applications.
One of the biggest challenges you will face as you move to the cloud is keeping your users productive while protecting your agency data. Your users' identities will live in your datacenter as well as in the cloud, so how you protect that and maintain your security processes is vitally important. The way people access applications and resources is changing. This is why the user's identity is crucial to protecting your data and applications.
Our discussion of hybrid identity will cover:
1. Options for synchronizing identities to the cloud
2. Self-service capabilities for your users, including password management, group management and single sign-on
3. How to configure single sign-on to SaaS applications
4. Automating identity management across different repositories in your datacenter
Please join Nikolay Yamakawa, Senior Analyst, Voice of the Enterprise: Servers/Converged Infrastructure for a deep dive into our 2016 outlook for customer spending across servers and converged infrastructure. During this interactive webinar we will also discuss adoption patterns and operational processes as gleaned from over 780 online surveys and dozens of interviews with senior IT buyers as a part of 451 Research’s Q4 2015 Voice of the Enterprise: Servers and Converged Infrastructure study.
How To Break “The Cycle” and Move To Hyperconvergence
In this webinar, Storage Switzerland's George Crump and SimpliVity's Adam Sekora compare and contrast the suitability of SANs vs. hyperconverged architectures; examine the benefits of consolidating and reducing the number of discrete IT devices in lieu of hyperconverged infrastructure; and discuss the merits of simplified IT and its impact on technology refresh initiatives.
How can global organizations transform their IT to meet the demands of the cloud age? During this webinar, 451 Research will outline how two enterprises changed the way their IT was designed, developed and delivered to adopt a truly digital infrastructure - and demonstrate how it can be applied in your organization.
The Cloud Maturity Model (CMM) is one of the most widely utilized tools published by the Open Data Center Alliance. Gain a deeper knowledge of the CMM and the best-practices that have shaped this visionary tool over the past five years.
The objective of the CMM is to help enterprises:
- Evaluate where its IT organization stands in its ability to adopt and integrate cloud services
- Benchmark its IT organization against other industry adopters of cloud
- Build a custom roadmap for their organization towards establishing more effective Hybrid IT – integrating cloud services to improve, not just change, their IT offering and aligned to their specific needs and objectives.
Public- and private-sector organizations have used the ODCA’s CMM to guide wide-scale implementations including the selection of cloud solutions and services. Many forward-thinking vendors integrate ODCA best practices into product and service roadmaps to support open standards and interoperability.
Hear from the primary contributors to version 3.0 of the Cloud Maturity Model. These technology and business executives represent top, global enterprise IT organizations who are on the leading-edge of cloud adoption and organizational maturity.