The challenge for any organisation is to decide which initiatives in the 'Green' agenda will provide the maximum impact. Unless you have an understanding of the relative energy consumption of your ICT, for example, user devices, back office server rooms and data centres, you may end up tackling your carbon load based on ‘gut instinct’ rather than hard data.
This webcast offers a practical overview of the challenges faced by any business leader; and shows how modelling your energy use across all silos in your business, including the ICT estate, can not only improve energy efficiency but reduce your carbon load and result in cost reductions and greater competitive edge.
RecordedApr 20 201047 mins
Your place is confirmed, we'll send you email reminders
Michael Elliott, Enterprise Cloud Strategist & Cloud Evangelist, NetApp
Tips for Recognizing Key Data Center Trends and When to Incorporate SDDC into Your Data Center Architecture
In the evolutionary journey of the enterprise data center, key inflection points drive adoption of new technologies such as virtualization, converged infrastructure, cloud, and containers. As business demands drive faster IT acceleration and innovation, new architectures have emerged giving rise to the Software Defined Data Center. Join this webinar to gain knowledge and understanding of the key trends affecting today’s data center, the components of a SDDC, and what you need to know to align your data center plans with the future of your business.
J Metz, Cisco, Alex McDonald, NetApp, John Kim, Mellanox, Chad Hintz, Cisco
Welcome to this first part of the webcast series, where we’re going to take an irreverent, yet still informative look, at the parts of a storage solution in Data Center architectures. We’re going to star with the very basics – The Naming of the Parts. We’ll break down the entire storage picture and identify the places where most of the confusion falls. Join us in this first webcast – Part Chartreuse – where we’ll learn:
•What an initiator is
•What a target is
•What a storage controller is
•What a RAID is, and what a RAID controller is
•What a Volume Manager is
•What a Storage Stack is
With these fundamental parts, we’ll be able to place them into a context so that you can understand how all these pieces fit together to form a Data Center storage environment.
Oh, and why are the parts named after colors, instead of numbered? Because there is no order to these webcasts. Each is a standalone seminar on understanding some of the elements of storage systems that can help you learn about technology without admitting that you were faking it the whole time! If you are looking for a starting point – the absolute beginning place – start with this one. We’ll be using these terms in all the other presentations.
Join Commvault experts to understand the best practice considerations of leveraging public cloud disaster recovery services when protecting your Software-Defined Data Centre (SDDC).
As awareness of the potential benefits of a Software Defined Data Centre has begun to resonate around CxO’s, the value and importance of disaster recovery provisioning has been challenged by the notion that clustering and high availability could be sufficient to accommodate the recovery needs (i.e. no DR provisioning is required).
In this webinar we'll look at the significant risks and pitfalls that a 'non DR' strategy can pose and learn about the five step programme to optimise your chances of recovery when working with Public Cloud providers.
In the era of data explosion in Cloud-Mobile convergence and Internet of Things, traditional architectures and storage systems will not be sufficient to support the transition of enterprises to cognitive analytics. The ever increasing data rates and the demand to reduce time to insights will require an integrated approach to data ingest, processing and storage to reduce end-to-end latency, much higher throughput, much better resource utilization, simplified manageability, and considerably lower energy usage to handle highly diversified analytics. Yet next-generation storage systems must also be smart about data content and application context in order to further improve application performance and user experience. A new software-defined storage system architecture offers the ability to tackle such challenges. It features seamless end-to-end data service of scalable performance, intelligent manageability, high energy efficiency, and enhanced user experience.
Camberley Bates, Managing Director and Senior Analyst, The Evaluator Group
Since the 90’s the storage architectures of SAN and NAS have been well understood and deployed with the focus on efficiency. With cloud-like applications, the massive scale of data and analytics, the introduction of solid state and HPC type applications hitting the data center, the architectures are changing, rapidly. It is a time of incredible change and opportunity for business and the IT staff that supports the change. Welcome to the new world of Enterprise Data Storage.
Jeff Kato, Taneja Group; Brian Biles, Datrium; Patrick Osborne, HPE; Kevin Fernandez, Nutanix
Join us for a fast-paced and informative 60-minute roundtable as we discuss one of the newest trends in storage: Disaggregation of traditional storage functions. A major trend within IT is to leverage server and server-side resources to the maximum extent possible. Hyper-scale architectures have led to the commoditization of servers and flash technology is now ubiquitous and is often times most affordable as a server-side component. Underutilized compute resources exist in many datacenters as the growth in CPU power has outpaced other infrastructure elements. One current hot trend— software-defined-storage—advocates collocating all storage functions to the server side but also relies on local, directly attached storage to create a shared pool of storage. That limits the server’s flexibility in terms of form factor and compute scalability.
Now some vendors are exploring a new, optimally balanced approach. New forms of storage are emerging that first smartly modularize storage functions, and then intelligently host components in different layers of the infrastructure. With the help of a lively panel of experts we will unpack this topic and explore how their innovative approach to intelligently distributing storage functions can bring about better customer business outcomes.
Jeff Kato, Senior Analyst & Consultant, Taneja Group
Brian Biles, Founder & CEO, Datrium
Patrick Osborne, Senior Director of Product Management and Marketing, HPE
Kevin Fernandez, Director of World Wide Technical Marketing, Nutanix
Sushant Rao, Sr. Director of Product Marketing, DataCore Software
Server virtualization was supposed to consolidate and simplify IT infrastructure in data centers. But, that only “sort of happened”. Companies do have fewer servers but they never hit the consolidation ratios they expected. Why? In one word, performance.
Surveys show that 61% of companies have experienced slow applications after server virtualization with 77% pointing to I/O problems as the culprit.
Now, with hyper-converged, companies have another opportunity to fulfill their vision of consolidating and reduce the complexity of their infrastructure. But, this will only happen if their applications get the I/O performance they need.
Join us for this webinar where we will show you how to get industry leading I/O response times and the best price/performance so you can reduce and simplify your infrastructure.
As organizations move into the "3rd platform", they’ll need to discover new ways to support solutions like Private Cloud and Real-Time Analytics. The traditional disk-based storage they're depending on is not keeping up. Flash is transforming datacenter infrastructure performance, capacity and cost to address these challenges.
Join Chris Tsilipounidakis from Tegile as he examines the ways of transforming to a Flash datacenter to support various, mixed application workloads in an ever-changing ecosystem.
The first rule of data analytics for fast-growing companies? Measure all things. When putting in place a robust data analytics strategy to go from measurement to insight, you’ve got lots of options for tools -- from databases and data warehouse options to new “big data” tools such as Hadoop, Spark, and their related components. But tools are nothing if you don’t know how to put them to use.
We’re going to get some real talk from practitioners in the trenches and learn how people are bringing together new big data technologies in the cloud to deliver a truly world class data analytics solution. One such practitioner is Celtra, a fast-growing provider of creative technology for data-driven digital display advertising. We’re going to sit down with the Director of Engineering, Analytics at Celtra to learn how they built a high-performance data processing pipeline using Spark + a cloud data warehouse, enabling them to process over 2 billion analytics events per day in support of dashboards, applications, and ad hoc analytics.
In this webinar you’ll:
* Build a simpler, faster solution to support your data analytics
* Support diverse reporting and ad hoc analytics in one system
* Take advantage of the cloud for flexibility, scaling, and simplicity
* Evan Schuman, Moderator, VentureBeat
* Grega Kešpret, Director of Engineering, Analytics, Celtra
* Jon Bock, VP of Marketing and Products, Snowflake
Register today and learn how the top SaaS strategies can streamline your business.
Mike Matchett, Sr. Analyst & Consultant, Taneja Group
Come join Senior Analyst Mike Matchett's lively discussion about the concerns and challenges coming when the Internet of Things crashes into our enterprise datacenters. We think big data today is big, but future IoT data streams promise to swamp everything from servers to storage. And today's new big data applications will still need to become more real-time, more agile, distributed and even more scalable. What's coming down the road, and how should we start planning for the future today?
This webcast will go for 30 minutes, followed by a 15 minute Q & A session, where the audience is welcome to ask questions.
Nick Serrecchia, Systems Engineer at Veeam and Terry Grulke, Sr. Technical Advisor at Quantum
With the average company experiencing unplanned downtime 13 times a year, the costs associated with continuing to invest in a legacy backup solution can be extensive. For this reason, more customers are switching to Veeam® and Quantum than ever before. Update to a modern data center and achieve Availability for the Always-On Enterprise™ with Veeam coupled with Quantum’s tiered storage that increases performance, reduces bandwidth requirements and executes a best practices for data protection.
The Internet of Things is becoming a reality, and as a result companies of all shapes and sizes are implementing digital transformation initiatives. This digital transformation begins with a modern data center, built on converged infrastructure, that provides a simple and cost-effective process to both deploy and run IT – supporting core business and next-generation applications.
Carrick Carpenter, Director of Delivery; Cloud Computing - Healthcare and Ozan Talu, Director of private cloud services
Cloud computing is a growing force in healthcare and, while many organizations understand the opportunity that the cloud offers, why and how to get there is widely debated. As providers evaluate the pros and cons of cloud based solutions, several adoption strategies are emerging. Taking the right approach is critical to determining future readiness as healthcare becomes more information-driven and connected, and moves towards collaborative care models and payment reform. This workshop will examine key applications of cloud computing in healthcare (including hosting, security/privacy and medical image archiving), highlight change management strategies from a technical/operational/process perspective, and identify the pros and cons of different cloud models including public vs. private. The workshop will be divided into vignettes that include didactic presentations and real-world case studies with interactive discussions.
Walfrido Zafarana, Product Application Manager, Emerson Network Power
The data centre is mission-critical to many businesses. An efficient chilled water system guaranteeing continuous cooling availability is fundamental in obtaining an overall low data center PUE, and it is thus important to clearly understand the different technologies available for your data centre application.
Join the Emerson Network Power Critical Advantage Webcast for all of the answers.
The webcast will provide insight into:
• The advantages of Chilled Water systems
• The different solutions available according to your data center internal conditions: air-cooled, freecooling, adiabatic
• How to achieve utmost efficiency at the data center system level with the iCOMTM Control
Patrick Grillo, Senior Director, Security Solutions, Fortinet
The Data Center is not an island. It is part of a complex ecosystem, working and evolving together for the overall benefit of the enterprise.
Data Center security can no longer be treated as an island. It must integrate and interact with the overall enterprise ecosystem and security infrastructure to provide a real-time, effective security posture.
This webinar will present a high level view as to the importance of deploying an integrated, end-to-end enterprise security platform for achieving data center security.
Because sometimes, the best data center security solution has nothing to do with the data center!
Jose Ruiz, VP Engineering Operations, Compass Datacenters
As has often been reported, human error is one of the largest factors in data center outages. Since estimates of the average cost of an outage now exceed $740,000, the ability to reduce or eliminate man-caused outages can make a substantial impact on the organization’s bottom line. In this presentation, Jose Ruiz, VP of Engineering Operations for Compass Datacenters, will present a case study on how the introduction of wearable technology has enhanced one customer’s operational performance substantially.
John Mao. Director of Business Development at Stratoscale
While public cloud adoption has been on the rise, most companies are still bound to strict requirements around security, privacy, and data sovereignty issues. For those exact companies, the idea of owning a private cloud is appealing. The reason is simple: private clouds provide similar benefits as public clouds -- such as self-service, automation, orchestration, and "one throat to choke". But how do mainstream IT organizations reap these same benefits? Come join us as we explore how new software-defined paradigms helps transform your dreams of a private cloud into a reality.
Jabez Tan, Senior Analyst, Data Centres,Structure Research
What are the top data centre colocation trends for 2016. How have past predictions played out so far? Singapore and Hong Kong have stood out as the top 2 data centre markets in the Asia Pacific region. We take a quantitative deep dive into the data centre supply and revenue generation for each market, and how much revenue is being generated from colocation services.
Best practices for achieving an efficient data center
With today’s pressures on lowering our carbon footprint and cost constraints within organizations, IT departments are increasingly in the front line to formulate and enact an IT strategy that greatly improves energy efficiency and the overall performance of data centers.
This channel will cover the strategic issues on ‘going green’ as well as practical tips and techniques for busy IT professionals to manage their data centers. Channel discussion topics will include:
- Data center efficiency, monitoring and infrastructure management;
- Data center design, facilities management and convergence;
- Cooling technologies and thermal management
And much more