Hi [[ session.user.profile.firstName ]]

Turn Your Data Center into Your Very Own Public Cloud

Companies are actively looking to transform their highly virtualized data center into a dynamic on-premise private cloud that offers the same benefits as the public cloud. Today’s dedicated infrastructure offerings fail to address the growing demands and higher service levels needed for today’s applications. They lack the advanced capabilities like elastic scale, flexibility, management automation, and usage-based tracking to deliver IT as a Service (ITaaS) for their tenants. A modern dynamic data center necessitates the need for a modern software-defined infrastructure. An infrastructure that can be dynamically provisioned across a pool of compute, network and storage resources, thus reducing and eliminating the planning and complexity associated with application sizing, provides much more flexibility and allows for future proofing and investment protection.
For companies considering whether or not to use the public cloud versus staying with or building their own data center, there are three primary factors affecting their decision: business needs, data security and total cost of operations (CAPEX/OPEX).
Recorded Dec 7 2016 56 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Sanjay Jagad, Sr. Manager, Product Marketing, Cohodata; Suzy Visvanathan, Director, Product Management, Cohodata
Presentation preview: Turn Your Data Center into Your Very Own Public Cloud

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • WekaIO: 5 Reasons Why NAS Can’t Support Deep Learning Workloads Aug 14 2018 4:00 pm UTC 60 mins
    Barbara Murphy, VP of Marketing, WekaIO and David Hiatt, Director of Product Marketing, WekaIO
    Legacy storage systems, like NAS, were architected when spinning disk and slower networking technologies were the industry standard. In this webinar, we’ll present five reasons why NAS can’t keep pace with the I/O demands of new deep learning workloads. To support these workloads, the data processing layer has to have immediate access to, and a constant supply of, data. Here NAS falls short, because the data gets bottle-necked between the compute and storage. WekaIO Matrix™ is a next-generation shared, distributed file system that visualizes the SSDs into one logical pool of fast storage presenting a global namespace to the host applications. Matrix was written from scratch to leverage the benefits of standard Intel x86 architecture combined with NVMe. The result is an easy to deploy, easy to manage storage architecture that is a radical departure from legacy NAS systems. By optimizing Matrix for flash, the storage solution is ideal for deep learning and high-performance computing workloads.
  • Solving the Jevons Paradox of Performance with Lenovo and Intel Optane Jul 19 2018 4:00 pm UTC 60 mins
    Tony Harvey, Senior Product Manager, Lenovo; Vivek Sarathy, Enterprise Architect, Intel
    Jevons paradox states that when a resource is used more efficiently, the consumption of that resource actually rises due to the lower prices and/or increased availability of the product. Nowhere in the data center is that more true than when you look at server performance. Storage has always been a bottleneck in compute systems because of the vast performance difference between the storage subsystem and CPU and memory. Together Lenovo, Intel and VMware address the bottleneck of storage performance in Hyperconverged systems with the introduction of Intel Optane technology to Lenovo vSAN ReadyNodes. Join us to hear how these new ReadyNodes from Lenovo will help you to increase the performance of your applications, increase VM density and still improve end-user response times.
  • Red Hat OpenShift Container Platform Reference Architecture using Intel® Select Jul 17 2018 4:00 pm UTC 60 mins
    Dave Cain, Senior Architect, Red Hat; Seema Mehta, Enterprise Solutions Manager, Intel
    Deploying Intel® Xeon® Platinum or Gold processors with an Intel® Select Solution delivers verified solution performance ready to meet high quality thresholds for data protection and resiliency, while allowing more time and resources for innovative enhancements specific to workloads and use cases.

    The current solution using Red Hat’s OpenShift Container Platform and Container-Native Storage co-locates storage and applications, and provides resilient, scale out enterprise-caliber storage to workloads on OpenShift, enabling rapid and highly-available stateful and stateless applications on baremetal x86 systems.
  • Simplifying Virtual SAN Infrastructure Deployments Jun 28 2018 4:00 pm UTC 60 mins
    Sim Upadhyayula, Sr. Director, Solutions, Supermicro; Christine McMonigal, Director of Hyperconverged Marketing, Intel
    Supermicro and Intel are pleased to introduce two new Intel® Select Solutions for VMware vSAN from Supermicro. These vSAN ReadyNode certified configurations comprise the latest generation of Intel® Xeon® Scalable Processor and Intel® Optane™ technologies, addressing the IOP and latency requirements of the most demanding enterprise applications. These hardware platforms have been optimized to support the latest hyperconverged technology from VMware, providing excellent performance and TCO to enterprise and SMB customers alike. Join us to learn how your organization can benefit from these innovative new solutions.
  • Accelerating Genomic Data Sharing in Clouds & On Premise Jun 27 2018 3:00 pm UTC 60 mins
    David Labrosse, NetApp Global Alliances Mgr.; Dan Greenfield, PetaGene, CEO
    Many of the answers locked inside human DNA are discoverable and actionable, and the goal of precision medicine is within reach. However, we need to face the looming tsunami of data: the latest sequencers generate two or more terabytes daily. This enormous data management challenge is impacting both bio medical researchers and IT leaders around the globe who are searching for efficient, effective solutions.
    Learn how data compression and unified data management are increasing efficiency, and how the move from IT silos to a common storage platform is enabling seamless data sharing, on premise and in the cloud. Find out how your institution can achieve faster genomic data sharing, paving an accelerated path to improved patient care.
  • Wiwynn: Manage Software Defined Composable DC w/ Intel® Rack Scale Design Recorded: Jun 5 2018 46 mins
    Ethan SL Yang, Deputy Manager, Wiwynn Corporation
    Wiwynn® Cluster Manager with Intel® Rack Scale Design is a system software that makes data center easier to manage with features such as composable infrastructure resource planning, massive firmware and OS deployment, real-time rack level visual monitoring. For advanced SDDC capabilities, we will introduce the new telemetry feature of Intel® RSD 2.2 providing extended visibility to CPU, memory, and I/O utilization without in-band agents and unlocks new possibilities of diagnostics, workload balancing, and power/cooling optimization. The latest feature of NVMe over fabric from Intel® RSD 2.3 enabling NVMe resource pooling at scale through Ethernet protocol will also be demonstrated and be ready to launch soon.
  • Efficiently Deploying Supermicro Disaggregated Storage w/Intel “Ruler” Recorded: May 31 2018 49 mins
    Mike Scriber, Sr. Director, Server Solution Mgmt, Supermicro and JonMichael Hands, Product Line Manager, Intel
    Join Intel and Supermicro as we delve into the innovative Intel “Ruler” form factor NVMe SSDs and discuss how both companies are delivering efficient disaggregated storage solutions to the market. In this technical webinar, you’ll learn all about the new Intel® SSD DC P4500 in the “Ruler” form factor, how it delivers value for data center customers in disaggregated storage use cases, and the roadmap of products being delivered in high volume today.
  • Iguazio Presents: Delivering In-Memory Database Performance on Intel Flash Recorded: May 23 2018 60 mins
    Ori Modai, VP R&D, Iguazio
    Using Flash to substitute memory in databases can substantially increase density, reduce costs and improve durability. However, most databases access Flash through traditional storage and file system layers, utilizing only a fraction of Flash’s potential speed. As a result, companies keep using in-memory databases with expensive memory and limited data capacity and resiliency. Ori Modai will explain how to build real-time intelligent apps which run across large data sets and reach in-memory database performance with Intel’s NVMe Flash. He will describe the challenges posed by traditional database and storage architectures and will present a new database design which maximizes the value of flash, reaching 20x lower costs and 100x higher density. The session will include a technical deep dive followed by real-world customer use-cases in the fields of IoT edge and online AI applications.
  • Quanta-LIVE Webinar: From Silo to Hyper-converged Infrastructure Recorded: May 23 2018 43 mins
    Wendy Wu, Biz Dev Team Lead, QCT & Trista Huang, Biz Dev Team, QCT
    Enterprises around the world strive to stay ahead and would like to rearrange their data centers to support new ways of doing business. However, there are lots of challenges that one must face on this transformation journey. In this webinar, Wendy Wu, Business development team lead for VMware solutions from QCT (Quanta Cloud Technology) will elaborate the challenges you may face and how can you leverage QCT and Intel’s knowledge and efforts to address those challenges and find the fastest path to your data center transformation.
  • HPE: Accelerate your SQL Server Performance with HPE and Intel Recorded: May 8 2018 50 mins
    Kannan Mani, HPE; Chris Tackett, Intel; Maurice De Vidts, HPE
    Business critical application performance and cost control have typically been a challenge for many customers - Intel® recognized these challenges and developed the Intel Select Solutions program as a way to simplify and bring confidence so that solution component selection will result in workload optimized deployments. Hewlett Packard Enterprise has followed the Intel reference design to create tested solution configurations that meet Intel Select performance standards while providing predictable outcomes for Microsoft® SQL Server 2016 OLTP database workloads.
  • Taking Performance from Good to Great: Intel NVMe Flash & Pivot3 Advanced QoS Recorded: May 2 2018 58 mins
    George Wagner, Sr. Product Mgr, Pivot3; Keith Hageman, Sr. Tech Mktg Engineer, Pivot3
    Low latency NVMe SSD performance eliminates storage bottlenecks for demanding datacenter applications such as databases, big data, virtual servers, virtual desktops, and business intelligence. Harnessing this new level of performance with Quality of Service (QoS) enables enterprises to extend this breakthrough performance to run multiple, mixed application workloads, including latency and queue depth sensitive applications, on hyperconverged infrastructure (HCI). Join the webinar to see how Intel NVMe Flash and Pivot3 advanced QoS enable enterprises to better achieve strategic business objectives and realize savings that go straight to the bottom line.
  • Excelero: Designing Elastic Storage Architectures Leveraging Distributed NVMe Recorded: Apr 24 2018 47 mins
    Josh Goldenhar, Vice President Products, Excelero & Yaniv Romem, CTO, Excelero
    Tech Giants have redefined infrastructure design for web-scale applications, leveraging standard servers and shared-nothing architectures to ensure maximum operational efficiency and flexibility. Enterprises and service providers are seeking to optimize their infrastructures in the same way. For storage, this means they want to deploy unified scale-out storage infrastructures leveraging Software-Defined Storage. SDS, integrated with Software-Defined Networking, virtualization and newer architectures such as containers and their orchestration platforms completes the Software-Defined Data Center.

    In this webinar you will learn how to build elastic storage architectures with client-side services, leveraging the latest generations of Intel® NVMe flash and software-defined storage, with no need for proprietary arrays. Your network becomes your storage!
  • SAP HANA Runs Better on NetApp Recorded: Apr 19 2018 55 mins
    Nils Bauer, SAP Competence Center Manager, NetApp & Bernd Herth, Senior Technical Marketing Engineer, NetApp
    The NetApp solution for SAP HANA can deliver validated solutions to complete your SAP projects in less time with less resources, and lower delivery risk. Join us for this webinar and learn:
    • Rapid backup and recovery with no impact on production performance and time savings of more than 95% means your HANA environment is fully available to power your business
    • Efficient system copies mean no more working nights to copy data and no impact on production, so your developers get the tools they need to accelerate SAP projects
    • Cloud integration with NetApp means you can move workloads to the Cloud and back and be prepared for future deployment models
    • Disaster recovery becomes simplified, plus you can use your disaster recovery site for testing purposes
    • The SnapCenter plug-in for HANA enables admins to efficiently manage data protection requirements, by enabling self-service through an enterprise protection framework
  • Stratoscale: Reduce Operational Complexity-Deliver Database Services On-Prem Recorded: Apr 17 2018 54 mins
    Omer Kushmaro, Product Manager, Stratoscale
    Managing databases involves not just deployment, but also complex operational aspects such as storage, backup, failure detection, scaling, configuration management and upgrades. In this webinar we’ll discuss how you can:
    ● Simplify the entire database operations lifecycle and easily set up, operate, and scale databases
    ● Support users by offering an easy-to-use and self-service cloud-based consumption model in any enterprise environment, while maintaining full control over infrastructure and policies
  • Assessment to Action: Creating a Cloud Roadmap Recorded: Apr 4 2018 59 mins
    David LaBrosse, Healthcare Strategic Partner Manager, NetApp
    Today’s cloud choices can be confusing. Explore new and emerging models for connecting to, running on or even building your own cloud.

    Learn the benefits and limitations of each, and hear how healthcare organizations are using different options to support their strategic goals. Leave the webinar with concrete tools for assessing your needs, vetting cloud providers, identifying cloud-ready workloads and implementing a plan to take your organization to the next step.
  • Supermicro Turbo Charges Solutions with NVMe Over Fabrics & Intel® Recorded: Mar 22 2018 62 mins
    Ray Pang, Director of Technical Marketing and Solutions, Supermicro and Scott Garee, Product Manager Supermicro RSD
    Supermicro’s innovative Rack Scale Design (Supermicro RSD) solutions bring the superior performance of NVMe based network storage to your applications. Supermicro RSD solutions will simplify the complexity of scaled NVMe over Fabric (NVMe-oF) deployments, encapsulating them into a service based model for NVMe target, client, and fabric management.
  • Integrated Data Protection with NetApp and Commvault Recorded: Mar 21 2018 46 mins
    Glenn Miller, Sr Product Manager, NetApp Charlie Patterson, Alliance Architect, Commvault
    In January this year we introduced the value proposition NetApp and Commvault bring to the Next Generation Data Center. In this webinar we will drill down into this a little more, highlighting the tight integration between NetApp ONTAP and Commvault IntelliSnap for NetApp and what this means for customers. We will highlight how we’ve combined the benefits from a traditional back-up product with high-speed snapshot technology, all within a simple management interface, to eliminate complexity and accelerate protection and recovery.
  • Architecting a Complete Data Infrastructure for AI and Deep Learning Recorded: Mar 14 2018 60 mins
    Santosh Rao, Senior Technical Director, NetApp
    Enterprises are eager to take advantage of artificial intelligence technologies such as deep learning to introduce new services and enhance insights from company data. As data science teams move past proof of concept and begin to operationalize deep learning, it becomes necessary to focus on the creation of a complete data architecture that eliminates bottlenecks to facilitate faster model iteration.

    Designing a data architecture involves thinking holistically about the data pipeline, from data ingest and edge analytics, to data prep and training in the core data center, to archiving in the cloud. It is critical to understand performance requirements and data services needed, but one should also consider future extensibility and supportability as deep learning hardware and cloud approaches evolve over time.

    This session will examine all the factors involved in the architecture of a data pipeline for deep learning, focusing in on data management and the hybrid cloud. Careful infrastructure planning can smooth the flow of data through your deep learning pipeline, lead to faster time to deployment, and thus maximum competitive differentiation.
  • NetApp for SAP – A powerful Partnership for Your Digital Transformation Journey Recorded: Mar 6 2018 57 mins
    Mike McNamara, Sr. Manager, Product Marketing, NetApp Roland Wartenberg, Sr. Director, Global Alliances, NetApp
    “Data is the new Oil”, said SAP’s CEO Bill McDermott at SAPPHIRE 2017. As the market leader in enterprise application software, SAP is at the center of today’s business and technology revolution. NetApp, being an SAP Global Technology Partner for more than 15 years, is the data authority in the hybrid cloud.

    NetApp and SAP work together to deliver uninterrupted real-time data access, multitenancy, and accelerated project ramp-up to propel our customers’ business forward. A Data Fabric powered by NetApp technology optimizes the value of SAP data and helps companies worldwide access enterprise applications and data where they fit best—across on-premises, hosted, and public-cloud environments.

    But the world is changing. Digital transformation with internet of things, machine learning and artificial intelligence will change the way business are run, today and tomorrow.

    Do you want to learn how NetApp helps customers transform their SAP environments to take advantage of hybrid cloud capabilities? Do you want to learn how NetApp’s solutions and products can help you to accelerate projects, to automate processes and save time on your journey to the cloud? Do you want to learn how NetApp can help you to create a comprehensive end-to-end data management strategy?

    In this webcast we answer not only these questions, we also will show how customers have taken advantage of today’s NetApp SAP Software Solutions to make their businesses ready for tomorrow’s challenges in the world of digital transformation.
  • Modernize Data Protection with NetApp ONTAP and Veeam Recorded: Feb 7 2018 57 mins
    Glenn Miller, Sr. Product Manager Data Protect, NetApp; Colm Keegan, Sr. Alliance Product Marketing Manager, Veeam Software
    According to ESG, nearly 70% of senior IT Decision Makers believe that their IT environment is more complex than it was two years ago.* In this webinar, learn how to modernize your data protection by deploying NetApp ONTAP storage with Veeam Availability solutions to simplify IT, reduce costs, mitigate risk and increase business agility.

    *Source: ESG Master Survey Results, 2018 IT Spending Intentions Survey, December 2017
Weekly webcasts targeted at an enterprise audience
This channel provides technical information on SDI solutions from members of the Intel Builders programs.

The Intel Builders program bring together a broad ecosystem of solution providers to drive data center innovation through accelerated development and deployment of tools and documentation for cloud, storage, and network solutions. Leveraging Intel architecture, the ecosystem works to optimize solutions targeted at customer requirements. Members develop reference architecture, solutions briefs, and deployment guides to help accelerate deployment of these solutions.

Learn more at https://builders.intel.com.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Turn Your Data Center into Your Very Own Public Cloud
  • Live at: Dec 7 2016 6:00 pm
  • Presented by: Sanjay Jagad, Sr. Manager, Product Marketing, Cohodata; Suzy Visvanathan, Director, Product Management, Cohodata
  • From:
Your email has been sent.
or close