The Importance of Very Fast, Highly Scalable Storage for Today’s HPC
Today, data drives discovery. And discoveries create are key to creating sustained advantages. The better your critical workflows are able to create and access data – the better you’ll be able to discover new, innovative solutions to important problems, or to create entirely new products. More than
Today, data drives discovery. And discoveries create are key to creating sustained advantages. The better your critical workflows are able to create and access data – the better you’ll be able to discover new, innovative solutions to important problems, or to create entirely new products. More than ever before, data intensive applications need the sustained performance and virtually unlimited scalability that only parallel storage software delivers.
Designed for maximum performance and scale, storage solutions powered by Lustre software deliver the performance at scale to meet today’s storage requirements. As the most widely used parallel storage system for HPC, Lustre-powered storage is the ideal storage foundation.
But scalable performance storage by itself only solves half the problem. Today’s users expect storage solutions that deliver sustained performance, scale upward to near limitless capacities, and are simple to install and manage. Intel(r) Enterprise Edition for Lustre* software combines the straight line speed and scale of Lustre with the bottom line need for lowered management complexity and cost.
As the recognized leaders in the development and support of the Lustre file system, Intel has the expertise to make storage solutions for data intensive applications faster, smarter and easier.
In the traditional world of EDW, ETL pipelines are a troublesome bottleneck when preparing data for use in the data warehouse. ETL pipelines are notoriously expensive and brittle, so as companies move to Hadoop they look forward to getting rid of the ETL infrastructure.
But is it that simple? Some companies are finding that in order to move data between clusters for backup or aggregation purposes, they are building systems that look an awful lot like ETL.
Join us for a live webinar featuring Dr. Jim Metzler, Distinguished Research Fellow at Ashton Metzler & Associates, as he discusses the future of the WAN and why now is the time to rethink the architecture in order to evolve/accelerate your business.
Why take the traditional approach to branch office expansion by relying on providers that can’t meet the urgency of your business needs? A new era has come where complexity, cost, flexibility and time-to-market are no longer a hurdle. Welcome to the world of the Software-Defined WAN.
Join this webinar featuring Jim Metzler, industry specialist to discover:
•Why it’s time to re-architect your WAN
•How today’s WAN infrastructure is failing IT
•New market dynamics that the traditional WAN cannot support
•A step-by-step approach to evolving your WAN with minimal disruption
For the Data Scientist, Data Science is complex; for the average business user it is a mystical art form that promises a lot, but often under delivers against expectation. For many established companies the result of this has been a lack of investment in an area that is, for others, quickly becoming an area of competitive advantage.
Helping the business understand the value of Big Data and Analytics, whilst also helping translate their business requirements and expectations, is a critical foundational step of the Data Analytics Lifecycle that can lead to greater investment from the business and greater profit for the organization. By way of customer examples, this presentation discusses the importance of engaging the business early and the importance of being able to tell an engaging story about the ‘Art of the Possible’.
New video! Manage your vSphere environment with a powerful solution that leverages the cloud, always-on analytics, and data science to optimize visibility, insight and control. Five minutes to activate, and you can increase utilization, boost performance, diagnose, troubleshoot and fix issues—even proactively—ten times faster. Eliminate capacity shortfalls and drill down to root causes, hazards and more.
Join us for a live Big Data Analytics customer case study webcast featuring Dana Gardner, a leading IT industry analyst at Interarbor Solutions, as he interviews Procera Networks executive, Cam Cullen.
Learn how Procera Networks dealt with massive data volume challenges to provide network performance benefits to its global users, powered by HPE Vertica. HPE Vertica is the industry’s first comprehensive, scalable, open, and secure platform for Big Data Analytics.
Financial advisors espouse that proper asset allocation during times of market volatility can help us sleep better. There is a parallel in the IT world. With volatility driven by technology advancement, virtual & cloud environments, and consumer demand for the newest applications and hardware, a good night’s sleep for an Asset Manager requires properly managed and optimally allocated hardware and software assets in a constantly changing environment.
This session explores this intimidating world, common pitfalls, prescriptive actions and what the latest technology can do to make sure your assets, licenses and infrastructure are optimally aligned to drive wealth in IT…and let the Asset Manager sleep well without the fear of negative audit findings and exorbitant fines.
A movement is underway. Businesses are awakening to a new era of the digital enterprise, requiring companies to find new ways of delivering their services built for the digital era. Success in this new era requires a digital industrialization strategy in which datacenters become the core asset of the business enabling a transformation from an infrastructure that is tightly coupled with the business to a modern infrastructure that enables any business.
Digital industrialization is a continuous cycle that organizations can use to turn IT infrastructure from a cost into an asset by standardizing on one set of technologies and economics across facilities, hardware, software, and operations; consolidating datacenters; abstracting functionality; automating operations and governing it all to ensure security, integrity, and compliance.
This presentation will go through why digital industrialization is needed, what the benefits are and how the Ericsson Cloud portfolio facilitates it.
Would you like to cut complexity across all phases of app development and deployment?
Join us for this straightforward discussion on how CA Application Lifecycle Conductor reduces risk through a single source-of-truth. CA Application Lifecycle Conductor automates and manages the software development lifecycles that span mobile-to-mainframe environments — from the initial service desk ticket to the deployment of the application in production.
Join Rose Sakach, Sr. Principal Product Manager, and Vaughn Marshall, Director, Product Management as they outline CA Application Lifecycle Conductor’s many benefits. Discover how you can:
• Create one view and traceability for the application development lifecycle
• Identify the potential time savings for project managers, release managers and compliance managers
• Determine which customer segments would benefit the most from adopting CA ALC
Are you ready to simplify application lifecycle management—from mobile to mainframe?
With the average company experiencing unplanned downtime 13 times a year, the costs associated with continuing to invest in a legacy backup solution can be extensive. For this reason, more customers are switching to Veeam® and Quantum than ever before. Update to a modern data center and achieve Availability for the Always-On Enterprise™ with Veeam coupled with Quantum’s tiered storage that increases performance, reduces bandwidth requirements and executes a best practices for data protection.
After a record-setting year in 2015, where will the tech M&A market go in 2016? What trends that pushed M&A spending to its highest level since the Internet Bubble burst will continue to drive deals and which ones will wind down? What other sectors are likely to see the most activity this year? And most importantly, what valuations will be handed out in deals over the coming year? Drawing on data and views from across 451 Research, the Tech M&A Outlook webinar maps many of the major developments in the IT landscape (IoT, Big Data, cloud computing) to how those influence corporate acquisition strategies. Join us for a look ahead to what we expect for tech M&A in 2016.
Fred Magnotta, Engineering Manager; Soila Kavulya, Research Scientist; Anahita Bhilwandiwalla, Software Engineer
Spark is a central tool within the Trusted Analytics Platform (TAP). Join this discussion with the engineering team behind integrating Spark capabilities into TAP. We’ll discuss how data scientists benefit as these 2 open-source projects inherit improvements, hardware acceleration and contributions into their respective projects. Our team will review upcoming features to this powerful workflow and engage in an open Q&A with attendees.
Fred Magnotta (Engineering Manager), Szymon Bultrowicz (Platform Engineer) and Chuck Freedman (Chief Developer Advocate)
Trusted Analytics Platform (TAP) engineering and support teams present an overview of the open-source project’s capabilities to deploy cloud-native applications on advanced analytics at scale. In this reoccurring session, the team will demonstrate TAP, give an overview of features, outline the project & documentation in github, and hold a live Q&A with attendees. Learn more about TAP: http://trustedanalytics.org
Fred Smith, VP, Cloudian; Eric Kern, CTO EBG Professional Services, Lenovo; Christine McMonigal, ISV Ecosystem Manager, Intel
Intel, Lenovo, and Cloudian are revolutionizing leading-edge petabyte scale computing for enterprise and STaaS customers—now they do it together—with modern solutions that offer scale out architecture, hybrid cloud tiering, S3 compatibility, and multi-datacenter multi-tenancy features.
With scores of major wireless carriers now rolling out LTE service across the globe, they need ways to offer high-bandwidth services to their subscribers efficiently and effectively while fiercely guarding the security of their advanced 4G networks.
In this webinar, Artesyn will show how its MaxCore™ platform leverages the Advanced Encryption Standard (AES) cryptography instruction set of the Intel Xeon-D and the advanced switching and high speed interfaces of the Intel Ethernet Multi-host Controller FM10000 (formerly Red Rock Canyon) to provide a highly efficient and cost effective security gateway solution.
For example firewall performance of up to 600Gbps can be achieved by combining four 100GE smart IO cards based on FM10000 with 22 Intel Xeon-Ds populating the balance of MaxCore™’s 15 PCIe slots. The result is a system with 15x the density of a comparable rack mount server implementation. In addition, by using low power multicore processors, the system can cut energy consumption by up to 70%.
Jason Tsai, Senior Product Manager, QCT; Jen-Yao Chung, Associate VP, QCT; Fred Smith, VP, Cloudian
The storage market is going through accelerated change to address the deluge of digital data and an ever-escalating business imperative to access and analyze this data in real time. Back-up, video streaming, IoT, archive, and big data analytics are just a few examples. QCT and its partners are leading the response to this critical need in the marketplace by designing industry standard, simple-to-manage and hyper-scalable storage infrastructure and solutions. QCT will introduce its newest storage server, the QuantaGrid SD1Q-1ULH powered by Intel Xeon D-1500 processor family, and show how it brings great efficiency and benefits to enterprises on Cloudian HyperStore and Ceph object storage solutions.
Jonathan Stern, Storage Applications Engineer at Intel Corporation
A brief introduction to the Intel® Intelligent Storage Acceleration Library (ISA-L), a freely licensed set of storage-domain algorithms implemented by a lean, mean team of bare-knuckle assembly coders. Leveraging in-depth knowledge of processor internals, the ISA-L team is able to see substantial improvements in performance of hashing, encryption, compression, CRC, RAID, and erasure coding efficiency over comparable open source alternatives.
Jonathan Stern, Storage Applications Engineer at Intel Corporation
An introduction to the Storage Performance Development Kit (SPDK) an extension of the Data Plane Development Kit (DPDK) into a storage context. We cover the components offered within SPDK and have a brief discussion of real-world customer use cases.
Raghu Yeluri, Principal Engineer and Lead Security Solutions Architect, Intel Corporation
This webinar will explore Intel’s focus around Container security, as well as Trusted Docker Containers and what Intel is doing to enable Trusted containers. We will also discuss about how to deploy Trusted VMs and Trusted Docker Containers transparently and seamlessly in OpenStack.
Rob Strechay, Director Software-defined Storage & Hyper-Converged, HP & Dave Cohen, Systems Architect Storage Division, Intel
As if you didn’t know, today’s data center is undergoing a revolution, in large part due to two new advances: software-defined storage and hyper-convergence. Combined, the two advances offer simplicity of deployment and greater flexibility, as well as new operational efficiencies, reducing the need for IT staff specialization and creating time for bigger projects. Learn the hidden secret behind an HP hyper-converged data center, and discover how these two technologies make it possible for you to set up a virtualized server environment that can handle the hyper-growth that today’s business world imposes. And that’s just a small slice of what you’ll learn in this webinar.
Mark Henderson, Storage Technology Marketing Engineer, Intel & Patrick Osborne, Sr. Dir. Product Management & Marketing, HP
Once the decision to transition from spinning disks is made, you’ll want to have a robust recovery system put in place that matches the speed, scale and efficiency that Flash gives you. In this webinar, you’ll see how you can achieve 17 times faster backup than in the past with 5 times faster restore at significantly reduced cost. But that’s not all. You’ll also discover more about E2E Data Protection, Flat Backup Snapshot Management, Remote Copy and Asynchronous Replication.
Rich Salz, President, OpenSSL Foundation w/ Brian Will, Software Architecture & Edward Pullin, Product Marketing Mgr., Intel
SSL/TLS is a technology widely used for creating secure communications channels. It is the workhorse for enabling Enterprise and Internet Security. OpenSSL has become such an integral part of the mechanics of the way we communicate, that we barely notice that it secures our Enterprises, Clouds, Sites, Transactions, and our Network Access. Greater than 35% of Enterprise & Internet Traffic Uses SSL/TLS Encryption and this is usage continues to grow at >20%/Year. Some vendors supporting SSL/TLS have resorted to taking OpenSSL and customizing, forking, or just plain making contributions to optimize performance of the key Open Source crypto frameworks.
In this Webinar Intel and the OpenSSL foundation discuss how OpenSSL may be used to increase your security performance for your Enterprise, Cloud, Hyper-Converged System or CDN while improving your product efficiency with integrations of technologies such as the Intel® QuickAssist Technology.
Sandra Rivera, VP, Data Center Group & General Manager, Network Platforms Group at Intel
In this webinar Sandra will discuss the newly announced Intel® Network Builders Fast Track. This is the company’s next step in working with the networking industry to accelerate innovation in the network ecosystem. Through a combination of market development activities and Intel Capital investments Intel plans to help drive integration of solutions for deployment, accelerate interoperability across stack layers and across networks, and accelerate adoption of standards based technologies using Intel architecture with trials and deployments with industry leading service providers.
Sandra will explain what areas Intel will invest in to help drive integration of solutions for deployment, ensure interoperability across stack layers and across networks, and accelerate optimization of standards based technologies using Intel architecture in areas such as NFV/SDN, 5G and mobile edge computing with the goal of creating a faster path to deployment. Sandra will also provide insight into the programs Intel may invest such as the Solution Blueprint Program to develop use cases, the Intel® Network Builders University to drive technical education for the ecosystem and the creation of interoperability centers for testing compliance, performance tuning and testing.
Curt Aubley, VP & CTO of Intel’s Data Center Group and General Manager of the Innovation, Pathfinding, & Architecture Group
Today's data centers face new and unexpected threats every day – from corporate espionage artists to terrorists to natural disasters. While data center security has always been a top priority, with the growing use of social media, mobile technologies, analytics and cloud computing, a resilient Data Center is more important than ever – and managing risk in these changing times creates new challenges and opportunities for success. In this audio webinar, Intel's Curt Aubley will explore how the data center security landscape is evolving, and discuss Intel’s holistic approach to defending the data center.
Vish Mulchand, Director, Product Management, HP Storage
Now that Flash storage is affordable, on par with spinning disk, IT departments everywhere can realize the amazing transformation All-Flash storage can provide. It’s definitely a game changer. Take a look at spinning disk vs. flash, trends, and the vision of an all-flash data center. Right here.
Rajkumar Jalan, CTO of A10 Networks, Sab Gosal, Network Security Segment Mgr, Intel & Matt Jonkman, President, OISF
Network functions virtualization (NFV) and cloud computing are the next generation of network-based computing, since they can deliver both software and hardware as on-demand resources. While this represents a transformation towards a much-needed flexible software-driven infrastructure, one of the significant concerns with this transformation is security.
In this web seminar, we discuss network security in a virtualized environment focusing on these key points:
•How to virtualize security applications, such as IPS/IDS, in a way that scales linearly while maintaining performance.
•Real-world use-case examples of security virtualization highlighting the challenges and corresponding solutions.
•A review of virtual machine architectures based on the Intel® Architecture coupled with key enabling technologies, such as Hyperscan pattern matching, used to drive the scale and performance of industry-leading IPS/IDS use-cases such as Suricata.
Data Plane Development Kit (DPDK) is a mature, community driven software for accelerating packet processing workloads using general purpose processors. It can provide performance and flexibility benefits in both purpose-built and virtual form factors.
After introducing DPDK concepts, this webinar will take you through several Enterprise and Telecom Cloud case studies depicting DPDK deployments. It will also introduce you to the DPDK open source community and walk you through the DPDK Roadmap.
Intel Chief Data Scientist Bob Rogers discusses data analytics trends and issues in this panel discussion with data science industry leaders, including David Edwards, vice president and fellow at Cerner Corp.; Dr. Don Fraynd, CEO at TeacherMatch; and Dr. Andreas Weigend, Stanford professor and director of the Social Data Lab.
Parviz Peiravi, Principal Architect for Big Data, Intel; Anant Chintamaneni, VP of Products, BlueData
Big Data analysis is having an impact on every industry today. Industry leaders are capitalizing on these new business insights to drive competitive advantage. Apache Hadoop is the most common Big Data framework, but the technology is evolving rapidly – and one of the latest innovations is Apache Spark.
So what is Apache Spark and what real-world business problems will it help solve? Join Big Data experts from Intel and BlueData for an in-depth look at Apache Spark and learn:
- Real-world use cases and applications for Big Data analytics with Apache Spark
- How to leverage the power of Spark for iterative algorithms such as machine learning
- Deployment strategies for Spark, leveraging your on-premises data center infrastructure
A rapid rate of change complicates every facet of datacenter management, and server-centric compute models are too cumbersome for today’s highly variable workloads. Is it possible to optimize resources and operations in such dynamic environments? In this presentation, learn how to replace manual, hardware-defined application provisioning and management with a highly automated, software-defined resource model and orchestration layer that enables flexibility, simplified on-demand capital efficiency, and lower TCO. Find out how to compose more agile pools of datacenter resources, and simultaneously drive up IT efficiency, optimize energy requirements, increase datacenter resilience, and strengthen disaster recovery plans.