Michael Basilyan, Google Cloud Platform; Scott Jeschonek, Avere Systems; Jason Stowe, Cycle Computing
Google Cloud Platform, Avere Systems, and Cycle Computing experts will share best practices for advancing solutions to big challenges faced by enterprises with growing compute and storage needs. In this “best practices” webinar, you’ll hear how these companies are working to improve results that drive businesses forward through scalability, performance, and ease of management.
In this webinar, you will learn:
- How enterprises are using Google Cloud Platform to gain compute and storage capacity on-demand
- Best practices for efficient use of cloud compute and storage resources
- Overcoming the need for file systems within a hybrid cloud environment
- Understand how to eliminate latency between cloud and data center architectures
- Learn how to best manage simulation, analytics, and big data workloads in dynamic environments
- Look at market dynamics drawing companies to new storage models over the next several years
In just 60-minutes, you’ll be presented with a foundation to build infrastructure to support ongoing demand growth and have ample opportunity to ask direct questions to presenters.
Damien Bataille (Eight VFX), Philippe Chotard (Eight VFX), Aaron Wetherold (Avere), Jeff Kember (Google)
In a cluttered digital world, advertising commercials need to captivate and stimulate to deliver results for brands. To do this, visual effects techniques are bringing jaw-dropping experiences to 60-second spots on the biggest screens to the smallest. With fast-changing production pipelines and unpredictable rendering workloads, Eight VFX shopped for alternatives to rental nodes to meet peak demands. After considering a number of caching options, bandwidth optimization, and even in-house scripting, they designed a one-click cloud access solution that lead to measurable, compelling value.
In this webinar, you’ll hear how Eight VFX produces honored, high-end commercials for some of the world’s biggest brands using modern cloud computing resources with instant access to unlimited cores.
George Crump (Storage Switzerland), Jeff Tabor (Avere Systems)
Object Storage promises many things - unlimited scalability, both in terms of capacity and file count, low cost but highly redundant capacity and excellent connectivity to legacy NAS (Network Attached Storage). But, despite these promises object storage has not caught on in the enterprise like it has in the cloud. It seems like, for the enterprise object storage just isn’t a good fit. The problem is that most object storage system’s starting capacity is too large. And while connectivity to legacy NAS systems is available, seamless integration is not. Can object storage be sized so that it is a better fit for the enterprise?
Aaron Wetherold, Solutions Architect, Avere Systems
Learn what cloud bursting is and when you should use it to run your applications in the cloud using existing NAS protocols. Improve app performance with cloud compute resources while keeping data on-premises.
Aaron Black (Inova Translational Medicine Institute), Jeff Tabor (Avere), Mark Johnston (AWS)
The Inova Translational Medicine Institute (ITMI) applies genomic and clinical information from individuals to develop personalized healthcare for patients. It is a division of Inova Center for Personalized Health (ICPH), which connects researchers, clinicians and consumers to integrate genomic research for patient care, prevention and wellness.
ITMI’s Informatics team saw the potential risk of rapid data set growth causing slowed research and increased demand for compute and storage resources. In this webinar, Aaron Black, Director of Informatics for ITMI, will share his innovative hybrid cloud solution that uses a high performance caching layer to couple existing on-premises compute and storage with cost-efficient cloud-based resources.
Jeff Tabor, Senior Director of Product Management and Marketing for Avere Systems, and Mark Johnston, Director of Business Development for Amazon Web Services, also discuss how ITMI supports large genomic and clinical data sets within their companies.
- Discover the objectives of ITMI clinical research programs and how they collaborate with other organizations for studies and to leverage predictive analytics that can guide patient care
- Learn how ITMI developed and executed a hybrid cloud infrastructure to manage massive research data growth and improve scalability to optimize cost effectiveness and deliver on its mission
- Discover how ITMI has eliminated redundancies of storing petabyte scale research data between cloud and on-premises storage while managing access to that data for research compute using both cloud and owned resources.
- Gain insight into technologies and services essential to the topology design and delivery with discussion among supporting panelists
Chris Dagdigian, co-founder and CTO of BioTeam, Inc., delivers a candid assessment of the best, the worthwhile, and the most overhyped information technologies (IT) for life sciences. He’ll cover what has changed (or not) in the last year regarding infrastructure, storage, computing, and networks. This presentation, designed for those responsible for information management and support in research institutes, will help you understand what technologies will help you build and support data intensive science.
This webinar is sponsored by Avere Systems and is being presented in cooperation with AIRI (airi.org).
We'll also highlight exciting content to be unveiled at the 2nd Annual Converged IT Summit coming to San Diego in October. You won't want to miss it!
William Fearnley, IDC Financial Insights & Scott Jeschonek, Avere Systems
Within the financial services industry, middle office analytics and simulations continue to grow in volume and complexity. Massive compute and storage demands cause strain on IT resources. While new technologies promise speed and scalability, evaluating this unique middle office environment requires a look at compliance, risk, and pricing analytics to determine potential gains and losses. In this webinar, IDC – Financial Insights Research Director, Bill Fearnley, looks at current middle office IT workflows supporting analytics, backtesting and financial modeling and evaluates a hybrid cloud infrastructure to support growing demands.
In this webinar, you’ll:
· Hear an IDC Analyst’s view on the current financial services IT environment
· Learn of common challenges and approaches to combat growing strain on compute and storage infrastructure
· Join in a discussion about the viability of enabling cloud services to expand compute and storage capacity
· Gain guidance on how large hedge funds and investment banks are overcoming inherent cloud challenges like latency, data accessibility, and cost management
Mike Requa (Cycle Computing), Scott Jeschonek (Avere)
As cloud computing becomes more widely used, the complexity of managing workloads can become more complex. Sometimes this requires the use of more than one cloud service provider. Instead of trying to manage all of these different providers, cloud orchestration offers a more time and cost-effective solution. By keeping data management under control with workflow automation, organizations can instead focus on larger objectives.
This demo video provides an overview of an easy cloud orchestration solution, integrating Avere's high-performance vFXT Edge filers into the CycleCloud orchestration technology offered by Cycle Computing.
Facing build or lease options for their rendering farm and storage, RVX, a growing special effects studio in Iceland, needed to factor high-performance demand and environmental impact into their cost analysis. As they weighed their options, a plan formed with the help of two providers.
Rui Gomes, chief technology officer at RVX had challenging projects ahead that demanded seamless access to storage resources to render films like 'Everest'. Their needs were quickly outpacing their capacity in their in-house data centre and moving to a cloud service was not an option due to the content needed to remain in a controlled environment. He faced the decision to grow what he owned or look at colocation options that could handle his high performance computing (HPC) needs for complex rendering workflows. In the end, he was able to design a solution that allowed him to check all of the boxes — scalable, accessible and fast with a bonus of an environmentally-friendly footprint. Next steps: deliver powerful, exciting virtual reality (VR) experiences using the same infrastructure.
In this webinar, Gomes and his selected partners walk through his evaluation process, talk about outcomes, and discuss new opportunities. In this webinar, you’ll learn:
- How Gomes compared options, prioritized objectives, and evaluated costs
- About new opportunities in virtual reality using the same infrastructure
- How distance of the co-located infrastructure became a non-issue even with high performance demands
- Important factors in choosing a colocation partner when considering calculated cost benefit and enterprise environmental impact
Join Avere Systems and Carahsoft for a complimentary webcast to learn how to modernize government data centers to gain performance for users, access both compute and storage capacity in the cloud, and protect existing infrastructure investments and IT resources.
Hear how flexibility is key to extending resources and meeting the needs of constituents and partners to deliver government services quickly and thoroughly. The right tools can be implemented quickly and easily without steep learning curves or additional human resources.
In this webcast, you’ll learn:
- Why flexibility is important to the modern federal data center
- How to get immediate performance gains from your existing infrastructure
- How to be ready to add cloud compute and storage resources to meet growing demands
- How to protect existing resources and minimize resource strain while gaining modern flexibility
Mike Nalls (NIA), Jonathan Bingham (Google), Greg Mazzu (Avere), Doug Sainato (Onix)
How the NIA took Parkinson's Research to the Cloud
Join life science and genomic industry leaders -- the Statistical Genetics Group Lead for the National Institute on Aging and the co-founder of Google Genomics -- and learn how researchers use the cloud to securely store, process, explore and share biological datasets. Presenters will describe the process of building a workflow to support a recent study that processed nearly 200 TB of data for 6,500 exomes in just 3.5 weeks, compared to months on local infrastructure.
During this webinar you will:
- Hear firsthand how the National Institute of Aging used Google Genomics to aggregate and process local datasets gathered from constituents across the globe to support Parkinson’s research, and created a high-quality dataset that is securely accessible and will power a number of future studies into biological underpinnings of the disease.
- Learn how to leverage the cloud to gather global research datasets, overcome compute availability resource limitations, and maintain strict data access controls for large scale projects.
- Learn how Google Genomics is helping scientists in cancer genomics, autism, and large patient cohort analyses
- Be introduced to cloud bursting with secure datasets and learn how to gain performance and flexibility in workflows accessing GATK or Galaxy clusters
- Understand the value an authorized Google partner can bring in terms of facilitating project onboarding, setup, procurement, billing, reporting and support.
Learn how you can eliminate bottlenecks when running financial models, risk analyses, and portfolio balancing applications. Improve your storage performance and increase your analysis potential with the power of a caching layer.
Brian Bashaw, Technical Lead - HGST & Jeff Tabor, Sr. Director - Avere Systems
Creating a digital archive that is both accessible and cost effective may seem like an impossible task. While public and private clouds may offer cost-effective scalable storage that is perfect for protecting assets, planning for responsive accessibility, and flexibility can be challenging. To evaluate these hybrid storage models, you must understand object storage and options for file access.
In this webinar, you’ll discover:
•The fundamentals of cloud archives including terminology and core technology
•Why hybrid clouds are a smart option for petabyte-scale archives and how to describe the overall use and economic value to others
•What digital data is appropriate for cloud archiving
•How transition from a legacy storage environment to a public/private hybrid cloud model
•How to enable the active archive to monetize digital assets
For these valuable archives, moving to cloud storage is a big decision, but one that can come with big rewards, like cost efficiency, scalability, and accessibility. These industry experts will provide education, use case examples, and most importantly, answer your questions.
Rick Friedman, VP, Cycle Computing & Scott Jeschonek, Director, Avere Systems
While cloud computing offers virtually unlimited capacity, harnessing that capacity in an efficient, cost effective fashion can be cumbersome and difficult at the workload level. At the organizational level, it can quickly become chaos.
You must make choices around cloud deployment, and these choices could have a long-lasting impact on your organization. It is important to understand your options and avoid incomplete, complicated, locked-in scenarios. Data management and placement challenges make having the ability to automate workflows and processes across multiple clouds a requirement.
In this webinar, you will:
• Learn how to leverage cloud services as part of an overall computation approach
• Understand data management in a cloud-based world
• Hear what options you have to orchestrate HPC in the cloud
• Learn how cloud orchestration works to automate and align computing with specific goals and objectives
• See an example of an orchestrated HPC workload using on-premises data
From computational research to financial back testing, and research simulations to IoT processing frameworks, decisions made now will not only impact future manageability, but also your sanity.
Join Storage Switzerland Lead Analyst George Crump and Avere Systems Director Chris Bowen for a live webinar on March 17th at 1:00pm ET, "4 Ways to Improve NetApp Storage Performance Without Replacing It". In this webinar, George and Chris will discuss why NAS storage performance is so critical, how to balance storage performance and storage capacity, and four ways to improve storage performance without replacing your existing NAS system.
People in analytical roles are demanding more and more compute and storage to get their jobs done. Instead of building out infrastructure for a few employees or a department, systems engineers and IT managers can find value in creating a compute stack in the cloud to meet the fluctuating demand of their clients.
In this 45-minute webinar, you’ll learn:
- How to identify the right analytical workloads
- How to create a scalable compute environment using the cloud for analysts in under 10 minutes
- How to best manage costs associated with the cloud compute stack
- How to create dedicated client stacks with their own scratch space as well as general access to reference data
Health systems departments, research & development departments, and business analyst groups all face silos of these challenging, compute-intensive use cases. By learning how to quickly build this flexible workflow that can be scaled up and down (or off) instantly, you can support business objectives while efficiently managing costs.
Philip Bourne (National Institutes of Health), Wilfred Justin (Amazon Web Services), Jeff Tabor (Avere Systems)
Agencies continue to expand how they use big data technologies to achieve their missions. As big data demands continue to grow and grow, the cloud is quickly becoming the platform of choice, offering security, efficiency and lower costs for supporting mission-critical applications.
In this webinar, Philip Bourne from the National Institutes of Health (NIH) shows how they are improving and expanding their big data initiatives by using the cloud. He discusses the NIH's approach to incorporating cloud into their infrastructure, including best practices and new strategies for leveraging the power of the cloud. He also looks into how the cloud supports the FAIR Principle of data by making it easier to interact with and access valuable data.
Sara Hebert, Brennan Chapman, Aaron Wetherold, Jeff Kember
Moonbot Shoots for the Cloud to Meet Deadlines and Manage Costs
Threatened by deadlines for Academy award submissions, Moonbot Studios faced a shortage of rendering capacity while working on Taking Flight, its newest animated short film, and other important projects. As a small studio with a matching budget, the team did what it does best—it got creative and solved the problem with what they first called “magic.”
In this webinar, the Moonbot team will tell its tale of sending its rendering capacity to Google Compute Engine and how they defied networking odds by caching data close to the animators with an Avere vFXT. Hear Moonbot’s pipeline supervisor tell how they turned cloud data center distance into a non-issue, met deadlines, and gained quantitative benefits that sparked energy in this small team of creative aviators.
In this session, you will learn:
•What drove the Moonbot Studios to move to the cloud
•How they moved complex renders to Google Compute Engine, overcoming data access roadblocks
•Measurable results including speed, economics, flexibility, and creative freedom
The Moonbot Studios flight to the cloud will be supported by Google Cloud Platform and Avere Systems for a complete overview of how the technologies help bring new ideas to life.
Storage Flexibility for Demanding Enterprise Archtitectures
As cloud storage options move into every data center, understanding how to keep options open and easily move data between providers, both on premises and in the cloud, is important to demanding enterprise environments.