Hi [[ session.user.profile.firstName ]]

Three Steps to Modern Media Asset Management with Active Archive

Digital media assets are extremely valuable, providing organizations a collection of tools to reach and engage. Making this growing collection of large files manageable over time can challenge even the most seasoned IT professionals.

In this webinar, we'll look at the workflow in place at large global broadcasters, then discuss the best practices identified while building a modern media asset management system with active archive support. Operations directors, program managers, systems architects, and media technology professionals will learn:

- How the right tools can help bring order to "big data" assets and methodologies for implementation and savings of media professionals valuable time.
- How private cloud archive reduces cost while improving accessibility and security
- How to create a high performance active archive that masks network latency between media asset management software and cloud archives

This presentation will arm you with a complete toolkit to begin evaluating a comprehensive, modern solution for "big data" digital media files for content creators.
Recorded Dec 3 2015 62 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Dave Clack - CatDV, Nancy Bennis - Cleversafe, Bernie Behn - Avere Systems
Presentation preview: Three Steps to Modern Media Asset Management with Active Archive

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Accelerating NGS Workloads with Scientific Cloud Computing Mar 29 2017 6:00 pm UTC 45 mins
    Scott Jeschonek, Director of Cloud Products, Avere Systems
    Next-generation sequencing (NGS) technology has brought an influx of large-scale data and ever-growing demands for compute, resulting in a challenge for those in science informatics. To overcome these issues, laboratories need to be able to support researchers while managing data and the infrastructure needed to deliver mission-critical results and ultimately, positive patient outcomes.

    At the NGS Data Analysis and Informatics Conference in February, attendees spoke about these challenges at length. This webinar, built from content shared at the event, will review potential approaches to cost-effective data growth using cloud compute and storage. Moving genomics workloads to the cloud can provide scale and affordability, while optimizing investments in existing resources. Adding that ability to allow researchers to easily collaborate while reducing latencies to these resources is pertinent to usability.

    In this webinar, you’ll learn how to:
    - Increase computation potential
    - Optimize your physical infrastructure
    - Diagram common use cases to meet specific research goals
    - Increase flexibility by limiting data gravity
  • Solving Enterprise Business Challenges Through Scale-Out Storage & Big Compute Recorded: Mar 15 2017 59 mins
    Michael Basilyan, Google Cloud Platform; Scott Jeschonek, Avere Systems; Rob Futrick, Cycle Computing
    Google Cloud Platform, Avere Systems, and Cycle Computing experts will share best practices for advancing solutions to big challenges faced by enterprises with growing compute and storage needs. In this “best practices” webinar, you’ll hear how these companies are working to improve results that drive businesses forward through scalability, performance, and ease of management.

    In this webinar, you will learn:
    - How enterprises are using Google Cloud Platform to gain big compute and storage capacity on-demand
    - Best practices for efficient use of cloud compute and storage resources
    - Overcoming the need for file systems within a hybrid cloud environment
    - Understand how to eliminate latency between cloud and data center architectures
    - Learn how to best manage simulation, analytics, and big data workloads in dynamic environments
    - Look at market dynamics drawing companies to new storage models over the next several years

    In just 60-minutes, you’ll be presented with a foundation to build infrastructure to support ongoing demand growth and have ample opportunity to ask direct questions to presenters.
  • Cloud Bursting 101: What to Do When Computing Demand Exceeds Capacity Recorded: Feb 16 2017 47 mins
    Bernie Behn, Principal Engineer & Damian Hasak, Solutions Architect Lead
    Deploying applications locally and bursting them to the cloud for compute may seem difficult, especially when working with high-performance, critical information. However, using cloudbursts to offset peaks in demand can bring big benefits and kudos from organizational leaders always looking to do more with less.

    After this short webinar, you’ll be ready to:

    - Explain what cloud bursting is and what workloads it is best for
    - Identify efficiencies in applying cloud bursting to high-performance applications
    - Understand how cloud computing services access your data and consume it during burst cycles
    - Share three real-world use cases of companies leveraging cloud bursting for measurable efficiencies
    - Have seen a demonstration of how it works

    Presenters will build an actionable framework in just thirty minutes and then take questions.
  • Cloud Computing & Digital Media: Scale Peak Demands with Cloud Access Solutions Recorded: Nov 30 2016 60 mins
    Damien Bataille (Eight VFX), Philippe Chotard (Eight VFX), Aaron Wetherold (Avere), Jeff Kember (Google)
    In a cluttered digital world, advertising commercials need to captivate and stimulate to deliver results for brands. To do this, visual effects techniques are bringing jaw-dropping experiences to 60-second spots on the biggest screens to the smallest. With fast-changing production pipelines and unpredictable rendering workloads, Eight VFX shopped for alternatives to rental nodes to meet peak demands. After considering a number of caching options, bandwidth optimization, and even in-house scripting, they designed a one-click cloud access solution that lead to measurable, compelling value.

    In this webinar, you’ll hear how Eight VFX produces honored, high-end commercials for some of the world’s biggest brands using modern cloud computing resources with instant access to unlimited cores.
  • Sizing Up Object Storage for the Enterprise Recorded: Nov 29 2016 61 mins
    George Crump (Storage Switzerland), Jeff Tabor (Avere Systems)
    Object Storage promises many things - unlimited scalability, both in terms of capacity and file count, low cost but highly redundant capacity and excellent connectivity to legacy NAS (Network Attached Storage). But, despite these promises object storage has not caught on in the enterprise like it has in the cloud. It seems like, for the enterprise object storage just isn’t a good fit. The problem is that most object storage system’s starting capacity is too large. And while connectivity to legacy NAS systems is available, seamless integration is not. Can object storage be sized so that it is a better fit for the enterprise?
  • When to Enable Cloud Bursting Recorded: Nov 7 2016 3 mins
    Aaron Wetherold, Solutions Architect, Avere Systems
    Learn what cloud bursting is and when you should use it to run your applications in the cloud using existing NAS protocols. Improve app performance with cloud compute resources while keeping data on-premises.
  • What is a Tiered Cloud File System? Recorded: Nov 7 2016 4 mins
    Jeff Tabor, Dir Product Management and Product Marketing, Avere Systems
    Get the answer to this question, and learn how tiered file systems can be used in the cloud to increase storage performance.
  • Hybrid Cloud Outcomes: Transforming Research IT to Support Precision Medicine Recorded: Oct 13 2016 63 mins
    Aaron Black (Inova Translational Medicine Institute), Jeff Tabor (Avere), Mark Johnston (AWS)
    The Inova Translational Medicine Institute (ITMI) applies genomic and clinical information from individuals to develop personalized healthcare for patients. It is a division of Inova Center for Personalized Health (ICPH), which connects researchers, clinicians and consumers to integrate genomic research for patient care, prevention and wellness.

    ITMI’s Informatics team saw the potential risk of rapid data set growth causing slowed research and increased demand for compute and storage resources. In this webinar, Aaron Black, Director of Informatics for ITMI, will share his innovative hybrid cloud solution that uses a high performance caching layer to couple existing on-premises compute and storage with cost-efficient cloud-based resources.

    Jeff Tabor, Senior Director of Product Management and Marketing for Avere Systems, and Mark Johnston, Director of Business Development for Amazon Web Services, also discuss how ITMI supports large genomic and clinical data sets within their companies.

    Learning Objectives:

    - Discover the objectives of ITMI clinical research programs and how they collaborate with other organizations for studies and to leverage predictive analytics that can guide patient care
    - Learn how ITMI developed and executed a hybrid cloud infrastructure to manage massive research data growth and improve scalability to optimize cost effectiveness and deliver on its mission
    - Discover how ITMI has eliminated redundancies of storing petabyte scale research data between cloud and on-premises storage while managing access to that data for research compute using both cloud and owned resources.
    - Gain insight into technologies and services essential to the topology design and delivery with discussion among supporting panelists
  • HPC Trends in the Trenches Update Recorded: Sep 8 2016 63 mins
    Chris Dagdigian, BioTeam
    Chris Dagdigian, co-founder and CTO of BioTeam, Inc., delivers a candid assessment of the best, the worthwhile, and the most overhyped information technologies (IT) for life sciences. He’ll cover what has changed (or not) in the last year regarding infrastructure, storage, computing, and networks. This presentation, designed for those responsible for information management and support in research institutes, will help you understand what technologies will help you build and support data intensive science.

    This webinar is sponsored by Avere Systems and is being presented in cooperation with AIRI (airi.org).

    We'll also highlight exciting content to be unveiled at the 2nd Annual Converged IT Summit coming to San Diego in October. You won't want to miss it!
  • Efficiency & Scale in Middle Office Financial Services Recorded: Aug 24 2016 59 mins
    William Fearnley, IDC Financial Insights & Scott Jeschonek, Avere Systems
    Within the financial services industry, middle office analytics and simulations continue to grow in volume and complexity. Massive compute and storage demands cause strain on IT resources. While new technologies promise speed and scalability, evaluating this unique middle office environment requires a look at compliance, risk, and pricing analytics to determine potential gains and losses. In this webinar, IDC – Financial Insights Research Director, Bill Fearnley, looks at current middle office IT workflows supporting analytics, backtesting and financial modeling and evaluates a hybrid cloud infrastructure to support growing demands.

    In this webinar, you’ll:

    · Hear an IDC Analyst’s view on the current financial services IT environment
    · Learn of common challenges and approaches to combat growing strain on compute and storage infrastructure
    · Join in a discussion about the viability of enabling cloud services to expand compute and storage capacity
    · Gain guidance on how large hedge funds and investment banks are overcoming inherent cloud challenges like latency, data accessibility, and cost management
  • Cloud Power: Data Acces and Orchestration Recorded: Aug 23 2016 29 mins
    Mike Requa (Cycle Computing), Scott Jeschonek (Avere)
    As cloud computing becomes more widely used, the complexity of managing workloads can become more complex. Sometimes this requires the use of more than one cloud service provider. Instead of trying to manage all of these different providers, cloud orchestration offers a more time and cost-effective solution. By keeping data management under control with workflow automation, organizations can instead focus on larger objectives.

    This demo video provides an overview of an easy cloud orchestration solution, integrating Avere's high-performance vFXT Edge filers into the CycleCloud orchestration technology offered by Cycle Computing.
  • Transparent Footprints: Optimizing HPC Workloads with Colocated Infrastructure Recorded: Aug 3 2016 55 mins
    Rui Gomes (RVX), Leigh Huntridge (Avere), and Jorge Balcells (Verne Global)
    Facing build or lease options for their rendering farm and storage, RVX, a growing special effects studio in Iceland, needed to factor high-performance demand and environmental impact into their cost analysis. As they weighed their options, a plan formed with the help of two providers.

    Rui Gomes, chief technology officer at RVX had challenging projects ahead that demanded seamless access to storage resources to render films like 'Everest'. Their needs were quickly outpacing their capacity in their in-house data centre and moving to a cloud service was not an option due to the content needed to remain in a controlled environment. He faced the decision to grow what he owned or look at colocation options that could handle his high performance computing (HPC) needs for complex rendering workflows. In the end, he was able to design a solution that allowed him to check all of the boxes — scalable, accessible and fast with a bonus of an environmentally-friendly footprint. Next steps: deliver powerful, exciting virtual reality (VR) experiences using the same infrastructure.

    In this webinar, Gomes and his selected partners walk through his evaluation process, talk about outcomes, and discuss new opportunities. In this webinar, you’ll learn:

    - How Gomes compared options, prioritized objectives, and evaluated costs
    - About new opportunities in virtual reality using the same infrastructure
    - How distance of the co-located infrastructure became a non-issue even with high performance demands
    - Important factors in choosing a colocation partner when considering calculated cost benefit and enterprise environmental impact
  • Modernizing the Government Data Center Recorded: Jun 23 2016 50 mins
    Scott Jeschonek, Avere Systems
    Join Avere Systems and Carahsoft for a complimentary webcast to learn how to modernize government data centers to gain performance for users, access both compute and storage capacity in the cloud, and protect existing infrastructure investments and IT resources.

    Hear how flexibility is key to extending resources and meeting the needs of constituents and partners to deliver government services quickly and thoroughly. The right tools can be implemented quickly and easily without steep learning curves or additional human resources.

    In this webcast, you’ll learn:
    - Why flexibility is important to the modern federal data center
    - How to get immediate performance gains from your existing infrastructure
    - How to be ready to add cloud compute and storage resources to meet growing demands
    - How to protect existing resources and minimize resource strain while gaining modern flexibility
  • Cloud Computing for Batch Operations in Financial Services Recorded: Jun 9 2016 5 mins
    Scott Jeschonek, Avere Systems
    Discover how to best use cloud bursting for batch operations. This video also covers topics like: data gravity, when to use cloud bursting, and using cloud cache in batch processing workflows.
  • Using Cloud Computing for Faster Discovery Recorded: Jun 1 2016 47 mins
    Mike Nalls (NIA), Jonathan Bingham (Google), Greg Mazzu (Avere), Doug Sainato (Onix)
    How the NIA took Parkinson's Research to the Cloud

    Join life science and genomic industry leaders -- the Statistical Genetics Group Lead for the National Institute on Aging and the co-founder of Google Genomics -- and learn how researchers use the cloud to securely store, process, explore and share biological datasets. Presenters will describe the process of building a workflow to support a recent study that processed nearly 200 TB of data for 6,500 exomes in just 3.5 weeks, compared to months on local infrastructure.

    During this webinar you will:

    - Hear firsthand how the National Institute of Aging used Google Genomics to aggregate and process local datasets gathered from constituents across the globe to support Parkinson’s research, and created a high-quality dataset that is securely accessible and will power a number of future studies into biological underpinnings of the disease.

    - Learn how to leverage the cloud to gather global research datasets, overcome compute availability resource limitations, and maintain strict data access controls for large scale projects.

    - Learn how Google Genomics is helping scientists in cancer genomics, autism, and large patient cohort analyses

    - Be introduced to cloud bursting with secure datasets and learn how to gain performance and flexibility in workflows accessing GATK or Galaxy clusters

    - Understand the value an authorized Google partner can bring in terms of facilitating project onboarding, setup, procurement, billing, reporting and support.
  • Accelerating Financial Simulations and Risk Models Recorded: May 18 2016 7 mins
    Scott Jeschonek, Avere Systems
    Learn how you can eliminate bottlenecks when running financial models, risk analyses, and portfolio balancing applications. Improve your storage performance and increase your analysis potential with the power of a caching layer.
  • Active Archive: Build Accessible, Responsive Archives with Cloud Scale Economics Recorded: May 4 2016 62 mins
    Brian Bashaw, Technical Lead - HGST & Jeff Tabor, Sr. Director - Avere Systems
    Creating a digital archive that is both accessible and cost effective may seem like an impossible task. While public and private clouds may offer cost-effective scalable storage that is perfect for protecting assets, planning for responsive accessibility, and flexibility can be challenging. To evaluate these hybrid storage models, you must understand object storage and options for file access.

    In this webinar, you’ll discover:
    •The fundamentals of cloud archives including terminology and core technology
    •Why hybrid clouds are a smart option for petabyte-scale archives and how to describe the overall use and economic value to others
    •What digital data is appropriate for cloud archiving
    •How transition from a legacy storage environment to a public/private hybrid cloud model
    •How to enable the active archive to monetize digital assets

    For these valuable archives, moving to cloud storage is a big decision, but one that can come with big rewards, like cost efficiency, scalability, and accessibility. These industry experts will provide education, use case examples, and most importantly, answer your questions.
  • Deliver Best-in-Class HPC Cloud Solutions Without Losing Your Mind Recorded: Apr 13 2016 61 mins
    Rick Friedman, VP, Cycle Computing & Scott Jeschonek, Director, Avere Systems
    While cloud computing offers virtually unlimited capacity, harnessing that capacity in an efficient, cost effective fashion can be cumbersome and difficult at the workload level. At the organizational level, it can quickly become chaos.

    You must make choices around cloud deployment, and these choices could have a long-lasting impact on your organization. It is important to understand your options and avoid incomplete, complicated, locked-in scenarios. Data management and placement challenges make having the ability to automate workflows and processes across multiple clouds a requirement.

    In this webinar, you will:

    • Learn how to leverage cloud services as part of an overall computation approach
    • Understand data management in a cloud-based world
    • Hear what options you have to orchestrate HPC in the cloud
    • Learn how cloud orchestration works to automate and align computing with specific goals and objectives
    • See an example of an orchestrated HPC workload using on-premises data

    From computational research to financial back testing, and research simulations to IoT processing frameworks, decisions made now will not only impact future manageability, but also your sanity.
  • 4 Ways to Improve NetApp Storage Performance Without Replacing It Recorded: Mar 17 2016 59 mins
    George Crump, Storage Switzerland
    Join Storage Switzerland Lead Analyst George Crump and Avere Systems Director Chris Bowen for a live webinar on March 17th at 1:00pm ET, "4 Ways to Improve NetApp Storage Performance Without Replacing It". In this webinar, George and Chris will discuss why NAS storage performance is so critical, how to balance storage performance and storage capacity, and four ways to improve storage performance without replacing your existing NAS system.
  • Building a Just-in-Time Application Stack for Analysts Recorded: Feb 17 2016 49 mins
    Scott Jeschonek
    People in analytical roles are demanding more and more compute and storage to get their jobs done. Instead of building out infrastructure for a few employees or a department, systems engineers and IT managers can find value in creating a compute stack in the cloud to meet the fluctuating demand of their clients.

    In this 45-minute webinar, you’ll learn:

    - How to identify the right analytical workloads
    - How to create a scalable compute environment using the cloud for analysts in under 10 minutes
    - How to best manage costs associated with the cloud compute stack
    - How to create dedicated client stacks with their own scratch space as well as general access to reference data

    Health systems departments, research & development departments, and business analyst groups all face silos of these challenging, compute-intensive use cases. By learning how to quickly build this flexible workflow that can be scaled up and down (or off) instantly, you can support business objectives while efficiently managing costs.
Storage Flexibility for Demanding Enterprise Archtitectures
As cloud storage options move into every data center, understanding how to keep options open and easily move data between providers, both on premises and in the cloud, is important to demanding enterprise environments.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Three Steps to Modern Media Asset Management with Active Archive
  • Live at: Dec 3 2015 4:00 pm
  • Presented by: Dave Clack - CatDV, Nancy Bennis - Cleversafe, Bernie Behn - Avere Systems
  • From:
Your email has been sent.
or close