Hi [[ session.user.profile.firstName ]]

Meeting Demand for Capacity-driven Data with Object Storage

This 1-hour webinar will discuss the ability for organizations to meet demand for capacity-driven data with object storage.

Today’s enterprise data requirements are clearly dividing into a need for latency-sensitive and capacity-driven solutions, as organizations store and exploit data from existing and increasingly machine generated sources. This webinar looks at how enterprises meet the demand for capacity-driven data with object storage solutions from the major and upcoming solution vendors. During the webinar you will learn:

•Factors driving the adoption of object storage
•Critical features to look out for in object storage solutions
•Analysis of vendor offerings available in the market today
•Gigaom’s assessment of the market leaders and followers

Join Gigaom Research and Hitachi Data Systems (HDS) for this free expert webinar.
Recorded May 15 2019 65 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Chris Evans, Scott Baker
Presentation preview: Meeting Demand for Capacity-driven Data with Object Storage

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Perfect Partners: DevOps and Hybrid Cloud Jul 11 2019 3:00 pm UTC 62 mins
    jon Collins, Chris Merz, Ingo Fuchs
    This free 1-hour webinar from GigaOm Research brings together experts in DevOps and Hybrid Cloud, featuring GigaOm Jon Collins and special guests from NetApp, Ingo Fuchs, Chief Technologist, Cloud and DevOps, and Chris Merz, Principal Technologist, DevOps. The discussion will focus on delivering DevOps and the challenges faced when adopting this culture.

    Not only do organizations find it challenging to scale DevOps practices across the business, it’s also hard to keep pace with the fast, real-time needs of application development. As companies face the reality of the pace required for software development, they’re simultaneously challenged to balance quality and security, governance and management, complexity and collaboration, and they require an agile storage strategy to allow them to pivot as needs change.

    Should such organizations just go back to waterfall models and single-server, three-layer architectures? Of course not. The right approach allows them to pick and choose the right infrastructure, whether it be in the cloud, or on-premises, to deliver the right solutions for their business needs - all without affecting their developer experience. In this webinar, we talk to NetApp, itself a large enterprise using both DevOps and hybrid multi-cloud strategy, about lessons learned from helping customers towards this goal.

    In this 1-hour webinar, attendees will discover:

    • The challenges faced by organizations looking to adopt DevOps
    • What this means in terms of symptoms, costs, efficiency and effectiveness impacts
    • Best practices for supporting software development
    •What tools, technologies, and processes enable DevOps to exist, no matter your storage strategy
    •How to choose the right infrastructure for your strategy
  • What’s Best for Analytics & Data Science: Cloud, On-Premises or Hybrid? Jun 26 2019 4:00 pm UTC 61 mins
    Andrew Brust, Mathias Golombek, Helena Schwenk
    The cloud will inevitably be a component of your customers’ and prospects’ data strategies, not to mention your own. But how does this impact analytics and data science? That question is especially important since the journey to the cloud has many stops, and most companies won’t move all their data to the cloud immediately.

    It’s a heterogeneous, hybrid world out there: some data must stay on-premises, while some data is born in the cloud and should stay there. Other data allows discretion around migration and can be left in place. But when it comes to both analytics and data science, the work should encompass all data. Yet, how is that achieved with a single platform, when data isn’t centralized?

    The good news is solutions and architectures exist to achieve this mission. To learn more about applying analytics and harnessing data science across all data in the cloud, on-premises or otherwise, plan to attend this free 1-hour webinar from GigaOm Research. The Webinar features GigaOm Analyst Andrew Brust and special guests, Mathias Golombek, CTO, and Helena Schwenk, AR and Market Insights Manager from Exasol, a leader in in-memory analytic databases.

    In this 1-hour webinar, we will explore:

    - Why the cloud together with analytics and data science are such a good match
    - What hybrid cloud is, what its practical implications are, and how to make it work
    - Considerations for cloud analytics performance and scale
    - Using the cloud for AI and data science
  • Building a Sustainable Cloud Strategy in Multi-Cloud Environments Jun 26 2019 2:00 pm UTC 57 mins
    David Linthicum, Enrico Signoretti, Jim Donovan, Nathan Goulding
    Data is now created and consumed from more sources than ever, and with IoT and Edge computing this is even more so. The challenge is no longer how to make cloud computing central to the IT strategy, but is now about avoiding lock-ins, and keeping costs at bay while giving access to an increasing number of applications, devices and users — all dispersed across the globe on different clouds and networks.

    As the great migration to cloud is fully underway, organizations are now moving from cloud-first strategies to multi-cloud, and they seek solutions allowing access and the ability to process data quickly at reasonable costs, no matter where data is created or consumed.

    The goals are to build a tailored cloud made of best-of-breed “Cloud 2.0” solutions while removing any form of lock-in, and to outperform in terms of performance and cost what is currently available from established market vendors. But how can you build a new cloud model such as this one? There’s a host of disruptive Cloud 2.0 companies that provide Alternative Cloud Strategies built on Best-of-Breed Solutions, including Wasabi and Packet, our featured guests in this webinar.

    Moderated by GigaOm analyst David Linthicum and co-presented with Enrico Signoretti and special guests, SVP of Product at Wasabi Jim Donovan and SVP of Engineering at Packet, Nathan Goulding. In this one-hour GigaOm webinar we’ll be discussing how to overcome the limits and lock-ins imposed by traditional approaches, and how to lay the next-gen infrastructure for today’s and tomorrow’s applications and data. We will look specifically at cloud strategies and how to execute them.
  • DevOps 2020: Strategies to Increase DevOps Performance Across the Enterprise Jun 25 2019 6:00 pm UTC 59 mins
    Jon Collins, Baruch Sadogursky
    This free 1-hour webinar from GigaOm Research brings together experts in DevOps, featuring GigaOm analyst Jon Collins and a special guest from JFrog, Baruch Sadogursky, Head of Developer Relations, discussing how to set a pragmatic strategy for scaling DevOps that enables your organization to increase efficiency and productivity, without hampering delivery. Explore how to bring under management the assets — code and binaries — used in the development pipeline, building a competency that delivers and distributes right to the edge.

    If you’ve embraced DevOps as a way of delivering better software, faster and cheaper into deployment and management, you may be looking to move from a single success story to broader use, or perhaps you are ready to take its principles to a whole new area of development and operations, such as sensor-based IoT. In each case, the challenge has become clear: how to can the enterprise align different development groups without becoming prescriptive or stifling innovation?

    In this 1-hour webinar, you will discover:

    - What challenges enterprises face as they look to scale their use of DevOps across the business and out to the edge
    - What stages to work through to ensure that decision-makers balance the needs of efficiency and productivity with standardization and governance
    - Where do tools and technologies help support at each stage, and how do roles, responsibilities, and processes evolve along the way?
    - Where can you start delivering higher levels of efficiency without impacting productivity or creating new overheads?

    So, if you are looking to broaden your use of DevOps and want to know where to start, or if you are already on the journey and dealing with the challenges, attend this webinar and bring your questions with you.
  • Leveraging Desktop-as-a-Service Platforms to Simplify Multicloud Jun 25 2019 4:00 pm UTC 58 mins
    David Linthicum, Casten Puls, Simon Gibson
    Led by GigaOm Analyst David Linthicum, and co-presented with Simon Gibson and special guest, Carsten Puls, Senior Director, Frame at Nutanix, this free 1-hour GigaOm Research webinar focuses on the value of desktop-as-a-service platforms to simplify multicloud and remove complexities. As the world of IT, including cloud, becomes more complex, private and public clouds are increasingly expected to replace traditional computing and eventually make things easier. However, for most enterprises, the cloud is now an additive to existing traditional systems, and that adds complexity and reduces business value.

    Interfaces, or ways that the users interact with the infrastructure is key to removing this complexity. This is the approach where user interfaces are virtualized, in short, complexly abstracted from the complexity of the underlying environment including clouds. A world where IT is more simplistic, thus more productive to the business. So where to start? Begin by addressing complexity head-on. In this
    1-hour webinar, you will discover:

    • The concept of desktop-as-a-service including how it works, what it is, and where to get it.
    • How things have become complex, and how to measure complexity for your enterprise.
    • How to understand the business value of dealing with complexity.
    • Steps toward mapping out a course of action.
  • Automating AI in the Data-Driven Enterprise Webinar Jun 25 2019 2:00 pm UTC 60 mins
    Andrew Brust, Kurt Muehmel
    Artificial intelligence (AI) and machine learning (ML) are exciting, growing in adoption and more applicable to business contexts than ever before. The problem is, a lot of AI/ML work to date has relied upon rare and expensive data scientists. These practitioners take data sets, then experiment with different frameworks, algorithms and parameter values to create the best predictive models possible. This bespoke approach can be fascinating, but it just won’t scale sufficiently to bring AI and ML into the Enterprise mainstream.

    Why perpetuate an AI process that is so manual when heuristics – and even ML itself – can help automate data cleansing, algorithm selection and parameter tuning? Yes, AI automation can help organizations without data scientists do AI and ML. But beyond that, even orgs with strong data science teams can use AI automation to remove tedium from, and increase accuracy in, their work. It’s win-win, even if it removes some of AI’s mystique.

    Want to learn more about automating AI, and making it actionable and accessible to all your data workers? Join us for this free 1-hour webinar from GigaOm Research. The webinar features GigaOm analyst Andrew Brust with Kurt Muehmel, VP of Sales Engineering at Dataiku, a leader in AI for the Enterprise.

    In this 1-hour webinar, you will discover:

    Why bespoke AI work is fast becoming unsustainable
    The science and power behind automated machine learning (AutoML)
    How to leverage AI automation while combating bias, drift and other AI/ML pitfalls
    Register now to join GigaOm Research and Dataiku for this free expert webinar.
  • Five Secrets to On-Prem Kubernetes for Real World Enterprise IT Needs Recorded: Jun 21 2019 56 mins
    David Linthicum, John Mao
    This free 1-hour webinar from GigaOm Research brings together experts in Kubernetes on-prem ops success, featuring GigaOm Analyst David Linthicum and special guest John Mao, VP Business Development from Stratoscale.

    You don’t have to read all of the analyst surveys to understand that Kubernetes usage is accelerating in 2019. Indeed, while the growth in the cloud is exceptional, the growth in leveraging Kubernetes on premises is just as impressive.

    However, many enterprises with a need to deploy Kubernetes in their data center are left out in the cold. Tools that focus on Kubernetes on-prem don’t provide the value IT is expecting, and thus their Kubernetes on-prem ops are at risk of failing.

    Enter new tools and approaches that can ensure success. Moreover, enter in new guidance that allows leaders to approach the problem armed with the right knowledge.

    Indeed, five core secrets exist to approach on-prem ops, including, deployment, monitoring, security, upgrading, and of course scaling.

    In this 1-hour webinar, we will explore:

    • Why approaches to Kubernetes success are changing.
    • Emerging best practices and technology.
    • Five secrets of Kubernetes enabling containers orchestration that most enterprises and cloud providers need for success
    •Must-know keys to enabling successful technology
  • On-Ramps to Cloud Roadways: Cloud Data Migration at Scale Recorded: Jun 20 2019 62 mins
    Andrew Brust, Jagane Sundar
    Cloud adoption is in full swing, no longer dominated by hype and just a few early adopters. In the world of data analytics, migrating data from on-premises distributed storage systems to cloud object storage is the mission. Once the data has landed there, multiple data services can query and analyze it. The payoff can be huge, but the devil’s in the details.

    While the problem of migrating applications to the cloud has largely been solved, migrating data is much less straightforward. Today’s enterprise data volumes can be quite large, resulting in long-running data migration processes. Enterprises can’t just hit pause on their businesses, though. So how can they maintain operational continuity while conducting lengthy migrations?

    To learn more about cloud data migration challenges and solutions, join us for this free 1-hour webinar from GigaOm Research. The webinar features GigaOm analyst Andrew Brust and special guest Jagane Sundar from WANdisco, a leader in cloud data movement and hybrid data lake solutions.

    In this 1-hour webinar, you will discover:

    How to migrate Hadoop clusters at scale
    The realities of cloud data migration management
    The multi-data-engine benefits of cloud data lakes
    Register now to join GigaOm and WANdisco for this free expert webinar.

    Who Should Attend:

    CTOs
    Chief Data Officers
    Cloud Architects
    Data Engineers
    Database Administrators (DBAs)
  • Perfect Partners: DevOps and Hybrid Cloud Recorded: Jun 20 2019 62 mins
    jon Collins, Chris Merz, Ingo Fuchs
    This free 1-hour webinar from GigaOm Research brings together experts in DevOps and Hybrid Cloud, featuring GigaOm Jon Collins and special guests from NetApp, Ingo Fuchs, Chief Technologist, Cloud and DevOps, and Chris Merz, Principal Technologist, DevOps. The discussion will focus on delivering DevOps and the challenges faced when adopting this culture.

    Not only do organizations find it challenging to scale DevOps practices across the business, it’s also hard to keep pace with the fast, real-time needs of application development. As companies face the reality of the pace required for software development, they’re simultaneously challenged to balance quality and security, governance and management, complexity and collaboration, and they require an agile storage strategy to allow them to pivot as needs change.

    Should such organizations just go back to waterfall models and single-server, three-layer architectures? Of course not. The right approach allows them to pick and choose the right infrastructure, whether it be in the cloud, or on-premises, to deliver the right solutions for their business needs - all without affecting their developer experience. In this webinar, we talk to NetApp, itself a large enterprise using both DevOps and hybrid multi-cloud strategy, about lessons learned from helping customers towards this goal.

    In this 1-hour webinar, attendees will discover:

    • The challenges faced by organizations looking to adopt DevOps
    • What this means in terms of symptoms, costs, efficiency and effectiveness impacts
    • Best practices for supporting software development
    •What tools, technologies, and processes enable DevOps to exist, no matter your storage strategy
    •How to choose the right infrastructure for your strategy
  • Strategies for the Data Warehouse in the Enterprise Today Recorded: Jun 20 2019 61 mins
    William McKnight, Paige Roberts
    This free 1-hour webinar from GigaOm Research brings together experts in modern uses of data warehousing. Featuring GigaOm Analyst William McKnight and a special guest from Vertica, Paige Roberts, the presentation will focus on strategies enterprises should be undertaking today for evolving data warehouse ecosystems.

    Despite the allure and utility of other analytic data constructs, enterprises continue to invest the most in, and get tremendous benefit from, great data warehouses. The data warehouse is still the key component of the actionable analytic future. However, the data warehouse of today is used differently, can accomplish new things, and requires different technical and business strategies to get the most out of it, than before.

    In this webinar, McKnight dives into the current state of data warehouse initiatives. He discusses how implementation realities of the last two decades have demonstrated the importance of the data warehouse, and why enterprise leaders must think differently about data warehousing as a whole.

    In this 1-hour webinar, attendees will discover:

    • The state of the data warehouse today and strategies to maximize the return on data warehouse investments of 2019 and beyond.
    • The flavors of the data warehouse and the criteria for ensuring your data warehouse(s) is up to standard
    • Key considerations when building a new data warehouse, evolving an existing data warehouse, or contemplating a re-platforming.
    •The continued relevance of data warehousing today
    •How to know when to put your data warehouse in contain mode
    •Ideas for data platform selection for a data warehouse
  • Deploying Hyper-Converged Infrastructure (HCI) to Modernize IT Recorded: Jun 20 2019 60 mins
    Myank Gupta, Phil Sellers, Ray Lucchesi, Steve Ginsberg
    This free 1-hour webinar from GigaOm Research brings together experts in hyper-converged architecture (HCI) and data center infrastructure and administration to explore how enterprise leaders are using HCI to modernize data center infrastructure and prepare it for public cloud co-existence.

    Featuring, Mayank Gupta, Product Marketing Lead, Core HCI, at Nutanix, the discussion will be moderated by GigaOm Analyst Ray Lucchesi and includes GigaOm analysts, Phil Sellers and Steve Ginsberg. Many data center managers are struggling with infrastructure sprawl, adding servers and storage to address never-ending, application requirements. But for many, this is a symptom rather than a solution. Resource utilization is typically abysmal in most non-virtualized data centers. Modernizing a data center with HCI has the potential to drastically improve utilization, reduce complexity and provide a platform that can better co-exist with public cloud.

    In this 1-hour webinar, the team examines:

    • How employing HCI software-defined infrastructure can modernize the data center
    • How the use of HCI can increase resource utilization and reduce complexity across the data center.
    • How to build HCI in the data center and public cloud to co-exist with and support IT application needs.

    Participants will come away with a better appreciation of how HCI can help modernize the data center while at the same time improving utilization, reducing complexity and bringing about a data center better able to co-exist with public cloud.
  • How Google’s New Hybrid-Cloud Strategy Advancements Impact Stateful Workloads Recorded: Jun 19 2019 58 mins
    David Linthicum, Radhesh Menon
    This free 1-hour webinar from GigaOm Research brings together experts in cloud computing, storage, containers, and hybrid cloud to discuss how Google’s recent hybrid-cloud strategy Anthos announcement at Google Next ’19 changes production outlook for your IT and DevOps teams.

    Featuring GigaOm Analyst, David Linthicum and a special guest from Robin.io, Radhesh Menon, this webinar will explore Google's Hybrid-Cloud strategy Cloud’s Anthos announced at Google Next, and how that technology can change your outlook on both development and ability to modernize Stateful applications.

    Storage for stateful containerized applications is a fundamental building block of applications on hybrid clouds, leveraging containers, container orchestration, and other parts of that emerging ecosystem. While the focus for container-based applications has been stateless, the ability to maintain state both within and between containerized applications has been a core requirement that DevOps and IT. Attend this webinar to take the mystery out of running stateful applications in containers and managing using Kubernetes with Google Cloud can enable hybrid- and multi-cloud strategy through its new solution, Anthos.

    In this 1-hour webinar, attendees will:

    • Better understand the state of the technology, and what container native storage solutions bring to modern IT.
    • Understand Google Hybrid and multi-cloud strategy and explore new solutions that have just been announced, that may change the game in how you build, deploy, and operate containers.
    • Learn how to hide complexity, while providing and maintaining richness of capability.
  • The “Frontline-First” Approach to Digitization: What It Is and Why It’s Better? Recorded: Jun 19 2019 53 mins
    Stowe Boyd, Kevin O'Donnell
    This free 1-hour webinar from GigaOm Research brings together experts in Digital Transformation, featuring GigaOm Stowe Boyd and a special guest from Nitro, Kevin O’Donnell, Director of Product. The discussion will focus on the “Frontline-First” approach to improving workflow management.

    Too often, enterprise change initiatives focus on technology, infrastructure, and process without considering the people and workflows that are the foundation of a company’s productivity.

    Those working on the frontline — in close contact with customers and a changing competitive marketplace — are best positioned and most likely to push for new ways to save time and increase efficiency so long as management lowers barriers to making improvements.

    In this 1-hour webinar, you will discover:

    • Benefits of empowering both frontline and back office workers with the tools to improve their paper-based workflows
    • Power of user analytics when attempting to drive measurable and lasting change
    • Success stories powering this new ‘Frontline-First’ approach to workflow improvement
  • Creating A Winning Digital Transformation Strategy Recorded: Jun 19 2019 58 mins
    Art Langer, David Linthicum, EJ Bodner AI, Artificial Intelligence, Internet of Things, Mobile, 5G, IoT
    This free 1-hour webinar from GigaOm Research brings together experts in Digital Transformation, featuring GigaOm David Linthicum and special guest from Nutanix, Dr. Arthur Langer. The discussion will focus on the stepwise process of understanding where you are in employing digital, where you need to go, and the specifics to drive success the first time.

    “Digital transformation” is certainly the buzzword of 2019, but most don’t know what it is, why it is important and how to do it right. By investing in digital experience transformation enterprises can discover valuable customer behavior insights and execute personalized experiences to engage, delight and retain customers.

    In this Webinar, we’ll take the mystery out of what it takes to create and execute on a digital transformation strategy, including what’s important, and what needs to be avoided. This will be a candid discussion with industry experts. We’ll explore what technology to leverage, best practices, and approaches.

    In this 1-hour webinar, you will discover:

    • Reference technical architecture of digital transformation, from 5G to IoT.
    • Social architecture, or how people are important to this transformation strategy.
    • How to sell digital initiatives to the board of directors, and other stakeholders.
    • Core metrics and continuous integration that keep priorities straight.
  • Unstructured Data Management: A New Way of Fighting Data Recorded: Jun 18 2019 62 mins
    Enrico Signoretti, Krishna Subramanian
    The fight against data growth and consolidation was lost a long time ago. Several factors contribute to the increasing amount of data we store in storage systems such as users desire to keep everything they create, organizational policies, new types of rich documents, new applications, and demanding regulations are just some of the culprits.

    Many organizations try to eliminate storage silos and consolidate them in large repositories while applying conservative policies to save capacity. It’s time to take a different approach and think about managing data correctly to get the most out of it. Unstructured data, if not correctly managed, could become a huge pain point from both finance and technical perspectives. It is time to think differently and transform this challenge into an opportunity. Let us show you how to get more efficiency and value out of your data!

    Join this free 1-hour webinar from GigaOm Research that is bringing together experts in data storage and cloud computing, featuring GigaOm Enrico Signoretti and special guest from Komprise, Krishna Subramanian.

    In this 1-hour webinar, you will discover:

    • The reality beyond data growth and what to expect from the future
    • Why data storage consolidation failed
    • Why cloud isn’t a solution if not managed correctly
    • How to preserve and improve the user experience while changing how your infrastructure works
    • How to understand the value of stored data to take advantage of it
    • How data management is evolving and what you should expect in 12-to-18 months.
  • Hyper-Converged Infrastructure: Maximizing TCO and ROI Recorded: Jun 18 2019 58 mins
    Dave Demlow, Dusty Koenkenberg, Enrico Signoretti, Jody Harper Storage Capacity, HCI
    This free 1-hour webinar from GigaOm Research brings together IT infrastructure experts, featuring GigaOm Enrico Signoretti and special guests from Scale Computing, Dave Demlow, and the Detroit Symphony Orchestra, Jody Harper. The presentation and discussion will focus on the benefits of hyper-converged infrastructure (HCI) from the point of view of end users (those that have already adopted it) and an analysis of different approaches to HCI.

    End users struggle to keep their infrastructure simple, flexible, and scalable while keeping costs at bay. Hyperconvergence is a solution that promises it all, but there are also trade-offs and different approaches to solving the same problem. What is the best solution for your organization?

    Identifying the right solution is not easy, and it depends on several factors including system architecture, simplicity, usability, support, ease of use, and more.

    In this 1-hour webinar, you will discover:

    • Different types of HCI
    • What you can really do with HCI
    • The benefits of end-to-end solutions
    • HCI for centralized and distributed organizations
    • Key criteria to evaluate an HCI infrastructure
    • Data protection and Disaster Recovery options
    • Edge-Core-Cloud
    • The role of Cloud to improve infrastructure TCO

    This webinar is designed to inform and educate end users who are considering innovative solutions for their infrastructure strategy, giving both technical and business perspectives on the covered topics.
  • Performance, Capacity and Lower TCO. Can a single storage solution offer it all? Recorded: Jun 14 2019 58 mins
    Enrico Signoretti, Greg Klieinman, Jason Oblinger, Scott Hudson
    The storage industry is facing challenging times. Today’s CxOs, IT Specialists, and Project Managers adopting a two-tier solution have much to consider. On one side sits exponential data growth, longer retentions, and capacity-driven applications. On the other side we face applications requiring rapid data access with minimal latency being key to business growth. This trend is quickly accelerating as new applications enable IoT and leverage artificial intelligence demanding seamless integration with the cloud.

    The traditional classification of primary and secondary storage no longer applies. Associating structured data to the first and unstructured data to the latter is limiting with modern applications and computing paradigms. Primary and secondary storage as we knew them, impose too many constraints leading to a rise in infrastructure complexity and costs, rendering the strategy unsustainable. IT leaders today require solutions that automate regular tasks to simplify infrastructure and reduce expenses.

    During this one-hour webinar, GigaOm analyst Enrico Signoretti and special guests Greg Kleiman from Datrium, as well as Jason Oblinger, Sr. Systems Administrator from Next Generation Films, Inc. and Scott Hudson, System Administrator for ScienceLogic, will analyze several aspects of two-tier storage strategies including encryption, introduce Datrium’s innovative solution, and provide testimonies from end users that have adopted it.

    During this webinar, attendees will learn:

    •Why a two tier-storage strategy
    •Different types of tier integration
    •The benefits of an integrated two-tier storage infrastructure
    •Introduction to Datrium DVX platform and cloud services
    •How Datrium solution ecosystem fits in a two-tier storage strategy
    •The benefits of Datrium for the people in your organization
    •A successful two-tier storage strategy implementation with Datrium
  • Using Value Stream Management to Optimize DevOps Workflow Recorded: Jun 14 2019 61 mins
    Jon Collins, Karan Malhi, Michael Baldani
    While DevOps is acknowledged as delivering on an organization’s innovation goals, it is not prescriptive in terms of workflows or activities, nor indeed toolsets. And, as many organizations are discovering, simply ‘doing’ DevOps, or employing continuous integration and delivery tools and frameworks, does not always lead to innovation nirvana. As development, testing, and deployment processes are automated, attention turns to ensuring that the end-to-end process delivers effective results, with maximum efficiency.

    Value Stream Management is a discipline that focuses on the value delivered at each step. It defines the optimum stream for a given scenario, with best results and at a reduced cost. Delivering on this theory can depend on where organizations are on the DevOps journey. Less mature organizations may prefer more prescriptive solutions with the right kinds of guardrails in place, offering the best way to solve a particular kind of problem, such as developing a mobile app or deploying a cloud-based scalable solution. Other, more sophisticated enterprises, may look at how to connect teams, tools, and applications for a single view across the software delivery process.

    This free one-hour webinar from GigaOm Research brings together leading experts in DevOps, featuring GigaOm Principal Analyst, Jon Collins and Karan Malhi, Director of Product from CloudBees. In this webinar, we look at how Value Stream Management offers a roadmap towards better, faster software delivery and operational excellence.

    We consider:
    •The importance of visibility and insight across all value streams to effectively measure and manage DevOps performance.
    •How to access key benchmark metrics and track performance as a basis to learn, improve, and deliver better results at reduced costs.
    •Practical examples and real-world case studies to show how Value Stream Management can be part of an organization's innovation journey.
  • Boosting Data Science Project Success: Development is Not Enough Recorded: Jun 14 2019 59 mins
    Andrew Brust, Michael Nixon
    For enterprise data science environments, development platforms aren’t enough. The data itself, and the infrastructure that serves it, are foundational to data science work. Implementing them well is a strategic necessity.

    Fast, governed access to the right data is critical to data science project success. The data must be cleansed and of high quality, with proper role-based access enforced. The infrastructure must manage workloads well, letting data scientists explore, query and shape the data they need, to build the best predictive models.

    Join us for this free one-hour webinar with GigaOm Research and Snowflake, focused on data warehousing built for the cloud, to review infrastructure strategies that boost data science production.

    You will learn:
    •How to support demanding data science projects without hampering other data teams
    •Strategies to secure data easily and effectively, while keeping authorized access agile and available
    •How to accelerate query performance cost-effectively for data access technologies such as Python, Spark SQL, and others
  • AI Operations: It Can’t Be Just an Afterthought Recorded: Jun 13 2019 63 mins
    Andrew Brust, Adnan Khaleel, Dr. Sambit Bhattacharya
    In the worlds of machine learning (ML) and deep learning (DL), operations and deployment is a subject that often falls by the wayside. And the split reality between everyday on-premises Artificial Intelligence (AI) work and the industry’s fascination with more aspirational cloud-based AI work only makes matters worse.

    For adoption of AI/ML/DL to be actionable for Enterprise customers, the full spectrum of on-premises and cloud-based work needs to be accommodated. Deployment and operations across environments needs to be consistent. On-premises provisioning and deployment should feel cloud-like in ease-of-use, and hybrid scenarios need to be handled robustly. Installation and management of frameworks and models needs to be handled too.

    Join us for this free 1-hour webinar, from GigaOm Research, to explore these matters. The Webinar features GigaOm analyst Andrew Brust and special guests, Adnan Khaleel from Dell EMC, and Professor Sambit Bhattacharya of Fayetteville State University, a customer of Bright Computing. This webinar is sponsored by Dell EMC, NVIDIA, and Bright Computing.

    In this 1-hour webinar, attendees discover:

    ●How cross-premises AI deployment is both necessary and achievable
    ●What “AI Ops” looks like today, and where it’s going
    ●The sweet spot of ML/DL training workloads between data center and cloud
Market Research
Emerging market research for topics including but not limited to emerging technologies, Application development, and CxO Strategy.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Meeting Demand for Capacity-driven Data with Object Storage
  • Live at: May 15 2019 5:00 pm
  • Presented by: Chris Evans, Scott Baker
  • From:
Your email has been sent.
or close