Hi [[ session.user.profile.firstName ]]

Integrating the Enterprise with a Streaming Data Approach

Streaming and real-time data has high business value, but that value can rapidly decay if not processed quickly. If the value of the data is not realized in a certain window of time, its value is lost and the decision or action that was needed as a result never occurs. Streaming data - whether from sensors, devices, applications, or events - needs special attention because a sudden price change, a critical threshold met, a sensor reading changing rapidly, or a blip in a log file can all be of immense value, but only if the alert is in time.

In this webinar, we will review the landscape of streaming data and message queueing technology and introduce and demonstrate a method for an organization to assess and benchmark—for their own current and future uses and workloads—the technologies currently available. We will also reveal the results of our own execution of the OpenMessaging benchmark on workloads for two of the platforms: Apache Kafka and Apache Pulsar..

What Will Be Discussed:

- The Evolution of Queuing, Messaging, and Streaming
- Today’s Technology Landscape
- Assessing Performance: The OpenMessaging Benchmark
- Considerations for Your Evaluation
Recorded May 20 2019 54 mins
Your place is confirmed,
we'll send you email reminders
Presented by
William McKnight, Jon Bock
Presentation preview: Integrating the Enterprise with a Streaming Data Approach

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Performance, Capacity and Lower TCO. Can a single storage solution offer it all? Recorded: Jun 14 2019 58 mins
    Enrico Signoretti, Greg Klieinman, Jason Oblinger, Scott Hudson
    The storage industry is facing challenging times. Today’s CxOs, IT Specialists, and Project Managers adopting a two-tier solution have much to consider. On one side sits exponential data growth, longer retentions, and capacity-driven applications. On the other side we face applications requiring rapid data access with minimal latency being key to business growth. This trend is quickly accelerating as new applications enable IoT and leverage artificial intelligence demanding seamless integration with the cloud.

    The traditional classification of primary and secondary storage no longer applies. Associating structured data to the first and unstructured data to the latter is limiting with modern applications and computing paradigms. Primary and secondary storage as we knew them, impose too many constraints leading to a rise in infrastructure complexity and costs, rendering the strategy unsustainable. IT leaders today require solutions that automate regular tasks to simplify infrastructure and reduce expenses.

    During this one-hour webinar, GigaOm analyst Enrico Signoretti and special guests Greg Kleiman from Datrium, as well as Jason Oblinger, Sr. Systems Administrator from Next Generation Films, Inc. and Scott Hudson, System Administrator for ScienceLogic, will analyze several aspects of two-tier storage strategies including encryption, introduce Datrium’s innovative solution, and provide testimonies from end users that have adopted it.

    During this webinar, attendees will learn:

    •Why a two tier-storage strategy
    •Different types of tier integration
    •The benefits of an integrated two-tier storage infrastructure
    •Introduction to Datrium DVX platform and cloud services
    •How Datrium solution ecosystem fits in a two-tier storage strategy
    •The benefits of Datrium for the people in your organization
    •A successful two-tier storage strategy implementation with Datrium
  • Using Value Stream Management to Optimize DevOps Workflow Recorded: Jun 14 2019 61 mins
    Jon Collins, Karan Malhi, Michael Baldani
    While DevOps is acknowledged as delivering on an organization’s innovation goals, it is not prescriptive in terms of workflows or activities, nor indeed toolsets. And, as many organizations are discovering, simply ‘doing’ DevOps, or employing continuous integration and delivery tools and frameworks, does not always lead to innovation nirvana. As development, testing, and deployment processes are automated, attention turns to ensuring that the end-to-end process delivers effective results, with maximum efficiency.

    Value Stream Management is a discipline that focuses on the value delivered at each step. It defines the optimum stream for a given scenario, with best results and at a reduced cost. Delivering on this theory can depend on where organizations are on the DevOps journey. Less mature organizations may prefer more prescriptive solutions with the right kinds of guardrails in place, offering the best way to solve a particular kind of problem, such as developing a mobile app or deploying a cloud-based scalable solution. Other, more sophisticated enterprises, may look at how to connect teams, tools, and applications for a single view across the software delivery process.

    This free one-hour webinar from GigaOm Research brings together leading experts in DevOps, featuring GigaOm Principal Analyst, Jon Collins and Karan Malhi, Director of Product from CloudBees. In this webinar, we look at how Value Stream Management offers a roadmap towards better, faster software delivery and operational excellence.

    We consider:
    •The importance of visibility and insight across all value streams to effectively measure and manage DevOps performance.
    •How to access key benchmark metrics and track performance as a basis to learn, improve, and deliver better results at reduced costs.
    •Practical examples and real-world case studies to show how Value Stream Management can be part of an organization's innovation journey.
  • Boosting Data Science Project Success: Development is Not Enough Recorded: Jun 14 2019 59 mins
    Andrew Brust, Michael Nixon
    For enterprise data science environments, development platforms aren’t enough. The data itself, and the infrastructure that serves it, are foundational to data science work. Implementing them well is a strategic necessity.

    Fast, governed access to the right data is critical to data science project success. The data must be cleansed and of high quality, with proper role-based access enforced. The infrastructure must manage workloads well, letting data scientists explore, query and shape the data they need, to build the best predictive models.

    Join us for this free one-hour webinar with GigaOm Research and Snowflake, focused on data warehousing built for the cloud, to review infrastructure strategies that boost data science production.

    You will learn:
    •How to support demanding data science projects without hampering other data teams
    •Strategies to secure data easily and effectively, while keeping authorized access agile and available
    •How to accelerate query performance cost-effectively for data access technologies such as Python, Spark SQL, and others
  • AI Operations: It Can’t Be Just an Afterthought Recorded: Jun 13 2019 63 mins
    Andrew Brust, Adnan Khaleel, Dr. Sambit Bhattacharya
    In the worlds of machine learning (ML) and deep learning (DL), operations and deployment is a subject that often falls by the wayside. And the split reality between everyday on-premises Artificial Intelligence (AI) work and the industry’s fascination with more aspirational cloud-based AI work only makes matters worse.

    For adoption of AI/ML/DL to be actionable for Enterprise customers, the full spectrum of on-premises and cloud-based work needs to be accommodated. Deployment and operations across environments needs to be consistent. On-premises provisioning and deployment should feel cloud-like in ease-of-use, and hybrid scenarios need to be handled robustly. Installation and management of frameworks and models needs to be handled too.

    Join us for this free 1-hour webinar, from GigaOm Research, to explore these matters. The Webinar features GigaOm analyst Andrew Brust and special guests, Adnan Khaleel from Dell EMC, and Professor Sambit Bhattacharya of Fayetteville State University, a customer of Bright Computing. This webinar is sponsored by Dell EMC, NVIDIA, and Bright Computing.

    In this 1-hour webinar, attendees discover:

    ●How cross-premises AI deployment is both necessary and achievable
    ●What “AI Ops” looks like today, and where it’s going
    ●The sweet spot of ML/DL training workloads between data center and cloud
  • Building a Solid Foundation for Advanced Analytics Maturity Recorded: Jun 13 2019 60 mins
    Andrew Brust, Steve Mahoney
    As the best of big data analytics and business intelligence coalesce, organizations now have their sights set on AI for predictive and prescriptive analytics. But how can they get there?

    Effective predictive and prescriptive analytics will put an organization in an advanced analytics maturity level, but a solid foundation at the more basic levels has to come first. Earlier analytics maturity phases, including descriptive and diagnostic/root cause analytics, and intermediate levels, such as operational analytics, must be addressed robustly before further phases can be approached successfully.

    Understanding the capacity your organization has to leverage analytics, and designing your analytics to match that, may be the best way forward, and this requires introspection and planning. Join us for this free 1-hour webinar from GigaOm Research to discuss how to assess, shore up, and advance your company’s analytics maturity. The webinar features GigaOm analyst Andrew Brust and special guest Steve Mahoney, Product Director from Looker, a leading provider of modern analytics platform software.

    In this webinar, you will learn:
    - What a rational analytics maturity model looks like and where your organization lies within it
    - How to strengthen the integrity of your current analytics maturity level
    - How to map out your path to advanced analytics maturity and approach practical adoption of AI and machine learning
  • Modernizing Manufacturing with Mobile and IoT Recorded: Jun 13 2019 59 mins
    Bob Egan, John Gibson, Luis Llamas, Ryan Adams
    Digital Transformation is both a business and IT mandate across many business sectors. Nowhere is this more evident than in the modernization of the manufacturing sector. At the core of this transformation are mobile and IoT systems, driving innovation to achieve velocity through visibility by instrumenting the warehouse, the assembly line and throughout finished good supply chain. In fact, without architecting with IoT and mobile as core tenants in this modernization, AI and robotic initiatives may well miss their mark.

    In this webinar, we will discuss three key issues:

    •What does velocity through visibility look like, how is it evolving and what is the impact?
    •What are critical issues to increase efficiency, improve quality and granularity control?
    •How are leading entities modernizing manufacturing with mobile and IoT?
  • Re-crossing the DevOps Wall of Confusion: How Operational Data Can Enhance and A Recorded: Jun 12 2019 60 mins
    Jon Collins, Mark Herring
    Data, data everywhere, but not a drop to drink? The amount of information we now have on operational aspects of IT is astounding, creating the potential for data analytics, fault identification, and diagnosis. However, many organizations struggle to link information and insights back into the fast-moving development lifecycle.

    In this webinar, GigaOm analyst Jon Collins speaks to Time Series Platform vendor InfluxData about how organizations can not only monitor and control infrastructure assets across the private and public cloud, but also start to derive the benefits of data analytics across workflows, feeding infrastructure information to developers and decision makers alike.

    This webinar is aimed at anyone looking to make better use of operational data to feed decisions, from those working at the front line of software delivery and deployment, to senior executives and strategists. We won’t only be looking at the theory, but we’ll also be drilling into practices learned across a number of verticals and specific clients, taking examples from business-as-usual to disaster recovery.

    If you are looking to accelerate and automate your DevOps and continuous delivery practices through better use of data, or simply want to know how to deliver operational information to the development point of need, register today.
  • Speed and Scale: Advanced Analytics with Machine Learning Recorded: Jun 12 2019 61 mins
    Andrew Brust, Ryan Adams, Deepsha Menghani, Mark Balkenende
    Artificial Intelligence and Machine Learning (ML) can turn massive amounts of data into deep insights that drive revenue and decrease costs. But ML’s not an island – in fact, it’s carried out most successfully when paired with advanced analytics. To facilitate the best analytics work, enterprises need the right platforms and tools to load data, prepare it, ensure high-quality and integrate with corporate data governance processes.

    How can you get all that working harmoniously, especially in the cloud? It takes the right tools, strategy and workflow, but it can be done. Join us for this free 1-hour webinar, from GigaOm Research, to find out how. The webinar features GigaOm analyst Andrew Brust, Deepsha Menghani, Product Marketing Manager at Microsoft, and a special guests from Talend.

    In this 1-hour webinar, you will learn how:
    •Data analytics, data quality and data governance can be tightly intertwined with data science
    •Technologies like Apache Spark can serve both your data engineering and machine learning needs
    •Cloud services can be combined with open source software and analytics ecosystem tools for maximum benefit
  • Strategies for Moving from Application to Enterprise Customer Master Data Manage Recorded: Jun 12 2019 63 mins
    William McKnight, Leticia Barcia
    Many organizations start their Master Data Management (MDM) journey in support of a specific sponsor business need. Often hard-fought concessions are made to allow MDM techniques to support data needs of a particular function. Now, since almost every application needs your customer master, it’s time to expand the scope of MDM, to replace substandard methods of accessing customer data in the enterprise and to establish the MDM hub for use by new applications across all customer business processes.

    Crossing the chasm to the enterprise customer 360 can be daunting. During this 1-Hour Webinar, GigaOm analyst William McKnight will share some strategies for moving from application to enterprise customer MDM and will discuss the ruggedization that must exist in MDM to be ready.

    During this 1-Hour Webinar we will examine:

    •How MDM is established with TCO
    •The MDM Maturity Model (single application MDM is low)
    •Elements in MDM necessary for takeoff:
    •A data quality program (prove data quality according to beloved peers)
    •DaaS SLAs – how do they work with the hub?
    •Operational integration (including data lake) AND value-added customer analytical elements (perhaps gleaned from transactional data)
    •Syndicated data take-on
    •Graph capabilities
  • Modern Data Warehouse – Enterprise Data Curation for the Artificial Intelligence Recorded: May 23 2019 59 mins
    William McKnight, Kuber Sharma
    This free 1-hour webinar from GigaOm Research brings experts in AI and data analytics, featuring GigaOm analyst William McKnight and a special guest from Microsoft. The discussion will focus on the promise AI holds for organizations in every industry and every size, and how to overcome some of the challenge today of how to prepare for AI in the organization and how to plan AI applications.

    The foundation for AI is data. You must have enough data to analyze to build models. Your data determines the depth of AI you can achieve -- for example, statistical modeling, machine learning, or deep learning -- and its accuracy. The increased availability of data is the single biggest contributor to the uptake in AI where it is thriving. Indeed, data’s highest use in the organization soon will be training algorithms. AI is providing a powerful foundation for impending competitive advantage and business disruption.

    In this 1-hour webinar, you will discover:

    •AI’s impending effect on the world
    •Data’s new highest use: training AI algorithms
    •Know & change behavior
    •Data collection
    •Corporate Skill Requirements

    You’ll learn how organizations need to be thinking about AI and the data for AI.
  • Why the Data Warehouse is Back (And Why it Never Really Went Away) Recorded: May 23 2019 62 mins
    Andrew Brust, Ross Perez
    The rise in Enterprise data volumes and the increasing use of semi-structured data gave rise to Big Data and NoSQL platforms. But the conventional data warehousing model never went away. And with innovations in cloud object storage and compute capabilities, the data warehouse model has come out of the shadows and back into the spotlight.

    Data silos were a problem even in the old days, but the challenge they pose today is acute. Some organizations, still wary of older storage costs and cost models, are conservative in the data they preserve. Others tend towards the opposite extreme, saving data in cloud object storage with such abandon that they engender impenetrable repositories that form huge silos of their own.

    Since data warehouses have always sought to integrate siloed data, their role – in everything from analytics to machine learning – is more pivotal now than ever. But how can today’s cloud data warehouse platforms address both the old silos and the new? What can they do with semi-structured data? How can they integrate with data lakes and/or purify data swamps? And can they enable analytics on data and platforms where doing so had been an afterthought, at best?

    To get the answers, join us for this free 1-hour webinar from GigaOm Research. The Webinar features GigaOm analyst Andrew Brust and special guest, Ross Perez from Snowflake, a leader in cloud-native data warehousing.

    In this 1-hour webinar, you will learn:

    •How cloud data warehouses can scale both storage and compute, independently and elastically, to meet variable workloads
    •Distinct approaches for working with semi-structured data from structured data platforms
    •Why the equation for data warehouse and data lake doesn’t sum to zero
    •Whether the familiar relational/SQL paradigm can coexist with Big Data analytics and fluid, interactive performance
  • Cloud Data Warehousing: Explanations, Trends and Best Practices Recorded: May 22 2019 64 mins
    Andrew Brust, Kevin Petrie
    The popularity of cloud computing is at this point well-understood, but mixing the cloud model with data warehousing can generate unexpected synergies.

    While data lakes provide processing, economics and scalability, the need for structured data and a familiar query language matters. These considerations, along with the economics and architecture of cloud storage, has injected new scalability and cost-effectiveness into the ol’ reliable data warehouse model.

    But what are the best steps to get started with cloud data warehousing? How can you set up the right data pipelines to make your cloud data warehouse complete, authoritative and as close to real-time as possible? And can cloud innovations benefit on-premises data warehouse implementations too?

    To find out, join us for this free 1-hour webinar from GigaOm Research. The Webinar features GigaOm analyst Andrew Brust and special guest, Kevin Petrie from Attunity, a leader in data integration and ingest for Big Data and data warehouse solutions.

    In this 1-hour webinar, you will learn about:
    - The rise of the data warehouse in modern data pipelines
    - How the lake can feed data into the warehouse, for harmonious coexistence
    - How data warehousing fits into multi-cloud architectures
    - Trends in the cloud data warehouse market
  • Analytics for Action: How it All Comes Together Recorded: May 22 2019 63 mins
    Andrew Brust, Ira Cohen
    Analytics and Machine Learning are exciting, and the technologies around each of them are innovative. But most of these products provide building blocks, leaving a lot of work to the customer – work that may or may not go so well. It’s great to have the “dots,” but they have to be connected.

    And this isn’t just about integration. While implementing analytics in service of AI is laudable, what’s even more interesting is the opposite: using AI to automate and drive analytics. Ultimately, solutions that lead customers to action, rather than just giving them raw tools to derive insights, are what matters most. Add in forecasting and corresponding preparatory actions, and things get really interesting.

    In this market of loosely federated, open source analytics and machine learning technologies, is such an integrated, pragmatic solution feasible? Join us for this free Webinar to find out. GigaOm analyst Andrew Brust will be your host facilitating a discussion with Ira Cohen, Co-founder and Chief Data Scientist at Anodot. Cohen and Brust will shine a light on what today’s analytics and ML technologies are capable of, and contrast that with what’s on the market in ready-to-run form. By the end of the Webinar, you’ll understand what’s possible, what’s available and what may be in store in the future.

    In this 1-hour webinar, you will discover:

    • Why insights alone aren’t enough
    • Why automated analytics succeeds where manual analytics may fail
    • How streaming data processing, analytics and machine learning can be used together to maximum advantage
  • AI for the Enterprise: Actionable and On-Premises Recorded: May 21 2019 65 mins
    Andrew Brust, Adnan Khaleel
    There’s a lot of talk about AI in the Enterprise, but how can the corporate customer really get AI done? Most of the buzz is around AI in the cloud. But if an Enterprise customer has most of its data on-premises, is the chatter even relevant?

    The good news is there are lots of ways to do effective data science work on-premises. All the perceived accoutrements of cloud AI are there, too: open source frameworks, cluster-based distributed processing, GPU-based infrastructure and automated provisioning of the development environment. You no longer have to choose between defying data gravity to work in the cloud and withstanding arduous manual setup and update maintenance to operate on-premises.

    Join us for this free 1-hour webinar from GigaOm Research. The Webinar features GigaOm analyst Andrew Brust and special guest, xxx from Dell EMC.

    In this 1-hour webinar, you will discover:

    - How machine learning and deep learning can be conducted on-premises with ease
    - Taking advantage of hardware acceleration with GPUs, FPGAs and beyond
    - Automated management of sophisticated AI hardware and software stacks, right in your own data center
  • Data Lake Transformation: Merging BI, Knowledge Graphs and Search Recorded: May 21 2019 59 mins
    Andrew Brust, Giovanni Tummarello
    The number of innovative products and technologies in the analytics world is staggering. Unfortunately, so is the effort and expertise required to use them together effectively. We’ve got core analytics, big data streams, graph and even search technology. They’re all great, but each one is its own island of skills and tooling, with its own unique ecosystem.

    What’s needed is not just a way to integrate these technologies, but to use them in a cohesive way that weaves together paradigms and creates opportunities where before was siloes. And it all must be done by leaving data where it resides; No crazy ETL into new graph formats, pushing down queries and aggregates down to the DBs and infrastructure where it resides delivering both aggregate and detailed views of the data; providing a query experience based on a fusion of search, semantic reasoning and analytics; mapping relationships through observation, even when the links are not explicitly known; and using varied visualization techniques to understand the data, depending on the its detail level, structure and context.

    Join GigaOm’s Andrew Brust, and special guest Giovanni Tummarello (Chief Product Officer and Co-founder) from Siren, for this free Webinar. You’ll discover how to bring that motley crew of feeds, files and tables euphemistically called a data lake into a discovered, navigable whole that lets you derive real knowledge and insight from data assets across your organization.

    Join us for this Webinar and learn how:

    Core analytics, search and graph technology can be used together
    Doing so greatly reduces the risk of analytics project failure
    Combining technologies can elegantly provide complimentary perspectives, rather than force awkward context switches
    The Nirvana of data insight is achieved through technology synergy, rather than supremacy of a single approach
  • The Modern Data Warehouse – Enterprise Data Curation for the AI Future Recorded: May 20 2019 59 mins
    William McKnight, Kuber Sharma
    This free 1-hour webinar from GigaOm Research brings experts in AI and data analytics, featuring GigaOm analyst William McKnight and a special guest from Microsoft. The discussion will focus on the promise AI holds for organizations in every industry and every size, and how to overcome some of the challenge today of how to prepare for AI in the organization and how to plan AI applications.

    The foundation for AI is data. You must have enough data to analyze to build models. Your data determines the depth of AI you can achieve -- for example, statistical modeling, machine learning, or deep learning -- and its accuracy. The increased availability of data is the single biggest contributor to the uptake in AI where it is thriving. Indeed, data’s highest use in the organization soon will be training algorithms. AI is providing a powerful foundation for impending competitive advantage and business disruption.

    In this 1-hour webinar, you will discover:

    •AI’s impending effect on the world
    •Data’s new highest use: training AI algorithms
    •Know & change behavior
    •Data collection
    •Corporate Skill Requirements

    You’ll learn how organizations need to be thinking about AI and the data for AI.
  • Integrating the Enterprise with a Streaming Data Approach Recorded: May 20 2019 54 mins
    William McKnight, Jon Bock
    Streaming and real-time data has high business value, but that value can rapidly decay if not processed quickly. If the value of the data is not realized in a certain window of time, its value is lost and the decision or action that was needed as a result never occurs. Streaming data - whether from sensors, devices, applications, or events - needs special attention because a sudden price change, a critical threshold met, a sensor reading changing rapidly, or a blip in a log file can all be of immense value, but only if the alert is in time.

    In this webinar, we will review the landscape of streaming data and message queueing technology and introduce and demonstrate a method for an organization to assess and benchmark—for their own current and future uses and workloads—the technologies currently available. We will also reveal the results of our own execution of the OpenMessaging benchmark on workloads for two of the platforms: Apache Kafka and Apache Pulsar..

    What Will Be Discussed:

    - The Evolution of Queuing, Messaging, and Streaming
    - Today’s Technology Landscape
    - Assessing Performance: The OpenMessaging Benchmark
    - Considerations for Your Evaluation
  • Guide for Enterprises: Strategies and Options to Consider When Modernizing Data Recorded: May 17 2019 59 mins
    William McKnight, Ross Perez
    This free 1-hour Gigaom Research webinar will present the findings of a recently completed report on moving enterprise databases written by Gigaom analyst William McKnight, "A Guide for Enterprises: Strategies and Options to Consider when Modernizing Data Architecture."

    Competitive advantage with data cannot be accomplished without an intense focus on the many and growing technical bases that can be used to store, view and manage data. Data technology and data science has progressed with the importance of data and it is imperative to raise the data foundation of your company to be able to cultivate it as an asset.

    This talk will help an organization understand the value of modernizing the data architecture and how to frame a modernization effort that delivers analysis capabilities, diverse yet connected data and key performance measures.

    What Will Be Discussed:

    •What Does Modern Data Architecture Look Like
    •What to Evaluate to Decide That It Is Time to Modernize the Data Warehouse Database
    •Strategies for Modernizing the Data Warehouse Database

    Join Gigaom Research and our sponsor Snowflake Computing for “Modernizing Data Warehousing”, for this free expert webinar.
  • Modern Data Engineering in the Cloud Recorded: May 16 2019 56 mins
    Andrew Brust, Brian Dirking, Mike Destein
    Data engineering, the discipline of integrating, conforming, and readying data for downstream analysis, has been with us for many years, but it has new relevance and criticality today. Data engineering has to support analytics, machine learning, and maintain data quality; and it must ensure data privacy, security, and protection of sensitive data, for compliance with GDPR and other regulatory frameworks.

    A great data engineering platform must support full-fledged and operationalized data pipelines, be cloud-capable, and run on modern, distributed data execution platforms like Apache Spark. Finally, a modern data engineering platform must support savvy business analysts and other “citizen data engineers” – in addition to the more technical level database engineers, operators, and administrators.

    That’s a long list of requirements, but it is readily attainable with today’s technology. To learn more, join speakers from GigaOm, Talend and Databricks for this free 1-hour webinar from GigaOm Research. The webinar features GigaOm analyst Andrew Brust, Mike Destein from our sponsor Talend, a company focused on data engineering and data management, and Brian Dirking from Databricks, focused on Apache Spark-based machine learning and data engineering.

    In this 1-hour webinar, you will discover:
    •How modern data engineering platforms and cloud-based data processing services can work hand-in-hand
    •Why data engineering platforms must serve coders, architects, and analysts
    •How to facilitate self-service analytics and meet your data quality, privacy, security, and protection needs
    Register now to join GigaOm Research, Talend and Databricks for this free expert webinar.
  • Meeting Demand for Capacity-driven Data with Object Storage Recorded: May 15 2019 65 mins
    Chris Evans, Scott Baker
    This 1-hour webinar will discuss the ability for organizations to meet demand for capacity-driven data with object storage.

    Today’s enterprise data requirements are clearly dividing into a need for latency-sensitive and capacity-driven solutions, as organizations store and exploit data from existing and increasingly machine generated sources. This webinar looks at how enterprises meet the demand for capacity-driven data with object storage solutions from the major and upcoming solution vendors. During the webinar you will learn:

    •Factors driving the adoption of object storage
    •Critical features to look out for in object storage solutions
    •Analysis of vendor offerings available in the market today
    •Gigaom’s assessment of the market leaders and followers

    Join Gigaom Research and Hitachi Data Systems (HDS) for this free expert webinar.
Market Research
Emerging market research for topics including but not limited to emerging technologies, Application development, and CxO Strategy.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Integrating the Enterprise with a Streaming Data Approach
  • Live at: May 20 2019 2:00 pm
  • Presented by: William McKnight, Jon Bock
  • From:
Your email has been sent.
or close