Hi [[ session.user.profile.firstName ]]

Lessons from Customers: Implementing IoT Projects with Maximum Success

This free 1-hour webinar from GigaOm Research brings together leading minds in the Internet of Things (IoT), featuring GigaOm analyst Andrew Brust, joined by guests from IoT powerhouse Hitachi Vantara, which now includes the team behind Pentaho Data Integration and Analytics. The roundtable discussion will focus on the results of a comprehensive IoT research survey conducted by Gigaom Research and commissioned by Hitachi Vantara.

In this 1-hour webinar, you will discover:

- How Enterprise customers are using IoT and where they are in their planning, evaluation and deployment of IoT solutions?
- The scope of today’s IoT initiatives, in terms of use cases and mission criticality
- Who the IoT leaders, funders and champions are in companies of different size, across different industries
- Recommendations for IT and data professionals to stay ahead of the Big Data and IoT landscape

You’ll learn about factors and metrics germane to IoT initiative success, the architectures necessary for that success and the best approaches to IoaT technology and partners.

Register now to join GigaOm Research and Hitachi Vantara for this free expert webinar.
Recorded Apr 29 2019 61 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Andrew Brust, Jeanne Ford
Presentation preview: Lessons from Customers: Implementing IoT Projects with Maximum Success

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Modern Data Warehouse – Enterprise Data Curation for the Artificial Intelligence Recorded: May 23 2019 59 mins
    William McKnight, Kuber Sharma
    This free 1-hour webinar from GigaOm Research brings experts in AI and data analytics, featuring GigaOm analyst William McKnight and a special guest from Microsoft. The discussion will focus on the promise AI holds for organizations in every industry and every size, and how to overcome some of the challenge today of how to prepare for AI in the organization and how to plan AI applications.

    The foundation for AI is data. You must have enough data to analyze to build models. Your data determines the depth of AI you can achieve -- for example, statistical modeling, machine learning, or deep learning -- and its accuracy. The increased availability of data is the single biggest contributor to the uptake in AI where it is thriving. Indeed, data’s highest use in the organization soon will be training algorithms. AI is providing a powerful foundation for impending competitive advantage and business disruption.

    In this 1-hour webinar, you will discover:

    •AI’s impending effect on the world
    •Data’s new highest use: training AI algorithms
    •Know & change behavior
    •Data collection
    •Corporate Skill Requirements

    You’ll learn how organizations need to be thinking about AI and the data for AI.
  • Why the Data Warehouse is Back (And Why it Never Really Went Away) Recorded: May 23 2019 62 mins
    Andrew Brust, Ross Perez
    The rise in Enterprise data volumes and the increasing use of semi-structured data gave rise to Big Data and NoSQL platforms. But the conventional data warehousing model never went away. And with innovations in cloud object storage and compute capabilities, the data warehouse model has come out of the shadows and back into the spotlight.

    Data silos were a problem even in the old days, but the challenge they pose today is acute. Some organizations, still wary of older storage costs and cost models, are conservative in the data they preserve. Others tend towards the opposite extreme, saving data in cloud object storage with such abandon that they engender impenetrable repositories that form huge silos of their own.

    Since data warehouses have always sought to integrate siloed data, their role – in everything from analytics to machine learning – is more pivotal now than ever. But how can today’s cloud data warehouse platforms address both the old silos and the new? What can they do with semi-structured data? How can they integrate with data lakes and/or purify data swamps? And can they enable analytics on data and platforms where doing so had been an afterthought, at best?

    To get the answers, join us for this free 1-hour webinar from GigaOm Research. The Webinar features GigaOm analyst Andrew Brust and special guest, Ross Perez from Snowflake, a leader in cloud-native data warehousing.

    In this 1-hour webinar, you will learn:

    •How cloud data warehouses can scale both storage and compute, independently and elastically, to meet variable workloads
    •Distinct approaches for working with semi-structured data from structured data platforms
    •Why the equation for data warehouse and data lake doesn’t sum to zero
    •Whether the familiar relational/SQL paradigm can coexist with Big Data analytics and fluid, interactive performance
  • Cloud Data Warehousing: Explanations, Trends and Best Practices Recorded: May 22 2019 64 mins
    Andrew Brust, Kevin Petrie
    The popularity of cloud computing is at this point well-understood, but mixing the cloud model with data warehousing can generate unexpected synergies.

    While data lakes provide processing, economics and scalability, the need for structured data and a familiar query language matters. These considerations, along with the economics and architecture of cloud storage, has injected new scalability and cost-effectiveness into the ol’ reliable data warehouse model.

    But what are the best steps to get started with cloud data warehousing? How can you set up the right data pipelines to make your cloud data warehouse complete, authoritative and as close to real-time as possible? And can cloud innovations benefit on-premises data warehouse implementations too?

    To find out, join us for this free 1-hour webinar from GigaOm Research. The Webinar features GigaOm analyst Andrew Brust and special guest, Kevin Petrie from Attunity, a leader in data integration and ingest for Big Data and data warehouse solutions.

    In this 1-hour webinar, you will learn about:
    - The rise of the data warehouse in modern data pipelines
    - How the lake can feed data into the warehouse, for harmonious coexistence
    - How data warehousing fits into multi-cloud architectures
    - Trends in the cloud data warehouse market
  • Analytics for Action: How it All Comes Together Recorded: May 22 2019 63 mins
    Andrew Brust, Ira Cohen
    Analytics and Machine Learning are exciting, and the technologies around each of them are innovative. But most of these products provide building blocks, leaving a lot of work to the customer – work that may or may not go so well. It’s great to have the “dots,” but they have to be connected.

    And this isn’t just about integration. While implementing analytics in service of AI is laudable, what’s even more interesting is the opposite: using AI to automate and drive analytics. Ultimately, solutions that lead customers to action, rather than just giving them raw tools to derive insights, are what matters most. Add in forecasting and corresponding preparatory actions, and things get really interesting.

    In this market of loosely federated, open source analytics and machine learning technologies, is such an integrated, pragmatic solution feasible? Join us for this free Webinar to find out. GigaOm analyst Andrew Brust will be your host facilitating a discussion with Ira Cohen, Co-founder and Chief Data Scientist at Anodot. Cohen and Brust will shine a light on what today’s analytics and ML technologies are capable of, and contrast that with what’s on the market in ready-to-run form. By the end of the Webinar, you’ll understand what’s possible, what’s available and what may be in store in the future.

    In this 1-hour webinar, you will discover:

    • Why insights alone aren’t enough
    • Why automated analytics succeeds where manual analytics may fail
    • How streaming data processing, analytics and machine learning can be used together to maximum advantage
  • AI for the Enterprise: Actionable and On-Premises Recorded: May 21 2019 65 mins
    Andrew Brust, Adnan Khaleel
    There’s a lot of talk about AI in the Enterprise, but how can the corporate customer really get AI done? Most of the buzz is around AI in the cloud. But if an Enterprise customer has most of its data on-premises, is the chatter even relevant?

    The good news is there are lots of ways to do effective data science work on-premises. All the perceived accoutrements of cloud AI are there, too: open source frameworks, cluster-based distributed processing, GPU-based infrastructure and automated provisioning of the development environment. You no longer have to choose between defying data gravity to work in the cloud and withstanding arduous manual setup and update maintenance to operate on-premises.

    Join us for this free 1-hour webinar from GigaOm Research. The Webinar features GigaOm analyst Andrew Brust and special guest, xxx from Dell EMC.

    In this 1-hour webinar, you will discover:

    - How machine learning and deep learning can be conducted on-premises with ease
    - Taking advantage of hardware acceleration with GPUs, FPGAs and beyond
    - Automated management of sophisticated AI hardware and software stacks, right in your own data center
  • Data Lake Transformation: Merging BI, Knowledge Graphs and Search Recorded: May 21 2019 59 mins
    Andrew Brust, Giovanni Tummarello
    The number of innovative products and technologies in the analytics world is staggering. Unfortunately, so is the effort and expertise required to use them together effectively. We’ve got core analytics, big data streams, graph and even search technology. They’re all great, but each one is its own island of skills and tooling, with its own unique ecosystem.

    What’s needed is not just a way to integrate these technologies, but to use them in a cohesive way that weaves together paradigms and creates opportunities where before was siloes. And it all must be done by leaving data where it resides; No crazy ETL into new graph formats, pushing down queries and aggregates down to the DBs and infrastructure where it resides delivering both aggregate and detailed views of the data; providing a query experience based on a fusion of search, semantic reasoning and analytics; mapping relationships through observation, even when the links are not explicitly known; and using varied visualization techniques to understand the data, depending on the its detail level, structure and context.

    Join GigaOm’s Andrew Brust, and special guest Giovanni Tummarello (Chief Product Officer and Co-founder) from Siren, for this free Webinar. You’ll discover how to bring that motley crew of feeds, files and tables euphemistically called a data lake into a discovered, navigable whole that lets you derive real knowledge and insight from data assets across your organization.

    Join us for this Webinar and learn how:

    Core analytics, search and graph technology can be used together
    Doing so greatly reduces the risk of analytics project failure
    Combining technologies can elegantly provide complimentary perspectives, rather than force awkward context switches
    The Nirvana of data insight is achieved through technology synergy, rather than supremacy of a single approach
  • The Modern Data Warehouse – Enterprise Data Curation for the AI Future Recorded: May 20 2019 59 mins
    William McKnight, Kuber Sharma
    This free 1-hour webinar from GigaOm Research brings experts in AI and data analytics, featuring GigaOm analyst William McKnight and a special guest from Microsoft. The discussion will focus on the promise AI holds for organizations in every industry and every size, and how to overcome some of the challenge today of how to prepare for AI in the organization and how to plan AI applications.

    The foundation for AI is data. You must have enough data to analyze to build models. Your data determines the depth of AI you can achieve -- for example, statistical modeling, machine learning, or deep learning -- and its accuracy. The increased availability of data is the single biggest contributor to the uptake in AI where it is thriving. Indeed, data’s highest use in the organization soon will be training algorithms. AI is providing a powerful foundation for impending competitive advantage and business disruption.

    In this 1-hour webinar, you will discover:

    •AI’s impending effect on the world
    •Data’s new highest use: training AI algorithms
    •Know & change behavior
    •Data collection
    •Corporate Skill Requirements

    You’ll learn how organizations need to be thinking about AI and the data for AI.
  • Integrating the Enterprise with a Streaming Data Approach Recorded: May 20 2019 54 mins
    William McKnight, Jon Bock
    Streaming and real-time data has high business value, but that value can rapidly decay if not processed quickly. If the value of the data is not realized in a certain window of time, its value is lost and the decision or action that was needed as a result never occurs. Streaming data - whether from sensors, devices, applications, or events - needs special attention because a sudden price change, a critical threshold met, a sensor reading changing rapidly, or a blip in a log file can all be of immense value, but only if the alert is in time.

    In this webinar, we will review the landscape of streaming data and message queueing technology and introduce and demonstrate a method for an organization to assess and benchmark—for their own current and future uses and workloads—the technologies currently available. We will also reveal the results of our own execution of the OpenMessaging benchmark on workloads for two of the platforms: Apache Kafka and Apache Pulsar..

    What Will Be Discussed:

    - The Evolution of Queuing, Messaging, and Streaming
    - Today’s Technology Landscape
    - Assessing Performance: The OpenMessaging Benchmark
    - Considerations for Your Evaluation
  • Guide for Enterprises: Strategies and Options to Consider When Modernizing Data Recorded: May 17 2019 59 mins
    William McKnight, Ross Perez
    This free 1-hour Gigaom Research webinar will present the findings of a recently completed report on moving enterprise databases written by Gigaom analyst William McKnight, "A Guide for Enterprises: Strategies and Options to Consider when Modernizing Data Architecture."

    Competitive advantage with data cannot be accomplished without an intense focus on the many and growing technical bases that can be used to store, view and manage data. Data technology and data science has progressed with the importance of data and it is imperative to raise the data foundation of your company to be able to cultivate it as an asset.

    This talk will help an organization understand the value of modernizing the data architecture and how to frame a modernization effort that delivers analysis capabilities, diverse yet connected data and key performance measures.

    What Will Be Discussed:

    •What Does Modern Data Architecture Look Like
    •What to Evaluate to Decide That It Is Time to Modernize the Data Warehouse Database
    •Strategies for Modernizing the Data Warehouse Database

    Join Gigaom Research and our sponsor Snowflake Computing for “Modernizing Data Warehousing”, for this free expert webinar.
  • Modern Data Engineering in the Cloud Recorded: May 16 2019 56 mins
    Andrew Brust, Brian Dirking, Mike Destein
    Data engineering, the discipline of integrating, conforming, and readying data for downstream analysis, has been with us for many years, but it has new relevance and criticality today. Data engineering has to support analytics, machine learning, and maintain data quality; and it must ensure data privacy, security, and protection of sensitive data, for compliance with GDPR and other regulatory frameworks.

    A great data engineering platform must support full-fledged and operationalized data pipelines, be cloud-capable, and run on modern, distributed data execution platforms like Apache Spark. Finally, a modern data engineering platform must support savvy business analysts and other “citizen data engineers” – in addition to the more technical level database engineers, operators, and administrators.

    That’s a long list of requirements, but it is readily attainable with today’s technology. To learn more, join speakers from GigaOm, Talend and Databricks for this free 1-hour webinar from GigaOm Research. The webinar features GigaOm analyst Andrew Brust, Mike Destein from our sponsor Talend, a company focused on data engineering and data management, and Brian Dirking from Databricks, focused on Apache Spark-based machine learning and data engineering.

    In this 1-hour webinar, you will discover:
    •How modern data engineering platforms and cloud-based data processing services can work hand-in-hand
    •Why data engineering platforms must serve coders, architects, and analysts
    •How to facilitate self-service analytics and meet your data quality, privacy, security, and protection needs
    Register now to join GigaOm Research, Talend and Databricks for this free expert webinar.
  • Meeting Demand for Capacity-driven Data with Object Storage Recorded: May 15 2019 65 mins
    Chris Evans, Scott Baker
    This 1-hour webinar will discuss the ability for organizations to meet demand for capacity-driven data with object storage.

    Today’s enterprise data requirements are clearly dividing into a need for latency-sensitive and capacity-driven solutions, as organizations store and exploit data from existing and increasingly machine generated sources. This webinar looks at how enterprises meet the demand for capacity-driven data with object storage solutions from the major and upcoming solution vendors. During the webinar you will learn:

    •Factors driving the adoption of object storage
    •Critical features to look out for in object storage solutions
    •Analysis of vendor offerings available in the market today
    •Gigaom’s assessment of the market leaders and followers

    Join Gigaom Research and Hitachi Data Systems (HDS) for this free expert webinar.
  • Sector Roadmap for Cloud Analytic Databases: Selecting a Data Platform in 2017 Recorded: May 14 2019 58 mins
    William McKnight, Ben Book, Jon Bock, Melanie Marks
    This free one-hour webinar will present the findings of a recently completed Sector Roadmap for Cloud Analytic Databases.

    The cloud is proving immensely useful to providing elastic, measurable, on-demand, self-service resources to organizations. The uptake in 2016 has been phenomenal, continuing the biggest transformation that technology professionals will experience in their careers.

    Just about any software, including databases, can be placed in a public cloud these days by simply utilizing cloud resources as an extended data center. This may solve an immediate pressing problem, but the opportunities missed without true cloud integration are huge.

    Some relational databases have undergone significant cloud-related development in their latest releases. Those were the focus of this Sector Roadmap, along with the databases built native for the cloud.

    The methodology will be reviewed, along with the disruption vectors (criteria prominent in a cloud analytic database selection), and the key takeaways, all with a view to help the attendee select their cloud analytic database in 2017.

    Join Gigaom Research and our sponsor Snowflake Computing for “Sector Roadmap for Cloud Analytic Databases: Selecting a Data Platform in 2017”, a free expert webinar on March 9.

    What Will Be Discussed:
    - Data architecture for 2017
    - Data platform selection in 2017
    - Major criteria often overlooked in database selection
    - Why a tight integration with the cloud is imperative (“born in the cloud”)
    - Other key takeaways from the study
  • Making AI Work in Production, Not in Isolation Recorded: May 14 2019 63 mins
    Andrew Brust, Jon Richter
    There’s a ton of momentum around machine learning and AI today, but there are important logistics to be worked out. Despite fears and bold proclamations that AI will replace humans, its best application today is in serving people, to make them more productive. Today’s AI platforms need to support that use case, but how well do they do so?

    Next is the overwhelming fragmentation of AI tools and technologies. There are a range of machine learning and deep learning frameworks and libraries with which to build models. The result is that companies are getting distracted by these disparate technologies, diluting their focus on pragmatic adoption of AI. There is also the a decision point around using trained models from the public cloud providers: which platforms should you be on, and is there any way to mix, match and compare them?

    Abstraction layers help here, not just across libraries or cloud-based cognitive services, but for using them in combination, and testing which is most effective. Plus, once that’s done, and the models are built and/or selected, there’s the issue of deploying them to, and using them in, production. What’s the best way to achieve that operationalization?

    There are a lot of questions here. Join us for this free 1-hour webinar from GigaOm Research to get to some of the answers. The Webinar features GigaOm analyst Andrew Brust and our special guest, Jon Richter from CognitiveScale, a company specializing in augmented intelligence.

    In this 1-hour webinar, you will discover: What’s involved in building AI that makes all your people more productive; How to experiment with models from different libraries and cloud platforms, efficiently and efficaciously; Why production deployment and use of machine learning models is no mere detail – it’s the critical link in making AI work at scale, beyond the scope of mere proof-of-concept projects; How to maximize sharing and unification, across programming languages, tools, frameworks.
  • Navigating the Value of Hybrid and Multi-Cloud Strategies in the Enterprise Recorded: May 13 2019 60 mins
    David Linthicum, Tim Crawford, Greg White
    Join GigaOm Research Head of Research Tim Crawford, as well as GigaOm Analyst David Linthicum, and from Nutanix product marketing Greg White, for this free expert webinar.

    Cloud computing continues to gain adoption by enterprises as they leverage two cloud-based methodologies as part of their overall cloud strategy. The two methodologies are hybrid and multi-cloud. While some have confused the two as one and the same, they are very different in terms of their approach and ultimately, their value to the enterprise.

    How do these methodologies differ and how should enterprises think about each when considering their overall cloud-based strategy?

    In this webinar, we will delve into these two methodologies to understand how they differ, the challenges that enterprises are facing with each and how enterprises are gaining value from each approach.

    Join us for an insightful discussion on enterprise use of hybrid and multi-cloud.

    This 1-hour webinar will give attendees insights into:

    - How one should think about hybrid versus multi-cloud

    - Whether enterprises are using one approach, the other or both

    - What is driving enterprises to use hybrid cloud

    - With considerable discussion around cloud lock-in, is multi-cloud just a form of cloud arbitrage

    - What is driving enterprises to use multi-cloud

    - How enterprises are dealing with legacy applications versus cloud-native applications with each approach

    - Where automation and orchestration tools play a role in each approach for each classification of workload
  • The Importance of Telematics in an IoT World Recorded: May 13 2019 63 mins
    Tom Crawford, Gianfranco Giannella, Karin Riedel, Wolfram Jost
    This 1-hour webinar will discuss the importance of the telematics in the insurance industry.

    The insurance industry is experiencing disruptive forces from several directions. At the same time, competition is heating up and creating pressure in ways not previously seen. Telematics provides an innovative way for insurance companies to change the value proposition for their business and customers.

    In this webinar we discuss the disruption factors facing the industry and how the integration of telematics changes the landscape. Telematics are not new and historically have been difficult to leverage. Today, telematics are much easier to use and providing defining opportunities for the insurance industry.
  • DevOps Beyond the Controlled Environment Recorded: May 10 2019 59 mins
    Jon Collins, Nigel Kersten
    Nearly all of today’s enterprise organizations are doing something related to DevOps. All too frequently, however, the set of best practices remain confined to the minority. It is no coincidence, therefore, that organizations are asking themselves how they can scale and broaden the use of DevOps.

    In this webinar, Jon Collins, principal analyst and DevOps lead at GigaOm, speaks to Nigel Kersten, VP of Engineering at Puppet about the attendant set of challenges that face these organizations, and what should be in the kit bag of anyone looking to scale their DevOps initiatives.

    This 1-hour webinar will give attendees insights into:

    - How to move from automating your own services to services you don’t run and are not the consumer of?
    - What roles, characters, and archetypes are needed or will emerge, and how to best leverage them?
    - How to educate, involve, and inspire stakeholders at all levels, from the board to the front line?

    Register now to join GigaOm Research and Puppet for this free expert webinar.
  • Autonomous Analytics in the real world: Giving Your Data a Voice Recorded: May 10 2019 60 mins
    Ira Cohen, Sam Grossberg, Victor Em, William McKnight
    If your data could tell you what's most important, wouldn't you want to listen? Data should be screaming out what it is learning and what to do with it. Autonomous analytics does this by self-learning and self-directing to give you the most accurate and interesting insights that you didn't know existed.

    The use of analytics is not just to analyze the past; it is to optimize real-time and predict the future. No matter where you are in your analytics and AI journey, you can take advantage of Autonomous analytics for real world use cases like anomaly detection, root cause investigation and prediction. Learn how to control your market with Autonomous Analytics in this informative webinar led by industry leader William McKnight.

    Also Foursquare, the leading provider of location intelligence technology to hundreds of thousands of developers in the eCommerce, gaming, social, and other online services will share how they moved from a manual, reactive process to an automated and proactive application services management with Autonomous Analytics. We will discuss their challenges, analytics journey, solutions, and customer benefits.

    Questions to Be Discussed:

    · What is Autonomous Analytics and what is the Value of AI to he Enterprise?
    · How can AI be delivered to support real time and predictive enterprise business initiatives?
    · What are some real use cases and best practices to consider?
    · How Does Knowledge of Outliers Drive Business Results?
  • Simplifying Modern Work Management: Leveraging Artificial Intelligence Recorded: May 9 2019 54 mins
    Tim Crawford, Chris O'Neal, Kevin Ellington
    This free 1-hour webinar from Gigaom Research brings together leading minds in project management, featuring Gigaom head of research and CIO Tim Crawford joined by Workfront thought leaders in a roundtable discussion on how to transform collaboration, resource, and work management.

    In today’s world, we are often overwhelmed with the volume of work, processes and projects. How do you ensure that the right focus is placed in the most critical areas? If you work in a regulated industry, are there processes that must be maintained to ensure compliance? Are there newer methods using artificial intelligence that help catapult the insights?

    Whether it is project management or compliance requirements, ensuring what you focus on is critical to success. Effective collaboration and resource management are key components.

    This webinar dives into ways to tease apart the overwhelming complexity of processes facing enterprises today. We take a look at how enterprises are changing the way they work and the outcomes they are realizing as part of the changes.

    In this webinar, the speakers will discuss and uncover:

    What is ‘digital work’?
    What are some of the key workflows that hinder enterprises today?
    How do these challenges often show up?
    How is Artificial Intelligence changing the way that digital work is performed?
    How are these challenges evolving over time?
    Why is automation critical in today’s digital workflow?
    What are the challenges faced by regulatory and compliance requirements?
    What are strategies to address the ever-changing digital work?
    Who should attend:
  • Using Your Whole Data Lake: How the Operational Facilitates the Predictive Recorded: May 9 2019 64 mins
    Andrew Brust, Monte Zweben
    Many companies in the corporate world have attempted to set up their first data lake. Maybe they bought a Hadoop distribution, and perhaps they spent significant time, money and effort connecting their CRM, HR, ERP and marketing systems to it. And now that these companies have well-crafted, centralized data repositories, in many cases…they just sit there.

    But maybe data lakes fall into disuse because they’re not being looked at for what they are. Most companies see data lakes as auxiliary data warehouses. And, sure, you can use any number of query technologies against the data in your lake to gain business insights. But consider that data lakes can – and should – also serve as the foundation for operational, real-time corporate applications that embed AI and predictive analytics.

    These two uses of data lakes -- for (a) operational applications as well as for (b) insights and predictive analysis -- aren’t mutually exclusive, either. With the right architecture, one can dovetail gracefully into the other. But what database technologies can query and analyze, build machine learning models, and power microservices and applications directly on the data lake?

    Join us for this free 1-hour webinar from GigaOm Research. The Webinar features GigaOm analyst Andrew Brust, and Splice Machine CEO and Co-Founder, Monte Zweben. The discussion will explore how to leverage data lakes as the underpinning of application platforms, driving efficient operations, and predictive analytics that support real-time decisions.

    In this 1-hour webinar, you will discover:
    Why data latency is the enemy and data currency is key to digital transformation success; Why operational database workloads, analytics and construction of predictive models should not be segregated activities; How operational databases can support continually trained predictive models
  • Are Paper Records the Low-hanging Fruit of GDPR? Recorded: May 8 2019 59 mins
    Jon Collins, Adam Grainger, Nick Reeve, Stephen O'Riodan
    Did you know that 50% of data breaches are paper-based?

    The GDPR date may have passed—but for many organizations, the journey toward compliance is only beginning. Much of the focus has been on what is commonly perceived as “customer data,” such as CRM databases and email lists. Yet GDPR isn’t prescriptive about the form of Personally Identifiable Information (PII), meaning the regulation also applies to unstructured documents, email, and even the contents of an organization’s filing cabinets. 

    In this webinar, we consider the importance of ensuring all forms of data are incorporated into the GDPR strategy. We’ll discuss how GDPR presents an opportunity to eliminate non-electronic (paper) records and how this transition to digital can be low-hanging fruit for managing documents more securely, effectively and cost-efficiently.

    Whether you’re still on the GDPR starting blocks or are strategizing about forward-thinking steps, join
    GigaOm’s GDPR lead analyst Jon Collins as he speaks with Adam Grainger, Director of Information Technology, at Baker Tilly and experts from Nitro, a leader in document productivity and workflows.

    This 1-hour webinar will give you insight into: 

    - The breadth of challenges GDPR presents — how it covers all managed information about people
    - The paper-based challenge — how too much information is locked in insecure, paper-based records
    - The GDPR opportunity — how you can use it to better set budget and customer-oriented priorities
    - The route to successful GDPR — the different stages on the journey and the potential for quick wins
    - Ultimately, how a paperless environment can be a more compliant environment
Market Research
Emerging market research for topics including but not limited to emerging technologies, Application development, and CxO Strategy.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Lessons from Customers: Implementing IoT Projects with Maximum Success
  • Live at: Apr 29 2019 5:00 pm
  • Presented by: Andrew Brust, Jeanne Ford
  • From:
Your email has been sent.
or close