Hi [[ session.user.profile.firstName ]]
Sort by:
    • 5 Traps to Avoid and 5 Ways to Succeed with Big Data Analytics
      5 Traps to Avoid and 5 Ways to Succeed with Big Data Analytics Hal Lavender, Chief Architect, Cognizant, Thomas Dinsmore, BI & Big Data Expert, Josh Klahr, VP Product Management, AtScale D Recorded: Dec 20 2017 6:00 pm UTC 59 mins
    • When it comes to Big Data Analytics, do you know if you are on the right track to succeed in 2017?

      Is Hadoop where you should place your bet? Is Big Data in the Cloud a viable choice? Can you leverage your traditional Big Data investment, and dip your toe in modern Data Lakes too? How are peer and competitor enterprises thinking about BI on Big Data?

      Come learn 5 traps to avoid and 5 best practices to adopt, that leading enterprises use for their Big Data strategy that drive real, measurable business value.

      In this session you’ll hear from Hal Lavender, Chief Architetect of Cognizant Technologies, Thomas Dinsmore, Big Data Analytics expert and author of ‘Disruptive Analytics: Charting Your Strategy for Next-Generation Business Analytics, along with Josh Klahr, VP of Product, as they share real world approaches and achievements from innovative enterprises across the globe.

      Join this session to learn…

      - Why leading enterprises are choosing Cloud for Big Data in 2017
      - What 75% of enterprises plan to drive value out of their Big Data
      - How you can deliver business user access along with security and governance controls

      Read more >
    • Data Preparation Done Right
      Data Preparation Done Right Davinder Mundy, Specialist Big Data Technologies, Informatica Recorded: May 9 2018 9:00 am UTC 45 mins
    • How do you avoid your enterprise data lake turning into a so-called data swamp? The explosion of structured, unstructured and streaming data can be overwhelming for data lake users, and make it unmanageable for IT. Without scalable, repeatable, and intelligent mechanisms for cataloguing and curating data, the advantages of data lakes diminish. The key to solving the problem of data swamps is Informatica’s metadata driven approach which leverages intelligent methods to automatically discover, profile and infer relationships about data assets. Enabling business analysts and citizen integrators to quickly find, understand and prepare the data they are looking for.

      Read more >
    • Logtrust Real-time Big Data Analytics
      Logtrust Real-time Big Data Analytics Logtrust Big Data Analytics Recorded: Jul 7 2017 3:30 pm UTC 4 mins
    • No Code, Low Code Big Data Analytics from Simple Search to Complex Event Processing.

      Logtrust is designed for fast data exploration and interaction with real-time visualizations on complex data streams and historical data at rest such as:

      - Machine behavior during attacks
      - Network traffic flow analytics
      - Firewall events
      - Application performance metrics
      - Real-time threat hunting and cyber security
      - IoT analytics

      Explore Petabytes of data with Logtrust without worrying about storage costs or indexers, analyze billions of events per day with ultra-low latency queries, and experience unique real-time performance on trillions of events with over +150,000 ingest EPS per core, +1,000,000 search EPS per core, and +65,000 complex event processing EPS per core.

      Live Data Exploration
      Logtrust data is always fresh with real-time data updates in their native formats. Slice and dice subsets of data at any point in time for exploration and deep forensics on real-time data streams.

      Powerful Data Exploration & Analytics
      Accelerate time-to-insights and rich visualizations with simple point and click. Empower your team to quickly harness insights and make faster, smarter decisions. Optionally, use a single compact expressive SQL language (LINQ) and create reusable callable queries for more complex event processing operations.

      Read more >
    • The Data Lake for Agile Ingest, Discovery, & Analytics in Big Data Environments
      The Data Lake for Agile Ingest, Discovery, & Analytics in Big Data Environments Kirk Borne, Principal Data Scientist, Booz Allen Hamilton Recorded: Mar 27 2018 9:00 pm UTC 58 mins
    • As data analytics becomes more embedded within organizations, as an enterprise business practice, the methods and principles of agile processes must also be employed.

      Agile includes DataOps, which refers to the tight coupling of data science model-building and model deployment. Agile can also refer to the rapid integration of new data sets into your big data environment for "zero-day" discovery, insights, and actionable intelligence.

      The Data Lake is an advantageous approach to implementing an agile data environment, primarily because of its focus on "schema-on-read", thereby skipping the laborious, time-consuming, and fragile process of database modeling, refactoring, and re-indexing every time a new data set is ingested.

      Another huge advantage of the data lake approach is the ability to annotate data sets and data granules with intelligent, searchable, reusable, flexible, user-generated, semantic, and contextual metatags. This tag layer makes your data "smart" -- and that makes your agile big data environment smart also!

      Read more >
    • What's Ahead in Big Data and Analytics
      What's Ahead in Big Data and Analytics Paul Nelson, Leena Joshi, and Balaji Mohanam Recorded: Dec 12 2017 11:20 pm UTC 61 mins
    • We have come a long way since the term "Big Data" swept the business world off its feet as the next frontier for innovation, competition and productivity. Hadoop, NoSQL and Spark have become members of the enterprise IT landscape, data lakes have evolved as a real strategy and migration to the cloud has accelerated across service and deployment models.

      On the road ahead, the demand for real-time analytics will continue to skyrocket alongside growth in IoT, machine learning, and cognitive applications. Meeting the speed and scalability requirements of these types of workloads requires more flexible and efficient data management processes – both on-premises and in the cloud. Flexible deployment and integration options will become a must-have for projects.

      Finally, the need for data governance and security is intensifying as businesses adopt new approaches to expand their data storage and access via data lakes and self-service analytics programs. As data, along with its sources and users, continues to proliferate, so do the risks and responsibilities of ensuring its quality and protection.

      Join us to watch the replay of "What's Ahead in Big Data and Analytics" to get real direction and practical advice on the challenges and opportunities to tackle in 2018.

      Read more >
    • 5 Ways to Fuel Your Big Data Analytics in 2018
      5 Ways to Fuel Your Big Data Analytics in 2018 John Morrell, Senior Director Product Marketing, Datameer Recorded: Dec 14 2017 7:00 pm UTC 49 mins
    • The big data analytics market has undergone continuous transformation since its’ inception and continued in 2017 with new innovations and a strong move to the cloud. But from the view of a customer, the world should be getting simpler, not more complex, and they expect products to make deployments faster and easier.

      Instead of complex, “piece together your own architecture” approaches, 2018 will be a year in which customers can really focus on what’s important – the data and analytics – and not the underlying technologies that support them, whether on-premise, in the cloud, or hybrid.

      In this session, John will explore five ways in which modern big data platforms will enable to you:

      -Accelerate your big data initiatives
      -Get more value from your data lakes
      -Drive faster, more innovative analytics

      Read more >
    • Powering Real-Time Big Data Analytics with a Next-Gen GPU Database
      Powering Real-Time Big Data Analytics with a Next-Gen GPU Database Matt Aslett, Research Director, Data Platforms & Analytics at 451 Research, Dipti Borkar, VP Product Marketing at Kinetica Recorded: Nov 1 2017 5:00 pm UTC 52 mins
    • Freed from the constraints of storage, network and memory, many big data analytics systems now are routinely revealing themselves to be compute bound. To compensate, big data analytic systems often result in wide horizontal sprawl (300-node Spark or NoSQL clusters are not unusual!)— to bring in enough compute for the task at hand. High system complexity and crushing operational costs often result. As the world shifts from physical to virtual assets and methods of engagement, there is an increasing need for systems of intelligence to live alongside the more traditional systems of record and systems of analysis. New approaches to data processing are required to support the real-time processing of data required to drive these systems of intelligence.

      Join 451 Research and Kinetica to learn:
      •An overview of the business and technical trends driving widespread interest in real-time analytics
      •Why systems of analysis need to be transformed and augmented with systems of intelligence bringing new approaches to data processing
      •How a new class of solution—a GPU-accelerated, scale out, in-memory database–can bring you orders of magnitude more compute power, significantly smaller hardware footprint, and unrivaled analytic capabilities.
      •Hear how other companies in a variety of industries, such as financial services, entertainment, pharmaceutical, and oil and gas, benefit from augmenting their legacy systems with a modern analytics database.

      Read more >
    • Building a multi-purpose Data Lake for Increased Business Agility
      Building a multi-purpose Data Lake for Increased Business Agility Alba Fernández-Arias, Sales Engineering at Denodo. Recorded: Mar 27 2018 9:00 am UTC 39 mins
    • The data contained in the data lake is too valuable to restrict its use to just data scientists. It would make the investment in a data lake more worthwhile if the target audience can be enlarged without hindering the original users. However, this is not the case today, most data lakes are single-purpose. Also, the physical nature of data lakes have potential disadvantages and limitations weakening the benefits and possibly even killing a data lake project entirely.

      A multi-purpose data lake allows a broader and greater use of the data lake investment without minimizing the potential value for data science or for making it a less flexible environment. Multi-purpose data lakes are data delivery environments architected to support a broad range of users, from traditional self-service BI users to sophisticated data scientists.

      Attend this session to learn:

      * The challenges of a physical data lake
      * How to create an architecture that makes a physical data lake more flexible
      * How to drive the adoption of the data lake by a larger audience

      Read more >
    • Cloud Data Synergy, from Data Lakes and Data Warehouse to ML and AI
      Cloud Data Synergy, from Data Lakes and Data Warehouse to ML and AI GigaOm, Qubole & Snowflake Recorded: Mar 19 2018 10:35 pm UTC 63 mins
    • This 1-hour webinar from GigaOm Research brings together leading minds in cloud data analytics, featuring GigaOm analyst Andrew Brust, joined by guests from cloud big data platform pioneer Qubole and cloud data warehouse juggernaut Snowflake Computing. The roundtable discussion will focus on enabling Enterprise ML and AI by bringing together data from different platforms, with efficiency and common sense.

      In this 1-hour webinar, you will discover:

      - How the elasticity and storage economics of the cloud have made AI, ML and data analytics on high-volume data feasible, using a variety of technologies.
      - That the key to success in this new world of analytics is integrating platforms, so they can work together and share data
      - How this enables building accurate, business-critical machine leaning models and produces the data-driven insights that customers need and the industry has promised
      - How to make the lake, the warehouse, ML and AI technologies and the cloud work together, technically and strategically.

      Register now to join GigaOm Research, Qubole and Snowflake for this free expert webinar.

      Read more >
    • How to Bring BI into the Big Data World
      How to Bring BI into the Big Data World Claudia Imhoff, renowned analyst & Founder - Boulder BI Brain Trust and Ajay Anand, VP Products & Marketing - Kyvos Insights Recorded: May 25 2018 5:00 pm UTC 63 mins
    • Business intelligence (BI) has been at the forefront of business decision-making for more than two decades. Then along came Big Data and it was thought that traditional BI technologies could never handle the volumes and performance issues associated with this unusual source of data.

      So what do you do? Cast aside this critical form of analysis? Hardly a good answer. The better answer is to look for BI technologies that can keep up with Big Data, provide the same level of performance regardless of the volume or velocity of the data being analyzed, yet give the BI-savvy business users the familiar interface and multi-dimensionality they have come to know and love.

      This webinar will present the findings from a recent survey of Big Data and the challenges and value many organizations have received from their implementations. In addition, the survey will supply a fascinating look into what Big Data technologies are most commonly used, the types of workloads supported, the most important capabilities for these platforms, the value and operational insights derived from the analytics performed in the environment, and the common use cases.

      Attendees will also learn about a new BI technology built to handle Big Data queries with superior levels of scalability, performance and support for concurrent users. BI on Big Data platforms enables organizations to provide self-service and interactive on big data for all of their users across the enterprise.

      Yes, now you CAN have BI on Big Data platforms!

      Read more >