Hi [[ session.user.profile.firstName ]]
Sort by:
    • Philips Wellcentive Cuts Hadoop Troubleshooting from Months to Hours
      Philips Wellcentive Cuts Hadoop Troubleshooting from Months to Hours Geovanie Marquez, Hadoop Architect at Philips Wellcentive Recorded: Dec 6 2016 7:00 pm UTC 48 mins
    • Philips Wellcentive, a SaaS health management and data analytics company, relies on a nightly Mapreduce job to process and analyze data for their entire patient population; from birth to current day. It looks at their entire patient population to assess a number of different characteristics and powers the analytics that physician organizations need to deliver better services. When this job began to fail repeatedly, the Hadoop team spent months trying to identify the root cause using existing monitoring tools, but were unable to come up with an explanation for the job failures and slowdowns.

      Join our webinar to hear more about why existing Hadoop monitoring tools were insufficient to diagnose the root cause of Philips Wellcentive’s problems and how Pepperdata helped them to significantly improve their Big Data operations. The webinar will cover the different approaches that Philips Wellcentive took to rectify their missed SLAs, and how Pepperdata ultimately helped them quickly troubleshoot their performance problems and ensure their jobs complete on time.

      Read more >
    • Simplifying Hadoop Big Data Solutions
      Simplifying Hadoop Big Data Solutions Armando Acosta, Hadoop and Big Data Subject Matter Expert, Dell Recorded: Jan 22 2015 6:00 pm UTC 44 mins
    • From Getting Started with Hadoop to Advanced Solutions

      Big data has become the next frontier for innovation, competition and productivity. To capitalize on the opportunity, your organization needs to find the solution that allows you to collect, manage, store or analyze data in order to use that data to your advantage.

      Because identifying, procuring and integrating enterprise IT can be a complex endeavor, hear how your organization can simplify the process no matter where you are in your big data implementation:

      - Get started with a big data Hadoop solution
      - Scale an existing big data solution
      - Upgrade to interactive analytics

      Join us to learn how you can implement cost-effective solutions for collecting, managing and analyzing data in order to turn big data into valuable business insights.

      Read more >
    • Getting Started with Hadoop
      Getting Started with Hadoop Armando Acosta, Hadoop Product Manager, Dell Recorded: Dec 3 2014 5:00 pm UTC 38 mins
    • Organizations continue to recognize the importance of data as the new currency and as a competitive differentiator. Data is being created and consumed at rates never before seen. This is not unique to a single industry; it is affecting all vertical markets. With organizations struggling to understand and adopt Big Data solutions, we want to offer an easy starting point to begin a complete Big Data solution.

      We simplify the procurement and deployment offering an all-in-one comprehensive solution that allows organizations to start their Big Data Hadoop journey with a bundled solution that includes the hardware, software, including the Cloudera Basic Edition, and services needed to deliver a Hadoop cluster to start on a proof of concept to begin working with big data.

      - Introduces a fully-supported Hadoop proof of concept into organizations to allow users to begin to develop expertise
      - Incorporates full support from the experts to take the first steps with Hadoop
      - Includes one week of professional services to help get started
      - Ideal for proof of concept
      - Delivers an entry-level system that is a stop in a solution that scales to a larger production system

      This proof of concept solution features a Hadoop bundle built for fast deployment, ease-of-use and includes an aggressive starting price point.

      Read more >
    • HDFS TDE: Native Encryption in Hadoop
      HDFS TDE: Native Encryption in Hadoop Alberto Romero, Senior Hadoop Technical Architect, Hortonworks Recorded: Apr 22 2015 9:00 am UTC 47 mins
    • HDFS Transparent Data Encryption has been added to HDFS 2.6, and it finally provides with a solution to data encryption on a higher level than the OS one whilst remaining native and transparent to Hadoop. It aims cover the gap that existed for privacy and security regulations that many industries require, without having to introduce a third-party solution into the mix. This way, having encryption at HDFS level gives an optimal context for policy definition that is relevant to the industry, while remaining transparent to the applications running on Hadoop.

      Join this webinar to learn:

      -where HDFS Transparent Encryption sits within the Hadoop security framework
      - an introduction to the technical details including how to create Encryption Keys and Encryption Zones
      - Interaction with the Apache Key Management System (KMS) and the encryption/decryption data flow
      - Future work in the space of Hadoop security in general, and encryption in particular

      Read more >