Hi [[ session.user.profile.firstName ]]

The Modern Data Architecture for Avanced Business Intelligence with Hortonworks

Join Hortonworks and Microstrategy to:

-Discuss the modern architecture for Business Intelligence
-Learn how our joint solution helps enterprises store, process, and analyze vast amounts of structured and unstructured data
-Discover what new benefits Hadoop 2.0 offers and how the MicroStrategy Analytics platform leverages those new features
Recorded Nov 7 2013 55 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Anurag Tandon, Director of Product Marketing, MicroStrategy, John Kreisa, VP Sorategic Marketing, Hortonworks
Presentation preview: The Modern Data Architecture for Avanced Business Intelligence with Hortonworks

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Using Advanced Analytics for Better Customer Experience Feb 14 2017 6:00 pm UTC 60 mins
    Hortonworks and Pivotal
    Innovative mobile operators need to mine the vast troves of unstructured data now available to them to help develop compelling customer experiences and uncover new revenue opportunities. In this webinar, you’ll learn how HDB’s in-database analytics enable advanced use cases in network operations, customer care, and marketing for better customer experience. Join us, and get started on your advanced analytics journey today!
  • Providing Modern Healthcare Using Hadoop and Data Virtualisation Recorded: Jan 17 2017 64 mins
    Mark Pritchard, Denodo and Sean Roberts, Hortonworks
    Vizient needed a unified view of their accounting and financial data marts to enable business users to discover the information they need in a self-service manner and to be able to provide excellent service to their members. Vizient selected Hortonworks Big Data Platform and Denodo Data Virtualization Platform so that they can unify their distributed data sets in a data lake, and at the same time provide an abstraction for end users for easy self-serviceable information access.

    During this webinar, you will learn:

    1) The role, use, and benefits of Hortonworks Data Platform in the Modern Data Architecture.
    2) How Hadoop and data virtualisation simplify data management and self-service data discovery.
    3)What data virtualisation is and how it can simplify big data projects. Best practices of using Hadoop with data virtualisation

    About Vizient
    Vizient, Inc. is the largest nationwide network of community-owned health care systems and their physicians in the US. Vizient™ combines the strengths of VHA, University HealthSystem Consortium (UHC), Novation and MedAssets SCM and Sg2, trusted leaders focused on solving health care's most pressing challenges. Vizient delivers brilliant resources and powerful data driven insights to healthcare organizations.
  • How to run Hortonworks Data Cloud on Amazon Web Services Recorded: Dec 15 2016 60 mins
    Sean Roberts, Partner Solutions Engineer, Hortonworks
    Hortonworks Data Cloud for Amazon Web Services is a new product offering from Hortonworks that is delivered and sold via the AWS Marketplace. It allows you to start analyzing and processing vast amounts of data quickly. Powered by the Hortonworks Data Platform, Hortonworks Data Cloud is an easy-to-use and cost-effective solution for handling big data use cases with Apache Hadoop, Hive, and Spark.

    Join us on Dec. 15, 2016 to learn more about the product and to see a live demo by Sean Roberts. You’ll see how to quickly deploy Apache Spark and Apache Hive clusters for processing and analyzing data in the cloud.
  • Partnerworks Office Hours: Dynamic Security & Data Governance in HDP 2.5 Recorded: Dec 13 2016 93 mins
    Srikanth Venkat Senior Director of Security & Sean Roberts, Partner Engineering EMEA, Hortonworks
    The integration of Apache Ranger and Apache Atlas is driving the creation of a classification-based access security policies in Hadoop. The Atlas project is a data governance and metadata framework for the Hadoop platform that includes contributions from end users and vendors. The Ranger project provides central security policy administration, auditing, authorization, authentication, and data encryption for Hadoop ecosystem projects such as HDFS, Hive, HBase, Storm, Solr, Kafka,NiFi and YARN. When Atlas and Ranger are used together, system administrators can define flexible metadata tag-based security policies to protect data in real-time.

    In this month’s Partnerworks Office Hours we will discuss the Security & Governance capabilities of HDP 2.5 with integration Apache Ranger & Apache Atlas. As usual we will leave plenty of time for your questions & further discussion.

    Topics covered:

    - Classification-based Policies
    - Location-based Policies
    - Time-based Policies
    - Prohibition-based Policies
    - Hive Row Level Security & Dynamic Data Masking:
    - Broader Atlas eco-system support
  • How Universities are Using Data to Transform Education Recorded: Dec 8 2016 27 mins
    Hortonworks and Microsoft
    Student performance data is increasingly being captured as part of software-based and online classroom exercises and testing. This data can be augmented with behavioral data captured from sources such as social media, student-professor meeting notes, blogs, student surveys, and so forth to discover new insights to improve student learning. The results transcend traditional IT departments to focus on issues like retention, research, and the delivery of content and courses through new modalities.
    Hortonworks is partnering with Microsoft to show you how the Hortonworks Data Platform (HDP) running on the Microsoft stack enables you to develop a “single view of a student”. 
  • How to Get Started with Hortonworks Data Cloud for AWS Recorded: Dec 7 2016 57 mins
    Jeff Sposetti Senior Director of Product Management, Cloud & Operations
    Hortonworks Data Cloud for Amazon Web Services is a new product offering from Hortonworks that is delivered and sold via the AWS Marketplace. It allows you to start analyzing and processing vast amounts of data quickly. Powered by the Hortonworks Data Platform, Hortonworks Data Cloud is an easy-to-use and cost-effective solution for handling big data use cases with Apache Hadoop, Hive, and Spark.

    Join us on Dec. 7, 2016 to learn more about the product and to see a live demo by Jeff Sposetti, Senior Director of Product Management. You’ll see how to quickly deploy Apache Spark and Apache Hive clusters for processing and analyzing data in the cloud.

    The first 50 attendees will get access to $100 cloud credit for AWS!
  • How to use Apache Zeppelin with Hortonworks HDB Recorded: Dec 6 2016 26 mins
    Hortonworks; Pivotal
    Part five in a five-part series, this webcast will be a demonstration of the integration of Apache Zeppelin and Pivotal HDB. Apache Zeppelin is a web-based notebook that enables interactive data analytics. You can make beautiful data-driven, interactive and collaborative documents with SQL, Scala and more. This webinar will demonstrate the configuration of the psql interpreter and the basic operations of Apache Zeppelin when used in conjunction with Hortonworks HDB.
  • The Path to a Modern Data Architecture in Financial Services Recorded: Dec 1 2016 60 mins
    Vamsi Chemitiganti, GM - Financial Services at Hortonworks and Lee Phillips, Product Marketing at Attivio
    Delivering Data-Driven Applications at the Speed of Business: Global Banking AML use case.

    Chief Data Officers in financial services have unique challenges: they need to establish an effective data ecosystem under strict governance and regulatory requirements. They need to build the data-driven applications that enable risk and compliance initiatives to run efficiently. In this webinar, we will discuss the case of a global banking leader and the anti-money laundering solution they built on the data lake. With a single platform to aggregate structured and unstructured information essential to determine and document AML case disposition, they reduced mean time for case resolution by 75%. They have a roadmap for building over 150 data-driven applications on the same search-based data discovery platform so they can mitigate risks and seize opportunities, at the speed of business.

    Hosted by Hortonworks and Attivio
  • How to Build IoT Intelligence from Geographically Distributed Sensor Data Recorded: Dec 1 2016 55 mins
    Hellmar Becker, Solutions Engineer, Hortonworks
    Apache MiNiFi is designed to make it practical to enable data collection from the second it is born, ideal for IoT scenarios where there are a large number connected devices or a need for a smaller and more streamlined footprint than Apache NiFi. Join us as we share a use case and demo of Apache MiNiFI, and how it can enable edge data collection from Raspberry PI’s attached to weatherproof antennas distributed all over the world.
  • Delivering a Flexible IT Infrastructure for Analytics on IBM Power Systems Recorded: Nov 30 2016 53 mins
    IBM and Hortonworks
    Customers are preparing themselves to analyze and manage an increasing quantity of structured and unstructured data. Business leaders introduce new analytical workloads faster than what IT departments can handle. Legacy IT infrastructure needs to evolve to deliver operational improvements and cost containment, while increasing flexibility to meet future requirements. By providing HDP on IBM Power Systems, Hortonworks and IBM are giving customers have more choice in selecting the appropriate architectural platform that is right for them. In this webinar, we’ll discuss some of the challenges with deploying big data platforms, and how choosing solutions built with HDP on IBM Power Systems can offer tangible benefits and flexibility to accommodate changing needs.
  • Key Considerations for IT Leaders to Optimize Data Architectures for IoT & Cloud Recorded: Nov 29 2016 61 mins
    Jennifer Riggins, Dave Russell (Hortonworks), Michael Bironneau (Open Energi), Alex Montgomery (Microsoft)
    Rapid data growth from a wide range of new data sources is significantly outpacing organizations’ abilities to manage data with existing systems. Today’s data architectures and IT budgets are straining under the pressure. In response, the center of gravity in the data architecture is shifting from structured transactional systems to cloud based modern data architectures and applications; with Hadoop at it's core.

    Join this live and on-demand video panel as they discuss how the landscape is changing and offer insights into how organizations are successfully navigating this shift to capture new business opportunities while driving cost out.
  • Real-Time Data Ingestion and Streaming Analytics with Hortonworks & Oracle Recorded: Nov 24 2016 56 mins
    Sean Roberts, Hortonworks; John Mullis, Oracle
    Organizations today are looking to exploit modern DataArchitectures that combine the power and scale of Big Data Hadoop platforms with operational data from their Transactional Systems. In order to react to situations in an agile manner in real-time, low-latency
    access to data is essential.

    Hortonworks and Oracle can provide comprehensive solutions that allow organisations to respond rapidly to data events.

    During this webinar we will cover:

    - How Oracle GoldenGate empowers organisations to capture,route, and deliver transactional data from Oracle and non-Oracle databases for ingestion in real-time to HDP :registered:.

    - How GoldenGate complements HDF providing optimised delivery to Hadoop targets such as HDFS, Hive, Hbase, NoSQL, Kafka, to support customers with their real-time big data analytics initiatives.

    - How Oracle GoldenGate enables uses cases for analysis of data in motion

    Attend this webinar to learn how Hortonworks and Oracle can help you with your real-time big data analytics and streaming initiatives!
  • The Role of Big Data in Trade Surveillance & Market Compliance Recorded: Nov 22 2016 59 mins
    Paul Isherwood at Lloyds Banking Group; Vamsi K Chemitiganti at Hortonworks & Shant Hovsepian at Arcadia Data
    Today’s European financial markets hardly resemble the ones from 15 years ago. The high speed of electronic trading, explosion in trading volumes, the diverse range of instruments classes and a proliferation of trading venues pose massive challenges. With all this complexity, market abuse patterns have also become egregious. Banks are now shelling out millions of euros in fines for market abuse violations. In response to this complex world, European regulators thus have been hard at work.

    In this webinar, we will discuss how compliance teams are fighting back with Big Data and trying to stay out of regulatory hot water. Rapid response to suspect trades means compliance teams need to access and visualize trade patterns, real time and historic data, and be able to efficiently perform trade reconstruction at any point in time.

    Join Hortonworks and Arcadia Data for this live webinar on 22 November at 14:00 GMT, where we’ll cover the use case at a Top 25 Global Bank who now has deep forensic analysis of trade activity.
  • Are you Mature Enough to Handle this Big Data? Recorded: Nov 22 2016 57 mins
    Abhas Ricky, Head of Customer Innovation Strategy EMEA, Hortonworks
    The fourth Industrial revolution is here, and competing to succeed in the 4.0 ‘digital’ world entails making the right decisions based on data driven pointers, to successfully implement your strategy. As we work with the entire stack of Fortune 100 organizations, we often see companies—particularly those operating across business lines with complex lines of businesses and/ or looking to break into new markets or asset classes— struggle to answer two questions:

    ‘How to innovate’ using the rich trove of data that they already have
    ‘Which systems and processes to renovate’ to get the best value for their investment

    Hortonworks Big Data Maturity Scorecard fills the knowledge gap. It gives you a better understanding of the opportunities unique to your business, and helps comprehend ‘how the maturity of your organization enables or inhibits your ability to strategically pursue big-data enabled business transformation programs aligned to your business goals. Join this webinar to learn more.
  • Partnerworks Office Hours: What's New in HDP 2.5 Recorded: Nov 17 2016 75 mins
    Sean Roberts, Partner Solutions Engineer, Hortonworks
    To better serve you, our partners, we are pleased to announce the monthly Partnerworks Office Hours.
    In each session we will cover a topic and leave plenty of time for discussion.

    This 1st session is scheduled at 15:00 GMT (10:00am US/East):

    What’s New in HDP 2.5, providing a brief overview including:
    - Dynamic Security Policies with Apache Atlas & Ranger Integration
    - Apache Zeppelin Notebook & Spark
    - Real-Time Applications: Storm, HBase & Phoenix
    - Streamlined Operations: Apache Ambari
    - Interactive Query: Hive with LLAP (Technical Preview)
  • How to get started with Apache MADlib on Hortonworks HDB Recorded: Nov 16 2016 31 mins
    Hortonworks; Pivotal
    Part four in a five-part series, this webcast will be a demonstration of the installation of Apache MADlib (incubating), an open source library for scalable in-database analytics, into Hortonworks HDB. MADlib is an open-source library for scalable in-database analytics. It provides data-parallel implementations of mathematical, statistical and machine learning methods for structured and unstructured data. This webinar will demonstrate the installation procedures, as well as some basic machine learning algorithms to verify the install.
  • How Verizon is Solving Big Data Problems with Interactive BI Recorded: Nov 15 2016 65 mins
    Hortonworks; Verizon; Kyvos
    With increasing data volumes and data sources, enterprises are outgrowing their traditional BI solutions and struggling to make use of the data collected on their new data platforms. Frequently, data engineers will resort to old habits of shifting data sets between repositories so that data can be analyzed using older methods. Benefits of attending:

    Learn how to deal with the complexity of big data at rest and in motion
    The differences between traditional OLAP and the more modern OLAP on Hadoop
    How to put together a Hadoop architecture for self-service interactive BI
  • Scaling real time streaming architectures with HDF and Dell EMC Isilon Recorded: Nov 10 2016 41 mins
    Hortonworks; Dell EMC
    Streaming Analytics are the new normal. Customers are exploring use cases that have quickly transitioned from batch to near real time. Hortonworks Data Flow / Apache NiFi and Isilon provide a robust scalable architecture to enable real time streaming architectures. Explore our use cases and demo on how Hortonworks Data Flow and Isilon can empower your business for real time success.
  • How Johnson Controls Moved From Proof of Concept to a Global Big Data Solution Recorded: Nov 9 2016 56 mins
    Johnson Controls, Hortonworks and RCG
    Johnson Controls delivers best-in-class building technologies and energy storage. In their quest to continually improve operations, they implemented a modern data architecture based on Hadoop. They started with a small successful proof of concept and recognized the need to make more of their data accessible to more teams. Johnson Controls was able to successfully integrate Big Data technology into more of their operations that ultimately supports their goals to create better products, reduce waste, and increase profitability.

    Join this webinar where our guest speaker from Johnson Controls shares their journey from a small Hadoop POC to a global production-ready implementation that includes security and governance.

    You will hear:
    •What steps JCI took throughout the process
    •What framework and technologies they used
    •How they engaged executives for full corporate buy-in
    •Issues they encountered and solved
    •How JCI was able to provide a secure reliable platform to consolidate global data
    •Their next steps to complete the global integration.

    Speakers: Tim Derrico, Manager of Architecture and Strategy - Big Data, Johnson Controls
    Thomas Clarke, Managing Principal, RCG Global Services
    Eric Thorsen, VP Retail / Industry Solutions, Hortonworks

    Co-hosted by Hortonworks and RCG Global Services
  • Apache Kafka, Storm and NiFi: Better Together Recorded: Nov 8 2016 56 mins
    Bryan Bende, Haimo Liu, Hortonworks
    Apache NiFi, Storm and Kafka augment each other in modern enterprise architectures. NiFi provides a coding free solution to get many different formats and protocols in and out of Kafka and compliments Kafka with full audit trails and interactive command and control. Storm compliments NiFi with the capability to handle complex event processing.

    Join us to learn how Apache NiFi, Storm and Kafka can augment each other for creating a new dataplane connecting multiple systems within your enterprise with ease, speed and increased productivity.
Powering the future of data
Founded in 2011 by 24 engineers from the original Yahoo! Hadoop team, Hortonworks has amassed more Hadoop experience under one roof than any other organization in the world. Our team works every day to enhance the Hadoop core, creating new code to improve the open product we have stewarded since its inception. Simply put, we are your best choice to support you in your Hadoop journey.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: The Modern Data Architecture for Avanced Business Intelligence with Hortonworks
  • Live at: Nov 7 2013 11:55 pm
  • Presented by: Anurag Tandon, Director of Product Marketing, MicroStrategy, John Kreisa, VP Sorategic Marketing, Hortonworks
  • From:
Your email has been sent.
or close