Join us for this next segment of “Under the Hood” that focuses on the database designer feature of HPE Vertica.
Learn how the schema designs created by Database Designer provide optimal query performance for your most challenging analytic workloads. Database Designer uses smart strategies to create efficient schema designs that can be deployed, changed and re-deployed by almost anyone, even those without advanced database knowledge.
On Monday, 5 December, Euroclear Bank Settlement went live on the Taskize platform. Taskize is a new service that will help the financial services industry make work flow by enabling clients, colleagues and counterparties to address manual interventions efficiently, intelligently and securely.
In this 30-minute webinar, Luigi Bearzatto, Euroclear, joins Taskize Limited's John O'Hara and Philip Slavin to provide more detail on Taskize, its features, and the Euroclear-related offer that can be made available to you.
Earlier this year, the open source community delivered the Stinger Initiative to improve speed, scale, and SQL semantics in Apache Hive. Now Stinger.next is underway to build on those initial successes.
Join this 30-minute webinar with Hortonworks co-founder Alan Gates and Hortonworks Hive product manager Raj Baines to discuss SQL queries in HDP 2.2: ACID transactions and the cost based optimizer. You will also hear about the road ahead for the Stinger.next initiative.
Owen O’Malley and Carter Shanklin host the second of our seven "Discover HDP 2.1" webinars. They discuss the Stinger Initiative and the improvements to Apache Hive that are included in HDP 2.1: faster queries with Hive on Tez, new SQL semantics, and more.Read more >
Join Rodney Landrum, Consulting Services Manager at Ntirety, a division of HOSTING, as he demonstrates his favorite new features of the latest Microsoft SQL Server 2016 Service Pack 1.
During the webinar, Rodney will touch on the following:
• A demo of his favorite new features in SQL Server 2016 and SP1 including:
o Query Store
o Database Cloning
o Dynamic Data Masking
o Create or Alter
• A review of Enterprise features that are now available in standard edition
• New information in Dynamic Management Views and SQL Error Log that will make your DBAs job easier.
IBM has taken query tuning to a new level with IBM Data Studio. More detail is available than ever before. However, the tool does take some getting used to, especially for folks that are used to a green screen based query tuning experience. This presentation introduces you to IBM Data Studio and gets you started tuning queries.Read more >
Part three in a five-part series, this webcast will be a demonstration of the integration of Hortonworks HDB and Apache Hadoop YARN. YARN provides the global resource management for HDB for cluster-level hardware efficiency, while the in-database resource queues and operators provide the database and query-level resource management for workload prioritization and query optimization. This webinar will focus on demonstrating the installation process as well as discuss the various YARN and HDB parameters and best practice settings.Read more >
Analysing big data quickly and efficiently requires a data warehouse optimised to handle and scale for large datasets. Amazon Redshift is a fast, petabyte-scale data warehouse that makes it simple and cost-effective to analyse big data for a fraction of the cost of traditional data warehouses. By following a few best practices, you can take advantage of Amazon Redshift’s columnar technology and parallel processing capabilities to minimize I/O and deliver high throughput and query performance. This webinar will cover techniques to load data efficiently, design optimal schemas and tune query and database performance.
• Get an inside look at Amazon Redshift's columnar technology and parallel processing capabilities
• Learn how to migrate from existing data warehouses, optimise schemas and load data efficiently
• Learn best practices for managing workload, tuning your queries and using Amazon Redshift's interleaved sorting features
Who Should Attend:
• Data Warehouse Developers, Big Data Architects, BI Managers and Data Engineers
Other sessions on the AWS Big Data Webinar Day - 28 September:
10:00 - 11:00 GMT | Getting Started with Big Data on AWS
Register » https://www.brighttalk.com/webcast/9019/221047?utm_campaign=Brighttalk
11:15 - 12:15 GMT | Architectural Patterns for Big Data on AWS
Register » https://www.brighttalk.com/webcast/9019/221063?utm_campaign=Brighttalk
12:30 - 13:30 GMT | Building Big Data Solutions with Amazon EMR and Amazon Redshift
Register » https://www.brighttalk.com/webcast/9019/221145?utm_campaign=Brighttalk
From game consoles to supercomputers and now the datacenter, GPUs are permeating more and more of the computing ecosystem. By boasting order of magnitude performance improvements on key tasks and exhibiting massive cost of ownership advancements these once specialized chips are writing a new chapter in enterprise computing. The MapD Manifesto talk will provide an overview on why GPUs are so well suited for analytics and visualization. Further, the talk will address what software optimizations MapD has employed to harness the parallel processing power of GPUs. Finally, this talk will discuss some of the potentially destructive analytical behaviors that result from having to wait minutes or hours for queries to complete. There will be live product demonstrations. Ample time will be left for Q&A.Read more >
We will show you how Kudu makes it easier for you to perform both real-time monitoring and ad hoc analytic queries on the same set of data.Read more >
Infectious Media runs on data. But, as an ad-tech company that records hundreds of thousands of web events per second, they have to deal with data at a scale not seen by most companies. You can not make decisions with data when people need to write manual SQL only for queries take 10-20 minutes to return. Infectious Media made the switch to Google BigQuery and Looker and now every member of every team can get the data they need in seconds.
Infectious Media will share:
- Why they chose their current stack
- Why faster data means happier customers
- Advantages and practical implications of storing and processing that much data
Learn How to Store and Query Time Series Data in NoSQL and Other Use CasesRead more >
Zoomdata, developers of the world’s fastest big data exploration, visualization & analytics platform, lets business users see and interact with data in all new ways.
Designed mobile and touch first, its patented micro-query architecture delivers results on billions of records in seconds and gives users a single plane of access for bridging old data and new data.
Zoomdata is backed by Accel Partners, B7, Columbus Nova Technology Partners, NEA and Razors Edge Ventures.
Disk I/O for a virtualized SQL Server database takes a mysterious journey. In this session we will take the mystery out of virtual disk I/O. Using common infrastructure tools we will explore the path of a disk I/O and share repeatable methods that can be used to quickly identify the cause of a performance bottleneck. Along the way we will explore SQL Server, the Virtual Machine, the hypervisor, virtual switches, physical switches, and the storage. We will also show the power of application analytics – using the SentryOne solution to analyze SQL Server and Tintri I/O performance, including the impact on server from the Virtual Host all the way down to the individual SQL queries impacting performance.Read more >
What you will learn in this webinar:
-Learn how Big Data is not just about Hadoop, but the wide range of new and existing frameworks inside and outside your enterprise.
-Learn how Zoomdata can query across multiple data sources to bring a single view of data across disparate data sources.
-See how business users can combine multiple sources without waiting for a data architect to set it up.
-See how the power of Apache Spark enables Zoomdata Fusion at Big Data scale.
-Learn how to access Zoomdata Fusion and more cutting-edge features in the Zoomdata Early Access Program.
The Apache® Spark™ compute engine has gone viral – not only is it the most active Apache big data open source project, but it is also the fastest growing big data analytics workload, on and off Hadoop. The major reason behind Spark’s popularity with developers and enterprises is its flexibility to support a wide range of workloads including SQL query, machine learning, streaming, and graph analysis.
This webinar features Ovum analyst Tony Baer, who will explain the real-world benefits to practitioners and enterprises when they build a technology stack based on a unified approach with Apache Spark.
This webinar will cover:
Findings around the growth of Spark and diverse applications using machine learning and streaming.
The advantages of using Spark to unify all workloads, rather than stitching together many specialized engines like Presto, Storm, MapReduce, Pig, and others.
Use case examples that illustrate the flexibility of Spark in supporting various workloads.
Part two in a five-part series, this webcast will be a demonstration of Pivotal Extension Framework (PXF), an extensible framework that allows Hortonworks HDB to query external system data. This is really useful for both data loading, and also avoiding data loading for data that doesn’t need to reside within the database instance. PXF includes built-in connectors for accessing data inside HDFS files, Hive tables via Catalog, and HBase tables.Read more >
To better serve you, our partners, we are pleased to announce the monthly Partnerworks Office Hours.
In each session we will cover a topic and leave plenty of time for discussion.
This 1st session is scheduled at 15:00 GMT (10:00am US/East):
What’s New in HDP 2.5, providing a brief overview including:
- Dynamic Security Policies with Apache Atlas & Ranger Integration
- Apache Zeppelin Notebook & Spark
- Real-Time Applications: Storm, HBase & Phoenix
- Streamlined Operations: Apache Ambari
- Interactive Query: Hive with LLAP (Technical Preview)
What if you could directly ask questions of your data and the software could respond with a selection, filter, or new visualization? In this DSC webinar, the Tableau Research team explains natural language queries and how they are (already) helping you visualize your data.
In this webinar, explore:
• What natural language processing (NLP) is
• Examples of how NLP is already used in Tableau's geocoding and map search
• A demonstration of some research prototypes that you might see in the not too distant future.
Come participate in the discussion and tell us what things you would say to or ask your visualization!
This webinar is designed to help decision makers and senior marketing individuals unlock the power of their employee workforce through social media advocacy. The session will cover the benefits, challenges and best practices of employee advocacy. By debunking the myths associated, we aim to give you a clear set of practical guidelines which will formulate the basis of your employee advocacy strategy. Sign up for the 45 minute webinar where we will help:
- optimise your current activities with helpful dos and don'ts
- show how social media reach can be significantly improved using impressions as a key metric
- provide advice on establishing and responding to new engagements
- provide a strategy to generate new social media leads
Join us for this interactive webinar that will allow attendees to participate in polls and a live Q&A session, where our presenter will answer your queries on-demand.
A brief introduction to the SS8 BreachDetect solution. Featuring a simple, visual UI, SS8 BreachDetect allows analysts to quickly view automated threat alerts, drill deeper into what triggered those alerts, and create custom queries and dashboards to investigate suspicious network activity and devices of interest.Read more >
Turbo-Charge BI on Hadoop: The Time is Now
Want to turn your Hadoop cluster into a super-powerful, analytics data warehouse? Need to run BI queries on Hadoop at top speed?
Watch this recording of a live best practice session. You'll see how leading companies are super-charging their BI on Hadoop by combining the power of Tableau with the scale of Impala, and accelerating it all with AtScale. In this session, leaders from Cloudera and Tableau share a real-world perspective on
How to get super-fast performance from BI queries on Hadoop
Deliver powerful self-service visualization directly on Hadoop
Leverage existing BI and Hadoop investments to deliver more value to more users
The world of commercial banking moves swiftly. B2B clients have complex needs and offer great opportunity for banks who can move fast, resolve queries quickly and provide a premium service. If relationship managers aren’t anticipating and responding to their client’s every need then business can easily be taken elsewhere. However, with hundreds of clients to manage at once, it is often impossible to keep them all happy.
For one of the largest commercial banks in the UK, Tableau provided the perfect solution to create client dashboards to help relationship managers, product partners, and service and operational staff to all easily access and take action on client feedback, review product opportunities and keep up to date with client industry news.
Creating a threat intelligence strategy is essential for a company to identify and prioritize threats effectively. Curating the necessary relevant data for this strategy, however, can be incredibly time consuming and resource intensive.
In this webinar, Greg Reith, Threat Intelligence Analyst at T-Mobile, will discuss how to use real-time threat intelligence from Recorded Future to create a forward-looking strategy, including:
• Identifying and analyzing hard-to-find threat data from the entire web including content in multiple languages.
• Gaining relevant intelligence effectively from large volumes of threat data with smart automation, alerts, and queries.
• Discovering trends and patterns that are useful in developing a forward-looking shift in strategy from multiple perspectives.
Find out how you can reduce the time to collect the necessary information for building an effective threat intelligence strategy by over 400 percent.