NoSQL, Hadoop and MapReduce: Building a Modern Data Infrastructure that Works
In a whirlwind of big data tools like MapReduce, NoSQL, Hadoop, and their cousins and brothers, it’s difficult to understand the stack you need to make your data as useful as possible. How do you decide which tools to use, and once you do decide, how do you make the jump?
Join this roundtable led by big data infrastructure experts to:
*Understand the ingredients of a modern data infrastructure
*Learn how to assess your needs
*Make a blueprint for building a modern data architecture that works for you
RecordedAug 21 201360 mins
Your place is confirmed, we'll send you email reminders
Heather Kreger, Gopal Indurkhya, Manav Gupta, Christine Ouyang from the Cloud Standards Customer Council
Using analytics reveals patterns, trends and associations in data that help an organization understand the behavior of the people and systems that drive its operation. Big data technology increases the amount and variety of data that can be processed by analytics, providing a foundation for visualizations and insights that can significantly improve business operations.
In this webinar, the Cloud Standards Customer Council will discuss how to support big data and analytics capabilities using cloud computing. The speakers will walk through a cloud reference architecture and cover the various considerations and best practices for building big data and analytics solutions in the cloud.
Financial data is both the most intimate and most powerful data we have about ourselves. Financial data should not kept in a silo but openly available to third parties, -- this is required for true innovation. However, security and data protection are crucial. Banks and third party providers have to work together to provide the infrastructure required to innovate.
Join this webinar where we will discuss:
-The power of APIs -- how to integrate banking data and financial sources quickly and easily
-What developers need to know about banking APIs and how to foster new services in the FinTech space
-PSD2 Post-Brexit -- what now?
-Will traditional banks be replaced by FinTech banks one day?
-Which is the biggest challenge: market education, technical issues, or regulation?
The basics of data cleaning are remarkably simple, yet few take the time to get organized from the start.
If you want to get the most out of your data, you're going to need to treat it with respect, and by getting prepared and following a few simple rules your data cleaning processes can be simple, fast and effective.
The Practical Data Cleaning webinar is a thorough introduction to the basics of data cleaning and takes you through:
• Data Collection
• Data Cleaning
• Data Classification
• Data Integrity
• Working Smarter, Not Harder
Harmeen Birk, Director; Hristiyan Nedkov, Business Analyst, Tableau
The world of commercial banking moves swiftly. B2B clients have complex needs and offer great opportunity for banks who can move fast, resolve queries quickly and provide a premium service. If relationship managers aren’t anticipating and responding to their client’s every need then business can easily be taken elsewhere. However, with hundreds of clients to manage at once, it is often impossible to keep them all happy.
For one of the largest commercial banks in the UK, Tableau provided the perfect solution to create client dashboards to help relationship managers, product partners, and service and operational staff to all easily access and take action on client feedback, review product opportunities and keep up to date with client industry news.
It’s the Big Bang of big data; or where it all begins. The foundational first step in data analysis is data preparation. With the potential to derail everything that supersedes it, it’s imperative that the right measures are taken far in advance of any discovery analysis.
Join these discussions to learn best practices in proper data profiling, modeling, transformations and normalization for all data sets.
It's no secret that brands today are investing more money than ever in marketing technologies to gain better insight into customer preferences and behavior. We rely heavily on these systems to collect data and deliver the best experience for engaging and converting visitors. However, when this data isn’t shared between your vendors, you may be overlooking a big opportunity to maximize return on your investments.
This webinar will explore 3 ways to power intelligent marketing campaigns by leveraging real-time data you already have, as well as examine a more practical take on how to extract value from connecting data across your entire technology stack, allowing more timely, relevant, and meaningful interactions with the customer.
Ina Yulo (BrightTALK), Steve Tigar (Money Dashboard), Dan Scholey (Moneyhub)
Data visualisation is a discipline that uses graphs and charts to easily communicate large chunks of data into easily digestible formats.
When it comes to personal finance, data visualization has been used to create useful dashboards where users can keep track of their spending, income, and budgeting.
Join this session where we will discuss:
-Why is data visualization so useful when it comes to personal finance?
-What are the best data viz tools/apps for personal finance?
-What are customers missing from banks that personal finance fintechs are able to provide?
-What are the best practice tips for using dashboards and apps to improve personal finance?
-What are some common mistakes people make when managing their personal finances?
-What are some common misconceptions of data visualization?
Three-quarters of Americans believe that control over their personal data is very important, but only 9% believe they have this control. Up until now, data governance and protection have been a low priority for brands, but the long-term impact of a data breach can lead to a loss of consumer confidence – not to mention massive financial implications. How do you balance the opportunity to provide the best customer experience with the increasing responsibilities in data privacy and security?
In this webinar, we’ll discuss five industry best practices for building an effective data governance plan. From the vendors you choose to work with, to the policies and practices in place today, learn how to make sense of the current legal landscape and how Tealium’s solutions allow you to provide these safeguards to your customers.
Ronald van Loon, Director Business Development, Adversitement
Many companies nowadays run their business through multiple channels. So to get insight into customer behavior they may perceive a need to focus on creating an omni-channel view. Obviously this is primarily on data collection, but using the data for visualization and analytics is that important.
It will facilitate use of BI tools by stakeholders to get the right insights. But are all tools suitable for all people, what are best practices and how to organize your teams to get best results?
In this webinar, Ronald van Loon will:
• Elaborate on the challenges
• Show how a new approach contributes to meeting them
• Discuss several case studies and their results
What’s the truth about using predictive analytics – the possibilities and reality; and dare you use it? Do you have the technical ability to implement it, and the tools to do something in response to the predictions?
In this webinar we’ll look at the full spectrum of technology and benefits, then tear it down into something we can actually use now, that’s not scary and delivers measurable value to you and your customers.
Gartner predicts that “analytics will be pervasive … for decisions and actions across the business.” Sounds like analytics nirvana with instant access for any analysis you want to do, in other words self-service BI. Is this dream or reality?
Join this webinar to find out how clouds like AWS or Azure are moving the industry close to this nirvana today through simple assembly of cloud services combined with the appropriate consumption model of these services.
We will demonstrate how easy it is to provision your high end SAP HANA Database right next to your BI Analytics tier.
Maybe we are closer to this nirvana than you think?
The “Old” world of BI, with its IT centric solutions, OLAP based reporting, and limited ad-hoc querying, has a lot of shortcomings that inhibit self-service BI. Yet, with increasing data complexity has come a new age of BI that is focused on taking strides to provide faster, more data driven and integrated solutions to try and empower the business user.
Join Ani Manian, Head of Product Strategy at Sisense, as he explains the old and new trends in data analytics, and how you can make sure you benefit from a more business-centric world. You’ll learn how to set up meaningful KPI’s, model data according to specific business needs, and work interactively with business users to prototype relevant reports.
Ina Yulo, Tom Berthon (Senior Product Owner, Growth team), Kathryn Birch (Customer Success Manager), BrightTALK
Data is everywhere, but unfortunately, most business users don't have the time to sort through a bunch of stats. They just want to see the bigger picture--so why not give it to them?
Self-service analytics has enabled business users to access and interpret data without needing any statistics background. When coupled with data visualization techniques, we've seen that not only are business users more encouraged to make more data-driven decisions, but they are also able to do this without needing help from their BI or IT teams.
Join this session where we will discuss:
-How to encourage people from all different teams to embrace self-service analytics
-Why data visualization plays a huge role in the success of a data-driven culture
-Top things to take note of when creating dashboards that can easily communicate and add value
-How to teach users to get the most from their data and generate actionable insights
David Burden - CEO, Daden Limited, an immersive learning and visualisation company
There are a number of exciting ways in which Virtual and Augmented Reality can be used to visualise data.
This presentation will look at 3 main approaches - the "virtual command centre", the "data space" and "annotated reality", and consider the pros and cons of each, and how they could be used with the current generation of VR/AR headsets, and how such systems may develop in the future.
With today’s advancements in data discovery and data visualization, not everyone needs to have a degree in data analytics to navigate business-critical data on their own. However, creating and implementing self-service analytics is no easy task.
Join us to learn about one of the many tools available to help your organisation put the analytical power into the hands of the business experts on the ground.
Join this webinar where we will cover some of the features and ideas for ensuring data accuracy, which is vital for business users. We will also discuss the forthcoming EU privacy regulations and how to make sure your setup is ready for this shift (GDPR comes into force in 2018).
Finally, we will present some ideas on whether to use a ready off-the-shelf solution or invest in creating your own. Its seems that the whole build vs rent discussion has reached the tipping point this year and there may be some surprising answers.
Shreyas Shah, Principal Data center Architect, Xilinx
In the cloud computing era, data growth is exponential. Every day billions of photos are shared and large amount of new data created in multiple formats. Within this cloud of data, the relevant data with real monetary value is small. To extract the valuable data, big data analytics frame works like SparK is used. This can run on top of a variety of file systems and data bases. To accelerate the SparK by 10-1000x, customers are creating solutions like log file accelerators, storage layer accelerators, MLLIB (One of the SparK library) accelerators, and SQL accelerators etc.
FPGAs (Field Programmable Gate Arrays) are the ideal fit for these type of accelerators where the workloads are constantly changing. For example, they can accelerate different algorithms on different data based on end users and the time of the day, but keep the same hardware.
This webinar will describe the role of FPGAs in SparK accelerators and give SparK accelerator use cases.
A Case Study presented by Kurt Jackson, Platform Lead, Autodesk
Companies such as Autodesk are fast replacing the once-true- and-tried physical data warehouses with logical data warehouses/ data lakes. Why? Because they are able to accomplish the same results in 1/6 th of the time and with 1/4 th of the resources.
In this webinar, Autodesk’s Platform Lead, Kurt Jackson,, will describe how they designed a modern fast data architecture as a single unified logical data warehouse/ data lake using data virtualization and contemporary big data analytics like Spark.
Logical data warehouse / data lake is a virtual abstraction layer over the physical data warehouse, big data repositories, cloud, and other enterprise applications. It unifies both structured and unstructured data in real-time to power analytical and operational use cases.
Attend and Learn:
-Why logical data warehouse/ data lakes are the bedrock of modern data architecture
-How you can build a logical data warehouse using data virtualization
-How to create a single, unified enterprise-wide access and governance point for any data used within the company
Andy Kirk, Data Visualization specialist and Editor, VisualisingData.com
In this talk Andy Kirk will shine a light on some of the most discussed and debated aspects of data visualisation design. The aim of the talk is to expose some of the myths about data visualisation and reinforce some of the truths in order to offer practitioners, professionals and part-time enthusiasts alike greater clarity about this increasingly popular discipline.
Viewers will come away with a greater understanding of the rights and the wrongs in data visualisation as well as an awareness of the aspects of this activity that must remain tagged with the elusive notion of ‘it depends’. Along the way Andy will exhibit some of the best examples and techniques from across the field.
Everyone loves to visualize data, but sometimes data visualizations are the wrong tools for the job. Learn to avoid common pitfalls and see how to make fast, easy, and accurate data visualizations part of your analytics mix, so everyone can make informed decisions.
Managing and analyzing data to inform business decisions
Data is the foundation of any organization and therefore, it is paramount that it is managed and maintained as a valuable resource.
Subscribe to this channel to learn best practices and emerging trends in a variety of topics including data governance, analysis, quality management, warehousing, business intelligence, ERP, CRM, big data and more.
Una Visión Única del Ciudadano (Master Data Management o Visión 360º) permite que todos los sistemas operacionales y los procesos empresariales
compartan datos fiables y coherentes.
Tales proyectos tienen un impacto directo en la calidad de los servicios que los ciudadanos reciben, así como en la optimización de los procesos internos, con la consiguiente reducción de costes. Por ejemplo, una visión única del contribuyente permitirá eliminar reembolso a contribuyentes duplicados, y detectar el fraude.
Esta visión estratégica está ganando terreno en el sector público conforme sus instituciones se afanan por unificar la información relativa a ciudadanos, procesos y sistemas.
Pero generalmente esos proyectos se abordan desde una visión muy tecnológica, de gran envergadura y coste elevado, con resultado a largo plazo, de ahí que las organizaciones sean reticentes a abordarlos.
En este webcast al cual Information Builders tiene el placer de invitarles, podrá ver cómo abordar un proyecto MDM desde otra óptica.
Learn how to reduce costs and drive efficiencies with Microsoft Azure!
The partnership between Microsoft and SAP benefits our enterprise customers every day. With over 50% of our enterprise customers running SAP for their core ERP systems, Microsoft and SAP have made significant investments helping our joint customers realize the benefits of our collaboration.
Join this webcast to learn how to reduce costs and drive efficiencies with Microsoft's roadmap for HANA on Azure. You'll discover why these customers have realized significant productivity gains, as well as 30-50% cost savings, from our investments in 3 key areas:
•Productivity: Full integration from SAP NetWeaver to Microsoft Office and Outlook with Single Sign-On
•Analytics: Rich visualizations in Microsoft Excel and Power BI solutions through native connectors from SAP Business Warehouse (BW) and HANA
•Infrastructure: Over 400 customers are taking advantage of the Azure (Microsoft Public Cloud) certification for all NetWeaver-based databases and HANA for production and non-production systems
From game consoles to supercomputers and now the datacenter, GPUs are permeating more and more of the computing ecosystem. By boasting order of magnitude performance improvements on key tasks and exhibiting massive cost of ownership advancements these once specialized chips are writing a new chapter in enterprise computing.
Sam Madden, professor at MIT's Computer Science and AI Lab, will cover the evolution of compute and why it impacts databases along the dimensions of speed, memory and scalability. Madden will detail those insights with a case study from Cambridge Telematics where he is a founder and the Chief Scientist.
The evolution of compute, particularly GPUs, needs software to harness its promise. The second part of the webinar will feature one of Sam’s former students, founder and CEO of MapD, Todd Mostak.
Todd will address what software optimizations MapD has employed to harness the parallel processing power of GPUs: LLVM, backend rendering, and memory management. There will be a live product demonstration using the 1.1B row NYC taxi/limo/uber dataset.
The financial crisis brought about huge losses to a large number of financial institutions and exposed many of the internal inefficiencies leveraged to manage risk and data within and across divisions. Today, things have changed dramatically. Despite the many regulatory changes over the years, marketplace lenders have been extremely successful implementing predictive modeling and data aggregation to assess consumers’ and small businesses’ financial health.
New lending technologies allow lenders to improve customer satisfaction by creating a personalized and unique customer experience. By taking into consideration personal, transactional, application, and product-selection data from financial institutions, lenders are able to leverage this data for other opportunities such as cross-selling credit products during the loan-application process. These innovations ultimately provide borrowers with an intuitive and simplified borrowing process.
In this webinar you’ll:
* Get insight into the trends and opportunities driving change in the lending and credit risk management industries
* Explore more accurate predictive ratings models based on alternative data sets
* Discover the future of commercial retail lending and credit risk management
* Learn how to improve credit decisions, collections, and portfolio management using new technologies and data analytics
* Spencer Robinson, Head of Strategy, Kabbage
* Sherif Hassan, Founding CEO, Herio Capital
* Saurabh Sharma, Founder and CEO, Indus Insights
* Terry McKeown, Practice Manager, Credit Analytics, Envestnet | Yodlee
* Evan Schuman, Moderator, VentureBeat
Join us on September 22nd at 10 am PT/1 pm ET for an exciting FREE discussion on the trends and opportunities driving change in lending and credit risk management.
Hadoop didn’t disrupt the data center. The exploding amounts of data did. But, let’s face it, if you can’t move your data to Hadoop, then you can’t use it in Hadoop. Join the experts from Hortonworks, the #1 leader in Hadoop development, and Attunity, a leading data management software provider, for a webinar where you’ll learn:
-How to ingest your most valuable data into Hadoop using Attunity Replicate
-About how customers are using Hortonworks DataFlow (HDF) powered by Apache NiFi
-How to combine the real-time change data capture (CDC) technology with connected data platforms from Hortonworks
We will discuss how Attunity Replicate and Hortonworks Data Flow (HDF) work together to move data into Hadoop. And, there will be a live question & answer session.
Hadoop is a wonderful framework that can drive insight into a range of problems, but administrators can be blind to performance, utilization and cost issues of cluster deployments. There’s risk in sizing initial deployments properly, as well as determining optimal capacity over time. Without sufficient visibility, job performance can suffer and costs can escalate. Keeping a critical resource running effectively and being able to plan for the future requires intelligent decision making capabilities that many Hadoop deployments lack. This webinar will look into common problems that affect clusters and ways to detect issues early and head off long term problems. We’ll dig into capacity management and ways to deal with ever increasing demands for Hadoop resources.
• Understand initial deployment challenges
• Handling cluster growth
• Working to meet increasing pressure on storage and job density
• Planning for capacity growth and extension to cloud
Presented by IoT expert, Bill Roberts, this webinar explores how the Industrial Internet of Things (IIoT) is having a dramatic impact on how manufacturers and service providers operate. From the factory floor to a distributed energy generation facility to anywhere a critical piece of equipment operates, machine data holds enormous potential to drive high levels of operational efficiency and unlock new service innovations. This blending of operational technologies (OT) with Information Technology (IT) has created challenges with the data, the analytics lifecycle and a deep thinking into the roles of personnel. A robust IoT analytics lifecycle can play a critical role in helping organizations separate the relevant signals from the noise in IIoT data, gain real time critical insights, and drive appropriate, timely action to realize the promised value of IIoT. This session will explore these topics and offer insights on pursuing a successful industrial IoT strategy that leverages a robust analytics lifecycle.
Gartner predicts there will be 250 million connected vehicles by 2020. While automotive manufacturers are on track to drive connected vehicles implementation, are they poised to leverage the trillion-dollar opportunity from the gold-mine that is “sensor data”?
Research from Morgan Stanley suggests, automotive manufacturers can save $488 billion by using predictive maintenance. By assessing in advance which equipment needs maintenance, automotive manufacturers can better plan maintenance work and smoothly convert the abrupt "unplanned outages" into shorter and fewer "planned outages" and spend lesser time in damage control as equipment issues are detected even before they actually occur. The result? Lower operational costs, increased machine lifetime and asset performance.
Join this webinar to learn how automotive manufacturers can:
· Transform connected vehicles into a revenue generating programme
· Reduce costs with equipment insights from engineering data
· Decrease downtime probabilities & boost production quality, safety and efficiencies
· Automate data science workflow with meta-learning enabling you to dramatically reduce the manual data science effort
· Learn how a leading automotive achieved 10% increase in operational efficiency with automated predictive maintenance