In this webinar, Vinaya Thummala and Sarah Moen will share insights on the data governance program at American Family Insurance. They will discuss the value of high quality data to their program and what it has meant to their success. By leveraging Informatica Data Quality, they have been able to ensure that their data is reliable, consistent, and clean. In this webinar, you will:
•Hear an overview of the Information Quality Management Program at American Family Insurance
•Lean how data governance activities are supported by data quality policies, standards and processes to continuously monitor and improve the reliability of business data
•Understand the effective use of data quality in projects: check and ensure the quality of all data, including existing data, transformed data and newly created data
•Discover their Data Quality as a Service Model: Data quality services deployment at Enterprise level for technical expertise, shared infrastructure, and different engagement models to meet all business needs
Big Data Analytics success has been constrained by the difficulty in accessing siloed data and by the traditional IT approach of gathering requirements, designing and building extracts to turn data into valuable data assets. As IT organizations are backlogged with servicing business requests, business analysts and data scientists are looking for alternative methods to discover relevant data, share data with colleagues across divisions or geographies and prepare data assets for actionable insights.
In this deep dive, you will have the opportunity to learn about new features of Informatica Big Data Management 10.1 and Informatica’s latest innovation, Intelligent Data Lake, leveraging self-service efficiency for business analysts and data scientists by incorporating semantic search, data discovery and data preparation for interactive analysis while governing data assets.
EU: General Data Protection Regulation (GDPR)
The clock is ticking and it’s time to act as Europe’s most demanding and far reaching Data Security regulation to date has been published. This webinar from Informatica’s Data Security Team will examine the key requirements of GDPR, and look at how we can use new Data Intelligence capabilities to be in control of our sensitive data assets.
During the Webinar there will be a demonstration of Informatica’s Secure@Source technology that swept the board at the recent Info Security awards, specifically to demonstrate how it’s now possible within a single view to find, and track all the GDPR qualifying data across the organisation, to see who’s using it and what security controls are in place to protect it. This ability to visualise the movement and control of sensitive data will be the critical key to an effective GDPR strategy. This is not a regulation that anyone can afford to ignore, and for those who get it wrong the compensation and regulatory fines could be on a par with the current cost of PPI to the banks ! Join us and get ahead of the game.
In financial services, the top big data analytics use cases include customer analytics to understand customer journey using data from all customer interaction channels, predict and avoid customer churn, and fraud and compliance. The financial and corporate benefits of these use cases range from improving customer retention, to hundreds of millions of dollars in incremental revenue and protection of shareholder value.
In this webinar, learn from big data analytics experts:
- Top 3 use cases in financial services
- The importance of applying the appropriate technologies
- The data driven insights that will give companies a competitive edge
Located at the intersection of pharmaceutical manufacturers and healthcare providers, AmerisourceBergen Corporation (ABC) is a $147B global pharmaceutical services and distribution company. The company’s success rests on knowing its diversified customer mix and serving them with innovative programs and solutions.
Formed in 2001, AmerisourceBergen had become a group of companies under one banner. Join us to hear how the company designed a data strategy to view business interactions holistically across multiple business groups to serve customers as a single company. Pete Stormer, Director of Data Management at ABC, will talk about their experience, best practices, and lessons learned to support the journey to one ABC.
The webinar will also cover the following topics:
• The evolution of MDM as a strategic platform to deliver trusted data to business users
• Best practices for master data management, data quality, data governance and reference data
• Defining the enterprise strategy for data management in complex business environments
Data science is a domain which promises to convert the available data in actionable insights. This could translate in huge wins for the organizations both in financial terms (higher revenues, reduce costs) but also in terms of better services for the customers with more tailored products and a personalized and improved customer experience.
But how to get those results out from the initial intuitions of statisticians and scientists to the customers? What is the best way to translate those solutions in production-level APIs and services? How to asses the quality of data-driven algorithms? These are very concrete concerns for anyone who wishes to operationalize data science into data-driven products.
This webinar will describe a number of techniques and patterns to monitor and deploy data models and to stay in control of predictive, data-driven services.
Data in the real-world is almost always dirty, incomplete, scattered or inconsistent. For data scientists, 'janitor work' is a key hurdle to data insights.
Whether you use big data for analytics or data science, with increasing variety and velocity of big data, the data pre-processing step can be the most time-consuming step in your data pipeline.
Featuring engineering concepts and practical examples in Python and R, this webinar will focus on technical considerations and data engineering techniques to optimise data preparation to get the most value from your big data pipeline.
Dr. Nicholas Marko, Chief Data Officer & Director of Neurosurgical Oncology of Geisinger Health System, will discuss, from the Healthcare Provider perspective, the reasons that it is important for health insurers to have updated Provider data sets, including:
•Trends causing Provider networks to become more complex.
•Pain points around poor Provider data and how they impact the relationship (financial, networking) between payers and providers.
•Other drivers relevant for payers to master their Provider data and better manage Physicians, such as Provider hierarchies.
•Ways that the payer – provider relationship are improved with a Provider Master Data Management solution.
•How an investment in a Provider solution will pay off for the insurer
Sharp Healthcare, San Diego's health care leader, recently launched a data governance program from the ground up to ensure that they can meet the needs of the ‘Patient of the Future’. Through the first six months of their data governance program, they have learned a lot about what to do, how to maintain focus, and how to build the team around the cause. Join Lori Kvasnicka, Director of Data Governance at Sharp Healthcare, along with Susan Wilson, Enterprise Information Management Practice Leader from Informatica, as they tell you about the Sharp Healthcare journey and illustrate the hurdles that you may face as you begin your data governance plans.
During the webinar, you will learn:
• How to build your strategic and tactical plans with Informatica
• How to get your team organized (and stay organized!)
• How to get through phase one of your program and transition into phase two
• What it means to reach your goals and how to keep the momentum going
This Informatica data governance webinar will share best practices, critical insights from a successful journey, and will get you thinking about how to kick start your data governance program.
Data preparation is occasionally referred to as data plumbing, suggesting that it is dirty grunge work that is an unglamorous necessity. We will see how this point of view misses entirely the discovery potential, value, and “thrill of the chase” that data preparation can offer. Data prep includes the all important first step in any analytics project: “Know Thy Data” (which is one of the fundamental principles of data science). We discuss data profiling, data transformations and normalization, feature selection and engineering, and more data prep approaches that empower you to begin learning from your data right out of the “starting gate” on your analytics project. Learning from Data (i.e., data science) doesn’t begin after data prep, but right from the start.Read more >
Certain types of data are commonly analyzed. Sales, customer service, web and marketing analytics are being closely monitored by most companies these days. But there are sources of data inside our companies that no one is thinking about that can be just as valuable, if not more so. How do we identify these, and how do we do something useful with them once we have? Brian Lange from Datascope shares his thoughts plus a couple examples from his work with Proctor and Gamble and others.Read more >
La digitalisation du marketing et l’émergence de nouvelles applications génèrent un flux toujours plus important de données, de toutes formes et tailles. L’exploitation de ces informations permet de mieux cibler, acquérir et satisfaire la clientèle, cependant, trop peu de directions marketing parviennent à en tirer parti.
Le 13 décembre à 11 heures, participez à notre webinaire Intelligent Data Lake et découvrez grâce à des exemples concrets de mise en œuvre comment :
- Visualiser le parcours client global pour identifier les programmes les plus efficaces et optimiser le mix marketing ;
- Développer des stratégies efficaces pour mieux cibler et acquérir de nouveaux clients ;
- Tirer profit des données issues du Big Data à travers tous les points de contact pour créer de la valeur métier.
Informatica Intelligent Data Lake
La solution Intelligent Data Lake d’Informatica offre la data préparation en libre-service, en combinant l’automatisation, l’intégration et la gestion de la qualité des données, aux capacités de gestion de la gouvernance et de la sécurité requises pour tirer la véritable valeur métier de tous types de données, tout en rationalisant les risques.
Reference data are data that is defined and maintained outside the organisation who is using the data. Traditionally these have been small datasets like a country list. Parallel to the rise of big data in general reference data being used for both operational and analytical purposes is also growing in terms of volume, velocity and variety.
This webinar takes you through the different kinds of big reference data, the challenges and the opportunities in exploiting large sets of reference data from the outside.
Hadoop is not just for play anymore. Companies that are turning petabytes into profit have realized that Big Data Management is the foundation for successful Big Data projects.
Informatica Big Data Management delivers the industry’s first and most comprehensive solution to natively ingest, integrate, clean, govern, and secure big data workloads in Hadoop.
In this webinar you’ll learn through in depth product demos about new features that help you increase productivity, scale and optimize performance, and manage metadata such as:
• Dynamic Mappings – enables mass ingestion & agile data integration with mapping templates, parameters and rules
• Smarter Execution Optimization – higher performance with pushdown to DB, auto-partitioning and runtime job execution optimization
• Blaze – high performance execution engine on YARN for complex batch processing
• Live Data Map – Universal metadata catalog for users to easily search and discover data properties, patterns, domain, lineage and relationships
Register today for this deep dive and demo.
Successful data governance and management programs require ongoing goodwill from stakeholders spread across the organization. This translates to a lot of communications – formal and informal, long and short, planned and unplanned. Wasted time? Hardly! Done right, they can result in stakeholders who become vocal advocates for the budget, resources, and tools you require to deliver the value they need.
The secret is stitching together your discussions with carefully crafted micro-conversations – often called “elevator speeches” – that enforce your key messages while intriguing your listeners. A winning elevator speech pulls them into your story and prompts them to ask for more information. It shows that you understand them, and it provides a service, since it may help them connect the dots between the message they’ve just heard and similar information they’ve heard in colleague introductions, conference room presentations, hallway conversations, status reports, and even pitches for tools, projects, and executive attention.
There’s a formula for crafting a winning elevator speech. And it’s more than the conventional advice to be short and natural-sounding. Learn the 7 qualities of powerful elevator speeches for data governance programs, and how to apply them to pitches for support, money, attention, participation, and tools.
Join this webinar to learn what is new in Data Integration Hub 10 and how Data Integration Hub’s unique publish/subscribe data integration has been extended to big data, the cloud and remote offices. A centrally managed data integration hub can contribute to efficiency, agility and visibility in analytics and application modernization projects, including big data and cloud based SaaS applications and analytics.Read more >
Government agencies all over the world are faced with increasing pressure to make better decisions faster to address rapidly changing operational environments and emerging risks of various nature. While there is an abundance of data for completing the information puzzle, it is a significant challenge to address massive volumes of disparate data from diverse sources such as surveillance cameras, broadcast media, sensors, social media, transactions, etc., and to effectively automate analytics processes to accelerate the delivery of comprehensive and relevant insights that matter.
Join our webinar to see how HPE IDOL, a unified analytics platform for text and rich media data, can help you break down data silos, monitor real-time data feeds, uncover coveted trends/patterns/relationships, and streamlines workflows. Discover how IDOL powers real-life applications in defense, transportation security and law enforcement.
Your data and how you use it can be your organization’s biggest and most sustainable competitive advantage. The challenge is to meet the growing needs of your business strategy by evolving your data warehouse environment beyond traditional business intelligence and to support new analytics use cases.
This has been made even harder by the explosion in data volume and complexity, and the fact that 50% or more of your data may be coming from outside of your organization. This means even less understanding and control over the structure and quality of data being used for analytics.
In this webinar, we will be discussing how to successfully augment your proven data warehouse environment with newer technologies that will enhance your data warehouse without replacing it. Some of these opportunities will include:
• Accelerating data warehouses with data warehouse appliances
• Upgrading data warehouses with real-time data
• Offloading and enhancing data warehouses with big data
The key to a successful data warehouse modernization is to build an architecture that preserves your current data warehouse investment, incorporates new data and new analytics technology, and fuels your analytics with clean, complete and timely data. We will also talk about best practices for data warehouse modernization, architecture, and pitfalls to avoid in order to deliver analytics results that will truly be a competitive advantage for your organization.