Join us for a technical deep dive of the latest version of Informatica Data Quality. In this webinar you will learn about the latest self service functions that we’ve added to IDQ 10.2. Our team of experts will also showcase our latest advancements in the world of hybrid. Having data that you can trust, that’s clean and reliable is simply essential – regardless if that data is on-premise, in the cloud, is big data or traditional data. With the latest advancements in IDQ 10.2, we’ll demo you how you can trust your data for analytics, reporting, governance, and more. This is one webinar you won’t want to miss.Read more >
In this webinar, Vinaya Thummala and Sarah Moen will share insights on the data governance program at American Family Insurance. They will discuss the value of high quality data to their program and what it has meant to their success. By leveraging Informatica Data Quality, they have been able to ensure that their data is reliable, consistent, and clean. In this webinar, you will:
•Hear an overview of the Information Quality Management Program at American Family Insurance
•Lean how data governance activities are supported by data quality policies, standards and processes to continuously monitor and improve the reliability of business data
•Understand the effective use of data quality in projects: check and ensure the quality of all data, including existing data, transformed data and newly created data
•Discover their Data Quality as a Service Model: Data quality services deployment at Enterprise level for technical expertise, shared infrastructure, and different engagement models to meet all business needs
These days consumers interact with companies across multiple channels—from walking into a physical location to engaging with your website or even calling to make a complaint. Is your customer experience consistent across these channels? Or do these multiple touch points mean multiple consumer experiences?
Join our webinar to learn how you can create a truly seamless omnichannel experience and gain better insights from a single customer view, all through unlocking the power of your data.
Attend this webinar to find out:
• Where the omnichannel dream falls flat
• What role data and its quality plays in omnichannel and the single customer view
• How to address common data collection challenges, through every channel you interact with customers
• How to achieve omnichannel success through better data quality
While firmly on the regulatory horizon in the EU, IDMP compliance and corresponding ISO standards are intended for global use and will underpin many future regulations. The ultimate goal is to simplify the exchange of information at a national, regional and global level. And this, in turn, should lead to increased regulatory efficiency, and contribute directly to improving patient safety.
But, Life Science companies are struggling to comply with IDMP requirements. It’s challenging because differing departmental vocabularies and pre-existing hurdles thwart even the simplest internal exchange of information across business and data silos.
Join Informatica, the industry leader in all things data, for a webinar showcasing industry best practices for laying a strategic data foundation for IDMP compliance. You will learn about:
• Best practice for a single authoritative, trusted view of substance, product, organization and reference data
• Key capabilities for the automation of data quality tasks:
o Discover anomalies in your data
o Generate data quality scorecards based on business and quality rules
o Validate and standardise your data
o Route records for manual remediation to data stewards.
Digitization is impacting all organization as they transform themselves into a full digital business to stay competitive – While customers want to engage and transact online and expect a quick and seamless experience. However, many organizations fail to deliver on both counts. For example,
If customers cannot find your products or when they do they are presented with missing or incorrect product information, you will lose the sale.
Challenges & Reality of Commerce (Online & Offline)
• 90% of all shopping cart abandonment happen because customers feel they do not have enough information,
• 40% of all returns are the result of poor product information,
• 65% of consumers find it frustrating to be presented with inconsistent offers, experiences or treatment through different channels when shopping for the same product or service.
This webinar will focus on how to address these challenges and how you can;
• Introducing products faster,
• Reduce the number of returns,
• Generate higher conversion rates,
• Ensure that you are always compliant with regulations,
• Automate your Product Information Data Quality processes.
Join us to hear how to deliver rich, clean and consistent product information that accelerates time to purchase and profit no matter which channel your customers use – And how to extending data quality to the entire customer journey.
Join the Webcast and learn how the latest enhancements of Documentum Q&M can deliver lifecycle support for medical device documentation - from design to production manufacturing, while ensuring compliance.
Hear Jessica Kelley, Quality Product Manager, provide an overview of the key enhancements and integrations with Documentum Q&M announced at EMC World in Las Vegas in May. She’ll also explain how our latest release (4.2) delivers lifecycle support for medical device documentation from design to production manufacturing and provides flexible print controls to maximize efficiency while ensuring compliance.
Implementing a more strategic approach to testing can have a huge impact on product quality, but measuring exactly how your QA strategy has made a difference can be challenging. The long-term success of any QA strategy depends on measuring change and communicating that change to the team at large, so it’s important to measure the right metrics.
In this webinar, discover...
1. Five essential metrics that QA teams should track
2. How to avoid common pitfalls when it comes to measuring the success of a QA strategy
3. Techniques to level-up product quality by taking a data-driven approach to software testing
Like most asset-intensive companies, you experience issues with poor quality asset and maintenance data in your CMMS system. You know you need to fix the issues to optimize your maintenance procedures, but you don't know where to begin. Join us as Paul Peterson and David Hattrick discuss the effects of bad data on the maintenance organization. Discover best practices to help you easily find and fix data issuesRead more >
LESSONS IN CUSTOMER SERVICE FROM THE MASTERCHEF KITCHEN!
If you’ve ever watched MasterChef, you know that the pantry is stocked with only the finest ingredients – a prerequisite for any great meal. And the home cooks’ dishes are judged not only on their flavor but also on how they look. There are a couple of lessons here for your customer service:
- Without a basis in quality, you can’t provide a great customer experience; and
- Presentation matters
Learn about the ingredients that determine knowledge quality and enable you to provide superior customer support experiences:
- How to write high-quality knowledge articles
- How to design article templates for the most effective knowledge delivery
- KCS article quality index (AQI) considerations
You will also see a demo of the RightAnswers automated Knowledge Quality tool that improves the quality of your knowledge as it’s being created, eliminating manual operations – because you have enough on your plate already!
Join this webinar where data quality & MDM expert Jake Freivald from Information Builders will give a high level overview covering the fundamentals of data quality and MDM. This webinar will also give you the opportunity to bring along your own questions and have them answered live.
This session will cover:
- How to build a repeatable and scalable data quality methodology
- How to develop a data quality management scorecard
- How you can establish the building blocks of a master data management (MDM) solution
Send your questions in advance to firstname.lastname@example.org.
Innovation Sessions 2017 #5
Poor data quality can have serious financial consequences. Regulatory fines, monetary losses from bad business decisions, and legal fees resulting from errors can add up to millions of dollars. When it comes to patient or consumer safety, bad data can cost lives.
This webinar will highlight effective steps for preventing and fixing bad data, and processes to help ensure optimum integrity in your data. Best practices in data quality and real-world success stories will be featured.
Sign up today!
Where some see complexity and uncertainty, we try to see past the clutter. Global Equities Specialists since 1984.Read more >
The Geisinger Health Plan cares for over 500,000 members in Northeast and Central Pennsylvania. During this webinar, you’ll learn how the Geisinger team is using Informatica’s metadata driven products and solutions to ensure that data is reliable and trusted. You’ll discover both strategic and tactical considerations for leveraging data quality to meet your goals, including industry best practices, design concepts for governance organizations and the development of core technical competencies. Don’t miss this webinar where you’ll hear about lessons learned and innovative approaches directly from the experts at Geisinger.Read more >
The journey to cloud is an evolution not a revolution Data management issues get amplified in the cloud because of a complex landscape across cloud and on-premise applications. When you are moving data around, data security, data governance and data quality should be a top priority. Join Joe McKendrick from Forbes and Anthony Smith, Head of Technology Solutions at JLL to discover:
•How to get the architecture right and understand what data goes to cloud and what stays on-premises, and what data management brings to the table around data quality, data security, and data governance.
•Best practices on embracing both cloud and on-premises to scale flexibly and innovate faster to meet with accessible, secure and relevant data
•How JLL is embracing both cloud and on-premise worlds for flexibility and faster development of best of breed solutions irrespective of where they are hosted. Hybrid is about digital innovations at JLL helping us scale flexibly and innovate faster to meet our customers’ needs with innovative solutions.
This webinar is all about the Holistic Data Stewardship life cycle, where to start, and how you can build a repeatable, collaborative and easy to use stewardship process.
This interactive session, led by data quality experts, David Lyle and Dominic Sartorio, will focus on the daily routine of a typical data steward and show how Informatica Data Quality 9.6 can help you simplify their life by:
-- Enhancing collaboration between IT and business stakeholders
-- Reducing time to discover data asset relationships and error remediation
-- Accelerating development and deployment of business rules
-- Increasing agility with self-service
You want to transform your business, but your data is spread across a multitude of on-premise and cloud systems. How are you going to organize and manage it all? And how do you even know what data you have and if it’s clean and reliable? We have your answers.
Discover how Informatica simplifies the journey to cloud at the latest unveiling of the world’s #1 data integration and data quality solution, including:
- Informatica PowerCenter 10.2
- Informatica Data Quality 10.2
- Informatica Data Integration Hub 10.2
Approved for one credit for CAMPEP, MDCB, ASRT
As radiation-oncology clinics transition from paper to paperless environments, more documentation is being stored in computer databases. Medical-physics documentation was traditionally stored in large binders on a bookshelf and disparate Excel files on a PC hard drive. Historically this was the only option for storing the large volume of data required for regulatory compliance. In the last 5 years many vendors of radiation-oncology quality-assurance (QA) equipment have stepped up to fill this gap. Additionally, QA software is now available that combines the interfacing of hardware with QA systems. In this lecture I will cover some of the software packages available from vendors and will share the LewisGale experience in implementing one software package for integrated QA. I will conclude the lecture with some of my personal thoughts regarding suggestions for QA software vendors for the future.
Contact and lead records enter Salesforce from various sources, but bad data is costly, and leaves you unable to communicate with your customers. While many departments have different needs they are typically using the same contact data – And each one has their own approach to fixing bad data.
Bad data in your Salesforce environment impacts your ability to;
•Communicate with customers
•Identify cross sell and upsell opportunities
•Plan and align territories
•Score and route leads
To fix these and other issues caused by bad data it is imperative that you implement the right tools and processes to understand the scale of the problem and fix it.
Informatica Cloud Data Quality Radar quickly identifies, fixes, and monitors data quality problems in your Salesforce environment. Attend this short webinar to hear how you can:
•Easily identify and visualize data quality issues within your applications – and drive increased confidence in data
•Automatically fix the most impactful data quality problems sooner - while reducing reliance on IT for data quality related activities
•Continuously monitor data quality across source systems
Today we are in an age of digital consumers who are hyper-connected, expecting information and service wherever and whenever they want. Companies must be able to respond to the increasing demands of their customers with superior products and services without sacrificing quality or security.
In this webinar you will learn about key findings from the eighth edition of the World Quality Report, which illustrates the impact of trends like Agile, DevOps, IoT and others. These trends are forcing organizations to disrupt with digital at an ever-faster pace. Specifically you will learn more about these 4 key trends in QA and Testing:
• Digital Transformation driving and shaping IT strategy
• Agile and DevOps continue to expand and grow in adoption
• Internet of Things a disrupting force with real implications on quality
• Infrastructure costs (test environments); a rising concern
The quality control (QC) of radiation-therapy (RT) treatment units is essential to deliver safe and effective radiation-therapy treatments. The complexity of the equipment and treatment techniques demands that many different tests be performed with varying frequencies, making the management of an RT department’s QC programme a complex and time-consuming task. Many professionals such as therapists, service engineers and physicists are working in collaboration to meet the machine QC requirements.
AQUA is a departmental quality-management software that centralizes all of the machine QC activities, helping to manage the complexity of quality-assurance requirements in a radiation-therapy department. AQUA is a server-based application that can be accessed throughout a RT department using a web browser. It uses a centralized database to consolidate all QC tests, procedures and results in one location and offers a workflow manager to guide the users in their day-to-day QC tasks. QC tests in AQUA can easily be created or customized using a XML-based scripting language. Integration between AQUA and the Elekta linear accelerators and other measurement devices (such as MV flat panels, electrometers and 2D arrays) using a built-in software interface has been implemented to automate time-consuming tasks. Finally, review tools such as near-real-time dashboard, plotting tool and reports are available in AQUA to streamline machine-performance assessment.
In this presentation, we will review some of the main AQUA features such as the workflow manager, QC test scripting language, and the dashboard for performance and compliance review. We will also describe test automation for some QC tasks and review the impact on QC workflows and test frequency. We will then discuss the impact of efficient review tools on the ability to detect change in machine performance and to guide servicing decisions. Finally, we will review new features in the up-coming version of the software.
For a world obsessed with data, we sure don’t handle it with the care it needs. And when we do, we sometimes go overboard. In this webcast, internationally acclaimed lecturer Rick van der Lans explores the relationship between data quality and business agility, emphasizing the alignment between businesspeople and IT to ensure successful analytics and operations.
- How to determine what data is worth cleansing and governing
- How to align IT and the business to ensure that your data is well-suited for its purposes
- What traps to avoid when implementing a data quality or data governance program
Partecipa a questo webinar di Information Builders sulla qualità dei dati.
Verra mostrata una panoramica ad alto livello sui fondamenti della qualità dei dati e della MDM.
Questa sessione riguarderà:
- Come costruire una metodologia di qualità dei dati ripetibile e scalabile
- Come sviluppare una scheda di valutazione della qualità dei dati
Come è possibile stabilire i blocchi di una soluzione MDM (master data management)
We live in the age of data. Organisations gather increasing quantities of data, but not everyone knows how to use it well.
Analytics can be worthless, counterproductive and even harmful when based on data that isn't high quality. It doesn't matter how fast or sophisticated the analytics capability, you won't be able to turn data into better decisions with poor data.
With a growing number of companies investing in analytic tools to visualise their data, there has never been a better time to look at the quality of your data.
Join Leopard Business Solutions, the data management experts and Yellowfin, a global analytics vendor as we discuss the way to making business decisions based on data you can trust.
This webinar examines technical developments in coal beneficiation covering dense-media and dry coal treatment, and upgrading various technologies.
Amid a global trend to use low quality, inexpensive coal, it is now recognised that feedstock quality is a key element of a future power strategy to raise power station performance and meet environmental legislation. Preparing feedstock in order to remove inert matter and reduce contaminants can benefit every aspect of a coal plant operation. We examine technical developments in coal beneficiation covering dense-media and dry coal treatment, and upgrading technologies such as coal refining, digestion, oxidation, fuel blending and biomass substitution.
Lignite or brown coal, the lowest quality coal, is normally used in its raw state resulting in significant energy and reliability penalties. Energy efficient technologies to reduce moisture and ash levels can significantly improve performance. As lignite demand declines in OECD countries, alternate markets are being sought that utilise synthesised humates.
Increases in the variety of data sources you are being asked to handle can naturally lead to increases in concern about the quality of data you are ingesting into your systems.
In this 30 minute high-level overview, we will describe multiple techniques for assessing and ensuring data quality and show you how to use CloverETL's built-in data quality capabilities to automate the process of detecting and handling low quality data.