Hi [[ session.user.profile.firstName ]]

MDM as a Platform for Systems Consolidation, Migration and Upgrade

How was your last data migration project – was it on time and within budget? Between 1999 and 2007, 84% of the data migrations went over budget and overtime. On an average, data migration projects cost $875,000, and roughly 30% of it is due to project overrun. To complicate the issue further, too of
How was your last data migration project – was it on time and within budget? Between 1999 and 2007, 84% of the data migrations went over budget and overtime. On an average, data migration projects cost $875,000, and roughly 30% of it is due to project overrun. To complicate the issue further, too often consolidations, migrations, and upgrades are handled as a one-off project leading to costly delays in product launch, minimal to no best practices and reusable assets, and increased risks to realizing business objectives. To top it all, 34% of migrations have lost or missed data.

Successful organizations have combatted these issues using master data management (MDM) as a platform for their systems consolidation, migration, and upgrade projects. MDM creates authoritative and trustworthy data for use in migration, simplifies migration architecture using hub-and-spoke architecture, maintains data consistency across new and old systems post-migration, and, above all, enables reuse of data, mappings, and rules for the next migration project. MDM allows organizations to minimize risk and increase the speed of data migration.

In this webinar, you will learn:
-The challenges in systems consolidation, migration and upgrade
-How MDM helps address these challenges in pre-migration, during-migration, and post-migration phases
-Examples of companies using MDM to manage data migration as a repeatable process
-Expanding the use of MDM beyond data migration for operational and analytical purposes
Recorded Mar 19 2014 28 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Ravi Shankar, VP, MDM Product Marketing, Informatica
Presentation preview: MDM as a Platform for Systems Consolidation, Migration and Upgrade
Recommended for you:
  • Date
  • Rating
  • Views
  • The All-Flash SAP HANA Solution: Performance, Economics, and Reliability The All-Flash SAP HANA Solution: Performance, Economics, and Reliability Adam Roberts, Chief Architect, SanDisk; Trick Hartman, ThinkSolutions Architect, Lenovo Recorded: May 3 2016 36 mins
    In this webinar members from Lenovo and SanDisk will introduce you to a high density SAP HANA solution. In this webinar you will learn how moving to an all-flash solution enables significant cost savings, efficiency gains, and removes complexity. Experts from Lenovo and SanDisk will show you how this solution delivers:

    •82% reduction in overall sub-system power and cooling requirements
    •20% reduction in hardware footprint
    •Removal of multi-storage tier complexity
    •Removal of cache tier and costly cache software licenses
    •Elimination of wasted capacity

    Learn how embracing an all-flash solution provides superior durability, better data protection, efficient use of existing capacity, and performance gains that will transform your business critical applications.
  • Next-Gen Storage for VMware: Addressing Cost, Performance and Capacity Challenge Next-Gen Storage for VMware: Addressing Cost, Performance and Capacity Challenge Chris Tsilipounidakis Recorded: May 3 2016 54 mins
    As organizations move into 2nd and 3rd generations of virtualization with VMware, they often experience high latency and unpredictable performance. The traditional disk-based storage they're depending on for their VMware environment is not keeping up. Flash is transforming datacenter cost, performance and capacity to address these challenges.

    Join Chris Tsilipounidakis as he examines the benefits of transforming to an All-Flash virtualized datacenter to support virtualized workloads.
  • Cómo garantizar los SLAs de disponibilidad de sus entornos NAS y Big Data - See Cómo garantizar los SLAs de disponibilidad de sus entornos NAS y Big Data - See César Funes (Commvault) Recorded: Apr 29 2016 29 mins
    Los datos no estructurados de la empresa son cada vez más un activo imprescindible para los servicios y procesos de negocio. Sin embargo, su crecimiento exponencial hace que las tecnologías tradicionales de protección de este tipo de datos sean no solo ineficientes sino incapaces de responder a las necesidades de disponibilidad en entornos empresariales.

    En este webinar veremos tres áreas de innovación de Commvault que permiten responder a las necesidades de protección de datos no estructurados:

    -En la primera parte, expondremos las novedades en la tecnología IntelliSnap, que permite la integración con nuevos fabricantes y la gestión de réplicas de volúmenes; asimismo hablaremos de la nueva tecnología de captura de bloques que extiende el concepto de protección continua y consistente a cualquier tipo de sistema de ficheros y base de datos, para disponer de un número ilimitado de puntos de recuperación de acceso nativo.

    -En la segunda, revisaremos cómo la nueva versión del software de Commvault expande sus capacidades a entornos de Big Data. De acuerdo con Gartner, más de tres cuartas partes de las empresas tienen iniciativas de Big Data, y por lo tanto la necesidad de gestionar la disponibilidad de estos datos es un reto que debe resolverse a corto plazo.

    -Por último, repasaremos cómo las soluciones NAS hiperescalables exigen de tecnologías innovadoras para poder garantizar la protección y acceso a los datos que almacenan.

    Únase a nosotros para conocer cómo la Plataforma de Gestión de Datos de Commvault le permite cumplir sus SLA de disponibilidad de datos no estructurados, independientemente de la tecnología que elige para almacenar y gestionar estos activos de su negocio.
  • How to run HDP on Microsoft Azure, an Interactive Demo How to run HDP on Microsoft Azure, an Interactive Demo Dave Russell , Solutions Engineering, Hortonworks and Rafael Achaerandio, Open Source Sales & Marketing Director, Microsoft Recorded: Apr 29 2016 60 mins
    The emergence of Big Data has driven the need for a new data platform within the enterprise. Apache Hadoop has emerged as the core of that platform and is driving transformative outcomes across every industry. Join this webinar for an overview of the technology, how it fits within the enterprise, and gain insight into some of the key initial use cases that are driving these transformations.
  • Data Protection Suite pour VMware, comment protéger les environnements virtuels? Data Protection Suite pour VMware, comment protéger les environnements virtuels? Charles PROUST, DPS Systems Engineer Recorded: Apr 29 2016 18 mins
    Data Protection Suite for VMware offers end-to-end data protection for VMware-based environments, including backup and recovery, replication, monitoring and analysis, and search capabilities, as well as tight integration with VMware. During this webinar you will learn how this new offering can help you to address challenges across your virtualized environments
  • The Analyzer of Everything: Going Far and Wide The Analyzer of Everything: Going Far and Wide Ben Vandiver, Sr Engineering Manager HPE Big Data & Dez Blanchfield, Data Scientist The Bloor Group Recorded: Apr 28 2016 65 mins
    The Briefing Room with Dez Blanchfield and HPE Big Data

    When you know better, you do better. That's a perfect mantra for the data-driven enterprise. Understanding what's happening anywhere in your company's purview opens doors to innovation, excellence and profit. Today, a confluence of next-generation technologies enables this kind of synergy. Analytics can be done anywhere, with data sets that are increasingly accessible and valuable.

    Register for this episode of The Briefing Room to hear Data Scientist Dez Blanchfield explain the sea change taking place now in the world of enterprise analytics. He'll demonstrate how open-source technologies like Apache Kafka can be combined with proven analytical databases to create a new world of possibilities. He'll briefed by Ben Vandiver of HPE Big Data, who will showcase his company's time-tested Vertica platform for deep analytics.
  • Optimize Your Infrastructure With Open Technologies in the Industrial IoT Optimize Your Infrastructure With Open Technologies in the Industrial IoT Jason Stamper 451 Research Analyst, Data Management and Analytics, Brian Clark Objectivity VP of Products Recorded: Apr 28 2016 59 mins
    The Industrial Internet of Things is rapidly evolving, both in terms of its business requirements and the enabling technologies needed to improve decision-making and gain competitive advantage. The ideal technical solution should be able to fuse streaming Fast Data coming from IoT devices and sensors with static Big Data about customers and assets.

    In this webinar, hosted by Brian Clark of Objectivity and analyst Jason Stamper of 451 Research, we’ll discuss how to augment these critical categories:

    · Configuration management
    · Predictive maintenance
    · Supply chain optimization

    We’ll explain the technical challenges involved when supporting massive volumes of data in a mixed workload environment, and how to leverage open technologies, such as Spark and HDFS, to enable real-time IoT intelligence.
  • Deep Dive on Informatica PowerCenter, Data Quality, & Data Integration Hub 10.1 Deep Dive on Informatica PowerCenter, Data Quality, & Data Integration Hub 10.1 Informatica Experts: Ash Parikh, Rob Karel, Awez Syed, and Chris Philips Recorded: Apr 28 2016 54 mins
    Whether you are modernizing your application portfolio, or growing and updating your analytics capabilities, or getting started with your data governance foundation – the challenge is fueling these initiatives with great data that will give your organization an advantage. Whether you are looking to improve healthcare outcomes, increase sales efficiency, improve marketing campaign effectiveness, reduce the risk of fraud, or lower customer churn, great data is your edge to product excellent results.

    In this webinar, you will:
    - Learn how Informatica 10.1 can help our you deliver great data faster in support your organization’s business initiatives
    - Understand how to leverage the new and innovative capabilities we are introducing with 10.1
    - Get a deep dive on the new and innovative capabilities in 10.1
  • Analyzing Buyer Behavior at the Speed of Business Analyzing Buyer Behavior at the Speed of Business Chris Selland, HPE, Mike Foster, Qlik, and Jay Hakami, SkyIT Recorded: Apr 28 2016 51 mins
    In retail, understanding which products will be hits or misses requires constant data collection and sharing among suppliers, retail stores, the wholesale business, and dotcoms. But even with open collaboration and cooperation, how can retailers collect varying data sources from each link in the omnichannel chain, run high-performance analytics, and visualize those results to truly understand buyer behavior?

    Join Sky I.T. Group, leaders in multi-source point-of-sale data collection, validation and analytics, as they share how their SKYPAD solution enables retailers and wholesalers to gain rapid insight into consumer sales trends. Hewlett Packard Enterprise and Qlik will also share how their underlying technologies -- the HPE Vertica Analytics Platform and QlikView data visualization – power the SKYPAD solution to capture more business and deliver these dramatic improvements over SKY I.T. Group’s former database platform:

    •70% reduction in time to load various forms of data
    •90% reduction in disk storage
    •50 to 500 times faster query and report execution
    •Zero system downtime or "batch windows" needed
    •Significant reduction in total cost of ownership
    •Fast implementation that leverages existing database knowledge
  • Using the Cloud for Speed-of-Thought Analytics on All Your Data Using the Cloud for Speed-of-Thought Analytics on All Your Data Snowflake Computing, Ask.com, Tableau Recorded: Apr 28 2016 64 mins
    1.5 TB of data per day? No problem! Learn how Ask.com turned to Snowflake’s cloud-native data warehouse combined with Tableau’s data visualization solution to address their challenges.

    Ask.com and its parent family of premium websites operate in an extremely competitive environment. To stand out in the crowd, the huge amounts of data generated by these websites needs to be analyzed to understand and monetize a wide variety of site traffic.

    Their challenges:
    Ask.com’s previous solution of Hadoop + a traditional data warehouse was limiting their analysts’ ability to bring together and analyze their data.
    - Significant amounts of custom processing to bring together data
    - Performance issues for data users due to concurrency and contention challenges
    - Several hours to incorporate new data into analytics.

    Join Ask.com, Snowflake Computing, and Tableau for an informative webinar where you’ll learn:
    - How Ask.com simplified their data infrastructure by eliminating the need for Hadoop + a traditional data warehouse
    - Why Ask.com’s analysts are able to explore and analyze data without the frustration of poor, inconsistent performance
    - How Ask.com’s widely distributed team of analysts can now access a single comprehensive view of data for better insights

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: MDM as a Platform for Systems Consolidation, Migration and Upgrade
  • Live at: Mar 19 2014 6:00 pm
  • Presented by: Ravi Shankar, VP, MDM Product Marketing, Informatica
  • From:
Your email has been sent.
or close
You must be logged in to email this