Hi [[ session.user.profile.firstName ]]

The State of Enterprise Data Quality 2016

In this recorded webinar, Carl Lehmann of 451 Research presents the findings of their survey of 200 IT Professionals, revealing how organizations are using advanced analytics and machine learning.
Recorded Aug 30 2016 43 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Carl Lehmann - 451 Research
Presentation preview: The State of Enterprise Data Quality 2016

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • IT Management Is Becoming So Predictive – Which Is Good. Right? Recorded: May 25 2017 5 mins
    Dan Ortega - Vice President of Marketing
    Improvements in IT data quality and analysis tools have enabled IT management to spend less time looking into the past and more time enabling the dynamic enterprise of the future. This allows them to anticipate business events more accurately, forecast costs and capacity, and identify operational risks before they appear. Empowered by technology-driven insights and technology-enabled prediction ability, IT leaders have secured a long-sought seat at the table with their business counterparts during the strategic planning process. IT management becoming more predictive is good. Right? Perhaps, but there are some risks to consider.

    Technology-enabled prediction is only as good as the underlying data, and does a poor job of addressing unknown variables. Human intuition and analysis skills have traditionally been used to fill gaps in available data, interpret meaning and project future events. The predictive abilities of most IT leaders are heavily dependent on the quality of information and technology-enabled processing power at their disposal. Modern machine learning systems have made tremendous strides in analyzing large volumes of data to identify trends and patterns based on past and current observations. Their capability to do so is limited, however, by the quality and dependability of data inputs. “Garbage in-garbage out” has been the rule for many years.

    Learning how to harness the power of technology and information and applying it to create valuable predictive insights for an organizations is definitely good; IT leaders should be commended for bringing new capabilities to the decision-making table. As we all know, however, no information is perfect, and technology has its limitations. Becoming entirely reliant on technology for prediction and losing the ability to apply a human filter is a risky situation for businesses. As with many business decisions, it is important to balance the potential benefits with the acceptable risk profile for your organization
  • Why CMDBs are Sexier than you Think Recorded: May 18 2017 3 mins
    Dan Ortega - Vice President of Marketing
    Sexy may not be the first word that comes to mind when you think about your CMDB and the operational data of your company… but (seriously) maybe it should be! After all, your CMDB has a number of attractive qualities and (with some care and feeding) could be the ideal partner for a lasting long-term relationship. There are lots of potential reasons this can work, but let’s focus on the top three:
    Substance: Your CMDB is not shallow and fickle, it is strong and deep, with a history as long as your company’s. The CMDB is built on a core of your master data and pulls together all of the facets of operational data your company creates every day. It contains the complex web of connective tissue that can help you understand how your company works. Those insights then become part of the CMDB itself – enabling the strength of your data to be balanced by the wisdom that comes from analytics and self-awareness.
    Long-term potential: You may lust after the latest new tool or trend, but your CMDB will stand by your company’s side through thick and thin, long into the future. It will grow and evolve with you, always be honest about what’s going on, and work with you to provide insights to get your company through troubled times. As your company changes with new markets, products, customers, and competitors or becomes a part of something bigger through acquisition or partnership, your CMDB is there to help you navigate the changes and achieve success.
    Air of mystery: You may never fully understand all of the secrets that your CMDB holds about your company. As you unlock one insight, the potential for others seems to appear magically. What would you expect from something that brings together all parts of your company data and the complex interrelationships in one place for you to explore?
    Deep substance, long-term potential and an air of mystery. Maybe your CMDB is sexier than you think
  • Data Integrity the Key to Operational Insights or an Elephant in the Room? Recorded: May 5 2017 4 mins
    Dan Ortega - Vice President of Marketing
    Throughout history, business has always struggled with the challenge of data accuracy and integrity. Executives constantly ask their IT leaders how they can improve the quality and integrity of data in order to obtain the insights needed to guide their company effectively. While it sounds reasonable, it may well be the wrong question. Rather than focusing on the quality of raw data, a better approach is to focus on the quality of insights available and the speed/cost to obtain them by asking, “How can we better leverage the data we already have to cost effectively obtain the insights we need?”
    Advances in machine learning, data science and correlation analysis during the past decade have enabled a broader range of capabilities to analyze data from disparate operational processes and information systems. This has been accomplished without developing some of the structured relationships and incurring data-model-integration costs associated with traditional data warehousing and reporting approaches
    Through assessment of the trends and relationships between different data elements, modern data analysis systems are able to “discover” a variety of insights that may not have been available during the past. Examples include undocumented dependencies within operational processes, sources of data inaccuracy and the evolution of operational processes during time. Instead of focusing on what is “known” about operational data, modern methods focus on understanding what is “unknown” about operational data.
    Is data integrity the key to operational insights or is it the elephant in the room? That depends on how organizations want to view the situation. Data Integrity at both the informational and operational level is a core requirement of any modern business, and has been an area of focus for Blazent since the early days of Big Data.
  • Why Is Operational Data Important for IT? Recorded: Apr 28 2017 4 mins
    Dan Ortega - Vice President of Marketing
    Each day, with every customer transaction, employee task and business process, companies generate vast amounts of operational data that provides leaders and managers with insight into what is working well and what requires attention. Operational data is particularly important to those responsible for stewarding the information and technology assets of their organization.
    In this context, operational data is particularly important to IT, which is why it is so critical to understand the three different types of operational data on which IT leaders rely.
    Business operational data is all about the business processes and user experiences, which IT enables with the technology and services it provides. The reason organizations invest in technology is to improve the productivity and effectiveness of business operations. Process and user-related data evaluated over time provides a contextual picture into how effectively the technology is achieving that goal.
    IT operational data is concerned with the content of “what” technology components are operating and being used. IT operational data is important as a part of the IT planning process to understand capacity utilization and determine where scalability constraints exist, as well as to understand the cost of services provided to users and to assess security and risk considerations of the business-technology ecosystem. Within IT service management processes, operational data is critical to ensure performance and availability Service Levels Agreements (SLAs) are honored, and to drive technology cost reduction through infrastructure optimization.
    Operational data provides IT with the critical picture it needs to understand and optimize the role it plays in the context of the company.
  • Benefits of Machine Learning in IT Infrastructure Recorded: Apr 21 2017 4 mins
    Dan Ortega - Vice President of Marketing
    During the next 5 years, machine learning is poised to play a pivotal and transformational role in how IT Infrastructure is managed. Two key scenarios are possible: transforming infrastructure from a set of under-utilized capital assets to a highly efficient set of operational resources through dynamic provisioning based on consumption; and the identification of configurations, dependencies and the cause/effect of usage patterns through correlation analysis.
    In the world of IT infrastructure, it’s all about efficient use of resources. With on-premise infrastructure (compute, storage and network) utilization rates for most organizations in the low single digits, the cloud has sold the promise of a breakthrough. For those organizations moving to Infrastructure as a Service (IaaS), utilization in the middle to high teens is possible, and for those moving to Platform as a Service (PaaS), utilization in the mid-twenties is within reach.
    Dynamic provisioning driven by demand is essentially the same operational concept as power grids and municipal water systems – capacity allocation driven by where resources are consumed, rather than where they are produced.
    The second part of the breakthrough relates to right-sizing infrastructure. Whether this is network capacity or compute Virtual Machine size – machine learning will enable analysis of the patterns of behavior by users and correlate them to the consumption of infrastructure resources.
    During the near term, these benefits will be much more tactical. Automated discovery combined with behavioral correlation analysis will virtually eliminate the need for manual inventory and mapping of components and configuration items in the IT ecosystem to reveal how the ecosystem is operating.
    Today, IT has the opportunity to automate the mapping of components in their infrastructure to provide a more accurate and actionable picture.
  • Downstream Impacts of IT Data Improvement Recorded: Apr 17 2017 4 mins
    Dan Ortega - Vice President of Marketing
    How well prepared is your organization for growth? What are the challenges to making progress?
    IT systems and the data they contain for the organization is often seen as a foundational capability, an underpinning function, or simply a static resource separate from the organization’s core value chain. Framing data as a part of a value chain can enable you to see the downstream impact of upstream IT data improvements in the activities that consume them.
    When the integrity and quality of IT data improves, leaders have more confidence in the decisions they make. They have the ability to evaluate opportunities and problems faster and more easily, without the need to question and independently validate the information they are continuously receiving. This increase in confidence can lead to the pursuit of more ambitious and more broadly scoped business opportunities, as well as the ability to preemptively mitigate organizational risks.

    Improving the quality of the data available from IT enables data professionals to see process-performance variances easier and faster, and to correlate previously independent data sets that can drive new operational insights.

    Data integration improvements across IT systems improve the efficiency of employees involved in executing transactional processes by reducing the need for redundant data entry tasks to keep operational data in sync as transactions flow through business processes. By removing manual tasks, managers and leaders have greater transparency into operational performance with a lower risk of intentional data manipulation and/or human error.
    By providing an automated solution that delivers the highest data quality by using information gained from multiple sources to create refined data records.
  • The Three Key Requirements to Achieve Data Integrity Recorded: Apr 13 2017 3 mins
    Dan Ortega - Vice President of Marketing
    Having data you can rely on is foundational to good decision making. Data Integrity is an important requirement which can be defined in many ways. The Technopedia definition of Data Integrity focuses on three key attributes of completeness, accuracy, and consistency.
    In this video, we review these attributes in the context of IT Service and Operations Management . So, to begin:
    Completeness: A data record such as a description of an IT asset needs to be complete in order to satisfy the needs of all its consumers. For example, IT Operations cares about whether the asset is active, as well as its location, while Finance wants to manage attribution of software licenses. Gaps in the attribute data can impair an organization’s ability to manage the asset.
    Accuracy: Having wrong or misleading data helps no one. The cause of inaccuracy can be due to manual input errors, or mis-handled conflicting data between sources from poor IT discovery tools that miss or double-count an asset.
    Consistency: This is one of the harder data integrity issues to resolve. If you only have a single source of data, it is likely to be consistent (although potentially consistently wrong). However, in order to verify the data, it has to be validated against multiple sources. Deciding which source is the most accurate is complicated, and setting up automated precedence rules can be challenging without the right tool.
    Achieving and maintaining data integrity can be done using various error-checking methods such as normalization and validation procedures. Blazent’s Data Integrity platform was designed to make data management scalable through an automated process for exception handling. To learn more about the importance of good data integrity in an IT Service Management context, you can read Blazent’s white paper on Data Powered IT Service Management, available on the resources page on our website at www.blazent.com
  • Top Five Limitations of IT Discovery Tools Recorded: Apr 2 2017 3 mins
    Dan Ortega - Vice President of Marketing
    IT discovery tools a promise to uncover all hardware, software, servers, databases, applications, dependencies and more. Unfortunately, discovery tools have some significant limitations.
    Below are the Top 5 limitations users face:
    5. Complexity: Discovery tools usually depend on complex Unix or Linux package manager or windows registry data that is complex and hard to decipher. For discovery to be useful, you need undiscoverable data. Finding out who owns a software license on a server and why is it deployed it is difficult if not impossible to obtain that from automated discovery for example.

    4. Inaccuracy: Software installers can fail to update fingerprints to reflect the true owner of the license creating a false positive. Dependencies can cause false negatives. For example, if a given process only runs sporadically, then the discovery process may have to be running at just the right time to catch it.

    3. Currency: The discovery tool usually runs as a batch process. It caches the data, which can then become outdated almost immediately, until the next run of the discovery, when it becomes outdated again almost immediately. Continuous discovery is rare because of the overhead it generates.

    2. Reach: It is rare for an organization to have a network that is not segmented. Discovery has to be given access to every segment to be complete.

    1. It’s just one source. Your business architecture, services, products, and applications must still be mapped. Who owns what? Who is called? How do escalation paths work? What about chargebacks? Who is accountable for compliance exceptions? None of this can be discovered. Discovery has to be augmented with data from many sources to be validated and complete.

    Blazent has been working with some of the largest and most complex IT infrastructures in the world helping customers overcome the limitations of their discovery tools. Learn more at www.blazent.com
  • The Top Priorities for Data Intelligence and Integrity Solutions Recorded: Mar 28 2017 3 mins
    Dan Ortega - Vice President of Marketing
    Because Blazent is a leading provider Data Intelligence and Integrity solutions, I felt compelled to read Philip Russom of TDWI Research’s white paper outlining his top priorities for data quality solutions. This top list is a subset of Philip’s, with a focus on the data that typically supports the IT Service Management (ITSM) functions in an organization. ITSM functions rely heavily on a foundation of dependable data. This makes improving data quality a critical requirement for the successful delivery of IT services.
    Priority #1: Broader Scope for Data Quality We say data quality as if it’s a single, solid monolith. In reality, data quality is a family of eight or more related techniques. Data standardization is the most commonly used technique, followed by verification, validation, monitoring, profiling, matching, and so on. These techniques are applicable to data that is used by any business function including IT, Operational Technology (OT), Finance, Sales, and Marketing. Don’t make the mistake of limiting the benefits of data quality management to just IT.
    Priority #2: Real-Time Data Quality TDWI’s survey revealed that real-time data quality is the second-fastest-growing data management discipline, after master data management and just before real-time data integration. Applying real-time data quality techniques as data is created and streamed means data can be make ITSM more responsive to real-time business needs.
    Priority #3: Data Quality Services Data quality techniques need to be generalized so they are available as services that can be called from a wide range of tools, applications, databases, and business processes. Data quality services enable greater interoperability among tools and modern application architectures as well as reuse and consistency.
    Because ITSM processes rely so heavily on data accuracy and completeness, data quality services have a tremendous value potential to drive operational efficiencies.
  • Avoiding IT and Operational Technology Convergence Pitfalls Recorded: Mar 23 2017 4 mins
    Dan Ortega - Vice President of Marketing
    In this video, we discuss pitfalls to avoid when consolidating IT and Operational Technologies

    A key technology convergence impacting the mainstream adoption of the Internet-of-Things (IoT) is the coming together of Information Technology (IT) and Operational Technology (OT).
    Below we explore five potential pitfalls to avoid when considering unified IT and OT:
    1.Visibility: Improving visibility across unified IT/OT infrastructure has some benefits such as enabling a single service desk to handle both IT and OT domains, and being able to use common management tools.

    2.Security: We have discussed how operational technology can create a risk for IT. There is, however, an upside of converging IT and OT. The converged technology infrastructure can be subject to the same security policies and can use common compliance controls.

    3.Scalability: By operating OT and IT in separate silos, you miss out on opportunities to procure complimentary technology for both. Purchasing can negotiate better discounts if they are buying technology in high volumes and IT gets to buy IT and OT technology that works together because it can be pre-integrated.

    4.Administration: By keeping IT and OT separated, an organization cannot benefit from being able to lower administration costs through streamlining and centralizing management.

    5.Collaboration: Higher up the food chain, since OT is normally more closely aligned with how the business makes money, a converged IT and OT solution can improve the partnership between business and IT.
    Blazent focuses on providing near real-time insights that can be gained by being able to ingest and analyze large numbers of IT and IoT data streams, correcting data gaps and inconsistencies before the data is consumed.
  • The Top 5 Configuration Management Database Challenges for 2017 Recorded: Mar 11 2017 4 mins
    Dan Ortega - VP of Marketing at Blazent
    While not particularly high-visibility, a well-managed Configuration Management Database (CMDB) provides organizations with tremendous value. Along with this value, a CMDB requires an organization to take responsibility for keeping it fit-for-purpose. Hewlett Packard Enterprise (HPE) sponsored a leading analyst firm to poll 100 IT executives on CMDB and discovery tools. This video reviews the most interesting findings on CMDB challenges.
  • The Top 5 Reasons to Augment Discovery Tools Recorded: Mar 7 2017 4 mins
    Dan Ortega - VP of Marketing at Blazent
    In this episode we cover Blazent’s Top 5 reasons that discovery tools need to be augmented with a process that increases data accuracy to levels that make are make the data authoritative and useful to multiple IT and Finance functions.
  • Top 5 Enterprise Data Layers Recorded: Feb 26 2017 3 mins
    Dan Ortega - VP of Marketing at Blazent
    In this video, we look at the top 5 data layers that manage the operations and governance of an enterprise.
  • Top Five Reasons to Use Verified Asset Discovery Recorded: Feb 16 2017 4 mins
    Dan Ortega - VP of Marketing for Blazent
    In this Top 5 video we list the top responses from a survey of 58 IT managers on the benefits they observed from using IT asset discovery and mapping technology.
    During February 2017, Forrester Research published a report titled, “IT Efficiency Begins with Effective Discovery and Dependency Mapping,” sponsored by BMC, which focuses on the benefits of using discovery and mapping tools.
    The top 5 responses, in ascending order and assuming validated data, are as follows:
    5. Improved disaster recovery: An accurate inventory of systems and services to be failed-over or recovered in the event of an outage is essential for a successful disaster recovery. This is an important factor to consider when deciding how often to refresh any inventory.
    4. Improved ability to track and report on existing and new assets: Maintaining an accurate inventory improves service management, compliance, security and IT operations functions. Conversely, an un-validated asset inventory will hinder these functions.
    3. Ability to reduce risk by upgrading software: Unsupported software creates a weakness in an organization’s armor that attackers can exploit. Software versions must be meticulously tracked and updated to contain exposures and control risk.
    2. Better asset management: When managing the asset life cycle, delaying the recording of milestones, such as deployments, decommissions and changes in asset ownership, can be costly. In an environment, which outsources the management of assets, inaccurate billing can occur, which is expensive, burdensome to reconcile, and reduces customer satisfaction.
    1. Better able to address compliance issues: Compliance was the #1 reason to automate asset discovery, since incomplete or inaccurate inventories can lead to compliance audit failures and fines. The fines can be from vendors or worse, regulators who have the power to interrupt business operations.
  • Top Five Steps to Improve Business Insight Through Data Quality Recorded: Feb 10 2017 3 mins
    Dan Ortega - VP of Marketing at Blazent
    Organizations generate phenomenal amounts of data every day.
    This video suggests five steps that should be taken to improve operational insight as well as to minimize the negative effects of bad data.
  • Top 5 Security Breaches of 2016 Recorded: Feb 1 2017 5 mins
    Dan Ortega - VP of Marketing at Blazent
    In this episode, we look back at Top 5 Security Breaches of 2016. These include commercial companies such as Yahoo as well as US States.
  • Top 5 Benefits of Doing IoT Right Recorded: Jan 23 2017 4 mins
    Dan Ortega - VP of Marketing at Blazent
    In this episode we will review some of the Top 5 reasons why IoT is fast becoming mainstream. The benefits came from a November 2016, Hewlett-Packard Enterprise survey of approximately 80 IT professionals to understand how IoT was working for them.
  • Top 5 Accelerators Defining the Business Landscape in 2020 Recorded: Jan 18 2017 4 mins
    Dan Ortega - Blazent's Vice President of Marketing
    In this episode we will review the Top 5 Accelerators Defining the Business Landscape in 2020. What follows is a discussion of the accelerators primarily identified by Forrester and their impact on IT.
  • Top 5 CMDB Source Types Recorded: Jan 10 2017 4 mins
    Dan Ortega - Vice President of Marketing
    In this episode we will review our list of the Top 5 CMDB Source Types.
    These are based on data gathered from hundreds of CMDB stand-ups and data quality initiatives using our data quality management solutions.
  • Top 5 IoT applications for IT Recorded: Jan 4 2017 4 mins
    Dan Ortega - Vice President of Marketing
    In this episode we will review the Top 5 (Internet-of-Things) applications for IT. This Top 5 list was inspired by an article in Forbes reporting on Gartner predictions for The Internet-of-Things (IoT) technologies in 2017 and 2018.
Blazent
Blazent

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: The State of Enterprise Data Quality 2016
  • Live at: Aug 30 2016 10:30 pm
  • Presented by: Carl Lehmann - 451 Research
  • From:
Your email has been sent.
or close