Hi [[ session.user.profile.firstName ]]

The Foundation of Data Quality

Blazent's CEO, Charlie Piper and Dan Ortega introduce the company strategy, vision and it's value to customer's and MSP partners. Together, Charlie and Dan describe how Blazent's platform finds the the most accurate data to improve decision manking in IT and beyond.
Recorded Jul 22 2016 5 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Charlie Piper - CEO and Dan Ortega - VP of Marketing
Presentation preview: The Foundation of Data Quality

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Why Is Operational Data Important for IT? Recorded: Apr 28 2017 4 mins
    Dan Ortega - Vice President of Marketing
    Each day, with every customer transaction, employee task and business process, companies generate vast amounts of operational data that provides leaders and managers with insight into what is working well and what requires attention. Operational data is particularly important to those responsible for stewarding the information and technology assets of their organization.
    In this context, operational data is particularly important to IT, which is why it is so critical to understand the three different types of operational data on which IT leaders rely.
    Business operational data is all about the business processes and user experiences, which IT enables with the technology and services it provides. The reason organizations invest in technology is to improve the productivity and effectiveness of business operations. Process and user-related data evaluated over time provides a contextual picture into how effectively the technology is achieving that goal.
    IT operational data is concerned with the content of “what” technology components are operating and being used. IT operational data is important as a part of the IT planning process to understand capacity utilization and determine where scalability constraints exist, as well as to understand the cost of services provided to users and to assess security and risk considerations of the business-technology ecosystem. Within IT service management processes, operational data is critical to ensure performance and availability Service Levels Agreements (SLAs) are honored, and to drive technology cost reduction through infrastructure optimization.
    Operational data provides IT with the critical picture it needs to understand and optimize the role it plays in the context of the company.
  • Benefits of Machine Learning in IT Infrastructure Recorded: Apr 21 2017 4 mins
    Dan Ortega - Vice President of Marketing
    During the next 5 years, machine learning is poised to play a pivotal and transformational role in how IT Infrastructure is managed. Two key scenarios are possible: transforming infrastructure from a set of under-utilized capital assets to a highly efficient set of operational resources through dynamic provisioning based on consumption; and the identification of configurations, dependencies and the cause/effect of usage patterns through correlation analysis.
    In the world of IT infrastructure, it’s all about efficient use of resources. With on-premise infrastructure (compute, storage and network) utilization rates for most organizations in the low single digits, the cloud has sold the promise of a breakthrough. For those organizations moving to Infrastructure as a Service (IaaS), utilization in the middle to high teens is possible, and for those moving to Platform as a Service (PaaS), utilization in the mid-twenties is within reach.
    Dynamic provisioning driven by demand is essentially the same operational concept as power grids and municipal water systems – capacity allocation driven by where resources are consumed, rather than where they are produced.
    The second part of the breakthrough relates to right-sizing infrastructure. Whether this is network capacity or compute Virtual Machine size – machine learning will enable analysis of the patterns of behavior by users and correlate them to the consumption of infrastructure resources.
    During the near term, these benefits will be much more tactical. Automated discovery combined with behavioral correlation analysis will virtually eliminate the need for manual inventory and mapping of components and configuration items in the IT ecosystem to reveal how the ecosystem is operating.
    Today, IT has the opportunity to automate the mapping of components in their infrastructure to provide a more accurate and actionable picture.
  • Downstream Impacts of IT Data Improvement Recorded: Apr 17 2017 4 mins
    Dan Ortega - Vice President of Marketing
    How well prepared is your organization for growth? What are the challenges to making progress?
    IT systems and the data they contain for the organization is often seen as a foundational capability, an underpinning function, or simply a static resource separate from the organization’s core value chain. Framing data as a part of a value chain can enable you to see the downstream impact of upstream IT data improvements in the activities that consume them.
    When the integrity and quality of IT data improves, leaders have more confidence in the decisions they make. They have the ability to evaluate opportunities and problems faster and more easily, without the need to question and independently validate the information they are continuously receiving. This increase in confidence can lead to the pursuit of more ambitious and more broadly scoped business opportunities, as well as the ability to preemptively mitigate organizational risks.

    Improving the quality of the data available from IT enables data professionals to see process-performance variances easier and faster, and to correlate previously independent data sets that can drive new operational insights.

    Data integration improvements across IT systems improve the efficiency of employees involved in executing transactional processes by reducing the need for redundant data entry tasks to keep operational data in sync as transactions flow through business processes. By removing manual tasks, managers and leaders have greater transparency into operational performance with a lower risk of intentional data manipulation and/or human error.
    By providing an automated solution that delivers the highest data quality by using information gained from multiple sources to create refined data records.
  • The Three Key Requirements to Achieve Data Integrity Recorded: Apr 13 2017 3 mins
    Dan Ortega - Vice President of Marketing
    Having data you can rely on is foundational to good decision making. Data Integrity is an important requirement which can be defined in many ways. The Technopedia definition of Data Integrity focuses on three key attributes of completeness, accuracy, and consistency.
    In this video, we review these attributes in the context of IT Service and Operations Management . So, to begin:
    Completeness: A data record such as a description of an IT asset needs to be complete in order to satisfy the needs of all its consumers. For example, IT Operations cares about whether the asset is active, as well as its location, while Finance wants to manage attribution of software licenses. Gaps in the attribute data can impair an organization’s ability to manage the asset.
    Accuracy: Having wrong or misleading data helps no one. The cause of inaccuracy can be due to manual input errors, or mis-handled conflicting data between sources from poor IT discovery tools that miss or double-count an asset.
    Consistency: This is one of the harder data integrity issues to resolve. If you only have a single source of data, it is likely to be consistent (although potentially consistently wrong). However, in order to verify the data, it has to be validated against multiple sources. Deciding which source is the most accurate is complicated, and setting up automated precedence rules can be challenging without the right tool.
    Achieving and maintaining data integrity can be done using various error-checking methods such as normalization and validation procedures. Blazent’s Data Integrity platform was designed to make data management scalable through an automated process for exception handling. To learn more about the importance of good data integrity in an IT Service Management context, you can read Blazent’s white paper on Data Powered IT Service Management, available on the resources page on our website at www.blazent.com
  • Top Five Limitations of IT Discovery Tools Recorded: Apr 2 2017 3 mins
    Dan Ortega - Vice President of Marketing
    IT discovery tools a promise to uncover all hardware, software, servers, databases, applications, dependencies and more. Unfortunately, discovery tools have some significant limitations.
    Below are the Top 5 limitations users face:
    5. Complexity: Discovery tools usually depend on complex Unix or Linux package manager or windows registry data that is complex and hard to decipher. For discovery to be useful, you need undiscoverable data. Finding out who owns a software license on a server and why is it deployed it is difficult if not impossible to obtain that from automated discovery for example.

    4. Inaccuracy: Software installers can fail to update fingerprints to reflect the true owner of the license creating a false positive. Dependencies can cause false negatives. For example, if a given process only runs sporadically, then the discovery process may have to be running at just the right time to catch it.

    3. Currency: The discovery tool usually runs as a batch process. It caches the data, which can then become outdated almost immediately, until the next run of the discovery, when it becomes outdated again almost immediately. Continuous discovery is rare because of the overhead it generates.

    2. Reach: It is rare for an organization to have a network that is not segmented. Discovery has to be given access to every segment to be complete.

    1. It’s just one source. Your business architecture, services, products, and applications must still be mapped. Who owns what? Who is called? How do escalation paths work? What about chargebacks? Who is accountable for compliance exceptions? None of this can be discovered. Discovery has to be augmented with data from many sources to be validated and complete.

    Blazent has been working with some of the largest and most complex IT infrastructures in the world helping customers overcome the limitations of their discovery tools. Learn more at www.blazent.com
  • The Top Priorities for Data Intelligence and Integrity Solutions Recorded: Mar 28 2017 3 mins
    Dan Ortega - Vice President of Marketing
    Because Blazent is a leading provider Data Intelligence and Integrity solutions, I felt compelled to read Philip Russom of TDWI Research’s white paper outlining his top priorities for data quality solutions. This top list is a subset of Philip’s, with a focus on the data that typically supports the IT Service Management (ITSM) functions in an organization. ITSM functions rely heavily on a foundation of dependable data. This makes improving data quality a critical requirement for the successful delivery of IT services.
    Priority #1: Broader Scope for Data Quality We say data quality as if it’s a single, solid monolith. In reality, data quality is a family of eight or more related techniques. Data standardization is the most commonly used technique, followed by verification, validation, monitoring, profiling, matching, and so on. These techniques are applicable to data that is used by any business function including IT, Operational Technology (OT), Finance, Sales, and Marketing. Don’t make the mistake of limiting the benefits of data quality management to just IT.
    Priority #2: Real-Time Data Quality TDWI’s survey revealed that real-time data quality is the second-fastest-growing data management discipline, after master data management and just before real-time data integration. Applying real-time data quality techniques as data is created and streamed means data can be make ITSM more responsive to real-time business needs.
    Priority #3: Data Quality Services Data quality techniques need to be generalized so they are available as services that can be called from a wide range of tools, applications, databases, and business processes. Data quality services enable greater interoperability among tools and modern application architectures as well as reuse and consistency.
    Because ITSM processes rely so heavily on data accuracy and completeness, data quality services have a tremendous value potential to drive operational efficiencies.
  • Avoiding IT and Operational Technology Convergence Pitfalls Recorded: Mar 23 2017 4 mins
    Dan Ortega - Vice President of Marketing
    In this video, we discuss pitfalls to avoid when consolidating IT and Operational Technologies

    A key technology convergence impacting the mainstream adoption of the Internet-of-Things (IoT) is the coming together of Information Technology (IT) and Operational Technology (OT).
    Below we explore five potential pitfalls to avoid when considering unified IT and OT:
    1.Visibility: Improving visibility across unified IT/OT infrastructure has some benefits such as enabling a single service desk to handle both IT and OT domains, and being able to use common management tools.

    2.Security: We have discussed how operational technology can create a risk for IT. There is, however, an upside of converging IT and OT. The converged technology infrastructure can be subject to the same security policies and can use common compliance controls.

    3.Scalability: By operating OT and IT in separate silos, you miss out on opportunities to procure complimentary technology for both. Purchasing can negotiate better discounts if they are buying technology in high volumes and IT gets to buy IT and OT technology that works together because it can be pre-integrated.

    4.Administration: By keeping IT and OT separated, an organization cannot benefit from being able to lower administration costs through streamlining and centralizing management.

    5.Collaboration: Higher up the food chain, since OT is normally more closely aligned with how the business makes money, a converged IT and OT solution can improve the partnership between business and IT.
    Blazent focuses on providing near real-time insights that can be gained by being able to ingest and analyze large numbers of IT and IoT data streams, correcting data gaps and inconsistencies before the data is consumed.
  • The Top 5 Configuration Management Database Challenges for 2017 Recorded: Mar 11 2017 4 mins
    Dan Ortega - VP of Marketing at Blazent
    While not particularly high-visibility, a well-managed Configuration Management Database (CMDB) provides organizations with tremendous value. Along with this value, a CMDB requires an organization to take responsibility for keeping it fit-for-purpose. Hewlett Packard Enterprise (HPE) sponsored a leading analyst firm to poll 100 IT executives on CMDB and discovery tools. This video reviews the most interesting findings on CMDB challenges.
  • The Top 5 Reasons to Augment Discovery Tools Recorded: Mar 7 2017 4 mins
    Dan Ortega - VP of Marketing at Blazent
    In this episode we cover Blazent’s Top 5 reasons that discovery tools need to be augmented with a process that increases data accuracy to levels that make are make the data authoritative and useful to multiple IT and Finance functions.
  • Top 5 Enterprise Data Layers Recorded: Feb 26 2017 3 mins
    Dan Ortega - VP of Marketing at Blazent
    In this video, we look at the top 5 data layers that manage the operations and governance of an enterprise.
  • Top Five Reasons to Use Verified Asset Discovery Recorded: Feb 16 2017 4 mins
    Dan Ortega - VP of Marketing for Blazent
    In this Top 5 video we list the top responses from a survey of 58 IT managers on the benefits they observed from using IT asset discovery and mapping technology.
    During February 2017, Forrester Research published a report titled, “IT Efficiency Begins with Effective Discovery and Dependency Mapping,” sponsored by BMC, which focuses on the benefits of using discovery and mapping tools.
    The top 5 responses, in ascending order and assuming validated data, are as follows:
    5. Improved disaster recovery: An accurate inventory of systems and services to be failed-over or recovered in the event of an outage is essential for a successful disaster recovery. This is an important factor to consider when deciding how often to refresh any inventory.
    4. Improved ability to track and report on existing and new assets: Maintaining an accurate inventory improves service management, compliance, security and IT operations functions. Conversely, an un-validated asset inventory will hinder these functions.
    3. Ability to reduce risk by upgrading software: Unsupported software creates a weakness in an organization’s armor that attackers can exploit. Software versions must be meticulously tracked and updated to contain exposures and control risk.
    2. Better asset management: When managing the asset life cycle, delaying the recording of milestones, such as deployments, decommissions and changes in asset ownership, can be costly. In an environment, which outsources the management of assets, inaccurate billing can occur, which is expensive, burdensome to reconcile, and reduces customer satisfaction.
    1. Better able to address compliance issues: Compliance was the #1 reason to automate asset discovery, since incomplete or inaccurate inventories can lead to compliance audit failures and fines. The fines can be from vendors or worse, regulators who have the power to interrupt business operations.
  • Top Five Steps to Improve Business Insight Through Data Quality Recorded: Feb 10 2017 3 mins
    Dan Ortega - VP of Marketing at Blazent
    Organizations generate phenomenal amounts of data every day.
    This video suggests five steps that should be taken to improve operational insight as well as to minimize the negative effects of bad data.
  • Top 5 Security Breaches of 2016 Recorded: Feb 1 2017 5 mins
    Dan Ortega - VP of Marketing at Blazent
    In this episode, we look back at Top 5 Security Breaches of 2016. These include commercial companies such as Yahoo as well as US States.
  • Top 5 Benefits of Doing IoT Right Recorded: Jan 23 2017 4 mins
    Dan Ortega - VP of Marketing at Blazent
    In this episode we will review some of the Top 5 reasons why IoT is fast becoming mainstream. The benefits came from a November 2016, Hewlett-Packard Enterprise survey of approximately 80 IT professionals to understand how IoT was working for them.
  • Top 5 Accelerators Defining the Business Landscape in 2020 Recorded: Jan 18 2017 4 mins
    Dan Ortega - Blazent's Vice President of Marketing
    In this episode we will review the Top 5 Accelerators Defining the Business Landscape in 2020. What follows is a discussion of the accelerators primarily identified by Forrester and their impact on IT.
  • Top 5 CMDB Source Types Recorded: Jan 10 2017 4 mins
    Dan Ortega - Vice President of Marketing
    In this episode we will review our list of the Top 5 CMDB Source Types.
    These are based on data gathered from hundreds of CMDB stand-ups and data quality initiatives using our data quality management solutions.
  • Top 5 IoT applications for IT Recorded: Jan 4 2017 4 mins
    Dan Ortega - Vice President of Marketing
    In this episode we will review the Top 5 (Internet-of-Things) applications for IT. This Top 5 list was inspired by an article in Forbes reporting on Gartner predictions for The Internet-of-Things (IoT) technologies in 2017 and 2018.
  • Top 5 CMDB Asset Types Recorded: Dec 17 2016 5 mins
    Dan Ortega - Vice President of Marketing
    In this episodes of Blazent's Top 5, we will review the Top 5 CMDB asset types we have encountered across hundreds of stand-ups and data quality engagements.
  • Top 5 IT Considerations for M&A Recorded: Dec 9 2016 4 mins
    Dan Ortega - Vice President of Marketing
    Mergers and Acquisitions (M&A) are a widely used strategy for organizations. In this episode we discuss the top 5 IT considerations for M&A
  • Top 5 Reasons Why CMDB Initiatives Fail Recorded: Dec 6 2016 4 mins
    Dan Ortega - Vice President of Marketing
    Configuration Management Database (CMDB) initiatives are complex and require engagement from multiple IT disciplines to be successful.

    Understanding why CMDB initiatives fail, can help you make the right decisions early on and avoid running off the road.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: The Foundation of Data Quality
  • Live at: Jul 22 2016 12:40 pm
  • Presented by: Charlie Piper - CEO and Dan Ortega - VP of Marketing
  • From:
Your email has been sent.
or close