Hi [[ session.user.profile.firstName ]]


  • Date
  • Rating
  • Views
  • Why CMDBs are Sexier than you Think
    Why CMDBs are Sexier than you Think Dan Ortega - Vice President of Marketing Recorded: May 18 2017 3 mins
    Sexy may not be the first word that comes to mind when you think about your CMDB and the operational data of your company… but (seriously) maybe it should be! After all, your CMDB has a number of attractive qualities and (with some care and feeding) could be the ideal partner for a lasting long-term relationship. There are lots of potential reasons this can work, but let’s focus on the top three:
    Substance: Your CMDB is not shallow and fickle, it is strong and deep, with a history as long as your company’s. The CMDB is built on a core of your master data and pulls together all of the facets of operational data your company creates every day. It contains the complex web of connective tissue that can help you understand how your company works. Those insights then become part of the CMDB itself – enabling the strength of your data to be balanced by the wisdom that comes from analytics and self-awareness.
    Long-term potential: You may lust after the latest new tool or trend, but your CMDB will stand by your company’s side through thick and thin, long into the future. It will grow and evolve with you, always be honest about what’s going on, and work with you to provide insights to get your company through troubled times. As your company changes with new markets, products, customers, and competitors or becomes a part of something bigger through acquisition or partnership, your CMDB is there to help you navigate the changes and achieve success.
    Air of mystery: You may never fully understand all of the secrets that your CMDB holds about your company. As you unlock one insight, the potential for others seems to appear magically. What would you expect from something that brings together all parts of your company data and the complex interrelationships in one place for you to explore?
    Deep substance, long-term potential and an air of mystery. Maybe your CMDB is sexier than you think
  • Data Integrity the Key to Operational Insights or an Elephant in the Room?
    Data Integrity the Key to Operational Insights or an Elephant in the Room? Dan Ortega - Vice President of Marketing Recorded: May 5 2017 4 mins
    Throughout history, business has always struggled with the challenge of data accuracy and integrity. Executives constantly ask their IT leaders how they can improve the quality and integrity of data in order to obtain the insights needed to guide their company effectively. While it sounds reasonable, it may well be the wrong question. Rather than focusing on the quality of raw data, a better approach is to focus on the quality of insights available and the speed/cost to obtain them by asking, “How can we better leverage the data we already have to cost effectively obtain the insights we need?”
    Advances in machine learning, data science and correlation analysis during the past decade have enabled a broader range of capabilities to analyze data from disparate operational processes and information systems. This has been accomplished without developing some of the structured relationships and incurring data-model-integration costs associated with traditional data warehousing and reporting approaches
    Through assessment of the trends and relationships between different data elements, modern data analysis systems are able to “discover” a variety of insights that may not have been available during the past. Examples include undocumented dependencies within operational processes, sources of data inaccuracy and the evolution of operational processes during time. Instead of focusing on what is “known” about operational data, modern methods focus on understanding what is “unknown” about operational data.
    Is data integrity the key to operational insights or is it the elephant in the room? That depends on how organizations want to view the situation. Data Integrity at both the informational and operational level is a core requirement of any modern business, and has been an area of focus for Blazent since the early days of Big Data.
  • Why Is Operational Data Important for IT?
    Why Is Operational Data Important for IT? Dan Ortega - Vice President of Marketing Recorded: Apr 28 2017 4 mins
    Each day, with every customer transaction, employee task and business process, companies generate vast amounts of operational data that provides leaders and managers with insight into what is working well and what requires attention. Operational data is particularly important to those responsible for stewarding the information and technology assets of their organization.
    In this context, operational data is particularly important to IT, which is why it is so critical to understand the three different types of operational data on which IT leaders rely.
    Business operational data is all about the business processes and user experiences, which IT enables with the technology and services it provides. The reason organizations invest in technology is to improve the productivity and effectiveness of business operations. Process and user-related data evaluated over time provides a contextual picture into how effectively the technology is achieving that goal.
    IT operational data is concerned with the content of “what” technology components are operating and being used. IT operational data is important as a part of the IT planning process to understand capacity utilization and determine where scalability constraints exist, as well as to understand the cost of services provided to users and to assess security and risk considerations of the business-technology ecosystem. Within IT service management processes, operational data is critical to ensure performance and availability Service Levels Agreements (SLAs) are honored, and to drive technology cost reduction through infrastructure optimization.
    Operational data provides IT with the critical picture it needs to understand and optimize the role it plays in the context of the company.
  • Benefits of Machine Learning in IT Infrastructure
    Benefits of Machine Learning in IT Infrastructure Dan Ortega - Vice President of Marketing Recorded: Apr 21 2017 4 mins
    During the next 5 years, machine learning is poised to play a pivotal and transformational role in how IT Infrastructure is managed. Two key scenarios are possible: transforming infrastructure from a set of under-utilized capital assets to a highly efficient set of operational resources through dynamic provisioning based on consumption; and the identification of configurations, dependencies and the cause/effect of usage patterns through correlation analysis.
    In the world of IT infrastructure, it’s all about efficient use of resources. With on-premise infrastructure (compute, storage and network) utilization rates for most organizations in the low single digits, the cloud has sold the promise of a breakthrough. For those organizations moving to Infrastructure as a Service (IaaS), utilization in the middle to high teens is possible, and for those moving to Platform as a Service (PaaS), utilization in the mid-twenties is within reach.
    Dynamic provisioning driven by demand is essentially the same operational concept as power grids and municipal water systems – capacity allocation driven by where resources are consumed, rather than where they are produced.
    The second part of the breakthrough relates to right-sizing infrastructure. Whether this is network capacity or compute Virtual Machine size – machine learning will enable analysis of the patterns of behavior by users and correlate them to the consumption of infrastructure resources.
    During the near term, these benefits will be much more tactical. Automated discovery combined with behavioral correlation analysis will virtually eliminate the need for manual inventory and mapping of components and configuration items in the IT ecosystem to reveal how the ecosystem is operating.
    Today, IT has the opportunity to automate the mapping of components in their infrastructure to provide a more accurate and actionable picture.
  • Downstream Impacts of IT Data Improvement
    Downstream Impacts of IT Data Improvement Dan Ortega - Vice President of Marketing Recorded: Apr 17 2017 4 mins
    How well prepared is your organization for growth? What are the challenges to making progress?
    IT systems and the data they contain for the organization is often seen as a foundational capability, an underpinning function, or simply a static resource separate from the organization’s core value chain. Framing data as a part of a value chain can enable you to see the downstream impact of upstream IT data improvements in the activities that consume them.
    When the integrity and quality of IT data improves, leaders have more confidence in the decisions they make. They have the ability to evaluate opportunities and problems faster and more easily, without the need to question and independently validate the information they are continuously receiving. This increase in confidence can lead to the pursuit of more ambitious and more broadly scoped business opportunities, as well as the ability to preemptively mitigate organizational risks.

    Improving the quality of the data available from IT enables data professionals to see process-performance variances easier and faster, and to correlate previously independent data sets that can drive new operational insights.

    Data integration improvements across IT systems improve the efficiency of employees involved in executing transactional processes by reducing the need for redundant data entry tasks to keep operational data in sync as transactions flow through business processes. By removing manual tasks, managers and leaders have greater transparency into operational performance with a lower risk of intentional data manipulation and/or human error.
    By providing an automated solution that delivers the highest data quality by using information gained from multiple sources to create refined data records.
  • The Three Key Requirements to Achieve Data Integrity
    The Three Key Requirements to Achieve Data Integrity Dan Ortega - Vice President of Marketing Recorded: Apr 13 2017 3 mins
    Having data you can rely on is foundational to good decision making. Data Integrity is an important requirement which can be defined in many ways. The Technopedia definition of Data Integrity focuses on three key attributes of completeness, accuracy, and consistency.
    In this video, we review these attributes in the context of IT Service and Operations Management . So, to begin:
    Completeness: A data record such as a description of an IT asset needs to be complete in order to satisfy the needs of all its consumers. For example, IT Operations cares about whether the asset is active, as well as its location, while Finance wants to manage attribution of software licenses. Gaps in the attribute data can impair an organization’s ability to manage the asset.
    Accuracy: Having wrong or misleading data helps no one. The cause of inaccuracy can be due to manual input errors, or mis-handled conflicting data between sources from poor IT discovery tools that miss or double-count an asset.
    Consistency: This is one of the harder data integrity issues to resolve. If you only have a single source of data, it is likely to be consistent (although potentially consistently wrong). However, in order to verify the data, it has to be validated against multiple sources. Deciding which source is the most accurate is complicated, and setting up automated precedence rules can be challenging without the right tool.
    Achieving and maintaining data integrity can be done using various error-checking methods such as normalization and validation procedures. Blazent’s Data Integrity platform was designed to make data management scalable through an automated process for exception handling. To learn more about the importance of good data integrity in an IT Service Management context, you can read Blazent’s white paper on Data Powered IT Service Management, available on the resources page on our website at www.blazent.com
  • Top Five Limitations of IT Discovery Tools
    Top Five Limitations of IT Discovery Tools Dan Ortega - Vice President of Marketing Recorded: Apr 2 2017 3 mins
    IT discovery tools a promise to uncover all hardware, software, servers, databases, applications, dependencies and more. Unfortunately, discovery tools have some significant limitations.
    Below are the Top 5 limitations users face:
    5. Complexity: Discovery tools usually depend on complex Unix or Linux package manager or windows registry data that is complex and hard to decipher. For discovery to be useful, you need undiscoverable data. Finding out who owns a software license on a server and why is it deployed it is difficult if not impossible to obtain that from automated discovery for example.

    4. Inaccuracy: Software installers can fail to update fingerprints to reflect the true owner of the license creating a false positive. Dependencies can cause false negatives. For example, if a given process only runs sporadically, then the discovery process may have to be running at just the right time to catch it.

    3. Currency: The discovery tool usually runs as a batch process. It caches the data, which can then become outdated almost immediately, until the next run of the discovery, when it becomes outdated again almost immediately. Continuous discovery is rare because of the overhead it generates.

    2. Reach: It is rare for an organization to have a network that is not segmented. Discovery has to be given access to every segment to be complete.

    1. It’s just one source. Your business architecture, services, products, and applications must still be mapped. Who owns what? Who is called? How do escalation paths work? What about chargebacks? Who is accountable for compliance exceptions? None of this can be discovered. Discovery has to be augmented with data from many sources to be validated and complete.

    Blazent has been working with some of the largest and most complex IT infrastructures in the world helping customers overcome the limitations of their discovery tools. Learn more at www.blazent.com
  • The Top Priorities for Data Intelligence and Integrity Solutions
    The Top Priorities for Data Intelligence and Integrity Solutions Dan Ortega - Vice President of Marketing Recorded: Mar 28 2017 3 mins
    Because Blazent is a leading provider Data Intelligence and Integrity solutions, I felt compelled to read Philip Russom of TDWI Research’s white paper outlining his top priorities for data quality solutions. This top list is a subset of Philip’s, with a focus on the data that typically supports the IT Service Management (ITSM) functions in an organization. ITSM functions rely heavily on a foundation of dependable data. This makes improving data quality a critical requirement for the successful delivery of IT services.
    Priority #1: Broader Scope for Data Quality We say data quality as if it’s a single, solid monolith. In reality, data quality is a family of eight or more related techniques. Data standardization is the most commonly used technique, followed by verification, validation, monitoring, profiling, matching, and so on. These techniques are applicable to data that is used by any business function including IT, Operational Technology (OT), Finance, Sales, and Marketing. Don’t make the mistake of limiting the benefits of data quality management to just IT.
    Priority #2: Real-Time Data Quality TDWI’s survey revealed that real-time data quality is the second-fastest-growing data management discipline, after master data management and just before real-time data integration. Applying real-time data quality techniques as data is created and streamed means data can be make ITSM more responsive to real-time business needs.
    Priority #3: Data Quality Services Data quality techniques need to be generalized so they are available as services that can be called from a wide range of tools, applications, databases, and business processes. Data quality services enable greater interoperability among tools and modern application architectures as well as reuse and consistency.
    Because ITSM processes rely so heavily on data accuracy and completeness, data quality services have a tremendous value potential to drive operational efficiencies.
  • Avoiding IT and Operational Technology Convergence Pitfalls
    Avoiding IT and Operational Technology Convergence Pitfalls Dan Ortega - Vice President of Marketing Recorded: Mar 23 2017 4 mins
    In this video, we discuss pitfalls to avoid when consolidating IT and Operational Technologies

    A key technology convergence impacting the mainstream adoption of the Internet-of-Things (IoT) is the coming together of Information Technology (IT) and Operational Technology (OT).
    Below we explore five potential pitfalls to avoid when considering unified IT and OT:
    1.Visibility: Improving visibility across unified IT/OT infrastructure has some benefits such as enabling a single service desk to handle both IT and OT domains, and being able to use common management tools.

    2.Security: We have discussed how operational technology can create a risk for IT. There is, however, an upside of converging IT and OT. The converged technology infrastructure can be subject to the same security policies and can use common compliance controls.

    3.Scalability: By operating OT and IT in separate silos, you miss out on opportunities to procure complimentary technology for both. Purchasing can negotiate better discounts if they are buying technology in high volumes and IT gets to buy IT and OT technology that works together because it can be pre-integrated.

    4.Administration: By keeping IT and OT separated, an organization cannot benefit from being able to lower administration costs through streamlining and centralizing management.

    5.Collaboration: Higher up the food chain, since OT is normally more closely aligned with how the business makes money, a converged IT and OT solution can improve the partnership between business and IT.
    Blazent focuses on providing near real-time insights that can be gained by being able to ingest and analyze large numbers of IT and IoT data streams, correcting data gaps and inconsistencies before the data is consumed.
  • The Top 5 Configuration Management Database Challenges for 2017
    The Top 5 Configuration Management Database Challenges for 2017 Dan Ortega - VP of Marketing at Blazent Recorded: Mar 11 2017 4 mins
    While not particularly high-visibility, a well-managed Configuration Management Database (CMDB) provides organizations with tremendous value. Along with this value, a CMDB requires an organization to take responsibility for keeping it fit-for-purpose. Hewlett Packard Enterprise (HPE) sponsored a leading analyst firm to poll 100 IT executives on CMDB and discovery tools. This video reviews the most interesting findings on CMDB challenges.

Embed in website or blog