Hi [[ session.user.profile.firstName ]]


  • Date
  • Rating
  • Views
  • Integrated Data is the Key for State Agencies to Become More Customer-centric
    Integrated Data is the Key for State Agencies to Become More Customer-centric
    Dan Ortega - Vice President of Marketing Recorded: Aug 26 2017 5 mins
    For a long period of time, state agencies have been built on a foundation of bureaucracy, process and structure, imposing governmental culture and value systems on the citizens and organizations that interact with them. The impact of this is not only in the inherent inefficiencies that have been created, but also in the steadily increasing governmental costs associated with providing service. Fortunately, the environment is changing. Government agencies are increasingly looking to private industry as an example of modern customer-centric interactions and the internal capabilities needed to enable them.
    State IT organizations have been some of the strongest proponents of IT service management, enterprise architecture and data governance standards. While it may appear that these approaches perpetuate the bureaucratic mindset, in reality, they establish a framework where the lines between government/private industry can be blurred, and citizens can benefit from the strengths of government organizations in new and innovative ways.
    State processes have always been data-centric – collecting, processing and analyzing information to support the agency’s charter. Recently, however, the interpretation of this charter has changed to include a stronger focus on the efficient use of resources and the effectiveness of the organization in making a positive impact on its served community. While standards provide a framework for transparency, responsiveness and connectivity, achieving success relies strongly on implementation. How IT systems are implemented, both internally to the organization and in conjunction with the broader ecosystem of public and private partner organizations, is critical for determining whether the organization’s charter can be effectively fulfilled in the context of modern interactions and under the present-day cost constraints.
  • How Data Quality Management can Help Create a Patient Health Timeline
    How Data Quality Management can Help Create a Patient Health Timeline
    Dan Ortega - Vice President of Marketing Recorded: Aug 8 2017 4 mins
    One of the goals of health reform and digital medical records efforts during the past decade has been enabling the creation of unified medical records. This “patient health timeline ” would be a complete digital chronology of the patient’s lifetime medical history (including symptoms, test results, diagnosis, provider notes and treatment activities) that providers can use when treating the patient.

    An ambitious goal, the “patient health timeline” has been a difficult vision to realize due to the volume and fragmentation of patient health records – some of which have been digitized and some still reside in paper form only.

    Fragmentation: Health records for a single patient are spread across the systems of a number of healthcare providers, insurance companies, pharmacies, hospitals and treatment centers. Each of these systems is unique, with no standard means of integrating patient data. Properly contextualizing data through an accurate set of relationships is key to establishing the integrity of integrated data from different sources.

    Accuracy: There are portions of a patient’s health record which are relatively static throughout their lifetime (family medical history, allergies, chronic conditions and demographic data) and other portions that change with the patient’s health status and general aging (height/weight, reported symptoms, diagnosis and treatments, mental state, etc.). For the static portions (e.g., profile information), provider records often contain conflicting

    Patient Privacy: Regulations require patients to grant specific authorization for the use and sharing of personal health records. Compiling the patient health timeline would require the patient to grant authorization for the data to be integrated, for the use of the timeline data after it is compiled and to allow them to revoke authorization for specific data points or sets during the future.
  • Now That the Low-Hanging Fruit Has Been Harvested – Focus on Integrated Insights
    Now That the Low-Hanging Fruit Has Been Harvested – Focus on Integrated Insights
    Dan Ortega - Vice President of Marketing Recorded: Aug 4 2017 4 mins
    For almost a decade, companies have been investing in IT systems to support business process automation and to enable data-driven decision making. The good news is that those investments have generated acceptable ROIs, most core functions have IT systems to support them, and the leaders of those functions use the generated data to make decisions every day. What happens now?
    Reports aligned to the business functions the software is designed to support – providing improved functional insights to end-users and decision makers. This reporting is sufficient (and in some cases ideal) to support the discrete needs of the individual business function or process and, during time, has enabled companies to independently optimize sales, manufacturing, finance, customer support, IT and other functions. The downside has been a tendency to create siloed business behavior and blind spots to data in other parts of the organization.
    Modern businesses are becoming more aware of the blurred dividing lines across organizations, as business leaders work together to address mounting cost pressures to retain their competitive advantage. Not only has the low-hanging fruit of functional optimization already been harvested, but it is also becoming clear to many leaders that optimizing cross-functionally across the company not only leads to greater efficiency and reduced duplication, but impacts opportunities and the potential for value on a much larger scale.
    To enable cross-functional optimization, IT organizations must deliver capabilities to business decision makers to look at data across the organization, allowing them to gain the integrated insights they need.
  • Maintaining an accurate CMDB Demo
    Maintaining an accurate CMDB Demo
    Dan Ortega - Vice President of Marketing Recorded: Jul 28 2017 4 mins
    In less than 4 minutes this video demonstrates how Blazent creates and maintains the highest CMDB data quality
  • 5 Key Barriers to IT/OT Integration and How to Overcome Them
    5 Key Barriers to IT/OT Integration and How to Overcome Them
    Dan Ortega - VP Marketing at Blazent Recorded: Jul 21 2017 4 mins
    Operational Technology (OT) consists of hardware and software that are designed to detect or cause changes in physical processes through direct monitoring and control of devices. As companies increasingly embrace OT, they face a dilemma as to whether to keep these new systems independent or integrate them with their existing IT systems. As IT leaders evaluate the alternatives, there are 5 key barriers to IT/OT integration to consider.
    Business Process Knowledge
    Manageability & Support
    Dependency Risk – Two of the key challenges of enterprise IT environments are managing the complex web of dependencies and managing the risk of service impact when a dependent component fails or is unavailable. With traditional IT, the impact is typical to some human activity, and the user is able to mitigate impact through some type of manual activity. For OT, companies must be very careful managing the dependencies on IT components to avoid the risk of impacting physical processes when and where humans are not available to intervene and mitigate the situation.
    Management of OT Data – The data produced by OT devices can be large, diverse in content, time sensitive for consumption and geographically distributed (sometimes not even connected to the corporate network). In comparison, most IT systems have some level of tolerance for time delays, are relatively constrained in size and content and reliably connected to company networks, making them accessible to the IT staff for data management and support.
    Security – IT systems are a common target for malicious behavior by those wishing to harm the company. The integration of OT systems with IT creates additional vulnerability targets with the potential of impacting not just people and but also physical processes.
    Segmentation of IT
  • IT Under Attack: Data Quality Technology Helps Companies Assess Security Vulnera
    IT Under Attack: Data Quality Technology Helps Companies Assess Security Vulnera
    Dan Ortega - VP Marketing at Blazent Recorded: Jul 13 2017 6 mins
    In the wake of the most recent (May 2017) malware attack impacting computer systems around the world, company executives are in urgent discussions with IT leaders, asking them to provide assessments of risks and vulnerabilities and recommendations to safeguard the company’s information and operations. CIOs and IT leaders strongly depend on the accuracy, completeness and trustworthiness of the data at their disposal to make informed decisions. How confident are you of the data being used to protect your organization from harm?
    There are commonly at least 5 independent sources of data that must be combined to identify what devices are potentially vulnerable and what business functions depend on them. When these data sets are gathered, there will undoubtedly be a large number of duplicates, partial records, records for devices that have been retired or replaced, conflicting data about the same device and records with old data that is inaccurate. According to Gartner, at any moment, as much as 40% of enterprise data is inaccurate, missing or incomplete. Data quality technology can help integrate the data, resolve the issues, alert data management staff to areas that need attention and help decision makers understand the accuracy and completeness of the data on which they depend.
    Blazent has been a leader in providing Data Quality solutions for more than 10 years and is an expert in integrating the types of IT operational data needed to help CIOs and IT leaders assemble an accurate and unified big picture view of their technology ecosystem. With data quality and trustworthiness enabled by Blazent’s technology, your leaders and decision makers can be confident that the information they are using to assess vulnerabilities and risks will lead to solid recommendations and decisions that protect your organization from harm.
  • The Future Is Closer Than You Think – Data Is coming (and Fast). How Will You Ma
    The Future Is Closer Than You Think – Data Is coming (and Fast). How Will You Ma
    Dan Ortega - VP Marketing at Blazent Recorded: Jul 6 2017 5 mins
    What will you do when your job and the future of your company hinges on your ability to analyze almost every piece of data your company ever created against everything known about your markets, competitors and customers – and the impact of your decision will determine success or failure? That future is closer than you think. Data on an entirely different level is coming, and much faster than anyone realizes. Are you prepared for this new paradigm?

    •Technologists have been talking about “big-data” as a trend for more than a decade and that it is coming “” “Soon” is now in your rear-view mirror.
    •Companies have been capturing and storing operational and business process data for more than 20 years (sometimes longer), providing a deep vault of historical data, assuming you can access it.
    •IoT is leading to the creation of a massive stream of new operational data at an unprecedented rate. If you think volumes are high now, you’ve seen nothing yet.
    •The free flow of user-generated (un-curated) information across social media has enabled greater contextual insights than ever before, but concurrently the signal-to-noise ratio is off the charts.

    What does all this mean? It means big data is already driving everything we do. The analytics capabilities of IT systems are becoming more sophisticated and easier for business leaders to use to analyze and tune their businesses. For them to be successful and make good decisions, however, the data on which they rely must be trustworthy, complete, accurate and inclusive of all available data sets.

    Delivering the underlying quality data that leaders need is no small feat for the IT department. The problem has transformed from“ not enough data” to “too much of a good thing.” The challenge facing most organization is filtering through the noise in the data and amplifying the signal of information that is relevant and actionable for decision-making.
  • Answers to 8 Essential Questions about Assets That Should Be in Your CMDB
    Answers to 8 Essential Questions about Assets That Should Be in Your CMDB
    Dan Ortega - VP Marketing at Blazent Recorded: Jun 29 2017 6 mins
    1. What are they? An accurate inventory of what assets and configuration items exist in your IT ecosystem is the foundation of your CMDB. Your asset/CI Records may come from discovery tools, physical inventories, supplier reports, change records, or even spreadsheets, but whatever their origin, you must know what assets you have in your environment.
    2. Where are they? Asset location may not seem relevant at first, but the physical location of hardware, software and likely infrastructure impacts what types of SLAs you can provide to users, the cost of service contracts with suppliers and, in some areas,
    3. Why do we have them? Understanding the purpose of an asset is the key to unlocking the value it provides to the organization. Keep in mind that an asset’s purpose may change during time as the business evolves.
    4. To what are they connected? Dependency information is critical for impact assessment, portfolio management, incident diagnosis and coordination of changes.
    5. Who uses them? User activities and business processes should both be represented in the CMDB as CIs (they are part of your business/IT ecosystem).
    6. How much are they costing? Assets incur both direct and indirect costs for your organizations. Some examples may include support contracts, licensing, infrastructure capacity, maintenance and upgrades, service desk costs, taxes and management/overhead by IT staff.
    7. How old are they? Nothing is intended to be in your environment forever. Understanding the age and the expected, useful life of each of your assets helps you understand the past and future costs (TCO) and inform decisions about when to upgrade versus when to replace an asset.
    8. How often are they changing? Change requests, feature backlogs and change management records provide valuable insights into the fitness of the asset for use (both intended use and incidental).
  • Machine Learning Is Re-Inventing Business Process Optimization
    Machine Learning Is Re-Inventing Business Process Optimization
    Dan Ortega - VP Marketing at Blazent Recorded: Jun 23 2017 5 mins
    Machine Learning is a game changer for business process optimization – enabling organizations to achieve levels of cost and quality efficiency never imagined previously. For the past 30 years, business process optimization was a tedious, time-consuming manual effort. Those tasked with this effort had to examine process output quality and review a very limited set of operational data to identify optimization opportunities based on historical process performance. Process changes would require re-measurement and comparison to pre-change data to evaluate the effectiveness of the change. Often, improvement impacts were either un-measurable or failed to satisfy the expectation of management.
    With modern machine-learning capabilities, process management professionals are able to integrate a broad array of sensors and monitoring mechanisms to capture large volumes of operational data from their business processes. This data can be ingested, correlated and analyzed in real-time to provide a comprehensive view of process performance. Before machine learning, managing the signals from instrumented processes was limited to either pre-defined scenarios or the review of past performance. These limitations have now been removed.
    In business process optimization, there is an important distinction to be made between “change” and “improvement.” Machine-learning systems can correlate a large diversity of data sources – even without pre-defined relationships. They provide the ability to qualify operational (process) data with contextual (cost/value) data to help process managers quantify the impacts of inefficiencies and the potential benefits of changes. This is particularly important when developing a business justification for process optimization investments.
  • Unpatched Windows Machines Will Make You “Wanna Cry”
    Unpatched Windows Machines Will Make You “Wanna Cry”
    Dan Ortega - Vice President of Marketing Recorded: Jun 15 2017 3 mins
    The Wanna Cry ransomware worm ravaged computers across 150 countries. The attacks began May 12, 2017, infecting PCs of organizations that had not applied security updates to some versions of Microsoft Windows. This menace paired ransomware that encrypted computers and demanded payment with a worm that enabled it to spread quickly. The ransomware encrypts all the user’s data, then a pop-up message appears demanding a $300 Bitcoin payment in return for the decryption key.
    In the UK, the National Health System attack resulted in hospital workers being unable to review patient health histories, causing postponed surgeries and increasing risks to all new patients. Medical staff reported seeing computers go down “one by one” as the attack took hold, locking machines and demanding money to release the data.
    Organizations had only days to patch their Windows end-user and server systems. Once on a system, the malware discovers on what subnet it is located, so it can infect its neighbors. Anti-virus software is the next defense when a worm has breached a machine. Ensuring total coverage of IT infrastructure is critical. Any chinks in the armor must be detected and remediated. Anti-virus products detect strings of code known as virus signatures before killing the offending program. When these products fail, network administrators are forced to redirect suspicious traffic to IP sinkholes, and then direct them from harm’s way.
    Just like anti-virus software, patch management solutions usually require a management agent to be installed on the target system. Not surprisingly, 100% coverage is very rare.
    Despite encouraging reports of waning threat activity, Wanna Cry continues to pose significant risks. Blazent provides a SaaS solution that enables its customers to take advantage of five or more data sources to build an accurate inventory of their IT assets, such as end-user systems and servers.

Embed in website or blog