Hi [[ session.user.profile.firstName ]]

Blazent

  • Date
  • Rating
  • Views
  • Machine Learning Is Re-Inventing Business Process Optimization
    Machine Learning Is Re-Inventing Business Process Optimization Dan Ortega - VP Marketing at Blazent Recorded: Jun 23 2017 5 mins
    Machine Learning is a game changer for business process optimization – enabling organizations to achieve levels of cost and quality efficiency never imagined previously. For the past 30 years, business process optimization was a tedious, time-consuming manual effort. Those tasked with this effort had to examine process output quality and review a very limited set of operational data to identify optimization opportunities based on historical process performance. Process changes would require re-measurement and comparison to pre-change data to evaluate the effectiveness of the change. Often, improvement impacts were either un-measurable or failed to satisfy the expectation of management.
    With modern machine-learning capabilities, process management professionals are able to integrate a broad array of sensors and monitoring mechanisms to capture large volumes of operational data from their business processes. This data can be ingested, correlated and analyzed in real-time to provide a comprehensive view of process performance. Before machine learning, managing the signals from instrumented processes was limited to either pre-defined scenarios or the review of past performance. These limitations have now been removed.
    In business process optimization, there is an important distinction to be made between “change” and “improvement.” Machine-learning systems can correlate a large diversity of data sources – even without pre-defined relationships. They provide the ability to qualify operational (process) data with contextual (cost/value) data to help process managers quantify the impacts of inefficiencies and the potential benefits of changes. This is particularly important when developing a business justification for process optimization investments.
  • Unpatched Windows Machines Will Make You “Wanna Cry”
    Unpatched Windows Machines Will Make You “Wanna Cry” Dan Ortega - Vice President of Marketing Recorded: Jun 15 2017 3 mins
    The Wanna Cry ransomware worm ravaged computers across 150 countries. The attacks began May 12, 2017, infecting PCs of organizations that had not applied security updates to some versions of Microsoft Windows. This menace paired ransomware that encrypted computers and demanded payment with a worm that enabled it to spread quickly. The ransomware encrypts all the user’s data, then a pop-up message appears demanding a $300 Bitcoin payment in return for the decryption key.
    In the UK, the National Health System attack resulted in hospital workers being unable to review patient health histories, causing postponed surgeries and increasing risks to all new patients. Medical staff reported seeing computers go down “one by one” as the attack took hold, locking machines and demanding money to release the data.
    Organizations had only days to patch their Windows end-user and server systems. Once on a system, the malware discovers on what subnet it is located, so it can infect its neighbors. Anti-virus software is the next defense when a worm has breached a machine. Ensuring total coverage of IT infrastructure is critical. Any chinks in the armor must be detected and remediated. Anti-virus products detect strings of code known as virus signatures before killing the offending program. When these products fail, network administrators are forced to redirect suspicious traffic to IP sinkholes, and then direct them from harm’s way.
    Just like anti-virus software, patch management solutions usually require a management agent to be installed on the target system. Not surprisingly, 100% coverage is very rare.
    Despite encouraging reports of waning threat activity, Wanna Cry continues to pose significant risks. Blazent provides a SaaS solution that enables its customers to take advantage of five or more data sources to build an accurate inventory of their IT assets, such as end-user systems and servers.
  • Optimizing Business Performance with People, Process and Data
    Optimizing Business Performance with People, Process and Data Dan Ortega - Vice President of Marketing Recorded: Jun 9 2017 6 mins
    People are the heart and mind of your business. Processes form the backbone of your operations. Data is the lifeblood that feeds everything you do. For your business to operate at peak performance and deliver the results you seek, people, processes and data must be healthy individually, as well as work in harmony. Technology has always been important to bringing people, process and data together; however, technology’s importance is evolving. As it does, the relationships among people, processes and technology are also changing
    People are the source of the ideas and the engine of critical thinking that enables you to turn customer needs and market forces into competitive (and profitable) opportunities for your business. The human brain is uniquely wired to interpret a large volume of information from the environment, analyze it and make decisions about how to respond.
    Business and manufacturing processes provide the structure of your company’s operations – aligning the activities and efforts of your people into efficient and predictable workflows. Processes are critical to enable the effective allocation of the organization’s resources and ensure consistent and repeatable outcomes in both products and business functions.
    Operational data enables the people and process elements of your company to work together, providing both real-time and historical indications of what activities are taking place and how well they are performing. The ability of companies to fine tune their organization effectively for optimal business performance will be largely dependent on the quality and trustworthiness of the data assets they have at their disposal. Business processes have become more data-centric, and technology adoption has expanded the possibilities for new and diverse instrumentation. Bringing all of the operational, environmental and strategic data sources together to enable decision making has become critical to business success.
  • Quality to Integrity
    Quality to Integrity Dan Ortega - Vice President of Marketing Recorded: Jun 5 2017 3 mins
    Dan Ortega outlines the of Data Quality Management and integrity to IoT initiatives.
  • Machine Learning and the Rise of the Dynamic Enterprise
    Machine Learning and the Rise of the Dynamic Enterprise Dan Ortega - Vice President of Marketing Recorded: Jun 2 2017 4 mins
    The term “dynamic enterprise” was introduced during 2008, as an enterprise architecture concept. Rather than striving for stability, predictability and maturity, dynamic enterprises began focusing on continuous and transformational growth – embracing change as the only constant. This shift began with the proliferation of social media and user-generated (Web 2.0) content, which started to replace the curated information previously available.
    As the data consumption trends evolved within the business environment, technologists (including Tim Berners-Lee, the inventor of the World Wide Web) were working behind the scenes on standards for a Semantic Web (Web 3.0), where computers could consume and analyze all of the content and information available
    Making the data readable by computers was only part of the challenge. Most companies still lacked the technology capabilities and know-how to take advantage of the information at their disposal. Advancements in machine learning and cloud infrastructure during the past 3 years have finally unlocked the potential of big data to the masses. A few large cloud service providers have invested in computing infrastructure and developed the capabilities to ingest and process vast quantities of data. They have analyzed, correlated and made it available to users in the form of cloud services that require neither the technical expertise nor the capital investment that were former barriers to adoption.
    As more enterprises and individuals leverage machine learning to draw insights from data, those insights become part of the “learned knowledge” of the system itself, and help the computer understand context and consumption behavior patterns that further improved its capability to bridge the human-information divide.
  • IT Management Is Becoming So Predictive – Which Is Good. Right?
    IT Management Is Becoming So Predictive – Which Is Good. Right? Dan Ortega - Vice President of Marketing Recorded: May 25 2017 5 mins
    Improvements in IT data quality and analysis tools have enabled IT management to spend less time looking into the past and more time enabling the dynamic enterprise of the future. This allows them to anticipate business events more accurately, forecast costs and capacity, and identify operational risks before they appear. Empowered by technology-driven insights and technology-enabled prediction ability, IT leaders have secured a long-sought seat at the table with their business counterparts during the strategic planning process. IT management becoming more predictive is good. Right? Perhaps, but there are some risks to consider.

    Technology-enabled prediction is only as good as the underlying data, and does a poor job of addressing unknown variables. Human intuition and analysis skills have traditionally been used to fill gaps in available data, interpret meaning and project future events. The predictive abilities of most IT leaders are heavily dependent on the quality of information and technology-enabled processing power at their disposal. Modern machine learning systems have made tremendous strides in analyzing large volumes of data to identify trends and patterns based on past and current observations. Their capability to do so is limited, however, by the quality and dependability of data inputs. “Garbage in-garbage out” has been the rule for many years.

    Learning how to harness the power of technology and information and applying it to create valuable predictive insights for an organizations is definitely good; IT leaders should be commended for bringing new capabilities to the decision-making table. As we all know, however, no information is perfect, and technology has its limitations. Becoming entirely reliant on technology for prediction and losing the ability to apply a human filter is a risky situation for businesses. As with many business decisions, it is important to balance the potential benefits with the acceptable risk profile for your organization
  • Why CMDBs are Sexier than you Think
    Why CMDBs are Sexier than you Think Dan Ortega - Vice President of Marketing Recorded: May 18 2017 3 mins
    Sexy may not be the first word that comes to mind when you think about your CMDB and the operational data of your company… but (seriously) maybe it should be! After all, your CMDB has a number of attractive qualities and (with some care and feeding) could be the ideal partner for a lasting long-term relationship. There are lots of potential reasons this can work, but let’s focus on the top three:
    Substance: Your CMDB is not shallow and fickle, it is strong and deep, with a history as long as your company’s. The CMDB is built on a core of your master data and pulls together all of the facets of operational data your company creates every day. It contains the complex web of connective tissue that can help you understand how your company works. Those insights then become part of the CMDB itself – enabling the strength of your data to be balanced by the wisdom that comes from analytics and self-awareness.
    Long-term potential: You may lust after the latest new tool or trend, but your CMDB will stand by your company’s side through thick and thin, long into the future. It will grow and evolve with you, always be honest about what’s going on, and work with you to provide insights to get your company through troubled times. As your company changes with new markets, products, customers, and competitors or becomes a part of something bigger through acquisition or partnership, your CMDB is there to help you navigate the changes and achieve success.
    Air of mystery: You may never fully understand all of the secrets that your CMDB holds about your company. As you unlock one insight, the potential for others seems to appear magically. What would you expect from something that brings together all parts of your company data and the complex interrelationships in one place for you to explore?
    Deep substance, long-term potential and an air of mystery. Maybe your CMDB is sexier than you think
  • Data Integrity the Key to Operational Insights or an Elephant in the Room?
    Data Integrity the Key to Operational Insights or an Elephant in the Room? Dan Ortega - Vice President of Marketing Recorded: May 5 2017 4 mins
    Throughout history, business has always struggled with the challenge of data accuracy and integrity. Executives constantly ask their IT leaders how they can improve the quality and integrity of data in order to obtain the insights needed to guide their company effectively. While it sounds reasonable, it may well be the wrong question. Rather than focusing on the quality of raw data, a better approach is to focus on the quality of insights available and the speed/cost to obtain them by asking, “How can we better leverage the data we already have to cost effectively obtain the insights we need?”
    Advances in machine learning, data science and correlation analysis during the past decade have enabled a broader range of capabilities to analyze data from disparate operational processes and information systems. This has been accomplished without developing some of the structured relationships and incurring data-model-integration costs associated with traditional data warehousing and reporting approaches
    Through assessment of the trends and relationships between different data elements, modern data analysis systems are able to “discover” a variety of insights that may not have been available during the past. Examples include undocumented dependencies within operational processes, sources of data inaccuracy and the evolution of operational processes during time. Instead of focusing on what is “known” about operational data, modern methods focus on understanding what is “unknown” about operational data.
    Is data integrity the key to operational insights or is it the elephant in the room? That depends on how organizations want to view the situation. Data Integrity at both the informational and operational level is a core requirement of any modern business, and has been an area of focus for Blazent since the early days of Big Data.
  • Why Is Operational Data Important for IT?
    Why Is Operational Data Important for IT? Dan Ortega - Vice President of Marketing Recorded: Apr 28 2017 4 mins
    Each day, with every customer transaction, employee task and business process, companies generate vast amounts of operational data that provides leaders and managers with insight into what is working well and what requires attention. Operational data is particularly important to those responsible for stewarding the information and technology assets of their organization.
    In this context, operational data is particularly important to IT, which is why it is so critical to understand the three different types of operational data on which IT leaders rely.
    Business operational data is all about the business processes and user experiences, which IT enables with the technology and services it provides. The reason organizations invest in technology is to improve the productivity and effectiveness of business operations. Process and user-related data evaluated over time provides a contextual picture into how effectively the technology is achieving that goal.
    IT operational data is concerned with the content of “what” technology components are operating and being used. IT operational data is important as a part of the IT planning process to understand capacity utilization and determine where scalability constraints exist, as well as to understand the cost of services provided to users and to assess security and risk considerations of the business-technology ecosystem. Within IT service management processes, operational data is critical to ensure performance and availability Service Levels Agreements (SLAs) are honored, and to drive technology cost reduction through infrastructure optimization.
    Operational data provides IT with the critical picture it needs to understand and optimize the role it plays in the context of the company.
  • Benefits of Machine Learning in IT Infrastructure
    Benefits of Machine Learning in IT Infrastructure Dan Ortega - Vice President of Marketing Recorded: Apr 21 2017 4 mins
    During the next 5 years, machine learning is poised to play a pivotal and transformational role in how IT Infrastructure is managed. Two key scenarios are possible: transforming infrastructure from a set of under-utilized capital assets to a highly efficient set of operational resources through dynamic provisioning based on consumption; and the identification of configurations, dependencies and the cause/effect of usage patterns through correlation analysis.
    In the world of IT infrastructure, it’s all about efficient use of resources. With on-premise infrastructure (compute, storage and network) utilization rates for most organizations in the low single digits, the cloud has sold the promise of a breakthrough. For those organizations moving to Infrastructure as a Service (IaaS), utilization in the middle to high teens is possible, and for those moving to Platform as a Service (PaaS), utilization in the mid-twenties is within reach.
    Dynamic provisioning driven by demand is essentially the same operational concept as power grids and municipal water systems – capacity allocation driven by where resources are consumed, rather than where they are produced.
    The second part of the breakthrough relates to right-sizing infrastructure. Whether this is network capacity or compute Virtual Machine size – machine learning will enable analysis of the patterns of behavior by users and correlate them to the consumption of infrastructure resources.
    During the near term, these benefits will be much more tactical. Automated discovery combined with behavioral correlation analysis will virtually eliminate the need for manual inventory and mapping of components and configuration items in the IT ecosystem to reveal how the ecosystem is operating.
    Today, IT has the opportunity to automate the mapping of components in their infrastructure to provide a more accurate and actionable picture.

Embed in website or blog