This presentation will provide an overview of both UML (Unified Modeling Language) and SysML (Systems Modeling Language) and demonstrate how they are successfully applied in practice. You will also learn about the exciting work going on as these standards continue to evolve, including the expanding suite of standards defining precise, executable semantics for UML and SysML and the broad industry effort ongoing to develop SysML v2. Presentation recorded in Ottawa, Canada September 2018.Read more >
Software Systems, hardware systems, business systems - the mind reels at all that we are building or modernizing today. Can we possibly handle the complexity and achieve reliability? Yes, we can, by creating models. A model is a concrete representation of the often abstract, hard-to-comprehend aspects of a system. With models, we can understand requirements, analyze problems, and design solutions, whether for a system to be guilt or an existing one under study. And, crucially, we can also use models to communicate within and between teams. But clear communication requires the use of a common language backed up by a common community of practice.
This is exactly what OMG modeling language standards provide: a broad suite of industry-defined languages, with extensive tooling support, for modeling all sorts of systems: Unified Modeling Language (UML), Systems Modeling Language (SysML), Business Process Model and Notation (BPMN), Case Management Model and Notation (CMMN) and Decision Management Model and Notation (DMN).
But if you are new to this richness of standards, it is sometimes hard to know where to begin! this presentation will give you a concise overview of how these standards came to be, how they are being applied today and how they are continuing to evolve to meet growing user expectations. At the end of this presentation, you should feel comfortable with the breadth of modeling languages that the OMG has to offer and be ready to explore them in a bit more depth in subsequent presentations. Recorded live in Ottawa, September 2018.
Cyber threats are becoming more frequent and more targeted. Bad actors are more adept at social engineering and investigating your network and infrastructure to understand your organization’s cyber strengths and weaknesses. Security teams need to focus on who or what will seek to exploit them and how they are likely to do so, instead of being hyper-focused on just the threat itself.
This webinar delves into how one of the world's top financial services firms developed and implemented a robust threat model capable of repelling the world's most sophisticated hackers and nation-state actors. Join LookingGlass Product Manager, Dan Martin, and Security Ledger Editor-in-Chief, Paul Roberts for an introduction to ScoutThreat™, a threat management platform that helps security analysts streamline threat analysis work and extract the maximum value from threat intelligence.
In this webinar you will learn:
- Advantages of modeling adversaries to get ahead of threats to your IT environment
- Structuring threat models to account for a myriad of sophisticated cyber risks
- How to overcome hurdles in creating robust threat models that address real-world risks
- How ScoutThreat can help you build a proactive security posture
Predictive Modeling of Wi-Fi networks is an efficient and cost-effective way to determine ideal quantity, placement, and configuration of APs for optimal performance, security, and compliance. The accuracy of predictive modeling is determined both by the quality of the predictive modeling tool and by the accuracy of the data entered into the tool by the user. There is an industry standard process by which to gather accurate RF-loss-object data to ensure optimal output accuracy of predictive modeling software, such as AirMagnet’s Planner. The data gathering process involves measurement of wall, door, and object loss within facilities, but there’s a bit more to the process than one might initially expect, including dB offsets for low-capability devices.
Attend this webinar to understand the tools and processes required to optimize your predictive models for maximum accuracy.
Devin Akin is the founder of Divergent Dynamics, a Wi-Fi Systems Integrator, Value-Added Reseller (VAR), and Training organization. Akin has over 20 years in IT, with 15 years in WLAN specifically, and was recently named to the TWW Top 100 Wireless Technology Experts list for 2014.
The importance of facing, understanding, predicting and even mitigating uncertainty has been well acknowledged and studied in various fields such as philosophy, physics and finance. In terms of software and system engineering, due to the increasing complex of large-scale systems themselves, and the dynamic and unpredictable deployment and operation environments of such systems, increasing attentions have been given to address challenges of explicitly specifying and modeling uncertainty.
Uncertainty Modeling (UM) aims to promote and enable explicit specifications of uncertainty and uncertainty related concepts, at various contexts (e.g., developing large-scale Cyber-Physical Systems and Internet of Things, for different purposes (e.g., enabling uncertainty-wise requirements specifications, modeling, verification and validation, e.g., facilitating the definition of uncertainty-wise testing strategies), and at different phases of the development of such complex and uncertainty-inherent systems (e.g., requirements, architecture, design, testing and operation).
In the context of OMG, we see a diverse set of uncertainty modeling applications, such as 1) integrating with UML use cases, SysML Requirements Diagram, to enable requirements V&V, 2) capturing uncertainty as part of SysML or UML models to facilitate design and/or testing, 3) integrating with BPMN and other OMG standards to facilitate different kinds of analyses and generations.
OMG’s Uncertainty Modeling Request for Information (RFI) is currently open for responses. The RFI aims to solicit ideas, discussions, comments, recommendations, user needs and experiences about uncertainty modeling. Collected responses will be carefully analyzed and will be used to identify requirements, based on an RFP for an UM will be developed. Instructions for responding to this RFI are specified in the OMG Uncertainty Modeling Request for Information document (ad/16-09-02 (Uncertainty RFI)).
We invite you to join the conversation.
Risk and actuarial modeling are among the fastest growing workloads in the insurance industry.
The Microsoft cloud is helping insurers to manage increasing demands, including regulatory demands, in ways that deliver immediate value to the business and its customers. Motivated by some of the major regulatory issues we’ve encountered over the last few years, insurers are now being asked to run more complex models, more often.
Combine on-demand capacity with ground-breaking advanced analytics, and that’s where the cloud can make a real difference for you.
Join this webcast to:
- Learn how the Microsoft cloud enables insurers to dramatically improve their risk and actuarial modeling environments
- Stay ahead of regulatory reporting demands
- Explore the power of on-demand capacity with the ground-breaking advanced analytics that are unique to Microsoft
Presented by one of the foremost experts in BPM standards, this session will concretely demonstrate usage of the three leading business modeling standards produced by the Object Management Group (OMG). This session will explore the positioning and core behavioral differences between the Business Process Model and Notation (BPMN), the Case Management Model and Notation (CMMN) and the Decision Model Notation (DMN). The specific roles and usage of these dominant business modeling notations will be explained and demonstrated using a worked out example integrating BPMN, CMMN and DMN models.
What exactly are BPMN, CMMN and DMN?
When is one of these standards best suited for the purpose?
How to use BPMN, CMMN and DMN together?
What are the best practices for these standards?
To succeed, an analytics or data science team must effectively engage with business experts who are often inexperienced with advanced analytics, machine learning and data science. They need a framework for connecting business problems to possible analytics solutions. Decision modeling brings clarity to analytics projects, linking analytics solutions to business problems to deliver value.
In this webinar, IIA Expert James Taylor shares four key lessons learned by the central analytics team at a global leader in information technology. These lessons underscore that decision modeling builds a shared understanding with their business clients, revives projects that have lost their purpose, brings clarity to problems long thought difficult and delivers value quickly. The webinar will also introduce the basics of decision modeling and provide practical recommendations for adoption. A Leading Practice Brief based on this case study will be available for all registrants.
Ensuring analytics projects have clarity of purpose is critical to delivering business value. Learn how decision modeling helps a leading organization bring focus to its data science and analytic investments.
The focus of modern business intelligence has been self-service; pushing data into the hands of end users more quickly with more accessible user interfaces so they can get answers fast and on their own. This has helped alleviate a major BI pain point: centralized, IT-dominated solutions have been too slow and too brittle to serve the business.
What has been masked is a lack of innovation in data modeling. Data modeling is a huge, valuable component of BI that has been largely neglected. In this webinar, we discuss Looker’s novel approach to data modeling and how it powers a data exploration environment with unprecedented depth and agility.
Topics covered include:
-A new architecture beyond direct connect
-Language-based, git-integrated data modeling
-Abstractions that make SQL more powerful and more efficient
Data Modeling is a critical component of enterprise BI. Most data modeling is desktop-based. This can lead to a large number of problems including scale, maintenance, collaboration and security.
Join Chris Webb independent consultant and Peter Sprague VP Solutions Engineering at Pyramid Analytics for a discussion about how to avoid an epidemic of poorly designed and scattered data models throughout an organization. Questions tackled will include:
-What is the impact of poor data modeling decisions?
-Can data modeling decisions be delayed?
-Do new tools and technologies alleviate the need for good data modeling?
-What are the data modeling needs of an enterprise?
The Internet of Things is finally here. We know because it attacked everyone in October! If you are working at a company that is making—or even using—an Internet-connected gadget, you’ll want to learn how to properly model the threats against it. Legendary security guru David Holmes will walk you through a full threat model assessment process specific to the IoT. Hilarious examples of what not to do are included.
This presentation was originally delivered at the F5 Agility conference August, 2017.
Boost your modeling performance. This simple demonstration of ensemble modeling will show you one way how. Intended for marketers and all levels of analyst.Read more >
It’s probably not too often that you’ll get this perspective. Star Wars was really all about information disclosure threats! You’ll want to find out more as noted presenter and author Adam Shostack, references one of George Lucas’ epic sagas to deliver lessons on threat modeling. Not only was the Death Star badly threat modeled, the politics between Darth Vader and Gran Moff Tarkin distracted incidence response after the plans were stolen. This session will provide you with proven foundations for effective threat modeling as you develop and deploy systems. Adam will help you understand what works for threat modeling and how various approaches conflict or align. The force is strong with this session.Read more >
Marketers have been effectively analyzing their own customer data and identifying similar buyer attributes offline for decades. Unfortunately customer data is often siloed or spread across dozens of applications, preventing marketers from leveraging insights that should be used to model the perfect new audience for digital campaigns, or to campaign performance against offline consumer sales.
When your data is connected insights can be shared across marketing applications and measured offline or online. Data from every marketing technology in your toolbox can finally be connected allowing you to continuously refine your campaigns. Sign up to join our next webinar where you’ll learn:
– About new strategies for audience modeling and practical tips for getting started
– Reach the perfect audience by connecting your customer data to digital marketing campaigns
– To forget about measuring campaign performance with vanity metrics like “reach” or “impressions”
– How to determine which digital campaigns are actually delivering measurable sales
The last 10 years has seen a major shift in the BI market — legacy tools which provided governance but greatly limited access by most employees have given way to self-service workbook analytics tools. Workbook analytics allows data to be shared, but with everyone slicing and dicing data their own way, people find different answers to the same questions, and data chaos breaks out.
Learn how successful data-driven companies are striking a balance that allows for governance without bottlenecks, self-service without chaos and taps the true benefits of BI by using a cloud-based data model.
Hear from BI experts as they deep dive into:
- The evolution of BI to better understand how we got here
- Why data modeling is key to a successful data strategy
- LookML: Looker’s light-weight, git-integrated data modeling language
Understanding data modeling can help you get the best insights out of your data. The challenge of data modeling is to understand how to work with complex data in order to standardize, structure and optimize data to gets accurate insights quickly.Read more >