For most organizations, performing threat-modeling is a difficult and an expensive undertaking. There are good reasons why this is the case. Threat modeling traditionally requires an experienced security architect with knowhow in architecture patterns, design patterns, a breadth of technologies, and above all deep security knowledge.
Join this webinar and learn:
- Consistency/Reliability: Use of patterns allows us to identify recurring problems/patterns and provide consistently the same solution. In security this means that identifying patterns during threat modeling will allow us to create consistent design, development, testing, and risk guidance.
- Efficiency: Use of patterns allows us to automate some part of a problem while leaving the more complex concerns to be tackled by experts. This creates efficiencies.
- Commonly understood taxonomy: Patterns create a common taxonomy for organizing knowledge, training users/practitioners, communicating with stakeholders (developers, testers, architects, security analysts, etc.)
Predictive Modeling of Wi-Fi networks is an efficient and cost-effective way to determine ideal quantity, placement, and configuration of APs for optimal performance, security, and compliance. The accuracy of predictive modeling is determined both by the quality of the predictive modeling tool and by the accuracy of the data entered into the tool by the user. There is an industry standard process by which to gather accurate RF-loss-object data to ensure optimal output accuracy of predictive modeling software, such as AirMagnet’s Planner. The data gathering process involves measurement of wall, door, and object loss within facilities, but there’s a bit more to the process than one might initially expect, including dB offsets for low-capability devices.
Attend this webinar to understand the tools and processes required to optimize your predictive models for maximum accuracy.
Devin Akin is the founder of Divergent Dynamics, a Wi-Fi Systems Integrator, Value-Added Reseller (VAR), and Training organization. Akin has over 20 years in IT, with 15 years in WLAN specifically, and was recently named to the TWW Top 100 Wireless Technology Experts list for 2014.
The importance of facing, understanding, predicting and even mitigating uncertainty has been well acknowledged and studied in various fields such as philosophy, physics and finance. In terms of software and system engineering, due to the increasing complex of large-scale systems themselves, and the dynamic and unpredictable deployment and operation environments of such systems, increasing attentions have been given to address challenges of explicitly specifying and modeling uncertainty.
Uncertainty Modeling (UM) aims to promote and enable explicit specifications of uncertainty and uncertainty related concepts, at various contexts (e.g., developing large-scale Cyber-Physical Systems and Internet of Things, for different purposes (e.g., enabling uncertainty-wise requirements specifications, modeling, verification and validation, e.g., facilitating the definition of uncertainty-wise testing strategies), and at different phases of the development of such complex and uncertainty-inherent systems (e.g., requirements, architecture, design, testing and operation).
In the context of OMG, we see a diverse set of uncertainty modeling applications, such as 1) integrating with UML use cases, SysML Requirements Diagram, to enable requirements V&V, 2) capturing uncertainty as part of SysML or UML models to facilitate design and/or testing, 3) integrating with BPMN and other OMG standards to facilitate different kinds of analyses and generations.
OMG’s Uncertainty Modeling Request for Information (RFI) is currently open for responses. The RFI aims to solicit ideas, discussions, comments, recommendations, user needs and experiences about uncertainty modeling. Collected responses will be carefully analyzed and will be used to identify requirements, based on an RFP for an UM will be developed. Instructions for responding to this RFI are specified in the OMG Uncertainty Modeling Request for Information document (ad/16-09-02 (Uncertainty RFI)).
We invite you to join the conversation.
Risk and actuarial modeling are among the fastest growing workloads in the insurance industry.
The Microsoft cloud is helping insurers to manage increasing demands, including regulatory demands, in ways that deliver immediate value to the business and its customers. Motivated by some of the major regulatory issues we’ve encountered over the last few years, insurers are now being asked to run more complex models, more often.
Combine on-demand capacity with ground-breaking advanced analytics, and that’s where the cloud can make a real difference for you.
Join this webcast to:
- Learn how the Microsoft cloud enables insurers to dramatically improve their risk and actuarial modeling environments
- Stay ahead of regulatory reporting demands
- Explore the power of on-demand capacity with the ground-breaking advanced analytics that are unique to Microsoft
Presented by one of the foremost experts in BPM standards, this session will concretely demonstrate usage of the three leading business modeling standards produced by the Object Management Group (OMG). This session will explore the positioning and core behavioral differences between the Business Process Model and Notation (BPMN), the Case Management Model and Notation (CMMN) and the Decision Model Notation (DMN). The specific roles and usage of these dominant business modeling notations will be explained and demonstrated using a worked out example integrating BPMN, CMMN and DMN models.
What exactly are BPMN, CMMN and DMN?
When is one of these standards best suited for the purpose?
How to use BPMN, CMMN and DMN together?
What are the best practices for these standards?
To succeed, an analytics or data science team must effectively engage with business experts who are often inexperienced with advanced analytics, machine learning and data science. They need a framework for connecting business problems to possible analytics solutions. Decision modeling brings clarity to analytics projects, linking analytics solutions to business problems to deliver value.
In this webinar, IIA Expert James Taylor shares four key lessons learned by the central analytics team at a global leader in information technology. These lessons underscore that decision modeling builds a shared understanding with their business clients, revives projects that have lost their purpose, brings clarity to problems long thought difficult and delivers value quickly. The webinar will also introduce the basics of decision modeling and provide practical recommendations for adoption. A Leading Practice Brief based on this case study will be available for all registrants.
Ensuring analytics projects have clarity of purpose is critical to delivering business value. Learn how decision modeling helps a leading organization bring focus to its data science and analytic investments.
The focus of modern business intelligence has been self-service; pushing data into the hands of end users more quickly with more accessible user interfaces so they can get answers fast and on their own. This has helped alleviate a major BI pain point: centralized, IT-dominated solutions have been too slow and too brittle to serve the business.
What has been masked is a lack of innovation in data modeling. Data modeling is a huge, valuable component of BI that has been largely neglected. In this webinar, we discuss Looker’s novel approach to data modeling and how it powers a data exploration environment with unprecedented depth and agility.
Topics covered include:
-A new architecture beyond direct connect
-Language-based, git-integrated data modeling
-Abstractions that make SQL more powerful and more efficient
Data Modeling is a critical component of enterprise BI. Most data modeling is desktop-based. This can lead to a large number of problems including scale, maintenance, collaboration and security.
Join Chris Webb independent consultant and Peter Sprague VP Solutions Engineering at Pyramid Analytics for a discussion about how to avoid an epidemic of poorly designed and scattered data models throughout an organization. Questions tackled will include:
-What is the impact of poor data modeling decisions?
-Can data modeling decisions be delayed?
-Do new tools and technologies alleviate the need for good data modeling?
-What are the data modeling needs of an enterprise?
The Internet of Things is finally here. We know because it attacked everyone in October! If you are working at a company that is making—or even using—an Internet-connected gadget, you’ll want to learn how to properly model the threats against it. Legendary security guru David Holmes will walk you through a full threat model assessment process specific to the IoT. Hilarious examples of what not to do are included.
This presentation was originally delivered at the F5 Agility conference August, 2017.
Boost your modeling performance. This simple demonstration of ensemble modeling will show you one way how. Intended for marketers and all levels of analyst.Read more >
It’s probably not too often that you’ll get this perspective. Star Wars was really all about information disclosure threats! You’ll want to find out more as noted presenter and author Adam Shostack, references one of George Lucas’ epic sagas to deliver lessons on threat modeling. Not only was the Death Star badly threat modeled, the politics between Darth Vader and Gran Moff Tarkin distracted incidence response after the plans were stolen. This session will provide you with proven foundations for effective threat modeling as you develop and deploy systems. Adam will help you understand what works for threat modeling and how various approaches conflict or align. The force is strong with this session.Read more >
Marketers have been effectively analyzing their own customer data and identifying similar buyer attributes offline for decades. Unfortunately customer data is often siloed or spread across dozens of applications, preventing marketers from leveraging insights that should be used to model the perfect new audience for digital campaigns, or to campaign performance against offline consumer sales.
When your data is connected insights can be shared across marketing applications and measured offline or online. Data from every marketing technology in your toolbox can finally be connected allowing you to continuously refine your campaigns. Sign up to join our next webinar where you’ll learn:
– About new strategies for audience modeling and practical tips for getting started
– Reach the perfect audience by connecting your customer data to digital marketing campaigns
– To forget about measuring campaign performance with vanity metrics like “reach” or “impressions”
– How to determine which digital campaigns are actually delivering measurable sales
Understanding data modeling can help you get the best insights out of your data. The challenge of data modeling is to understand how to work with complex data in order to standardize, structure and optimize data to gets accurate insights quickly.Read more >
The last 10 years has seen a major shift in the BI market — legacy tools which provided governance but greatly limited access by most employees have given way to self-service workbook analytics tools. Workbook analytics allows data to be shared, but with everyone slicing and dicing data their own way, people find different answers to the same questions, and data chaos breaks out.
Learn how successful data-driven companies are striking a balance that allows for governance without bottlenecks, self-service without chaos and taps the true benefits of BI by using a cloud-based data model.
Hear from BI experts as they deep dive into:
- The evolution of BI to better understand how we got here
- Why data modeling is key to a successful data strategy
- LookML: Looker’s light-weight, git-integrated data modeling language
Nor-Tech will show you how to make a better ROI on your investments in Simulation & Modeling software. Software license costs are 75% or more of your TCO investment in most Simulation & Modeling solutions. Nor-Tech’s Intel HPC Technology makes smarter use of your license investments for your software dollars. Nor-Tech 20 years in HPC Technology and their built-in adds called NT-EZ for your HPC Technology solution make them easy to use, and lower cost in many cases than using Workstations or the Cloud.Read more >