Hi [[ session.user.profile.firstName ]]
Sort by:
    • Enhancing Predictive Modeling & RF Site Survey Accuracy
      Enhancing Predictive Modeling & RF Site Survey Accuracy Devin Akin, CEO, Divergent Dynamics Recorded: Nov 30 2017 7:00 pm UTC 60 mins
    • Predictive Modeling of Wi-Fi networks is an efficient and cost-effective way to determine ideal quantity, placement, and configuration of APs for optimal performance, security, and compliance. The accuracy of predictive modeling is determined both by the quality of the predictive modeling tool and by the accuracy of the data entered into the tool by the user. There is an industry standard process by which to gather accurate RF-loss-object data to ensure optimal output accuracy of predictive modeling software, such as AirMagnet’s Planner. The data gathering process involves measurement of wall, door, and object loss within facilities, but there’s a bit more to the process than one might initially expect, including dB offsets for low-capability devices.

      Attend this webinar to understand the tools and processes required to optimize your predictive models for maximum accuracy.

      Devin Akin is the founder of Divergent Dynamics, a Wi-Fi Systems Integrator, Value-Added Reseller (VAR), and Training organization. Akin has over 20 years in IT, with 15 years in WLAN specifically, and was recently named to the TWW Top 100 Wireless Technology Experts list for 2014.

      Read more >
    • Introduction to Uncertainty Modeling
      Introduction to Uncertainty Modeling Tao Yue, Shaukat Ali, and Bran Selic - Simula Research Laboratory Recorded: Jan 25 2017 4:00 pm UTC 61 mins
    • The importance of facing, understanding, predicting and even mitigating uncertainty has been well acknowledged and studied in various fields such as philosophy, physics and finance. In terms of software and system engineering, due to the increasing complex of large-scale systems themselves, and the dynamic and unpredictable deployment and operation environments of such systems, increasing attentions have been given to address challenges of explicitly specifying and modeling uncertainty.
      Uncertainty Modeling (UM) aims to promote and enable explicit specifications of uncertainty and uncertainty related concepts, at various contexts (e.g., developing large-scale Cyber-Physical Systems and Internet of Things, for different purposes (e.g., enabling uncertainty-wise requirements specifications, modeling, verification and validation, e.g., facilitating the definition of uncertainty-wise testing strategies), and at different phases of the development of such complex and uncertainty-inherent systems (e.g., requirements, architecture, design, testing and operation).
      In the context of OMG, we see a diverse set of uncertainty modeling applications, such as 1) integrating with UML use cases, SysML Requirements Diagram, to enable requirements V&V, 2) capturing uncertainty as part of SysML or UML models to facilitate design and/or testing, 3) integrating with BPMN and other OMG standards to facilitate different kinds of analyses and generations.
      OMG’s Uncertainty Modeling Request for Information (RFI) is currently open for responses. The RFI aims to solicit ideas, discussions, comments, recommendations, user needs and experiences about uncertainty modeling. Collected responses will be carefully analyzed and will be used to identify requirements, based on an RFP for an UM will be developed. Instructions for responding to this RFI are specified in the OMG Uncertainty Modeling Request for Information document (ad/16-09-02 (Uncertainty RFI)).
      We invite you to join the conversation.

      Read more >
    • Bringing Clarity to Analytics Projects with Decision Modeling: A Case Study
      Bringing Clarity to Analytics Projects with Decision Modeling: A Case Study James Taylor, IIA Expert in Decision Management Systems Recorded: Feb 2 2017 5:00 pm UTC 57 mins
    • To succeed, an analytics or data science team must effectively engage with business experts who are often inexperienced with advanced analytics, machine learning and data science. They need a framework for connecting business problems to possible analytics solutions. Decision modeling brings clarity to analytics projects, linking analytics solutions to business problems to deliver value.
       
      In this webinar, IIA Expert James Taylor shares four key lessons learned by the central analytics team at a global leader in information technology. These lessons underscore that decision modeling builds a shared understanding with their business clients, revives projects that have lost their purpose, brings clarity to problems long thought difficult and delivers value quickly. The webinar will also introduce the basics of decision modeling and provide practical recommendations for adoption. A Leading Practice Brief based on this case study will be available for all registrants.
       
      Ensuring analytics projects have clarity of purpose is critical to delivering business value. Learn how decision modeling helps a leading organization bring focus to its data science and analytic investments.

      Read more >
    • Threat Modeling: Lessons from Star Wars - Adam Shostack
      Threat Modeling: Lessons from Star Wars - Adam Shostack Adam Shostack Recorded: Oct 13 2015 8:40 pm UTC 43 mins
    • It’s probably not too often that you’ll get this perspective. Star Wars was really all about information disclosure threats! You’ll want to find out more as noted presenter and author Adam Shostack, references one of George Lucas’ epic sagas to deliver lessons on threat modeling. Not only was the Death Star badly threat modeled, the politics between Darth Vader and Gran Moff Tarkin distracted incidence response after the plans were stolen. This session will provide you with proven foundations for effective threat modeling as you develop and deploy systems. Adam will help you understand what works for threat modeling and how various approaches conflict or align. The force is strong with this session.

      Read more >
    • Emerging Trends in Audience Modeling
      Emerging Trends in Audience Modeling Dave Katz, Sr. Dir of Data Hearst; John Wallace, CGO at MarketShare; Rebecca Stone, Marketing Dir at LiveRamp Recorded: Oct 29 2015 4:00 pm UTC 33 mins
    • Marketers have been effectively analyzing their own customer data and identifying similar buyer attributes offline for decades. Unfortunately customer data is often siloed or spread across dozens of applications, preventing marketers from leveraging insights that should be used to model the perfect new audience for digital campaigns, or to campaign performance against offline consumer sales.

      When your data is connected insights can be shared across marketing applications and measured offline or online. Data from every marketing technology in your toolbox can finally be connected allowing you to continuously refine your campaigns. Sign up to join our next webinar where you’ll learn:

      – About new strategies for audience modeling and practical tips for getting started
      – Reach the perfect audience by connecting your customer data to digital marketing campaigns
      – To forget about measuring campaign performance with vanity metrics like “reach” or “impressions”
      – How to determine which digital campaigns are actually delivering measurable sales

      Read more >
    • Data Modeling in Hadoop
      Data Modeling in Hadoop Maloy Manna, Engineering, AXA Data Innovation Lab Recorded: Jun 16 2016 1:00 pm UTC 49 mins
    • The USP of Hadoop over traditional RDBMS is "Schema on Read".

      While the flexibility of choices in data organization, storage, compression and formats in Hadoop makes it easy to process data, understanding the impact of these choices on search, performance and usability allows better design patterns.

      Learning when and how to use schemas and data model evolution due to required changes is key to building data-driven applications.

      This webinar will explore the various options available and their impact to allow better design choices for data processing and metadata management in Hadoop.

      Read more >
    • Learning Environment Modeling Language: The New Language of Learning Design
      Learning Environment Modeling Language: The New Language of Learning Design Phylise Banner Recorded: Oct 18 2017 4:30 pm UTC 55 mins
    • Creating a vision that is shared among designers, subject matter experts, clients, and leaders is one of the fundamental challenges that content professionals must overcome. In the learning design space. Many project stakeholders lack the background knowledge, experiences, or "language" to relate their vision in ways that instructional designers can effectively implement.

      Join me, Scott Abel, The Content Wrangler and my special guest, Phylise Banner, Learning Experience Designer for this free, one-hour webinar. In this session, Phylise will introduce an easy-to-use and powerful visual learning design method called Learning Environment Modeling (LEM) — a unique visual language created to enhance communication and foster collaboration between instructional design professionals and diverse stakeholders.

      During the session, you will learn how to visually communicate the correlation of specific design elements to learning results and use Learning Environment Modeling (LEM) to collaborate effectively with blended learning project teams. Phylise will talk about how to facilitate more effective communication throughout the design process, and how to use a learning environment design system and tools to remove or reduce ego-centric behaviors and attitudes during the design process.

      Join us as we explore how LEM can support creative learning experience design and remove barriers to communication throughout the learning design process.

      Read more >
    • Enabling Content in Chatbots and AI: Modeling Microcontent Structure for DITA
      Enabling Content in Chatbots and AI: Modeling Microcontent Structure for DITA Steve Manning, Precision Content Recorded: Dec 12 2017 4:00 pm UTC 58 mins
    • We’ve all heard about the benefits of content modeling and structured content for technical information. We’ve implemented DITA or other topic-based strategies to break content into smaller blocks to manage and publish, but topics are not small enough.

      But the addition of Bots, voice-enabled interfaces, and AI means we must change the way we structure content. We are moving from a broadcast style of communication – publish and hope for the best – to a more conversational style of communication. More question and answer. This imposes requirements on the content models you need to create if you want to talk to the Bots. We must be more granular in our models. We need to implement Microcontent.

      Join Val Swisher, sitting in for Scott Abel, The Content Wrangler, and her special guest, Steve Manning of Precision Content, for this webinar. Steve will discuss how new technology will change the way we approach content and the topic-based approach to deliver what you need for the new technology challenges. Steve will work through a different approach to the content that focuses you more on reader outcomes how that affects your content models. Attendees will learn how deep into their content then need to model to get the most from chatbots, voice-enabled systems, and ultimately AI.

      Attendees will also learn 1) Why traditional topic-based DITA is not granular enough for the future; 2) How microcontent is a better approach to future-proof your content, and 3) How to use user outcomes to drive models and granularity.

      Read more >