The application development community features top thought leadership focusing on optimal practices in software development, SDLC methodology, mobile app development and application development platforms and tools. Join top software engineers and coders as they cover emerging trends in everything from enterprise app development to developing for mobile platforms such as Android and iOS.
Artificial intelligence (AI) and machine learning (ML) are exciting, growing in adoption and more applicable to business contexts than ever before. The problem is, a lot of AI/ML work to date has relied upon rare and expensive data scientists. These practitioners take data sets, then experiment with different frameworks, algorithms and parameter values to create the best predictive models possible. This bespoke approach can be fascinating, but it just won’t scale sufficiently to bring AI and ML into the Enterprise mainstream.
Why perpetuate an AI process that is so manual when heuristics – and even ML itself – can help automate data cleansing, algorithm selection and parameter tuning? Yes, AI automation can help organizations without data scientists do AI and ML. But beyond that, even orgs with strong data science teams can use AI automation to remove tedium from, and increase accuracy in, their work. It’s win-win, even if it removes some of AI’s mystique.
Want to learn more about automating AI, and making it actionable and accessible to all your data workers? Join us for this free 1-hour webinar from GigaOm Research. The webinar features GigaOm analyst Andrew Brust with Kurt Muehmel, VP of Sales Engineering at Dataiku, a leader in AI for the Enterprise.
In this 1-hour webinar, you will discover:
Why bespoke AI work is fast becoming unsustainable
The science and power behind automated machine learning (AutoML)
How to leverage AI automation while combating bias, drift and other AI/ML pitfalls
Register now to join GigaOm Research and Dataiku for this free expert webinar.
Emily Pali, Product Manager, and James Royalty, Principal EngineerRecorded: Jun 25 201922 mins
CDNs are critical in a globally distributed world where users expect optimal performance regardless of where they are located.
The reasons for implementing a multi-CDN environment might seem obvious--high availability and redundant infrastructure--but if set up correctly, it can actually impede the user experience.
The reality is, if you’re not routing traffic efficiently across your multi-CDN environment you’re not truly benefiting from it. NS1’s Pulsar Active Traffic Steering allows you to truly harness the power of multi-CDN without the headache.
Fulya Sengil, Solution Architect & Adam Reyland, Regional Marketing Specialist at VeracodeRecorded: Jun 25 201928 mins
Many commentators observe that the IoT devices just aren’t up to scratch, when it comes to security. GDPR requires vendors to and service providers to design things with security as standard. In February 2019, the European Standards body ETSI published security guidelines for the consumer Internet of Things aligning with IOT Security Compliance Framework.
IoT requires the best in all aspects of security — physical, operational technology, and cybersecurity. Thus, it makes sense to envisage IoT security as an ecosystem in itself. Unexpected challenges are likely to erupt because of the existence of several layers in the IoT ecosystem. This calls upon leaders to initiate regular automated risk assessments and simulations such that IoT specific breaches can be monitored closely. This helps businesses build reliable playbooks that enable organizations to respond to IoT security challenges.
Software installed on these devices could be potentially vulnerable, if it has not an automated security assessment before deployment. We take a look at how it’s possible to make the software that drives these devices and the backend serverless technologies secure based on requirements in IOT Security Compliance Framework.
H2O Driverless AI is H2O.ai's flagship platform for automatic machine learning. It fully automates the data science workflow including some of the most challenging tasks in applied data science such as feature engineering, model tuning, model optimization, and model deployment. Driverless AI turns Kaggle Grandmaster recipes into a full functioning platform that delivers "an expert data scientist in a box" from training to deployment.
In the latest version of our Driverless AI platform, we have included Natural Language Processing (NLP) recipes for text classification and regression problems. With this new capability, Driverless AI can now address a whole new set of problems in the text space like automatic document classification, sentiment analysis, emotion detection and so on using the textual data. Stay tuned to the webinar to know more.
Dave Berry, Senior Solutions Engineer - International, Unravel DataRecorded: Jun 25 201942 mins
The movement to utilize data to drive more effective business outcomes continues to accelerate. But with this acceleration comes an explosion of complex platforms to collect, process, store, and analyze this data. Ensuring these platforms are utilized optimally is a tremendous challenge for businesses.
Join Dave Berry, Senior Solution Engineer at Unraveldata, as he takes you through an AI/ML based approach to Application Performance Management applied to data applications on any infrastructure - whether it be cloud, on-premise, or a combination of the two.
Rajesh Ghai Research Director, Carrier Network Infrastructure IDC, Rajesh Ghai Director, Carrier Network InfrastructureRecorded: Jun 24 201953 mins
Enterprises have embraced cloud computing to unlock the opportunity offered by digital transformation. The cloud’s flexibility and agility enable enterprises to grow their business without borders and ensure productivity and efficiency.
While enterprise applications continue to migrate to the cloud, the necessary change in the wide area network (WAN) is often overlooked. The hub-and-spoke WAN architecture that served the needs of the enterprise when applications were delivered from the datacenter, must evolve to serve the needs of the era of cloud applications. Software-defined WAN (SD-WAN) is the WAN's response to this paradigm shift in application traffic to the cloud.
While SD-WAN has emerged as a key enabler of secure and seamless direct cloud application access with the benefits of transport independence, better security and path selection, it has brought into focus the importance of the transport underlay and the technology investment — both past and future — that an enterprise needs to consider before adopting SD-WAN.
This webinar spotlights two critical success factors for driving mainstream enterprise adoption of SD-WAN:
*Predictable and robust Internet connectivity
*Investment protection (of installed legacy network equipment or in new technology) as the WAN evolves to support applications delivered from the cloud.
Join us at this webinar as we make sense of SD-WAN adoption for your company.
David Linthicum, John MaoRecorded: Jun 21 201956 mins
This free 1-hour webinar from GigaOm Research brings together experts in Kubernetes on-prem ops success, featuring GigaOm Analyst David Linthicum and special guest John Mao, VP Business Development from Stratoscale.
You don’t have to read all of the analyst surveys to understand that Kubernetes usage is accelerating in 2019. Indeed, while the growth in the cloud is exceptional, the growth in leveraging Kubernetes on premises is just as impressive.
However, many enterprises with a need to deploy Kubernetes in their data center are left out in the cold. Tools that focus on Kubernetes on-prem don’t provide the value IT is expecting, and thus their Kubernetes on-prem ops are at risk of failing.
Enter new tools and approaches that can ensure success. Moreover, enter in new guidance that allows leaders to approach the problem armed with the right knowledge.
Indeed, five core secrets exist to approach on-prem ops, including, deployment, monitoring, security, upgrading, and of course scaling.
In this 1-hour webinar, we will explore:
• Why approaches to Kubernetes success are changing.
• Emerging best practices and technology.
• Five secrets of Kubernetes enabling containers orchestration that most enterprises and cloud providers need for success
•Must-know keys to enabling successful technology
Andrew Brust, Jagane SundarRecorded: Jun 20 201962 mins
Cloud adoption is in full swing, no longer dominated by hype and just a few early adopters. In the world of data analytics, migrating data from on-premises distributed storage systems to cloud object storage is the mission. Once the data has landed there, multiple data services can query and analyze it. The payoff can be huge, but the devil’s in the details.
While the problem of migrating applications to the cloud has largely been solved, migrating data is much less straightforward. Today’s enterprise data volumes can be quite large, resulting in long-running data migration processes. Enterprises can’t just hit pause on their businesses, though. So how can they maintain operational continuity while conducting lengthy migrations?
To learn more about cloud data migration challenges and solutions, join us for this free 1-hour webinar from GigaOm Research. The webinar features GigaOm analyst Andrew Brust and special guest Jagane Sundar from WANdisco, a leader in cloud data movement and hybrid data lake solutions.
In this 1-hour webinar, you will discover:
How to migrate Hadoop clusters at scale
The realities of cloud data migration management
The multi-data-engine benefits of cloud data lakes
Register now to join GigaOm and WANdisco for this free expert webinar.
Who Should Attend:
Chief Data Officers
Database Administrators (DBAs)
jon Collins, Chris Merz, Ingo FuchsRecorded: Jun 20 201962 mins
This free 1-hour webinar from GigaOm Research brings together experts in DevOps and Hybrid Cloud, featuring GigaOm Jon Collins and special guests from NetApp, Ingo Fuchs, Chief Technologist, Cloud and DevOps, and Chris Merz, Principal Technologist, DevOps. The discussion will focus on delivering DevOps and the challenges faced when adopting this culture.
Not only do organizations find it challenging to scale DevOps practices across the business, it’s also hard to keep pace with the fast, real-time needs of application development. As companies face the reality of the pace required for software development, they’re simultaneously challenged to balance quality and security, governance and management, complexity and collaboration, and they require an agile storage strategy to allow them to pivot as needs change.
Should such organizations just go back to waterfall models and single-server, three-layer architectures? Of course not. The right approach allows them to pick and choose the right infrastructure, whether it be in the cloud, or on-premises, to deliver the right solutions for their business needs - all without affecting their developer experience. In this webinar, we talk to NetApp, itself a large enterprise using both DevOps and hybrid multi-cloud strategy, about lessons learned from helping customers towards this goal.
In this 1-hour webinar, attendees will discover:
• The challenges faced by organizations looking to adopt DevOps
• What this means in terms of symptoms, costs, efficiency and effectiveness impacts
• Best practices for supporting software development
•What tools, technologies, and processes enable DevOps to exist, no matter your storage strategy
•How to choose the right infrastructure for your strategy
Rob Lauer –Senior Manager, Developer Relations, Progress & Tara Manicsic– Principal Developer Advocate, NativeScriptRecorded: Jun 20 201945 mins
t’s a safe bet that your organization manages countless apps, ranging from 20 year old web apps to cutting-edge native mobile apps. And the number of new app requests are increasing at an unsustainable rate. Plus, your users expect more - top line performance, modern usability, and an bug-free experience.
Your IT team is stuck in the middle. How do you keep building more apps, for more users, while maintaining a consistent UX, provisioning the right apps to the right people, and keeping up with today’s development standards?
The solution to your problems could very well be the Kinvey Microapps platform, powered by NativeScript.
Microapps are single-purpose workflows that help users accomplish individual tasks as part of a more complex workflow. Microapps help break complicated systems into an easier-to-manage array of workflows (microapps) that can be made instantly available on any device a user may want.
William McKnight, Paige RobertsRecorded: Jun 20 201961 mins
This free 1-hour webinar from GigaOm Research brings together experts in modern uses of data warehousing. Featuring GigaOm Analyst William McKnight and a special guest from Vertica, Paige Roberts, the presentation will focus on strategies enterprises should be undertaking today for evolving data warehouse ecosystems.
Despite the allure and utility of other analytic data constructs, enterprises continue to invest the most in, and get tremendous benefit from, great data warehouses. The data warehouse is still the key component of the actionable analytic future. However, the data warehouse of today is used differently, can accomplish new things, and requires different technical and business strategies to get the most out of it, than before.
In this webinar, McKnight dives into the current state of data warehouse initiatives. He discusses how implementation realities of the last two decades have demonstrated the importance of the data warehouse, and why enterprise leaders must think differently about data warehousing as a whole.
In this 1-hour webinar, attendees will discover:
• The state of the data warehouse today and strategies to maximize the return on data warehouse investments of 2019 and beyond.
• The flavors of the data warehouse and the criteria for ensuring your data warehouse(s) is up to standard
• Key considerations when building a new data warehouse, evolving an existing data warehouse, or contemplating a re-platforming.
•The continued relevance of data warehousing today
•How to know when to put your data warehouse in contain mode
•Ideas for data platform selection for a data warehouse
Michael Ducy, Director of Community & Evangelism, SysdigRecorded: Jun 20 201939 mins
The Falco community is celebrating three years of container protection from this open source run-time security project, born out of Sysdig and now part of the Cloud Native Computing Foundation. The project has come a long way since its initial release in 2016. We’ll cover those early days and talk about how the project - and the world of container security - has grown over the years. We’ll also share the latest updates on Falco, including: adoption, ways it's being used, newly released features, and the upcoming roadmap. Whether you’re new to the world of container security or a seasoned expert, you’ll want to join to learn more about how Falco is evolving as the standard for container run-time security.
Ahyoung An, Sr. Product Marketing Manager, MuleSoft and Aaron Landgraf, Sr. Product Marketing Manager, MuleSoftRecorded: Jun 20 201922 mins
The latest release of Anypoint Platform helps organizations create and engage a vibrant ecosystem of developers, partners and employees and succeed with API programs.
This release features the launch of Anypoint API Community Manager and Catalyst Mobilize to increase customer loyalty and engagement, improve internal ops effectiveness and introduce new business models through API programs.
See an in-depth overview of Anypoint Platform, including how to:
- Promote API products, increase adoption, and build successful API programs with Anypoint API Community Manager.
- Build and engage a vibrant ecosystem with Catalyst Mobilize, which includes in-person workshops, playbooks, and tailored exercises.
- Monitor and optimize the performance of API programs.
- Additional enhancements within Anypoint Partner Manager (EDI/B2B), Flow Designer, Government Cloud, and more.
Aditya Guthey, Speaker and Coach, www.whoweare.ioRecorded: Jun 20 201964 mins
Digital revolution has been happening at a fast pace. Things are constantly changing, and companies either play a catch-up game or lead the revolution. This webinar helps you consciously choose how to respond to the digital revolution.
We will discuss the six levels of energy that helps build consciousness around the decisions we make. Then we discuss how these six levels of energy apply to the digital transformation. This helps the companies know where they are and where they want to be with respect to the digital revolution so that they can consciously take action steps in moving closer towards the company's goals.
About the speaker:
Aditya Guthey is a performance coach who helps engineers perform at their peak. An engineer by craft, he offers keynotes, workshops, group, and individual coaching to help create high performing engineering teams.
Adam Baldwin, VP of SecurityRecorded: Jun 20 201926 mins
Matt Sitelman, Mimecast; Shayla Treadwell, ECS Federal; Ashley Schwartau, Sec. Awa. Co.; Brandon DunlapRecorded: Jun 20 201957 mins
All organizations wrestle with their security awareness programs. It is clear that users need to be part of the solution and not just part of the problem. Many users however see these programs as “gimmicks” and don’t take the training as seriously as the organization would like them to. Security managers are often putting out other fires and can’t devote the time they would like to ongoing awareness training. So what can be done to overcome these issues? Join Mimecast and (ISC)2 on June 20, 2019 at 1PM Eastern for a discussion on the do’s and don’ts of security awareness training and testing, why it’s important to customize training for different groups and how to truly engage your end-users to make them part of your security program. We will also provide tips for gaining management support and building a company-wide culture of security with training as a key component.
Rani Osnat, VP of Product Marketing Aqua SecurityRecorded: Jun 20 201947 mins
The concept of “shift left” engages security earlier in the development cycle of cloud-native applications, accelerating development while reducing risk. However, migrating to cloud-native environments also necessitates the security team to “shift up”, focusing on the application layer to account for the shared-service model and “thin OS” environments that are prevalent in these environments.
Attend this webinar to learn why Shifting Up provides improved security and cost efficiency in cloud-native environments, including:
- Kubernetes orchestrated applications
- Containers running on VMs
- Serverless containers (e.g., AWS Fargate and Azure Container Instances)
- Serverless functions (e.g., AWS Lambda and Azure Functions)
Myank Gupta, Phil Sellers, Ray Lucchesi, Steve GinsbergRecorded: Jun 20 201960 mins
This free 1-hour webinar from GigaOm Research brings together experts in hyper-converged architecture (HCI) and data center infrastructure and administration to explore how enterprise leaders are using HCI to modernize data center infrastructure and prepare it for public cloud co-existence.
Featuring, Mayank Gupta, Product Marketing Lead, Core HCI, at Nutanix, the discussion will be moderated by GigaOm Analyst Ray Lucchesi and includes GigaOm analysts, Phil Sellers and Steve Ginsberg. Many data center managers are struggling with infrastructure sprawl, adding servers and storage to address never-ending, application requirements. But for many, this is a symptom rather than a solution. Resource utilization is typically abysmal in most non-virtualized data centers. Modernizing a data center with HCI has the potential to drastically improve utilization, reduce complexity and provide a platform that can better co-exist with public cloud.
In this 1-hour webinar, the team examines:
• How employing HCI software-defined infrastructure can modernize the data center
• How the use of HCI can increase resource utilization and reduce complexity across the data center.
• How to build HCI in the data center and public cloud to co-exist with and support IT application needs.
Participants will come away with a better appreciation of how HCI can help modernize the data center while at the same time improving utilization, reducing complexity and bringing about a data center better able to co-exist with public cloud.
Josh Caid, Chief Evangelist, Cherwell SoftwareRecorded: Jun 20 201949 mins
Garbage in, garbage out, or so the saying goes. If you get more efficient at garbage, then you too can have a bigger pile of garbage. Automation is an area where “work smarter, not harder” really comes into play. Let’s look at what some companies have done to avoid this.
Derek Weeks, VP at Sonatype and Co-Founder of All Day DevOpsRecorded: Jun 20 201930 mins
We've spent six years studying secure coding practices of DevOps and Continuous Delivery organizations by surveying over 15,000 IT professionals. We've analyzed their staffing practices, educational priorities, automation choices, and process improvements that improve their cybersecurity preparedness. Our study has also uncovered details of where automation fails, awareness falls short and breaches happen.
Come participate in this session where we will share the 10 habits practiced by the DevSecOps Elite that you can then apply to -- or further mature within -- your own organization. We will also uncover what our analysis revealed about securing CI/CD pipelines, including what popular Jenkins plug-ins are used for security.
We've spent six years studying secure coding practices of DevOps and Continuous Delivery organizations. Learn the 10 habits practiced by the DevSecOps Elite -- including their approaches to training, process, and automation -- that you can apply to your own organization.
Over the last few years, a major shift has occurred in the Quality Assurance (QA) and Software Testing Management space. Organizations are adopting smart test automation, but continue to find the need for traditional PMO-level reporting offered by tools like Micro Focus ALM and Quality Center. Often, this results in a tool mismatch that negatively impacts the ability to report in real-time across all phases of manual and automated testing.
Smart test automation is no longer a “nice to have” but rather a foundational aspect of the testing process, and this is impacting the toolchain. Organizations are adding new test management tools like Selenium, Tricentis Tosca and others to their suite to better align with their Agile and test automation efforts. Global QA Directors are recognizing a multi-vendor and multi-tool approach to software quality assurance with commercial and open-source tools yields the best results.
With test planning, design, execution, and reporting now split across multiple tools, organizations have lost their visibility into key QA metrics for measuring manual vs. automated test execution effectiveness, test automation progress, automated testing execution progress, and defect remediation for all phases of the software testing lifecycle.
Join Senior Value Stream Architect, Tina Dankwart, to learn how Tasktop can help you:
•Modernize test management by adding new, specialized tools for test automation
•Improve collaboration between QA teams working in their preferred tools of choice
•Eliminate the laborious effort of manual synchronization and report generation
•Centralize test data from automated and manual tests spanning multiple tools and instances for real-time reporting
Building a brand isn’t just crafted by your marketing team - it’s also a critical part of product development. But in the tech world, brand building isn’t often a topic of discussion. Instead, product teams are focused on identifying user problems, getting user feedback and optimizing products. From a brand perspective, how do you create products and experiences that drive greater emotional connection while transforming cultural experiences? Casper CXO Eleanor Morgan will share the secret to building brands that resonate and influences product development to bring delight to your customers.
Eleanor Morgan is the Chief Experience Officer at Casper - responsible for the design and development of the Casper customer experience across all touchpoints (digital, retail, in-home). Prior to joining Casper, Eleanor spent nearly 10 years at IDEO where she led the development of dozens of award-winning products and brand experiences for startups and Fortune 500 companies. Prior to IDEO, she worked as a product designer at Volkswagen. Eleanor holds a Masters in Mechanical Engineering from Stanford University and currently lives in NYC.
Tim Berglund, Sr. Director Developer Experience, Confluent + Rachel Pedreschi, Worldwide Director of Field Engineering, ImplyRecorded: Jun 19 201956 mins
The next generation architecture for exploring and visualizing event-driven data in real-time requires the right technology. Microservices deliver significant deployment and development agility, but raise questions of how data will move between services and how it will be analyzed. This online talk explores how Apache Druid and Apache Kafka® can turn a microservices ecosystem into a distributed real-time application with instant analytics. Apache Kafka and Druid form the backbone of an architecture that meet the demands imposed on the next generation applications you are building right now. Join industry experts Tim Berglund, Confluent, and Rachel Pedreschi, Imply, as they discuss architecting microservices apps with Druid and Apache Kafka.
Amy Feldman, Head of APM Product Marketing and Nishant Kabra, Product ManagementJun 25 20194:00 pmUTC75 mins
With this increased reliance on applications, it’s more important than ever that IT organizations monitor and manage the end-user application experience across physical, virtual, web, mobile, wearables, cloud, containers, mainframe, user journey, behavior, and funnel. However, the ability to predict, identify, diagnose, and fix application issues is difficult, especially in today’s modern app environments that are complex, noisy, and dynamically changing. For organizations seeking to improve customer experience, reduce application issues, and gain insights into user behavior, using analytics and machine learning becomes a critical capability for any APM solution. As a result, modern APM solutions have evolved to include AIOps platforms that correlate and analyze data across users, applications, infrastructure, and network services apply machine learning, advanced analytics, and automation to deliver a new level of visibility with automated actionable intelligence. Attend this session to learn more about the exciting innovations delivered in DX APM 11.
David Linthicum, Casten Puls, Simon GibsonJun 25 20194:00 pmUTC58 mins
Led by GigaOm Analyst David Linthicum, and co-presented with Simon Gibson and special guest, Carsten Puls, Senior Director, Frame at Nutanix, this free 1-hour GigaOm Research webinar focuses on the value of desktop-as-a-service platforms to simplify multicloud and remove complexities. As the world of IT, including cloud, becomes more complex, private and public clouds are increasingly expected to replace traditional computing and eventually make things easier. However, for most enterprises, the cloud is now an additive to existing traditional systems, and that adds complexity and reduces business value.
Interfaces, or ways that the users interact with the infrastructure is key to removing this complexity. This is the approach where user interfaces are virtualized, in short, complexly abstracted from the complexity of the underlying environment including clouds. A world where IT is more simplistic, thus more productive to the business. So where to start? Begin by addressing complexity head-on. In this
1-hour webinar, you will discover:
• The concept of desktop-as-a-service including how it works, what it is, and where to get it.
• How things have become complex, and how to measure complexity for your enterprise.
• How to understand the business value of dealing with complexity.
• Steps toward mapping out a course of action.
Michael O'Connell, Chief Analytics Officer - TIBCO Software, Inc. Steven Hillion, Senior Director, Data Science - TIBCO SoftwJun 25 20195:00 pmUTC60 mins
Artificial intelligence is no longer in the future. It’s right here, right now—and it’s changing our lives. In this latest Data Science Central webinar, we’ll focus on the growing influence of anomaly detection on the Internet of Things (IoT), fintech, and healthcare.
You will learn how to:
Detect anomalies in IoT applications using TIBCO® Data Science with deep learning libraries (e.g. H2O, Python, TensorFlow, Amazon SageMaker)
Use TIBCO® Data Science models on the AWS Marketplace
Deploy models into operations for real-time monitoring and surveillance
Optimize your business and experience explosive growth with real-time anomaly detection.
Michael O'Connell, Chief Analytics Officer - TIBCO Software, Inc.
Steven Hillion, Senior Director, Data Science - TIBCO Software, Inc.
Rafael Knuth, Contributing Editor - Data Science Central
Nelson Petracek, Chief Technology Officer TIBCO Software Inc.Jun 25 20196:00 pmUTC64 mins
A Smarter Way to Drive Efficiency in the Semiconductor and Electronic Supply Chains
For years, manufacturers have been trying to streamline processes and associated supply chains by building marketplaces, standardized exchanges, and monolithic systems. In many cases, these solutions missed the mark, resulting in fragmented and brittle processes without easy data sharing.
Blockchain, the underlying technology behind Bitcoin and other cryptocurrencies, can potentially solve these problems. Sharing information via a trusted distributed network with embedded business logic offers many benefits for manufacturing, especially when combined with the IoT and AI communities.
Watch this webinar: Blockchain & Manufacturing: A Smarter Way to Drive Efficiency in the Semiconductor and Electronics Supply Chains, with speaker Nelson Petracek, chief technology officer, TIBCO Software.
With over 20 years’ experience, Nelson works to deliver solutions for the next stage of digital business, drawing upon his deep knowledge of cloud, blockchain, low-code applications, microservices, and event -driven applications.
Jon Collins, Baruch SadogurskyJun 25 20196:00 pmUTC59 mins
This free 1-hour webinar from GigaOm Research brings together experts in DevOps, featuring GigaOm analyst Jon Collins and a special guest from JFrog, Baruch Sadogursky, Head of Developer Relations, discussing how to set a pragmatic strategy for scaling DevOps that enables your organization to increase efficiency and productivity, without hampering delivery. Explore how to bring under management the assets — code and binaries — used in the development pipeline, building a competency that delivers and distributes right to the edge.
If you’ve embraced DevOps as a way of delivering better software, faster and cheaper into deployment and management, you may be looking to move from a single success story to broader use, or perhaps you are ready to take its principles to a whole new area of development and operations, such as sensor-based IoT. In each case, the challenge has become clear: how to can the enterprise align different development groups without becoming prescriptive or stifling innovation?
In this 1-hour webinar, you will discover:
- What challenges enterprises face as they look to scale their use of DevOps across the business and out to the edge
- What stages to work through to ensure that decision-makers balance the needs of efficiency and productivity with standardization and governance
- Where do tools and technologies help support at each stage, and how do roles, responsibilities, and processes evolve along the way?
- Where can you start delivering higher levels of efficiency without impacting productivity or creating new overheads?
So, if you are looking to broaden your use of DevOps and want to know where to start, or if you are already on the journey and dealing with the challenges, attend this webinar and bring your questions with you.
Terry Simpson, Technical Evangelist - Winslow Morgan Hodge, Sr. Acct. Mgr. - Dan Burke - Sales Engineer - NintexJun 25 20196:00 pmUTC54 mins
Workflow automation software is a must have for today’s enterprises. Not only do automated platforms give businesses a competitive advantage, but they also help mitigate manual, outdated practices that cause operational bottlenecks threatening everything from revenue to customer satisfaction.
Unlike other solutions Nintex makes it fast and easy to manage, automate, and optimize your business processes. It’s unique drag-and-drop designer and multiple out-of-the-box connectors make it simple for even non-technical personnel to create sophisticated workflow solutions that integrate with your everyday business applications.
Watch Nintex experts Terry Simpson and Winslow Moran-Hodge guide you through our webinar, “Process Automation Made Easy,” taking a closer look at how businesses can identify opportunities for process improvement – and why intelligent process automation is the key to addressing those improvements. In the webinar, you’ll uncover:
* How our customers use the Nintex Platform to manage, automate and optimize their business processes.
* Insights into the strength and simplicity of the Nintex Platform.
* Use cases for Nintex Platform features including task forms, advanced logic and anonymous forms.
Amy DeMartine, Forrester Principal Analyst and Utsav Sanghani, Senior Product Manager, SynopsysJun 25 20196:00 pmUTC50 mins
Application vulnerabilities are a prime target for attackers, and the critical task of identifying and remediating these flaws before they’re exploited can be daunting, especially for organizations adopting DevOps and CI/CD practices. Security teams don’t have the time or resources to find and fix every vulnerability, and developers prefer to do what they do best – build and deploy features quickly. Fortunately, developers can be good at their jobs and be your most effective application security resources if you enable them with the low-friction tools and training at the precise time they need them.
Join guest speaker Amy DeMartine, principal analyst at Forrester Research, and Utsav Sanghani, senior product manager at Synopsys, as they explore tools and techniques that can transform your developers into AppSec rock stars:
- Rapid and continuous in-IDE security testing can help your developers find and fix issues before they ever get committed to your codebase.
- Delivering short, contextualized AppSec training modules to developers in real time when they introduce vulnerabilities.
- Most modern applications contain more open source code than proprietary code. Help your developers identify and avoid risky OSS components.
Robin Moffatt, Developer Advocate, ConfluentJun 25 201910:00 pmUTC56 mins
Companies new and old are all recognizing the importance of a low-latency, scalable, fault-tolerant data backbone, in the form of the Apache Kafka® streaming platform. With Apache Kafka, developers can integrate multiple sources and systems, which enables low latency analytics, event-driven architectures and the population of multiple downstream systems.
In this talk, we’ll look at one of the most common integration requirements – connecting databases to Apache Kafka. We’ll consider the concept that all data is a stream of events, including that residing within a database. We’ll look at why we’d want to stream data from a database, including driving applications in Apache Kafka from events upstream. We’ll discuss the different methods for connecting databases to Apache Kafka, and the pros and cons of each. Techniques including Change-Data-Capture (CDC) and Apache Kafka Connect will be covered, as well as an exploration of the power of KSQL, streaming SQL for Apache Kafka, for performing transformations such as joins on the inbound data.
Register now to learn:
•Why databases are just a materialized view of a stream of events
•The best ways to integrate databases with Apache Kafka
•Anti-patterns to be aware of
•The power of KSQL for transforming streams of data in Apache Kafka
Владимир Станишевский, технический консультант, Micro Focus в РоссииJun 26 20198:00 amUTC60 mins
Micro Focus SecureMail- решение по шифрованию электронной почты без ущерба для удобства коммуникаций. Совместимость с мобильными платформами, возможности по архивации почты для целей внутреннего контроля, проведения расследований и исполнения требований по хранению корпоративной переписки.
Die Automatisierung verändert die Art und Weise, wie wir arbeiten und leben. Von selbstfahrenden Autos bis hin zu KI-gesteuerten virtuellen Assistenten - jeder ringt nach dem neuesten Stand der Automatisierungstechnik.
Der immense Druck auf digitalisierte Geschäftsabläufe lässt Unternehmen auf der Suche nach einer Prozessautomatisierung sein, die Genauigkeit und Geschwindigkeit verbessert und gleichzeitig die Kundenzufriedenheit verbessert.
In diesem Webinar erleben Sie drei Anwendungsfälle der Prozessautomatisierung, die das Personal befreien, an produktiveren Projekten zu arbeiten, menschliche Fehler zu reduzieren und schneller und effizienter auf Vorfälle und Anfragen zu reagieren.
Tom Finch, Solutions Architect EMEA, Chef & Ben Riley, Technical Director UK, SWEAGLEJun 26 201911:30 amUTC45 mins
Why Watch this?
Around 22% of production incidents are caused by the use of incorrect configuration data. SWEAGLE prevents this from happening.
Join this webinar to hear how SWEAGLE works with deployment tools to avoid these production incidents occurring. This webinar focuses on CHEF but it equally applies to all deployment tools.
Your world is complex already, delivering excellent software is critical and time is short. Add this into a constantly changing eco system of complex, inter-related applications which need to be more reliant and secure than ever before and the mission to deploy world class software delivering amazing customer experience through software is imperative.
In this webinar, see how Chef Habitat & SWEAGLE combine to take your enterprise applications to the next level in their deployment, scalability, reliability & security.
- Understand how Chef Habitat is able to automate its deployment quality using SWEAGLE
- See SWEAGLE consume, structure, validate and provide Habitat environment configData
- See how rigorous role-based access doesn’t slow your deployments down but keeps your data secure
- Watch Habitat plan, build & execute your estates build process and utilise SWEAGLE in the process
Take your software journey to the next level.
Who should attend:
Technology managers & Leaders
David Linthicum, Enrico Signoretti, Jim Donovan, Nathan GouldingJun 26 20192:00 pmUTC57 mins
Data is now created and consumed from more sources than ever, and with IoT and Edge computing this is even more so. The challenge is no longer how to make cloud computing central to the IT strategy, but is now about avoiding lock-ins, and keeping costs at bay while giving access to an increasing number of applications, devices and users — all dispersed across the globe on different clouds and networks.
As the great migration to cloud is fully underway, organizations are now moving from cloud-first strategies to multi-cloud, and they seek solutions allowing access and the ability to process data quickly at reasonable costs, no matter where data is created or consumed.
The goals are to build a tailored cloud made of best-of-breed “Cloud 2.0” solutions while removing any form of lock-in, and to outperform in terms of performance and cost what is currently available from established market vendors. But how can you build a new cloud model such as this one? There’s a host of disruptive Cloud 2.0 companies that provide Alternative Cloud Strategies built on Best-of-Breed Solutions, including Wasabi and Packet, our featured guests in this webinar.
Moderated by GigaOm analyst David Linthicum and co-presented with Enrico Signoretti and special guests, SVP of Product at Wasabi Jim Donovan and SVP of Engineering at Packet, Nathan Goulding. In this one-hour GigaOm webinar we’ll be discussing how to overcome the limits and lock-ins imposed by traditional approaches, and how to lay the next-gen infrastructure for today’s and tomorrow’s applications and data. We will look specifically at cloud strategies and how to execute them.
Dr. Shawn Andrews joins us to discuss gender equality and if the gender parity is attainable.
Dr. Shawn Andrews is a keynote speaker, organizational consultant, business school professor, and best-selling author. She has research expertise in the areas of Leadership, Emotional Intelligence, Gender, Unconscious Bias, and Diversity & Inclusion. She addresses the current leadership gender gap, barriers to leadership, gender-specific emotional intelligence, and gender culture differences. She explains how each of these show up every day in the workplace, how they affect perception and promotion, and provides recommendations for individuals to improve career advancement, and for organizations to enhance diversity and inclusion.
Andrew Brust, Mathias Golombek, Helena SchwenkJun 26 20194:00 pmUTC61 mins
The cloud will inevitably be a component of your customers’ and prospects’ data strategies, not to mention your own. But how does this impact analytics and data science? That question is especially important since the journey to the cloud has many stops, and most companies won’t move all their data to the cloud immediately.
It’s a heterogeneous, hybrid world out there: some data must stay on-premises, while some data is born in the cloud and should stay there. Other data allows discretion around migration and can be left in place. But when it comes to both analytics and data science, the work should encompass all data. Yet, how is that achieved with a single platform, when data isn’t centralized?
The good news is solutions and architectures exist to achieve this mission. To learn more about applying analytics and harnessing data science across all data in the cloud, on-premises or otherwise, plan to attend this free 1-hour webinar from GigaOm Research. The Webinar features GigaOm Analyst Andrew Brust and special guests, Mathias Golombek, CTO, and Helena Schwenk, AR and Market Insights Manager from Exasol, a leader in in-memory analytic databases.
In this 1-hour webinar, we will explore:
- Why the cloud together with analytics and data science are such a good match
- What hybrid cloud is, what its practical implications are, and how to make it work
- Considerations for cloud analytics performance and scale
- Using the cloud for AI and data science
Digital Transformation is dead! As you plan your journey, one of the first steps is to think about what needs to be done, why and how. The webinar presents 3 ways to navigate your way without losing track of your goal(s).
Each path has its own set of focus areas. Irrespective of he approach you take, their are 6 dimensions of transformation that you must plan around to be successful.
Kathleen Randall, EVP North America CISSP, CISA, GSNA and Ben Dalton, Senior Technical ConsultantJun 26 20194:00 pmUTC60 mins
In the past, healthcare organizations have paid lip service to HIPAA’s privacy requirements for third party vendors, or “business associates.” As data breaches and malware continue to cripple healthcare institutions, many are realizing that their weakest links may be out of their control and in their vendor’s hands. How can you be assured that your ePHI data is in safe hands? By asking the right questions. Industries like finance, insurance, energy have managed third party risk programs for years. In this webinar learn how industry best practices can be incorporated to not only meet HIPAA requirements but also give visibility to manage your external risk to your healthcare organization.
The session will cover:
• Managing the risk of your Business Associates (and the risk of their Business Associates)
• Case Studies: Vendors as a common source of breaches
• Before you sign off on the Business Associate Agreement, what you should be doing?
• Work smarter, not harder: How to drive a cost-effective and OCR-compliant process.
View the Future of PLM today! See a live and interactive demonstration of Propel’s game-changing cloud PLM solution Join us on Wednesday, June 26 to see why Propel is the fastest growing PLM solution in the market.
Pubudu Gunatilaka, Associate Technical Lead and Andrea Perera, Trainee Software Engineer, WSO2Jun 26 20196:00 pmUTC64 mins
Kubernetes is a leading open source container orchestration solution for managing containerized applications across multiple hosts. It allows users to easily deploy, maintain, and scale applications in containers. WSO2 API Manager is a fully open source solution for managing all aspects of the API lifecycle and is designed for massively scalable deployments.
In this webinar, we will explore a scalable deployment of WSO2 API Manager with API analytics on Kubernetes.
We will discuss:
* How to deploy WSO2 API Manager with Analytics in Google Kubernetes Engine (GKE)
* Autoscaling WSO2 API Manager based on the production load
* How to apply WSO2 Update Manager (WUM) updates in a production Kubernetes environment
* Best practices for deploying WSO2 API Manager in Kubernetes
Valerie Padilla, Technology Strategist in the Server CTO, Dell EMCJun 26 20196:00 pmUTC60 mins
In this webinar, attendees will gain insight into Gen-Z technology applications and a deeper understanding of potential deployment options for their purposes. Attendees will learn how Gen-Z can help overcome current challenges within existing computer architecture and provide future opportunities for innovative, open, efficient, secure, and cost-effective solutions.
Arno Candel, CTO at H2O.aiJun 26 20196:00 pmUTC60 mins
Driverless AI is H2O.ai's latest flagship product for automatic machine learning. It fully automates some of the most challenging and productive tasks in applied data science such as feature engineering, model tuning, model ensembling and production deployment. Driverless AI turns Kaggle-winning grandmaster recipes into production-ready code (Java and C++), and is specifically designed to avoid common mistakes such as under- or overfitting, data leakage or improper model validation, which are some of the hardest challenges in data science. Other industry-leading capabilities include automatic data visualization and machine learning interpretability.
We're now excited to add the ability for users, partners and customers to extend the platform with Bring-Your-Own-Recipe. Now domain experts and advanced data scientists can now write their own recipes and seamlessly extend Driverless AI with their favorite tools from the rich ecosystem of open-source data science and machine learning libraries. During this webinar we'll demonstrate how easy it is to write a new recipe for feature transformation or use a third party algorithm to extend Driverless AI.
Arno Candel is the Chief Technology Officer at H2O.ai. He is the main committer of H2O-3 and Driverless AI and has been designing and implementing high-performance machine-learning algorithms since 2012. Previously, he spent a decade in supercomputing at ETH and SLAC and collaborated with CERN on next-generation particle accelerators.
Arno holds a PhD and Masters summa cum laude in Physics from ETH Zurich, Switzerland. He was named “2014 Big Data All-Star” by Fortune Magazine and featured by ETH GLOBE in 2015. Follow him on Twitter: @ArnoCandel.
Andreas Widmann, Senior Principal Consultant, Micro Focus & Christian Schütz, Business Value Consultant, Micro FocusJun 27 20198:00 amUTC60 mins
Agile Entwicklung und DevOps sind in Unternehmen, die erfolgreich am Markt bestehen wollen, heute gesetzt. Die typischen Fallstricke wie kultureller Wandel, Mindset und Prozesse sind bekannt und werden adressiert. Aber ist es allein damit getan? Sicher nicht. Es gibt nämlich durchaus weitere Herausforderungen, die es zu meistern gilt:
- Wenn man Agile und DevOps im großen Stil, also auf Enterprise Level, umsetzt, verliert man schnell die Übersicht: Was passiert auf Portfolio-, Solution- und Programm-Ebene? Wie ist es um den Release Train bestellt? Was ist mit dem Code Commit? Können Sie auf diese Themen einfach und ohne großen Aufwand eingehen?
- Auch in Sachen Schnelligkeit spielen Lean und Agile erfolgreich zusammen. Doch Bottlenecks wirken sich negativ auf die Systemgeschwindigkeit aus. Haben Sie die Engpässe in der gesamten Prozesskette erkannt und beseitigt?
- Agile Entwicklung bedeutet Sprints. Sprints bedeuten häufige Releases. Damit steigt das Risiko für gravierende Sicherheitslücken im Code. Denn Entwickler sind keine Sicherheitsexperten. Müssen sie es werden?
Antworten auf all diese Fragen beleuchten wir gemeinsam in diesem Webinar. Erfahren Sie, wie Sie End-to-End Governance und Visibilität in alle Aspekte des DevOps-Prozesses erhalten, durch eine ganzheitliche Automatisierung Durchsatz/Velocity optimieren und das Thema Sicherheit in den Gesamtprozess integrieren können - ohne Ihre Entwickler zu Pentestern zu machen.
Seien Sie dabei und nutzen Sie die Gelegenheit, Ihre Fragen an unsere Experten zu stellen.
Wir freuen uns auf Ihre Teilnahme.
Rohit Kelapure, Pivotal Consulting Practice LeadJun 27 20198:30 amUTC63 mins
Digital transformation includes replatforming applications to streamline release cycles, improve availability, and manage apps and services at scale. But many enterprises are afraid to take the first step because they don’t know where to start. In this webinar, Rohit will provide a step-by-step guide that covers:
● How to find high-value modernization projects within your application portfolio
● Easy tools and techniques to minimally change applications in preparation for replatforming
● How to choose the platform with the right level of abstraction for your app
● Examples that show how Java EE Websphere applications can be deployed to Pivotal Cloud Foundry
Nabil Bousselham, Solution Architect at VeracodeJun 27 20199:00 amUTC45 mins
La conteneurisation des logiciels aide les entreprises à modifier complètement la manière dont les applications sont déployées pour répondre aux exigences des clients. La technologie a le potentiel de réduire radicalement le coût de possession des capacités et confère un pouvoir énorme aux ingénieurs de DevOps.
Ces avantages changent également la nature de la manière dont le risque doit être traité dans le cycle de développement. L’application logicielle doit non seulement respecter les normes de sécurité de l’organisation dans le conteneur Docker, mais également l’image de base doit être exempte de vulnérabilités exploitables.
Dans ce webinaire, je voudrais bien vous montrer les challenges de sécurité liées á l’utilisation des librairies tierces open source dans les applications et les conteneurs Docker. Je vais aussi vous présenter comment Veracode peut vous aider á les sécuriser.
Quality test data is an important part of the overall test design, and it is crucial to producing realistic results. But creating test data requires much manual intervention today and a source of acute pain for many development teams in-sprint, who simply want to test for code the moment it’s created. This is why development teams turn to Agile Requirements Design (ARD) today.
By managing test data inside ARD’s model-based test designs, development teams can identify the right data needs and have this matched to each test case early into their Agile development cycle. By integrating test data generation as part of their requirements design, applications are developed to actual requirements, and function to the right set of user experiences,
Join ARD expert Alyson Henry as we discuss how to integrate different types of test data within ARD, to get the most out of your model-based testing and reduce your costly manual efforts:
• How to Overlay Data Combinations, Dependencies & Business Rules to ARD Models
• Where to Populate Test Data Inside a Flow
• How each Test Case Reacts to Rich Data Sets
• Techniques for Modeling Applications with Different Data Needs
Madhukar Kumar, Vice President Technical and Product Marketing, at Redis LabsJun 27 20194:00 pmUTC43 mins
The tech industry regularly sees the rise and fall of several hype cycles including the advent of the dot com era to cloud computing, big data and more recently artificial intelligence (AI) and blockchain. Looking back, it is clear that each one of these major changes was additive or in some way related to the disruption that happened before. For example, AI would not be where it is today without big data. Big data would not have been possible without the advent of cloud computing and cloud itself would be non-existent without the world wide web boom in the 90s. Watch this webinar to learn the disruptive trends in data for 2019.
You Will Learn:
- Top 5 emerging megatrends
- Top 5 data trends
- What a zero latency future entails
About the presenter:
Madhukar is a product strategist with a track record of successfully running product management and product marketing teams for the last 10+ years. He has held several leadership positions in small and large technology companies like Zuora, HP and Oracle where he was responsible for building vision, go to market strategies and opening up new markets for hyper growth. More recently he ran product strategy and product marketing at Oracle for Customer Experience (CX) products portfolio including Marketing, CRM, Commerce and Service. In the last 10 years he has also been writing in technology journals and speaking at several industry events across the globe around disruptive new technologies affecting businesses and the future of customer experience.