Jenn Gille, Kate Osborn, Amir Rawdat, Jason Williams, NGINX
When it’s time to move from an old service to the new version, you don’t want to move all your traffic at once in case there are any issues with the new service. That’s why traffic splitting - including canary, circuit breaking and blue/green - is a valuable tool for ensuring resiliency. In this live stream, our experts will:
Read our blog to learn more about how to make Kubernetes more resilient with advanced traffic management http://bit.ly/resilient-K8s-blog
Discuss best practices for microservices traffic management
Outline uses cases for traffic splitting at the edge vs within the cluster
Give live demos of traffic splitting with NGINX Ingress Controller and NGINX Service Mesh
About NGINX Ingress Controller: https://bit.ly/3qtW6k8
App developers and DevOps teams at more than 400 million websites rely on NGINX for a wide range of app delivery functions – load balancer, API gateway, reverse proxy, and web server. But the more NGINX instances you have, the bigger the challenge in keeping track of them all. That’s where NGINX Instance Manager comes in.
Join this webinar to learn about a new solution called NGINX Instance Manager that helps you to effortlessly discover, configure, and monitor NGINX Open Source and NGINX Plus instances at scale. The webinar includes a demo of NGINX Instance Manager in action.
Masood Noori, Alliances Manager, SUSE | Allen Cheon, Business Development Mgr, Dylen Turnbull - Solution Architect, NGINX
Now with Rancher's market leading Kubernetes management platform plus NGINX’s market leading, production grade Ingress solution, organizations are empowered to simplify the deployment and management of Kubernetes at scale with enterprise grade solutions while focusing on delivering value for customers. Join us for this webinar and learn how to get started using Rancher + NGINX Ingress Controller.
David Luke, Enterprise Architect at NGINX, Now Part of F5 Networks
Identity and Access Control play a vital role in your application modernisation story. In this webinar we look at why you should care about OpenID Connect and why you need to ensure that your API Management solution supports Identity Management. We will talk about why you need to manage identity and access control at the application gateway level, together with the benefits from a security, policy and governance standpoint.
The following topics will be covered in this webinar:
• Identity Management – the current landscape, open standards, and how these have evolved
• Access Management – achieving fine-grained control of API user privilege
• RBAC – Role Based Access Control of API Management resources
• Microgateways – Best practices for orchestration within containerized deployment models including Kubernetes, RedHat OpenShift.
• Ingress Control – The relationship between the Kubernetes Ingress Controller/RedHat Openshift Router and API management.
Mark Boddington, Enterprise Solutions Architect, NGINX
Docker and Kubernetes are considered standard components when building and orchestrating applications today. Yet you’re still responsible for managing applications built with “traditional” tools. How do you meet the demands of new, microservices applications, while still maintaining the legacy, monolith ones?
As technologists, we value innovation and discovering new tools and solutions. This webinar does just that, but first we take a look back at how we got here. Join this webinar to learn:
•The history of current trends in computing - cloud, containerization, and Kubernetes
•How load balancing and network architecture have evolved from basic DNS to complex systems that support new development styles like CI/CD, A/B testing of code releases, and self service
•Challenges that come along with virtualization, distributed applications, and multi/hybrid cloud
•Solutions to reduce the complexity of enterprise application infrastructure, including a new platform from NGINX, now part of F5, that bridges the gap between traditional hardware and newer software solutions
Helen Beal - DevOps Institute | Rajiv Kapoor - Product Marketing, NGINX | Wendy Ng - Cloud Security Architect, OneWeb
In 2021, a web application firewall (WAF) is essential. Amid prolonged remote working and the pandemic-driven surge in the use of digital services and eCommerce solutions, organizations are facing a bigger threat from malicious website visitors than ever before. In the current day, hackers injecting malicious code instead of an email address to create an account, bypassing web application layers and instantly getting access to SQL databases, are a very real threat.
The risk to businesses is enormous - no organization can face losing customer trust and data in a crowded, competitive marketplace. But there is a solution: the web application firewall (WAF). WAFs safeguard company data by protecting servers from hackers. All incoming requests are inspected, bad traffic is filtered and attacks are prevented. An integral component of application protection, they’re also a requirement for complying with PCI-DSS, and help protect against the OWASP Top 10.
However, not all WAFs are created equal. In 2021, It’s crucial to have a modern and lightweight WAF that maps well to business requirements, drives superior customer experience and ultimately helps organizations to outpace their competitors. In episode 2 of Fuel Business Growth and Innovation with New Application Delivery Strategies we’re sharing just what the modern WAF can do.
Join us to hear:
- How modern WAFs address some of the most difficult challenges facing modern DevOps environments, including integrating security controls directly into the development automation pipeline
- Best practices for applying and managing security for modern and distributed application environments such as containers and microservices
- Why best-in-class WAF strategies support container-based environments and complement components like API gateways
- And more
Liam Crilly, Director of Product Management, F5 | Timo Stark, Professional Services Engineer, F5
You may have heard about NGINX Unit as a polyglot application server for running your application code. But there is a lot more to discover. In this session we deep dive into the network layer of NGINX Unit, configuring routes with matching patterns to proxy traffic to our NGINX Unit-controlled applications. We show you tips and tricks for using NGINX Unit in your day-to-day engineering work by means of APIs written in Python and a frontend application created with ReactJS. After this session you will be able to use NGINX Unit as your development control plane to easily run and proxy traffic to different applications without the operational overhead of multiple containers and a separate NGINX proxy.
Agenda
Dive deep into the network layer of NGINX Unit.
Configure routes with matching patterns to proxy traffic to our NGINX Unit-controlled applications.
Share you tips and tricks for using NGINX Unit in your day-to-day engineering work by means of APIs written in Python and a frontend application created with ReactJS
Speakers:
Liam Crilly — Director of Product Management, F5
Timo Stark — Professional Services Engineer, F5
Despite powering some of the most popular apps on the planet, microservices – including containers and Kubernetes – are still a mystery to many. Microservices is both an approach to software architecture that builds a large, complex apps from multiple small components and the term for the small components themselves. In this “Microservices 101” webinar, you’ll get an introduction to microservices that will give you a working understanding of the technologies:
- Monolithic, microservices, and hybrid architectures
- Containers and Kubernetes
- Ingress controllers and service meshes
Despite powering some of the most popular apps on the planet, microservices – including containers and Kubernetes – are still a mystery to many. Microservices is both an approach to software architecture that builds a large, complex apps from multiple small components and the term for the small components themselves. In this “Microservices 101” webinar, you’ll get an introduction to microservices that will give you a working understanding of the technologies:
- Monolithic, microservices, and hybrid architectures
- Containers and Kubernetes
- Ingress controllers and service meshes
Jenn Gille, Amir Rawdat, Jason Williams, NGINX | Ward Bekker, Grafana
Kubernetes is complex, and the only way to begin to get control is with robust visibility and monitoring. In this webinar, you’ll see our microservices experts demonstrate how to improve visibility in Kubernetes by:
- Leveraging the NGINX dashboard for live monitoring of key load-balancing and performance metrics
- Exporting the metrics to Prometheus
- Creating Grafana dashboards for a view of cumulative performance.
Read our blog for more details on how to improve insight in Kubernetes http://bit.ly/visibility-blog
Learn more about NGINX Controller: https://bit.ly/3qtW6k8
Jenn Gille, Amir Rawdat, Jason Williams, NGINX | Ward Bekker, Grafana
Kubernetes is complex, and the only way to begin to get control is with robust visibility and monitoring. In this webinar, you’ll see our microservices experts demonstrate how to improve visibility in Kubernetes by:
- Leveraging the NGINX dashboard for live monitoring of key load-balancing and performance metrics
- Exporting the metrics to Prometheus
- Creating Grafana dashboards for a view of cumulative performance.
Read our blog for more details on how to improve insight in Kubernetes http://bit.ly/visibility-blog
Learn more about NGINX Controller: https://bit.ly/3qtW6k8
In this webinar we will focus on NGINX Controller and Automation. Our challenge will be deploying and provisioning infrastructure to support API management and security. Our solution to the challenge will be using tools like Terraform and Ansible that we can easily deploy API management infrastructure to multiple clouds and/or on-premise infrastructure, which allows for a consistent, and distributed multi-cloud solution for API Management
Rajesh Bavanantham & Mike Holland, Solution Architect, NGINX
In this webinar we will focus on NGINX Controller and security. We will show you how to effectively secure your new and existing API deployments. We will also move security policies closer to the API's microservices workflow to allow technology specific signature sets to protect your APIs while reducing false positives.
David Luke, Enterprise Architect at NGINX, Now Part of F5 Networks
Identity and Access Control play a vital role in your application modernisation story. In this webinar we look at why you should care about OpenID Connect and why you need to ensure that your API Management solution supports Identity Management. We will talk about why you need to manage identity and access control at the application gateway level, together with the benefits from a security, policy and governance standpoint.
The following topics will be covered in this webinar:
• Identity Management – the current landscape, open standards, and how these have evolved
• Access Management – achieving fine-grained control of API user privilege
• RBAC – Role Based Access Control of API Management resources
• Microgateways – Best practices for orchestration within containerized deployment models including Kubernetes, RedHat OpenShift.
• Ingress Control – The relationship between the Kubernetes Ingress Controller/RedHat Openshift Router and API management.
Rajesh Bavanantham & Mike Holland, Solution Architect, NGINX
In this webinar we will focus on NGINX Controller and configuration.Our challenge will be integrating an APIM solution into CI/CD pipelines for easy API updates. Our solution we will present is to integrate an API Management solution into a CI/CD pipeline for a seamless API deployment.
The Kubernetes Ingress Controller is both widely used, and widely misunderstood. In this webinar David Luke will be covering all aspects of this software infrastructure component, including the benefits, and the potential downsides. We will dig deep into questions such as ‘Is the ingress controller mandatory?’ and ‘What are the similarities between the Ingress Controller and an API gateway?’
There is also a fascinating history as to how this component came to be, and where it fits into the Kubernetes Open Source Project. Key architectural and purchasing decisions with important implications going forward are regularly made regarding the Ingress Controller, sometimes without all the necessary information. This webinar aims to leave all visitors with key insights into making the right design decisions.
Helen Beal - DevOps Institute | Jenn Gile - NGINX | Fintan Wilson - African Bank | Yair Cohen - Datadog
In 2021, Digital Transformation in the world of business has become necessary to both survive and to unlock future agile operations, efficiencies and revenue streams. Increasingly at the heart of Digital Transformation projects, the sixth NGINX survey of its open-source community in 2020 found that the proportion of applications built with microservices jumped 20% in the past year - from 40% to 60%.
In episode 1 of Fuel Business Growth and Innovation with New Application Delivery Strategies, NGINX and guest experts are taking a look at what business demands are driving this spike in microservices adoption and what organizational benefits are leading to Gartner predicting that by 2022, 75% of global organisations will be running containerised applications in production environments.
Join us to discover:
- What benefits - including scalability, flexibility and faster development - are leading to businesses increasingly adopting microservices, containerised applications and Kubernetes
- How microservices can vastly improve user experience - a critical success factor for all organizations
- Advice and words of wisdom for overcoming microservices management and application development challenges - including scale, speed, quality, tooling and language selection
- And more
Hosted by Helen Beal - Chief Ambassador at DevOps Institute.
Guests:
Jenn Gile - Manager, Product Marketing at NGINX
Yair Cohen - Product Manager at DataDog
Fintan Wilson - Technical Architect at African Bank
This fourth and final part of the “Show Code” series will focus on how to secure API workloads that run on Kubernetes using ingress as an edge gateway or as a micro gateway for your LOB/BU.
To run NGINX Kubernetes Ingress Controller in the edge
https://github.com/b-rajesh/nginx-plus-ingress-kube-terraform
To run Multi NGINX Kubernetes Ingress Controller in the Edge& Namespace
https://github.com/b-rajesh/multi-nginx-plus-ingress-terraform
Mike Amundsen, Author, Speaker, Trainer; Kevin Jones, Senior Product Manager at NGINX; Mehdi Medjaoui, founder of APIdays
In a world In a world increasingly connected with APIs, applications rely on each other to operate. With the continuous demand to always go faster, businesses and customers are expecting everything to be real-time. But, in this highly connected world, a “slow” API in the digital supply chain may make the whole internal production or business workflow fail, or not deliver the expected customer experience. How can we build reliable and fast real-time architecture? What monitoring best practices do you need to apply to achieve speed and guarantee safety at the same time? What paradigm do you need to understand to transition to an Event-Driven architecture?
In our next API scene webinar, we will discuss why API performance is critical and needs to be addressed on both the technical and business sides; how can we address it internally; and what do we need to build to make it real.
This webinar will feature Mike Amundsen, author of many books on APIs and including API Traffic Management (O’Reilly) who will present his best practices and recommendations about managing the traffic of your APIs to ensure speed and safety at scale.
We will also be joined by Kevin Jones, Senior Product Manager at NGINX, who is an expert on delivering fast and reliable architecture with the NGINX API gateway. He will share how to build real time APIs infrastructure with NGINX. The webinar will be moderated by Mehdi Medjaoui, founder of APIdays conferences and co-author of Continuous API management (O’Reilly).
This fourth and final part of the “Show Code” series will focus on how to secure API workloads that run on Kubernetes using ingress as an edge gateway or as a micro gateway for your LOB/BU.
To run NGINX Kubernetes Ingress Controller in the edge
https://github.com/b-rajesh/nginx-plus-ingress-kube-terraform
To run Multi NGINX Kubernetes Ingress Controller in the Edge& Namespace
https://github.com/b-rajesh/multi-nginx-plus-ingress-terraform
NGINX helps companies deliver their sites and applications with performance, reliability, security, and scale. NGINX offers an award-winning, comprehensive application delivery platform in use on more than 300 million sites worldwide. Watch this webinars how to ensure flawless digital experiences through features such as advanced load balancing, web and mobile acceleration, security controls, application monitoring, and management.
Securing API Workloads Using IDPs - NARajesh Bavanantham, Solutions Architect, NGINX[[ webcastStartDate * 1000 | amDateFormat: 'MMM D YYYY h:mm a' ]]32 mins