Hi [[ session.user.profile.firstName ]]

The New Cyber: Faster, Better, Cheaper

Government agencies face the need to augment traditional SIEM systems and to do so in a manner that is better, faster and cheaper than before. This need for SIEM augmentation is driven by three factors - threat, scale, and cost. In addition most agencies now realise that they want to be more independent from cloud and SIEM vendors, so having a way to bring on new analytic destinations, including modern SIEMs, is an emerging requirement.



Confluent enables you to bridge the gap between old-school SIEM solutions and next-gen offerings by consolidating, categorising and enriching event logs, network data and log data generated by all relevant data sources for the purpose of real-time monitoring and security forensics.



Join the Confluent team for this online talk in which you will learn:

1. The top enterprise SIEM challenges facing government agencies
2. The value of augmenting your SIEM with Confluent
3. An overview of the Confluent solution and how it works
4. Customer use cases and examples
5. An integration example with Splunk
6. The role Kafka plays in all of this
Recorded Feb 4 2021 30 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Johnny Varley, Central Government Solutions Engineer at Confluent
Presentation preview: The New Cyber: Faster, Better, Cheaper

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Inside IIoT and Manufacturing: Event Streaming Use Cases in Manufacturing May 26 2021 11:00 am UTC 48 mins
    Kai Waehner, Global Technology Advisor
    The manufacturing industry must process billions of events per day in real-time and ensure consistent and reliable data processing and correlation across machines, sensors and standard software such as MES, ERP, PLM and CRM. Deployments must run in hybrid architectures in factories and across the globe in cloud infrastructures. Mission-critical and secure 24/7 operations on 365 days a year is normality and a key requirement.

    Join us to learn how event streaming with Apache Kafka, Confluent Platform and Confluent Cloud provides a scalable, reliable, and efficient infrastructure to make manufacturing companies more innovative and successful in automotive, aerospace, semiconductors, chemical, food, and other industries.



    The session will discuss use cases and architectures for various scenarios, including:
    - 10,000 Feet View – Event Streaming for Industry 4.0
    - Track&Trace / Production Control / Plant Logistics
    - Quality Assurance / Yield Management & Predictive Maintenance
    - Supply Chain Management
    - Cybersecurity
    - Servitization using Digital Twins
    - Additive Manufacturing
    - Augmented Reality
  • Modernising Change For Speed and Scale with Confluent and Kong May 25 2021 12:00 am UTC 45 mins
    Goran Stankovski
    Delivering new products and services at speed and scale continues to present the greatest opportunity - yet greatest challenge - to most organisations. Especially as business and consumer expectations continue to grow exponentially.

    Modern applications and platforms espouse the benefits of simplification, scale, and improved cost efficiencies, but often organisations face the challenge in managing significant increases in change volume and the associated cost of this change.

    Modern platforms such as Confluent and Kong provide many benefits to organisations individually, but together they form a unique capability offering as enablers of change, empowering organisations to supercharge their ability to deliver, set data in motion and manage change at scale.

    Please join us 10am - 10:45am on May 25th for this introductory talk (30 minute + 15 min Q&A) where LimePoint, Kong and Confluent will explore how modern applications and platforms are deployed, consumed, and managed.

    Come along and learn how platforms like Confluent and Kong, when used together, are enablers of change, empowering organisations to deliver new products and services at speed.

    Our Partners
    Confluent - The platform to set data in motion
    Kong - Service Connectivity for Modern Architectures
    You can register free of charge here. We look forward to welcoming you.
  • Preparing the Digital Banking Future of KeyBank (18th largest Bank in the US) Apr 28 2021 3:00 am UTC 60 mins
    Michael Roseman, SVP & Chief Architect, KeyBank
    In this rapidly changing environment, financial services firms must find innovative ways to grow revenues, fight fraud, and deliver great customer experiences over various digital channels. At the same time, they must meet continually evolving compliance standards around data residency, identity and access control, and risk management.

    KeyBank, the 18th largest bank in the US, was no exception. KeyBank felt the threat of disintermediation by market entrants, increased customer expectations, and pressure to innovate and deliver more real-time products and services. But none of this was easy—especially because their data was scattered across multiple legacy mainframes and trapped in ETL processes.

    Hear from KeyBank and how they are leveraging Confluent Platform and Google’s Business Application Platform (Google Cloud’s Anthos and Apigee) to drive ongoing digitization initiatives such as real-time fraud detection, lead management, and microservices.

    In this webinar, you will learn how to:

    Improve operational efficiency by adopting an event-driven architecture enterprise-wide backed by a Center of Excellence
    Bring new real-time products to market faster by moving away from a top-down, project-oriented approach
    Reduce legacy costs by cutting mainframe and messaging queue costs
    Presented by:

    Michael Roseman, SVP & Chief Architect, KeyBank
    Scott Tucker, Director of Open Bank Engineering, KeyBank
    Roger Scott, Chief Customer Officer, Confluent
    Gabe Jaynes, Hybrid Cloud Engineer, Google
    Register today to start planning the future of event streaming in your business.
  • Flattening the Curve with Kafka Apr 22 2021 5:00 pm UTC 29 mins
    Rishi Tarar, Northrop Grumman Corp
    Responding to a global pandemic presents a unique set of technical and public health challenges. The real challenge is the ability to gather data coming in via many data streams in variety of formats influences the real-world outcome and impacts everyone.

    The Centers for Disease Control and Prevention CELR (COVID Electronic Lab Reporting) program was established to rapidly aggregate, validate, transform, and distribute laboratory testing data submitted by public health departments and other partners. Confluent Kafka with KStreams and Connect play a critical role in program objectives to:

    o Track the threat of COVID-19 virus

    o Provide comprehensive data for local, state, and federal response

    o Better understand locations with an increase in incidence
  • Accelerate your Kafka Adoption : Special Session for India Apr 21 2021 7:30 am UTC 60 mins
    Kel Graham, Senior Solutions Architect, APAC, Confluent
    Accelerate your Kafka Adoption
    Kafka is crucial for any organization that benefits from real-time data. Kafka is fast, reliable, fault tolerant, and durable, modernizing architectures for new use cases large and small. From performant, scalable streaming data pipelines and message brokering, to event streaming, IoT data integration, and microservices communication, there are countless benefits across every industry.

    Attend this one hour talk to uncover how you can accelerate your Kafka adoption and drive faster time to value. Learn how you can overcome the most common challenges, how you can access best practises throughout your engagements and tools and reach self-sufficiency faster.

    Agenda :

    Welcome and Introduction - Aster Lee, Professional Services and Customer Success Director, APAC , Confluent

    Keynote : Accelerating your Kafka Adoption - Kel Graham, Senior Solutions Architect, APAC, Confluent

    Top 3 Customer Use Cases of Accelerated Kafka Adoption - Rana Banerji, Sales Director, APAC, Confluent

    Overview of Professional Services and Education - Aster Lee and Thomas Trepper, Head of Education Services

    Question & Answer
  • Modernising Data Pipelines with Confluent and MongoDB Apr 21 2021 4:00 am UTC 60 mins
    Guru Sattanathan and Noble Raveendran
    Confluent and MongoDB invite you for a session of education and conversation around today's modern data pipeline and how to set data in motion:


    Join us on April 21st to learn how you can effectively bring real-time data from across the enterprise to a database that can serve your APIs, services, and queries - resulting in faster data access and accelerated app development for a better customer experience.

    In this 45 minute online session, we’ll cover:

    Customer challenges
    Modern architecture with MongoDB
    Confluent: the platform to set data in motion
    MongoDB connector for Apache Kafka
    Fleet management demo

    We’ll then have 15 minutes for Q&A with our experts.

    Timezones:
    9:30am Mumbai
    12pm Singapore / Hong Kong
    2pm Sydney
    4pm Auckland


    Don’t miss this opportunity to hear customer stories from our experts. Reserve your spot today!
  • Accelerate your Kafka Adoption Apr 20 2021 3:00 am UTC 60 mins
    Kel Graham, Senior Solutions Architect, APAC, Confluent
    Kafka is crucial for any organization that benefits from real-time data. Kafka is fast, reliable, fault tolerant, and durable, modernizing architectures for new use cases large and small. From performant, scalable streaming data pipelines and message brokering, to event streaming, IoT data integration, and microservices communication, there are countless benefits across every industry.

    Attend this one hour talk to uncover how you can accelerate your Kafka adoption and drive faster time to value. Learn how you can overcome the most common challenges, how you can access best practises throughout your engagements and tools and reach self-sufficiency faster.

    Agenda :

    Welcome and Introduction - Aster Lee, Professional Services and Customer Success Director, APAC , Confluent

    Keynote : Accelerating your Kafka Adoption - Kel Graham, Senior Solutions Architect, APAC, Confluent

    Top 3 Customer Use Cases of Accelerated Kafka Adoption - Rana Banerji, Sales Director, APAC, Confluent

    Overview of Professional Services and Education - Aster Lee and Mario Sanchez, Head of Education Services

    Question & Answer
  • Data In Motion in the Insurance Industry Apr 15 2021 4:00 am UTC 30 mins
    Kai Waehner / Brett Randall
    Join Kai Waehner, Field CTO and Global Technology Advisor at Confluent to learn more about “Data In Motion in the Insurance Industry” with use cases and local experts on hand to answer your questions on Thursday April 15th 12pm SGT, 2pm AEST.
  • Confluent’s Premium Connector Spotlight: Oracle CDC Source Apr 14 2021 9:00 am UTC 32 mins
    Nathan Nam, Senior Product Manager, Connectors Nam, Senior Product Manager, Connectors
    One of the most common relational database systems needing to connect to Kafka is Oracle, which holds highly critical enterprise transaction workloads. While Oracle excels at storing data, it struggles to implement continuous real-time syncs to other data warehouses. Change Data Capture (CDC) seeks to solve this challenge by efficiently identifying and capturing data that has been added to, updated, or removed from Oracle relational tables. It then makes this change data available to the rest of the organization. Most enterprises seek to utilize this change data to enhance their real-time use cases, which requires bridging legacy systems to their modern data systems and applications through Kafka.

    However, sending Oracle CDC data to Kafka adds additional complexity for development teams, as few tools exist in the market today to address this need. Confluent’s new Oracle CDC Source Connector, the first of Confluent’s Premium Connectors, allows customers to reliably, and cost-effectively implement continuous real-time syncs by offloading data from Oracle Database to Confluent.

    In this webinar, you will learn how to:

    1. Jumpstart technical uses cases by leveraging a pre-built connector that has enterprises features and functionality ready out-of-the-box
    2. Reliably and quickly connect your Oracle database to Kafka and other modern data systems by leveraging 120+ pre-built connectors
    3. Quickly scale horizontally without taking on additional licensing costs with Oracle CDC Source Connector’s native Kafka connectivity
  • Building an Event Driven Global Data Fabric with Apache Kafka Apr 13 2021 5:00 pm UTC 58 mins
    Will LaForest, CTO Confluent Public Sector
    Agencies are grappling with a growing challenge of distributing data across a geographically diverse set of locations around the US and globally. In order to ensure mission success, data needs to flow to all of these locations rapidly. Additionally, latency, bandwidth and reliability of communication can prove to be a challenge for agencies. A global data fabric is an emerging approach to help connect mission to data across multiple locations and deliver uniformity and consistency at scale.

    This webinar will cover:

    An overview of Apache Kafka and and how an event streaming platform can support your agencies mission

    Considerations around handling varying quality communication links
    Synchronous vs asynchronous data replication
    New multi-region capabilities in Confluent Platform for Global Data Fabric
  • Bringing Industry Machine Learning to Government with Apache Kafka Recorded: Mar 30 2021 49 mins
    Will LaForest, Public Sector CTO, Confluent
    Apache Kafka® & Machine Learning
    Connecting the Dots from Inception to Production

    Join this session to understand how and why Apache Kafka® has become the de facto standard for reliable and scalable streaming infrastructures. AI/Machine learning and the Apache Kafka ecosystem are a great combination for training, deploying and monitoring analytic models at scale in real time. They are showing up more and more in projects, but still feel like buzzwords and hype for science projects.

    See how to connect the dots!

    - How are Kafka and Machine Learning related?
    - How can they be combined to productionize analytic models in mission-critical and scalable real time applications?
    - We will discuss a step-by-step approach to build a scalable and reliable real time infrastructure for anomaly detection in cyber data
  • Unlock the Valuable Data from Oracle Database with Confluent's Premium Connector Recorded: Mar 30 2021 35 mins
    Suraj Pillai, Senior Solutions Engineer at Confluent
    Oracle might hold the bulk of highly critical enterprise transaction workloads and even act as the main system of record for enterprise data in your organization. But until now, it hasn’t been easy to share the data siloed in Oracle with other systems in order to truly take advantage of it.

    Confluent’s new Oracle CDC Source Connector allows you to reliably and cost-effectively offload data from your Oracle Database to Confluent, greatly reducing net licensing costs you might accrue by using a third-party connector, and freeing up your developers’ time to focus on valuable business-impacting activities, rather than spending cycles building foundational tooling for supporting data-in-motion.

    Curious to hear more? Join us for a Premium Connector Forum on Oracle CDC Source. You’ll learn about:

    Supported features
    Example architecture
    Dependencies and limitations

    Register now for this virtual event !
  • Gaining Control of Hybrid Cloud with Apache Kafka® and Confluent Cloud Recorded: Mar 24 2021 26 mins
    Kai Waehner, Global Technology Advisor
    Kai Waehner from Confluent is your host for a session in which you will learn how you can accelerate your application modernization and benefit from the open source Apache Kafka ecosystem. Waehner will share real customer stories describing timely insights gained from event-driven applications built on an event streaming platform utilizing hybrid architectures with on premise / edge Kafka deployments and Confluent Cloud running on AWS/GCP/Azure, to store and process historical data and real-time data streams.
  • Confluent’s Premium Connector Spotlight: Oracle CDC Source Recorded: Mar 22 2021 33 mins
    Nathan Nam, Senior Product Manager, Connectors Nam, Senior Product Manager, Connectors
    One of the most common relational database systems needing to connect to Kafka is Oracle, which holds highly critical enterprise transaction workloads. While Oracle excels at storing data, it struggles to implement continuous real-time syncs to other data warehouses. Change Data Capture (CDC) seeks to solve this challenge by efficiently identifying and capturing data that has been added to, updated, or removed from Oracle relational tables. It then makes this change data available to the rest of the organization. Most enterprises seek to utilize this change data to enhance their real-time use cases, which requires bridging legacy systems to their modern data systems and applications through Kafka.

    However, sending Oracle CDC data to Kafka adds additional complexity for development teams, as few tools exist in the market today to address this need. Confluent’s new Oracle CDC Source Connector, the first of Confluent’s Premium Connectors, allows customers to reliably, and cost-effectively implement continuous real-time syncs by offloading data from Oracle Database to Confluent.

    In this webinar, you will learn how to:

    1. Jumpstart technical uses cases by leveraging a pre-built connector that has enterprises features and functionality ready out-of-the-box
    2. Reliably and quickly connect your Oracle database to Kafka and other modern data systems by leveraging 120+ pre-built connectors
    3. Quickly scale horizontally without taking on additional licensing costs with Oracle CDC Source Connector’s native Kafka connectivity
  • Applied Apache Kafka for Government: Introduction to Kafka Recorded: Mar 16 2021 55 mins
    Ken McCaleb, Enterprise Data Streaming Solutions Advisor, Confluent
    This is an introduction to Kafka and how government is modernizing systems to meet critical mission requirements using a modern event streaming platform.

    We will explain the origin of Kafka; why it is relevant; provide high level uses case examples of how it is used by the U.S. Government to greatly improve its capabilities to serve its citizens.
  • Confluent “Enabling real time Digital Transformation in the public sector” Recorded: Mar 4 2021 48 mins
    Johnny Mirza
    Each one of us now has first hand experience of living through a pandemic where situations change minute by minute, and critical information, updates and changes must be rolled out to citizens, businesses, intra government agencies and other key stakeholders.

    Today, the need for access to real time data is undeniable and the Public Sector has never played more of a pivotal role in every citizen’s life.

    Where once real time may have been something to aim for in the future, today even near real time is not enough.

    Join Johnny Mirza, Senior Solutions Engineer at Confluent for a 45 mins + 15 min Q&A virtual discussion “Enabling real time Digital Transformation in the Public Sector” on Thursday March 4th at 2pm AEST 11am SGT / 12pm AEST. You will learn how event streaming is a real time infrastructure revolution that is fundamentally changing how public sector organisations think about data and build applications to resolve the real time challenge that thousands of other organisations are addressing around us.

    Beyond the current pandemic, event-driven architecture is the future of data infrastructure, and this talk is designed to demonstrate and share practical examples of how to leverage the power of your event streams in real-time to deliver on mission outcomes, better serve citizens, ensure security and compliance, enhance IT efficiency, and maximise productivity.


    This session is suitable for both non-technical and technical attendees.

    Attendees will gain:
    -an overview of key event streaming concepts including “everything is an event”
    -a framework for solution architects and developers looking to learn a new skill
    -an understanding of common Use Cases in the Public Sector such as
    -an introduction to some of the technology behind event streaming, including Apache Kafka
  • The New Cyber: Faster, Better, Cheaper Recorded: Feb 25 2021 29 mins
    Johnny Varley, Central Government Solutions Engineer at Confluent
    Government agencies face the need to augment traditional SIEM systems and to do so in a manner that is better, faster and cheaper than before. This need for SIEM augmentation is driven by three factors - threat, scale, and cost. In addition most agencies now realise that they want to be more independent from cloud and SIEM vendors, so having a way to bring on new analytic destinations, including modern SIEMs, is an emerging requirement.



    Confluent enables you to bridge the gap between old-school SIEM solutions and next-gen offerings by consolidating, categorising and enriching event logs, network data and log data generated by all relevant data sources for the purpose of real-time monitoring and security forensics.



    Join the Confluent team for this online talk in which you will learn:

    1. The top enterprise SIEM challenges facing government agencies
    2. The value of augmenting your SIEM with Confluent
    3. An overview of the Confluent solution and how it works
    4. Customer use cases and examples
    5. An integration example with Splunk
    6. The role Kafka plays in all of this
  • Transforming Financial Services with Event Stream Data Recorded: Feb 25 2021 44 mins
    Ananda Bose, Senior Solutions Engineering Manager, APAC, Confluent
    A recent research (attached) shows that nearly all financial services organizations (97%) consider it important to accelerate speed the flow of information and improve the responsiveness of the organization. With the advent of streaming data technologies that capture and process large volumes of data in real time, organizations can quickly turn events into valuable business outcomes in the form of new products and services or revenue.

    Banks such as Citigroup, Bank Rakyat Indonesia, RBC, DBS, Bank Mandiri and Euronext have partnered with Confluent to identify organizational or revenue producing improvements and develop new products or services with event streams.

    Attend this live FORUM as we delve deeper into the rise of streaming data ; how it is transforming the financial services industry by providing the critical, timely information needed to :
    - Deliver world-class customer service
    - Detect and prevent fraud,
    - Tie together disparate legacy systems
    - Power new services and sources of revenue.

    Agenda
    - Welcome and Introductions
    - Keynote: Transforming Financial Services with Event Stream data
    - Top Use Cases in Financial Services
    - Q & A
  • The Top 5 Event Streaming Use Cases & Architectures in 2021 Recorded: Feb 17 2021 36 mins
    Kai Waehner, Technology Evangelist, Confluent
    With just a few weeks of 2020 left, it's time to make some predictions on the top event streaming use cases that the Confluent Team expects to see in 2021. Given the unpredictability of 2020 this may seem brave or even a little fool-hardy but Kai Waehner, Senior Solutions Engineer at Confluent, is willing to strike out and offer his predictions for next year.


    Register now to access this fascinating online talk in which Kai will discuss his top five cutting-edge use cases and architectures that will be adopted by more and more enterprises in 2021. Learn how companies will leverage event streaming, Apache Kafka, and Confluent to meet the demand of a real-time market, rising regulations, and customer expectations, and much more in 2021:

    1. Edge deployments outside the data center: It's time to challenge the normality of limited hardware and disconnected infrastructure. Event streaming can provide low latency and cost-efficient data integration and data processing in retail stores, restaurants, trains, and other remote locations.


    2. Hybrid architectures: Discover how these span multiple sites across regions, continents, data centers, and clouds with real-time information at scale to connect legacy and modern infrastructures.



    3. Service mesh-based microservice architectures: Learn what becomes possible when organisations can provide a cloud-native event-based infrastructure for elastic and scalable applications and integration scenarios.

    4. Streaming machine learning: In 2021, many companies will move to streaming machine learning in production without the need for a data lake that enables scalable real-time analytics.

    5. Cybersecurity: While security never goes out of style, in 2021 we will see cybersecurity in real-time at scale with openness and flexibility at its core. This protects computer systems and networks and prevents the theft of or damage to software and data.
  • The New Cyber: Faster, Better, Cheaper Recorded: Feb 4 2021 30 mins
    Johnny Varley, Central Government Solutions Engineer at Confluent
    Government agencies face the need to augment traditional SIEM systems and to do so in a manner that is better, faster and cheaper than before. This need for SIEM augmentation is driven by three factors - threat, scale, and cost. In addition most agencies now realise that they want to be more independent from cloud and SIEM vendors, so having a way to bring on new analytic destinations, including modern SIEMs, is an emerging requirement.



    Confluent enables you to bridge the gap between old-school SIEM solutions and next-gen offerings by consolidating, categorising and enriching event logs, network data and log data generated by all relevant data sources for the purpose of real-time monitoring and security forensics.



    Join the Confluent team for this online talk in which you will learn:

    1. The top enterprise SIEM challenges facing government agencies
    2. The value of augmenting your SIEM with Confluent
    3. An overview of the Confluent solution and how it works
    4. Customer use cases and examples
    5. An integration example with Splunk
    6. The role Kafka plays in all of this
Confluent: Data in motion.
Confluent is building the foundational platform for data in motion. Our cloud-native offering is designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organisation. With Confluent, organisations can create a central nervous system to innovate and win in a digital-first world.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: The New Cyber: Faster, Better, Cheaper
  • Live at: Feb 4 2021 4:05 pm
  • Presented by: Johnny Varley, Central Government Solutions Engineer at Confluent
  • From:
Your email has been sent.
or close