Traditional systems were designed in an era that predates large-scale distributed systems. These systems often lack the ability to scale to meet the needs of the modern data-driven organisation. Adding to this is the accumulation of technologies and the explosion of data which can result in complex point-to-point integrations where data becomes siloed or separated across the enterprise.
The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge. Apache Kafka and the Confluent Platform are designed to solve the problems associated with traditional systems and provide a modern, distributed architecture and Real-time Data streaming capability. In addition these technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .
By attending this talk you will develop a new understanding of:
•How Apache Kafka enables a 360 view of the customer
•How to provide a back bone for distribution of trade data
•How Kafka and Confluent Platform enable you to meet regulatory requirements for trade information, payments, liquidity
•How to overcome security concerns with SIEM
•How to integrate mainframe data with event streaming and the cloud
•How to reduce fraud with real-time fraud processing, fraud analytics and fraud notifications.
•How to Develop and enhance microservices.