Ken Johnson, Vice President, Application Services Business Unit, Red Hat
Whether you’re trying to better manage fraudulent claims, citizen interaction, respond to weather events or health crises, mitigate cyber attacks or route fleets of equipment and vehicles, a comprehensive, real-time event processing portfolio gives you the flexibility to produce sophisticated solutions with assurance of stability, simplicity, and proven enterprise patterns. Developing a solution that reacts and produces insights to real-time events requires an event driven architecture built to run across agency and infrastructure boundaries. The speed and volume of streaming data requires a high performance and scalable data event processing architecture and adds a layer of complexity regarding fault tolerance, data durability, and scalability. If properly analyzed, these real-time data streams can identify pain points, successes, behavioral habits, and provide real-time situational awareness. Critical to the adoption of these solutions are their innate portability. They must be able to run anywhere (on-premise, cloud, or hybrid cloud) and must be designed as decoupled, microservices, deployed within a robust container orchestration platform.
Want to continue the conversation? Have more questions about mission applications? Join us again on Thursday, August 20 for Public Sector Social Hours, where we will demonstrate a complete fraud detection application, running on OpenShift and leveraging Kafka, KNative-based Serverless, Istio and other modern techs and architectures. The panel of experts will discuss key points of the demo and ask architectural questions. Attendees are also encouraged to ask questions. Register: https://events.redhat.com/profile/form/index.cfm?PKformID=0x211262abcd&sc_cid=7013a000002goyuAAA