Keith Babo, Jeffrey Bride and Ken Johnson
Whether you’re trying to better manage fraudulent claims, citizen interaction, respond to weather events or health crises, mitigate cyber attacks or route fleets of equipment and vehicles, a comprehensive, real-time event processing portfolio gives you the flexibility to produce complex use-cases with assurance of stability, simplicity, and proven enterprise patterns. Developing a solution that reacts and produces insights to real-time events requires an event driven architecture built to run across agency and infrastructure boundaries. The speed and volume of streaming data requires a high performance and scalable data event processing architecture and adds a layer of complexity regarding fault tolerance, data durability, and scalability. If properly analyzed, these real-time data streams can identify pain points, successes, behavioral habits, and provide real-time situational awareness. That analysis, and reactive spun-up automated processes, is achieved through self-learning AI with decisioning logic, optimized for real-time data. Critical to the adoption of these solutions are their innate portability. They must be able to run anywhere (on-premise, cloud, or hybrid cloud) and must be designed as decoupled, microservices, deployed within a robust container orchestration platform.
Keith Babo, Consulting Product Manager, Middleware, Red Hat
Jeffrey Bride, Business Automation Specialist, Emerging Technology Domain Architects, Red Hat
Ken Johnson, Vice President, Application Services Business Unit, Red Hat