In this webinar, independent journalist Pat Brans interviews Martin Hulbert, Technical Director at Ignite Technology.
Martin explains Data Pipelines, the stream of data across an organisation from start to finish and Data Pipeline Orchestration which is about orchestrating all the individual components that make up the pipeline to allow it to flow from multiple sources, through multiple systems to its final destination.
Because processes are inherently isolated, and visibility is at a minimum, it is nearly impossible to orchestrate a data pipeline manually. The dawn of APIs has made it easier to connect to multiple systems, but the problem of coordinating disparate processes and jobs still persists.
We will discuss regulatory compliance—especially in financial services—billing in utility companies, and order processing in retailing. However, virtually every organisation in the world has at least one data pipeline that underlies some critical process and needs to be orchestrated.
Lack of remediation or slow remediation can cause problems and lead to a loss of time, missed deadlines, and potentially lost customers. The process of tracking back through the pipeline to identify a problem often requires escalation all the way up to the C-level, because the processes span organisation—and even require cooperation outside of the company.
Orchestration platforms can solve many of these challenges by coordinating processes from a central location. A spider-like tool with a central administration point and agents reaching out to different systems can be used to automate data pipeline orchestration. The tool frees up time for administrators and allows them—and everybody else involved—to turn their attention to more strategic tasks, whilst building in reliability and resilience.