CP4D has so many components that run simultaneously, however, the orchestration is not there. In Data & AI platforms there should be an orchestration framework that gives the ability to build and maintain workflows. Hence, we can automate the end to end pipeline as described in all CP4D customer presentations
There are a countless number of use-cases for workflow orchestration in Data & AI. Having such a feature in CP4D would give it an edge over the competitors.
Competition: Apache Oozie is a perfect example for such a workflow framework. Oozie is available on AWS, Databricks and Azure cloud services.
|Who would benefit from this IDEA?||Customer|
How should it work?
Here is an example,
there is a data pipeline that is built using Data Stage to bring some data in CP4D, then that data should be consumed by some python scripts to build models or whatever task it is, then deploy to WKC. To run such a workflow, we count on the jobs in APIs which are not efficient. Workflows should be responsible for having this "automation" of the pipeline, to execute specific scripts once the data is available and deploy new models or features.
|Priority Justification||Customers are asking about such a feature. Most of IBM competitors do have that feature.|
NOTICE TO EU RESIDENTS: per EU Data Protection Policy, if you wish to remove your personal information from the IBM ideas portal, please login to the ideas portal using your previously registered information then change your email to "email@example.com" and first name to "anonymous" and last name to "anonymous". This will ensure that IBM will not send any emails to you about all idea submissions