I'm one of the maintainers of Kedro.
@Ryan Carlson was right to share that article. We have a completely different aim.
We focus on the problem of workflow standardisation when you're trying to create data science code that is maintainable because you've thought about what software engineering convention for DS code looks like. Our users dig us if they've had issues around trying to take code into production because:
1. They were too heavily reliant on Jupyter notebooks
2. They mish-mash tons of script and create their own CLIs and project structures that are difficult to maintain
I suspect you understand problem 2 very well, because of how you described what you're looking for. Kedro also determines the running order for your pipeline, so that's a worry off your shoulders.
We are not an orchestrator though, and this is why we believe in a workflow that starts in Kedro but ends with any tool that will help you schedule and orchestrate your pipeline runs and we've found that Dagster, Prefect and Airflow are perfect for that. We've even started creating a series of deployment docs in our latest sprint, our
Prefect one has been completed as an example. The Astronomer team is also picking up the Kedro-Airflow plugin. Hope this helps, shout if you have more questions.