Hi Fred. That’s a great point. So I’d love to flesh this out a bit more. Currently Dagster actually does store the outputs of solids as intermediates which means if you have the run storage set up you can theoretically run parts of pipelines over and over again in dagit. However, you are totally right in that if you are tweaking config (IE your hyperparameter search space) between runs then you lose everything. As a result, why not just split your code into two pipelines. One which handles feature generation and another which grabs a dataset and trains a model? That way you run the first pipeline once and then you can run your training pipeline over and over again.