Having an issue with large logs. I'm logging CURLs...
# dagster-feedback
a
Having an issue with large logs. I'm logging CURLs at the end of my pipe which includes the data being submitted. It seems its to big and the logs get corrupted and I get GraphQL error. Perhaps some sort of guard that truncates the logs when they get too big would be good? I'm getting errors and can't see the reason as the logs are knocked out
Oh, i'm using 1.0.15
s
Hmm, sounds like something we should fix. Would you mind posting a little code to help us repro the problem?
a
yeah, i don't have an example coz the logs are corrupted 😛 I was using the curlify module to generate a CURL statement (as our Laravel guys want one when we have a problem submitting data) but this includes the bearer token and all the data to save for one request which can be a lot (we're batching for now). So just log a lot of data and you'll get the problem