hey, i just ran into a weird issue when running a ...
# ask-community
p
hey, i just ran into a weird issue when running a job locally through dagit where dagster seemingly can’t find the temporary directory being used for an op. any ideas?
1
appears to be thrown right at the start of op execution if that helps
c
Can you send a code snippet?
p
oof hold on i just realized the error was being thrown by code i’d copy-pasted from
dagster-shell
and modified a little. i’ll share a snippet after i’ve made sure i’m not just doing something stupid haha
c
If I had to guess, it's possible that the temp directory is being destroyed before the op is executing
p
okay so the error is being thrown on line 40 of this fn (copied from
dagster-shell
) which is called inside a context manager with that file open. it was working fine until i tried to yield an
AssetMaterialization
within that function on line 79, and works again once i take out 75-79
the command i’m running with that function loads some data to a bigquery table and also logs the load job result from bigquery. i’m trying to parse the log message to extract metadata about the load job and table and yield an asset materialization
c
Are you creating that temporary directory? Or is that temp directory being created by some code that you didn't write
I'm wondering if the yield of the asset materialization is somehow causing you to reach a state where the temp directory is deleted, but it's hard to tell just looking at this function
p
i’m creating it here and then calling the function i pasted above on line 6. and that seems plausible, instead of yielding the asset materialization while the tempfile is open i can try returning a list of materializations and yielding them later
c
Okay yea I think this is a result of the fact that you are returning instead of yielding here. Try
yield from execute_script_file(...)
instead
p
okay i’ll try that!
That worked btw, thanks!
c
Great!