Hi all, I’m struggling a little with importing cus...
# announcements
r
Hi all, I’m struggling a little with importing custom modules in my
example_pipeline.py
file (I’m using
import src.load_and_store
in the
example_pipeline.py
file (which is in the same directory as
src
). This works correctly when running the pipeline using the Python API (through
execute_pipeline
in _`__name__ == "__main__"`_ in
example_pipeline.py
) and when running it through
dagit
with a
workspace.yaml
file:
Copy code
load_from:
  - python_file: example_pipeline.py
However, when I try run
dagster pipeline execute -f example_pipeline.py
(from the same directory) I receive the error
ModuleNotFoundError: No module named 'src'
. I’m sure this is a fairly basic import error, but could someone please provide a solution/best practice for this?
a
so to resolve you can do
dagster pipeline execute -f example_pipeline.py -d ./
where
-d
is setting the “working directory”
I think this points at our default behavior being inconsistent between
dagit
and the
dagster
cli
r
Fantastic, thanks! If you’re suggesting that the default behaviour of
dagster pipeline…
should be to use the current working directory as the working directory (without specifying the
-d
flag), then I think that seems more intuitive (but I’m not sure what other implications that may have.
👍 2
d
Hi Richard, thanks for reporting this. A quick fix to make the working directory have a reasonable default should be live in our next release, going out tomorrow.
👍 2