https://dagster.io/ logo
Title
n

Nikhaar Gupta

03/22/2022, 4:40 PM
Hi - I'm running into issues importing an internal python module that I wrote. I have a directory structure like this: test1 dagster_cloud_template init.py repo.py internal_util util_script1.py Dockerfie Let's say util_script1.py contains a function called internal_function1. I am trying to import this function inside repo.py. Running dagster locally, I find that: • "from internal_util.util_script1 import internal_function1" - this import works • "from .internal_util.util_script1 import internal_function1" - this import does not works From dagster cloud, it's reversed. Note that locally, I am not actually running anything through docker. I just launch dagit directly pointing to the repo.py file. Dagster cloud is running through the docker container. Perhaps this is causing the difference. Do you know if there's a good way to keep the import code consistent between running locally vs the cloud? Or should I write conditional code for the import?
d

daniel

03/22/2022, 4:44 PM
Hi Nikhaar - what error message are you seeing when it isn't working?
And what does your Workspace tab look like in cloud, how is your code location configured?
n

Nikhaar Gupta

03/22/2022, 4:56 PM
So this is the folder structure with the actual names (my initial message was just to illustrate): test1 dagster_cloud_template init.py s3_test.py spf_util aws_secrets.py Dockerfie s3_test is the repository and the function inside aws_secrets.py is called get_secret_op
Here is the error message in the cloud:
the cloud is running through docker container built on amazon ECR
location_name: spf-ecr-test1 image: "ecr image tag" code_source: package_name: dagster_cloud_template
(this is the configuration we used in dagster cloud to deploy)
and here is the error message locally:
d

daniel

03/22/2022, 5:29 PM
And when you run dagit locally and it works, what command do you use?
n

Nikhaar Gupta

03/22/2022, 5:57 PM
locally, I just point directly to the script: dagit -f "s3_test.py"
note that in the cloud, I load the repository in the init file (from .s3_test import s3_read_test_repo)
d

daniel

03/22/2022, 6:13 PM
What if you run
dagit -m dagster_cloud_template
? I think that would more closely map to how you are running it within cloud
n

Nikhaar Gupta

03/22/2022, 6:41 PM
Thanks!! Yes that does work and let's me keep the import consistent between dagster local and dagster cloud. I'm also trying to run the s3_test.py code locally in python using pycharm without launching the dagster UI (this is my preferred way to develop code, before running it in dagster). So right now, the import that works for dagster in the s3_test.py script is "from .spf_util.aws_secrets import get_secret_op". However, when I run the script in pycharm, it breaks: Traceback (most recent call last): File "/opt/project/test1/dagster_cloud_template/s3_test.py", line 9, in <module> from .spf_util.aws_secrets import get_secret_op ImportError: attempted relative import with no known parent package The other way to import does work locally in pycharm, but not dagster as we discussed ("from spf_util.aws_secrets import get_secret_op"). Do you know if there is a way to keep the import consistent between this and dagster?
d

daniel

03/22/2022, 6:44 PM
Does putting an empy ___init___.py file in spf_util help? so that python realizes it can be loaded as a package?
n

Nikhaar Gupta

03/22/2022, 6:46 PM
no, I actually have that and it isn't working (forgot to include that in my directory structure in earlier messages)
d

daniel

03/22/2022, 6:48 PM
I'm not a pycharm expert, but is there a way to tell it to load things as a package rather than as a file? I think the root cause of the trouble here is some codepaths loading it as a file while others loading it as a module/package (we prefer the latter in dagster when possible b/c its less likely to result in weird import errors like this one)
n

Nikhaar Gupta

03/22/2022, 6:49 PM
gotcha, I'll take a look and try to take it from here. Appreciate your help!
:condagster: 1