n
Untitled
d
Hi Nick - are you adding a secret that includes the AWS_DEFAULT_REGION env var? From the error message it looks like that's what ECS is unhappy with
n
Yes, upon further inspection, I had that variable as a duplicate, both in the task definition and AWS with the dagster tag. I am able to launch my tasks with the launcher now, but I need to execute a command with the task that is launched, like how you can configure a command on the task definition itself. Do you know where this can be configured?
d
We don’t currently support customizing the command that’s launched - the expectation is that you’re using it to launch a particular dagster run, so we set the command to do that. You could call out to the ECs api inside one of your ops to run a second ecs task though.
Unless you mean there’s another command you want to run before the job starts?
n
Yes, there would be another command I want to run before the job starts. Is that possible?
Specifically I
Specifically I'd want to execute a shell script on the new task before my job is executed.
d
Here's a discussion that goes into this https://github.com/dagster-io/dagster/discussions/12687
so you could set an ENTRYPOINT like the one in that example in your Dockerfile
n
The entrypoint is working as expecting in that it is allowing me to run my script, but now I'm seeing a strange behavior where the task exists with code 0 after the script is done executing, leaving the dagster run hanging open. The code for my pipeline never ends up executing. Any suggestions?
d
Can you share the ENTRYPOINT that you ended up using - was it the one from the linked discussion?
n
Yes. I used the same ENTRYPOINT command referenced in your linked discussion but executing my own .sh script. I noticed that there is another command in your example
Copy code
exec "$@"
Is this required in the dockerfile to allow the run to execute?
d
Yeah, that's required - that's what keeps it running the CMD that dagster supplies
n
Gotcha. I have one last wrinkle that I can hopefully get some help with. I want to pass the "dagster api grpc" command via the command option in my ecs task definition. But now after adding that "$@" to the entrypoint .sh file, the command will not properly execute when the task stands up: ./code_repository.sh: line 37: exec: dagster api grpc --package-name rdbms_code_repository.repositories -h 0.0.0.0 -p 4322: not found
d
Is it possible to share the task definition json?
Both of a run that’s working and of a server that isn’t? Maybe there’s a clue in the difference between the two? both should work fine