Hi there, I need some help. I have a Dagster (v1.2...
# ask-community
s
Hi there, I need some help. I have a Dagster (v1.2.4) job to run a meltano command to load data from S3 bucket to Postgresql database. The file size is 50MB. When I ran the job, it failed with following error. I would like to know how to solve my issue. Thanks in advance. Here is the run_launcher:
Copy code
run_launcher:
  module: dagster_aws.ecs
  class: EcsRunLauncher
  config:
    include_sidecars: true
    secrets_tag: "" 
    run_resources:
      cpu: "256"
      memory: "512" # In MiB
      ephemeral_storage: "128" # In GiB
The error message:
Copy code
Multiprocess executor: child process for step load_update_data was terminated by signal 9 (SIGKILL). This usually indicates that the process was killed by the operating system due to running out of memory. Possible solutions include increasing the amount of memory available to the run, reducing the amount of memory used by the ops in the run, or configuring the executor to run fewer ops concurrently.
dagster._core.executor.child_process_executor.ChildProcessCrashException

Stack Trace:
  File "/usr/local/lib/python3.9/site-packages/dagster/_core/executor/multiprocess.py", line 240, in execute
    event_or_none = next(step_iter)
,  File "/usr/local/lib/python3.9/site-packages/dagster/_core/executor/multiprocess.py", line 357, in execute_step_out_of_process
    for ret in execute_child_process_command(multiproc_ctx, command):
,  File "/usr/local/lib/python3.9/site-packages/dagster/_core/executor/child_process_executor.py", line 174, in execute_child_process_command
    raise ChildProcessCrashException(exit_code=process.exitcode)