For some of our runs we are getting following erro...
# ask-community
f
For some of our runs we are getting following error. Any ideas what would cause this?
Copy code
OverflowError: timestamp too large to convert to C _PyTime_t
  File "/dagster-cloud/dagster_cloud/agent/dagster_cloud_agent.py", line 828, in _process_api_request
    api_result = self._handle_api_request(
  File "/dagster-cloud/dagster_cloud/agent/dagster_cloud_agent.py", line 718, in _handle_api_request
    launcher.launch_run(LaunchRunContext(pipeline_run=run, workspace=None))
  File "/dagster-aws/dagster_aws/ecs/launcher.py", line 311, in launch_run
    run_task_kwargs = self._run_task_kwargs(run, image, container_context)
  File "/dagster-aws/dagster_aws/ecs/launcher.py", line 501, in _run_task_kwargs
    backoff(
  File "/dagster/dagster/_utils/backoff.py", line 59, in backoff
    time.sleep(next(delay_generator))
We are running hybrid using AWS (ECS).
r
looks like exponential back-off value has got so big it's overflowed python's time struct
d
Hi Frederik - I believe this is happening due to a bug that we fixed in a recent dagster version if upgrading your agent is an option.
I would also expect restarting the agent task to help as a temporary mitigation
f
Thank you very much! We will try and do that 🙂