I'm having some trouble with backfilling my partit...
# announcements
c
I'm having some trouble with backfilling my partitioned pipeline and was wondering if someone could help me debug this. My partitions are defined as follows:
Copy code
import dagster
import datetime


def run_config_for_date_partition(partition):
    date = partition.value
    return {"solids": {"query_telemetry_events": {"config": {"date": date}}}}


test_sessions_partition = dagster.PartitionSetDefinition(
    name="test_sessions_partition",
    pipeline_name="test_sessions",
    partition_fn=dagster.utils.partitions.date_partition_range(
        start=datetime.datetime(2021, 2, 16),
        delta_range="hours",
        inclusive=True,
        fmt="%Y-%m-%d-%H",
    ),
    run_config_fn_for_partition=run_config_for_date_partition,
)
When I try to run a backfill from the Dagit UI (
v0.9.21
) I get the following error:
Copy code
2021-02-16T22:25:40.046372Z [error    ] Exception calling application: Object of type Pendulum is not JSON serializable [grpc._server]
Traceback (most recent call last):
  File "/Users/caleb/Library/Caches/pypoetry/virtualenvs/panopticon-M-mcCUTC-py3.8/lib/python3.8/site-packages/grpc/_server.py", line 435, in _call_behavior
    response_or_iterator = behavior(argument, context)
  File "/Users/caleb/Library/Caches/pypoetry/virtualenvs/panopticon-M-mcCUTC-py3.8/lib/python3.8/site-packages/dagster/grpc/server.py", line 385, in ExternalPartitionSetExecutionParams
    serialized_external_partition_set_execution_param_data_or_external_partition_execution_error=serialize_dagster_namedtuple(
  File "/Users/caleb/Library/Caches/pypoetry/virtualenvs/panopticon-M-mcCUTC-py3.8/lib/python3.8/site-packages/dagster/serdes/__init__.py", line 227, in serialize_dagster_namedtuple
    return _serialize_dagster_namedtuple(
  File "/Users/caleb/Library/Caches/pypoetry/virtualenvs/panopticon-M-mcCUTC-py3.8/lib/python3.8/site-packages/dagster/serdes/__init__.py", line 213, in _serialize_dagster_namedtuple
    return seven.json.dumps(_pack_value(nt, whitelist_map), **json_kwargs)
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/json/__init__.py", line 234, in dumps
    return cls(
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/json/encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/json/encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/json/encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type Pendulum is not JSON serializable
Not sure how to go about fixing this b/c I could only get partitions working by using
date_partition_range
but it seems to be returning the wrong data type. Any help is greatly appreciated!
a
my guess is that in
run_config_for_date_partition
you need to cast
date
to
str
c
Nice! That did the trick 😄 Still having trouble launching the backfill though. When I launch the run nothing happens. Is it absolutely required that I use something other than the default run launcher? I'm just running dagit locally along the lines of
dagit -h 0.0.0.0 -w ./workspace.yaml
a
nothing happening is unexpected - do you see any error output anywhere?
c
Nope. No errors in my terminal and nothing in the runs tab
a
js console?
c
Nope
I added print statements to
run_config_for_date_partition
and I'm seeing those in the terminal but that is it
p
@Caleb Schoepp do you have time to hop on a call to debug? We need to do a better job of surfacing errors here…
c
@prha sure I'm free
p
Tracking the error handling here: https://github.com/dagster-io/dagster/issues/3695