Hi, I have a strange problem. We are trying to run...
# announcements
d
Hi, I have a strange problem. We are trying to run a pipeline from a jenkins job(
pipenv run dagster pipeline execute -n ...
), running the same command locally works fine but in the jenkins job all the INFO messages are missing, and so is the rest of stdout/stderr from other printouts etc.
After some more investigations the issue seems to be with running dagster from inside a docker container. We've tried running the same command in latest alpine, amazon linux and python docker images, all with the same behaviour. Again, running the same command on my mac is working as expected, all log messages and printouts are sent to the console.
m
interesting -- do you see any output at all in the docker logs?
d
Yes, DEBUG messages from dagster itself but no messages from the solids I've created. I use the context.log logger to log, also tried regular print but same result.
Messages containing
ENGINE_EVENT
with e.g.
Launching subprocess
and
Starting initialization
If you have any ideas of what I might try that'd be great. We are in different time zones so you will be asleep when I'm back to work tomorrow.
m
are you central european time?
@nate is attempting to reproduce this
is it possible that you're using a docker logging driver to redirect docker logs somewhere?
n
hey @Daniel Olausson - this is the repro case I set up https://gist.github.com/natekupp/ca98b61d843b849a555ffc9a721bbd9d - but the logs look as I expect. Let us know if there’s something substantially different in the way you’re invoking execution
d
I took your example and added our setup bit by bit and I came to a point where it stopped behaving nice.
We need to dynamically change the number of inputs to the solid we have defined, that's why we have a solid factory.
m
thanks for this, I think I've found your issue
you have an extra (unnecessary) level of indirection in your solid factory, and we aren't warning about this in the pipeline function
but i think if you use the following instead you'll be fine:
Copy code
def solid_factory(name):
    @solid(required_resource_keys={"my_mode"})
    def chatty_solid(context):
        <http://context.resources.my|context.resources.my>_mode.do_the_thing(context)

    return chatty_solid.alias(name)
d
Why does it work on my local machine then and not in the docker container? In our case it's not really unnecessary because we want to achieve flexibility in the number of inputs. This is more or less exactly what our solid factory really looks like:
Copy code
def task(name):
    def fn(*inputs):
        input_map = {f'{i.solid_name}': i for i in inputs}
        input_defs = [
            InputDefinition(name=input_name, dagster_type=Optional[Bool], default_value=None)
            for input_name in input_map.keys()
        ] if inputs else None

        @solid(name=name,
               input_defs=input_defs or [InputDefinition('start', Nothing)],
               output_defs=[OutputDefinition(name='out0', dagster_type=Bool)]
               )
        def _x_solid(context, **_ins) -> Bool:
            # do stuff
            pass

        return _x_solid.alias(name)() if not inputs else _x_solid.alias(name)(**input_map)

    return fn
We do this so that we can do e.g. this:
Copy code
a = task('a')
b = task('b')
c = task('c')
d = task('d')
# etc..

a_out = a()
b_out = b()
c_out = c(a_out, b_out)
d_out = d(a_out, c_out)
m
@Daniel Olausson is the same code running locally but not in docker, or is the code different?
it might be easier to keep track of some of this stuff by using the
SolidDefinition
API directly rather than the
@solid
decorator
d
It's the exact same code.
m
are the versions of dagster running inside the container and outside the same?
this runs for me