an you give me an example of adding a custom logge...
# ask-ai
j
an you give me an example of adding a custom logget to the dagster adding custom logger to Definitions
s
Certainly! The documentation provides an example of how to define a custom logger using the
@logger
decorator and then how to add it to a job's Definitions. Here's a step-by-step explanation based on the example from the documentation: 1. Define the Custom Logger: You start by defining a function that will create your custom logger. This function should take a single argument,
init_context
, which is available during logger initialization. The function should return an instance of
logging.Logger
. 2. Use the
@logger
Decorator
: You decorate the function with
@logger
, which allows you to specify configuration schema for your logger. This schema can include fields such as
log_level
and
name
, which can be configured by users when they run a job. 3. Return the Logger Instance: Inside the function, you configure and instantiate your logger, set up any handlers or formatters you need, and then return the logger instance. 4. Add the Logger to Job Definitions: Once you have defined your custom logger, you can add it to a job by passing a dictionary to the
logger_defs
argument of the
@job
decorator. The keys in this dictionary are the names you want to use for your loggers within the job, and the values are the logger functions you defined. Here is an example from the documentation:
Copy code
python
import logging
import json
from dagster import logger, Field, job, op

@logger(
    {
        "log_level": Field(str, is_required=False, default_value="INFO"),
        "name": Field(str, is_required=False, default_value="dagster"),
    },
    description="A JSON-formatted console logger",
)
def json_console_logger(init_context):
    level = init_context.logger_config["log_level"]
    name = init_context.logger_config["name"]

    klass = logging.getLoggerClass()
    logger_ = klass(name, level=level)

    handler = logging.StreamHandler()

    class JsonFormatter(logging.Formatter):
        def format(self, record):
            return json.dumps(record.__dict__)

    handler.setFormatter(JsonFormatter())
    logger_.addHandler(handler)

    return logger_

@op
def hello_logs(context):
    <http://context.log.info|context.log.info>("Hello, world!")

@job(logger_defs={"my_json_logger": json_console_logger})
def demo_job():
    hello_logs()

# Now you can execute the job, and it will use your custom logger
demo_job.execute_in_process()
In this example,
json_console_logger
is a custom logger that outputs logs in JSON format to the console. It is added to the
demo_job
under the name
my_json_logger
. When
demo_job
is executed, it will use this custom logger to log the message "Hello, world!" from the
hello_logs
op. Please note that the actual code you use may vary depending on your specific requirements for logging. This example is meant to illustrate
the process as described in the documentation.
j
@Scout Ok but I want to add the logger to Definitions - so it runs for all assets
s
To add a custom logger to
Definitions
so that it applies to all assets within a code location, you can follow these steps: 1. Define the Custom Logger: Similar to the previous example, you define a custom logger using the
@logger
decorator. 2. Create a
Definitions
Object
: You instantiate a
Definitions
object and pass your custom logger in the
loggers
argument. This will apply the logger to all jobs and assets within the code location. Here's an example based on the documentation:
Copy code
python from dagster import logger, Field, Definitions, define_asset_job, asset  # Define your custom logger @logger(config_schema={"log_level": Field(str, is_required=False, default_value="INFO")}) def json_console_logger(init_context):     # Configure your logger here, similar to the previous example     ...  # Define an asset @asset def some_asset():     # Asset computation logic here     ...  # Define a job for your assets the_job = define_asset_job("the_job", selection="*")  # Create a Definitions object with your custom logger defs = Definitions(     jobs=[the_job],     assets=[some_asset],     loggers={"json_logger": json_console_logger} )  # When you load this code location, the custom logger will be applied to all assets
In this example,
json_console_logger
is the custom logger that you want to apply to all assets. The
Definitions
object is created with
jobs
and
assets
that you want to include in the code location, and the
loggers
argument is used to specify the custom logger. When you load this code location, Dagster will use the
json_console_logger
for all the assets and jobs defined in the
defs
object. If loggers are explicitly specified at the job level, they will override the loggers provided to the
Definitions
object. Please ensure that you are using the correct version of Dagster as the APIs may change over time, and this feature should be supported in the version you are using.
j
@Scout This is how I have set things up but my custom logger is never called!
s
If your custom logger is not being called, there are a few common issues that you might want to check: 1. Logger Configuration: Ensure that your logger is correctly configured in your
Definitions
object. Double-check the name you've given to your logger and ensure it matches the name used in the job or asset definitions. 2. Logger Registration: Verify that the
Definitions
object with your custom logger is correctly registered in your code location. If you're using a repository.yaml file, ensure that it points to the module where your
Definitions
object is defined. 3. Logger Usage: Make sure that your job or asset is actually producing log messages. You can test this by temporarily using a built-in logger and seeing if any log messages are output. 4. Dagster Configuration: Check your
dagster.yaml
configuration file to ensure that there are no conflicting logger settings that might be overriding your custom logger. 5. Code Location: Confirm that the code location where your
Definitions
object is defined is the one being loaded by Dagster. If you have multiple code locations or repositories, it's possible that the wrong one is being used. 6. Dagster Version: Ensure that you are using a version of Dagster that supports the features you are trying to use. If necessary, consult the documentation for the version of Dagster you are using to confirm that your approach is correct. 7. Error Handling: Check for any errors in the initialization of your logger. If there are issues in the
@logger
decorated function, it might prevent the logger from being properly instantiated. 8. Execution Context: If you are running your job or asset through Dagit or some other execution environment, ensure that the environment is correctly configured to use your custom logger. If you've checked all of these and the logger is still not being called, you may want to add debug statements or breakpoints in your logger definition to ensure that the code is being reached and to inspect the
init_context
for any clues. If the issue persists, you might want to consult the documentation for any additional configuration steps, or reach out to the Dagster community for support, as there might be other less common issues at play.