Hi all,
I am working on a project where in docker I am creating 3 different containers :
1.Dagit UI,
2.my code container(where all my pipelines are listed in repository.py)
3. dagster daemon container
My code container has all the pipelines listed in repository, which I am executing using graphql submit_pipeline_execution function. I have a separate Dagster.yaml file
as well which indicates all my settings to store event logs in postgres database. Before executing the pipelines using graphql client, we used to
execute_pipeline directly and inside solids we have our customised logger functions to store all the activities happening inside the solids.
On integrating the graphql client with DAGIT ui, my customised logger functions are not working, seems it is not going inside those functions itself.
context.log.info is working properly as I can see the event logs with Run ID captured in postgres database. Is there any way where in I can have a my event logs and
all my logging info statements getting logged in my single log file?
dagster.yaml:
run_storage:
module: dagster_postgres.run_storage
class: PostgresRunStorage
config:
postgres_db:
hostname:
env: DAGSTER_POSTGRES_HOST
username:
env: DAGSTER_POSTGRES_USER
password:
env: DAGSTER_POSTGRES_PASSWORD
db_name:
env: DAGSTER_POSTGRES_DB
port: 5432
event_log_storage:
module: dagster_postgres.event_log
class: PostgresEventLogStorage
config:
postgres_db:
hostname:
env: DAGSTER_POSTGRES_HOST
username:
env: DAGSTER_POSTGRES_USER
password:
env: DAGSTER_POSTGRES_PASSWORD
db_name:
env: DAGSTER_POSTGRES_DB
port: 5432
My customised log function:
log_info_statement(p_in_log_file=l_log_file, p_in_statement='Exiting', p_in_function=l_function)-----
calling inside the solid
def log_info_statement(p_in_log_file, p_in_statement, p_in_function):
if not logger.handlers:
handler = logging.FileHandler(
l_log_file)
formatter = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
handler.setFormatter(formatter)
logger.addHandler(handler)
logger.info(f'{str(datetime.now())} - {l_function}')
logger.info(f'\t{l_statement}\n')
Dockerfile of my code container:
CMD dagster api grpc -h 0.0.0.0 -p 3001 --working-directory $(pwd) -f ./orchestration_manager/repositories/datamax_etl_repository.py