is it possible to have the compiled dbt sql appear...
# ask-ai
d
is it possible to have the compiled dbt sql appear in the asset materialization logs?
s
Yes, it is possible to have the compiled dbt SQL appear in the asset materialization logs. According to the documentation, you can achieve this by creating a custom
DagsterDbtTranslator
with a
get_description
method that takes the
compiled_code
from the dbt manifest and sets it as the description for your dbt assets. This requires that you create your dbt manifest using
dbt compile
, so that the compiled code will be present in the dbt manifest. Here are the steps you would follow based on the documentation: 1. Ensure that you create your dbt manifest using
dbt compile
. This way, the compiled code will be available in the dbt manifest. 2. Define a custom
DagsterDbtTranslator
with a
get_description
method that extracts the
compiled_code
from the dbt manifest. 3. Set the
compiled_code
as the description for your dbt assets in Dagster. This approach allows you to customize the descriptions of your dbt assets to include the compiled SQL, which will then be visible in the asset materialization logs within the Dagster UI.
d
@Scout how would i do that if the vars argument in dbt compile command is different from my desired dbt run command?
s
If the
vars
argument you want to use in the
dbt compile
command is different from the one you want to use in the
dbt run
command, you can configure the
dbt_cli_resource
in Dagster to pass different arguments to each command. The
dbt_cli_resource
allows you to configure dbt CLI commands with specific flags and arguments that can be passed to the dbt executable. Here's how you can approach this situation: 1. Configure the
dbt_cli_resource
to include the specific
vars
argument for the
dbt compile
command. This will ensure that when you compile your dbt project, the correct variables are used. 2. When defining your dbt run operation in Dagster, you can configure the
dbt run
command separately with its own set of
vars
arguments. 3. You can use the
configured
method to create different configurations of the
dbt_cli_resource
for the compile and run commands, each with the appropriate
vars
arguments. 4. In your Dagster job or pipeline, you can then use these configured resources for the respective dbt operations. Here's a conceptual example of how you might configure the
dbt_cli_resource
for different
vars
arguments:
Copy code
python from dagster_dbt import dbt_cli_resource  # Configure dbt_cli_resource for dbt compile with specific vars dbt_compile_resource = dbt_cli_resource.configured({     "vars": {"var_for_compile": "value1"},     # ... other configurations specific to dbt compile })  # Configure dbt_cli_resource for dbt run with different vars dbt_run_resource = dbt_cli_resource.configured({     "vars": {"var_for_run": "value2"},     # ... other configurations specific to dbt run })  # Define your Dagster job with the configured resources @job(resource_defs={     "dbt_compile": dbt_compile_resource,     "dbt_run": dbt_run_resource, }) def my_dbt_job():     # Use the resources in your ops or assets     # ...
In this example,
dbt_compile_resource
and
dbt_run_resource
are configured with different
vars
arguments, which you can then use in your job definition to execute the
dbt compile
and
dbt run
commands with the desired variables. Please note that the actual implementation may vary based on your specific use case and the version of Dagster and dbt you are using. Always refer to the latest documentation for the most accurate and up-to-date guidance.
d
@Scout can you update the above example using the
@dbt-assets
decorator?
s
As of my knowledge cutoff date in early 2023, the
@dbt_assets
decorator in Dagster is used to define a set of assets that correspond to dbt models. The decorator allows you to specify a dbt manifest and other settings that control how dbt models are translated into Dagster assets. However, the
@dbt_assets
decorator does not directly accept
vars
as an argument since it is designed to work with the dbt manifest rather than individual dbt commands. To pass different
vars
to
dbt compile
and
dbt run
when using the
@dbt_assets
decorator, you would typically handle this within the function that the decorator is applied to. Inside this function, you can invoke dbt CLI commands using the
DbtCliResource
provided by Dagster, and you can pass different
vars
to each command as needed. Here's a conceptual example of how you might use the
@dbt_assets
decorator and pass different
vars
to
dbt compile
and `dbt run`:
Copy code
python from dagster import AssetExecutionContext from dagster_dbt import dbt_assets, DbtCliResource  # Path to your dbt project's manifest.json dbt_manifest_path = "path/to/manifest.json"  @dbt_assets(manifest=dbt_manifest_path) def my_dbt_assets(context: AssetExecutionContext, dbt: DbtCliResource):     # Here you can invoke dbt CLI commands with different vars     # For example, to compile your dbt project with specific vars:     dbt.compile(context=context, vars={"var_for_compile": "value1"})      # Later, when you want to run your dbt project with different vars:     yield from dbt.run(context=context, vars={"var_for_run": "value2"}).stream()
In this example, the
dbt.compile
and
dbt.run
methods are called with different
vars
arguments within the function decorated by
@dbt_assets
. The
yield from
statement is used to stream the results of the
dbt run
command, which will materialize the dbt models as Dagster assets. Please note that the actual implementation details may vary, and you should consult the latest Dagster and dbt documentation for the most accurate and up-to-date information. Additionally, the capabilities and APIs of Dagster and its integrations with dbt may have evolved since my knowledge cutoff date, so I recommend checking the latest resources for any new developments.
t
Is the crux of your question that you want an individual run’s exact SQL code shown or that you want an explanation of the SQL code shown in your asset’s definition?
If you want to see your actual run’s code, you might want to grab the
manifest.json
or
run_results
after running and log the executed code with
<http://context.log.info|context.log.info>
. The schema for this can be found on dbt’s site
d
Hey Tim thanks for responding. I'd like to see the runs' actual code.
t
So yeah, a way to show that is to fetch one of your run artifacts after a run, loop through the models and log each executed SQL
d
that works for me. thank you very much for your response
t
ack that the anchor linking doesn’t work very well, so wanted to explicitly share the code to fetch an artifact after a run:
Copy code
@dbt_assets(manifest=Path("target", "manifest.json"))
def my_dbt_assets(context: AssetExecutionContext, dbt: DbtCliResource):
    dbt_build_invocation = dbt.cli(["build"], context=context)

    yield from dbt_build_invocation.stream()

    run_results_json = dbt_build_invocation.get_artifact("run_results.json")
d
perfect, thanks!
How would I go about changing this so that the run_results still will fetched even if the dbt run fails? I noticed that this works well for succesful dbt runs, but the information is lost on failed runs.