Hi all, I've got a conceptual question about dags...
# ask-community
m
Hi all, I've got a conceptual question about dagster resources. In an implementation I'm doing, I'm using a resource to connect to snowflake using the
snowflake-connector-python
package, I'm not currently using the dagster-snowflake implementation because I'd like to be able to use the web based browser authentication for this resource on a local machine. Setting up the resource is no trouble, but something I'm noticing is that If I have multiple ops that consume this resource, the browser auth is triggered for every distinct op that utilizes the resource. Reading through the documentation it conceptually makes sense that this would happen, basically executing the code within a resource definition at every op. But is there any way to instantiate a connection object/cursor once at the beginning of a job within a resource, so that each time the resource is utilized in an op it doesn't attempt to recreate the object?
🤖 1
s
Hey Mark - the root of the issue is that ops typically execute in different processes, often on different machines, and it's typically not possible to transfer a connection/ cursor object across process boundaries.
s
I've run into this problem setting up dbt. I'm not sure, but for local environments, you may be able to set up caching as described in the Snowflake docs. From dbt docs.
Copy code
Note: By default, every connection that dbt opens will require you to re-authenticate in a browser. The Snowflake connector package supports caching your session token, but it currently only supports Windows and Mac OS. See the Snowflake docs for how to enable this feature in your account.
Will also note I never got this working satisfactorily (hazy on the details of why) and went with a keypair route, so i'm just pointing you here, cant endorse it. 😬
m
I think we're gonna end up using the keypair route as well. We also tried (unsuccessfully) to set up connection caching, but same behavior where it didn't actually do anything.
sadblob 1