https://dagster.io/ logo
#ask-community
Title
# ask-community
a

Anthony Reksoatmodjo

07/07/2022, 9:24 PM
Hi! It seems dagit's default sqlite backend doesn't scale up with multithreading, so someone suggested switching to postgresql. I don't have much experience with postgresql; are there any docs or tutorial explaining how to make this switch?
p

prha

07/07/2022, 9:28 PM
Hi Anthony.
You would need to stand up a postgres instance and then configure your
dagster.yaml
to point to that postgres instance for dagster storage: https://docs.dagster.io/deployment/dagster-instance#postgres-storage
a

Anthony Reksoatmodjo

07/07/2022, 9:33 PM
Thanks!
Another question: Is this expected behavior? Is there a way to config the sqlite backend to work with increased core-counts?
p

prha

07/07/2022, 10:39 PM
Hmm, the sqlite faq indicates that it does support multiple processes writing because it uses db file locks, but the GH issue you referenced is indicating there’s some sort of locking issue with the multiprocess_executor. Presumably this is because there are multiple processes simultaneously writing to the event log for the in progress run. You might have some success tuning this by configuring a sqlite busy handler: https://www.sqlite.org/c3ref/busy_handler.html