:wave: Hello, team! I'm having a problem with dag...
# ask-community
a
👋 Hello, team! I'm having a problem with dagster-mysql, I've set dagster.yaml and DAGSTER_HOME in linux, but when I launch dagit I got an Exception
Copy code
/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/dagster_mysql/run_storage/run_storage.py:98: ExperimentalWarning: "MySQLRunStorage" is an experimental class. It may break in future versions, even between dot releases. To mute warnings for experimental functionality, invoke warnings.filterwarnings("ignore", category=dagster.ExperimentalWarning) or use one of the other methods described at <https://docs.python.org/3/library/warnings.html#describing-warning-filters>.
  return MySQLRunStorage(inst_data=inst_data, mysql_url=mysql_url_from_config(config_value))
Traceback (most recent call last):
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/mysql/connector/connection_cext.py", line 236, in _open_connection
    self._cmysql.connect(**cnx_kwargs)
_mysql_connector.MySQLInterfaceError: Lost connection to MySQL server during query

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 3256, in _wrap_pool_connect
    return fn()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 310, in connect
    return _ConnectionFairy._checkout(self)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 868, in _checkout
    fairy = _ConnectionRecord.checkout(pool)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 476, in checkout
    rec = pool._do_get()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/impl.py", line 256, in _do_get
    return self._create_connection()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 256, in _create_connection
    return _ConnectionRecord(self)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 371, in __init__
    self.__connect()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 666, in __connect
    pool.logger.debug("Error on connect(): %s", e)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 72, in __exit__
    with_traceback=exc_tb,
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 207, in raise_
    raise exception
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 661, in __connect
    self.dbapi_connection = connection = pool._invoke_creator(self)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/create.py", line 590, in connect
    return dialect.connect(*cargs, **cparams)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 597, in connect
    return self.dbapi.connect(*cargs, **cparams)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/mysql/connector/__init__.py", line 272, in connect
    return CMySQLConnection(*args, **kwargs)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/mysql/connector/connection_cext.py", line 85, in __init__
    self.connect(**kwargs)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/mysql/connector/abstracts.py", line 1028, in connect
    self._open_connection()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/mysql/connector/connection_cext.py", line 242, in _open_connection
    sqlstate=exc.sqlstate)
mysql.connector.errors.OperationalError: 2013 (HY000): Lost connection to MySQL server during query

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/aingeru/anaconda3/envs/pyspark_env/bin/dagit", line 8, in <module>
    sys.exit(main())
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/dagit/cli.py", line 162, in main
    cli(auto_envvar_prefix="DAGIT")  # pylint:disable=E1120
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/click/core.py", line 1128, in __call__
    return self.main(*args, **kwargs)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/click/core.py", line 1053, in main
    rv = self.invoke(ctx)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/click/core.py", line 1395, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/click/core.py", line 754, in invoke
    return __callback(*args, **kwargs)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/dagit/cli.py", line 105, in dagit
    with get_instance_for_service("dagit") as instance:
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/contextlib.py", line 112, in __enter__
    return next(self.gen)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/dagster/cli/utils.py", line 13, in get_instance_for_service
    with DagsterInstance.get() as instance:
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/dagster/core/instance/__init__.py", line 394, in get
    return DagsterInstance.from_config(dagster_home_path)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/dagster/core/instance/__init__.py", line 409, in from_config
    return DagsterInstance.from_ref(instance_ref)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/dagster/core/instance/__init__.py", line 424, in from_ref
    run_storage=instance_ref.run_storage,
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/dagster/core/instance/ref.py", line 241, in run_storage
    return self.run_storage_data.rehydrate()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/dagster/serdes/config_class.py", line 86, in rehydrate
    return klass.from_config_value(self, result.value)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/dagster_mysql/run_storage/run_storage.py", line 98, in from_config_value
    return MySQLRunStorage(inst_data=inst_data, mysql_url=mysql_url_from_config(config_value))
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/dagster_mysql/run_storage/run_storage.py", line 58, in __init__
    table_names = retry_mysql_connection_fn(db.inspect(self._engine).get_table_names)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/inspection.py", line 64, in inspect
    ret = reg(subject)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/reflection.py", line 182, in _engine_insp
    return Inspector._construct(Inspector._init_engine, bind)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/reflection.py", line 117, in _construct
    init(self, bind)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/reflection.py", line 128, in _init_engine
    engine.connect().close()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 3210, in connect
    return self._connection_cls(self, close_with_result=close_with_result)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 96, in __init__
    else engine.raw_connection()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 3289, in raw_connection
    return self._wrap_pool_connect(self.pool.connect, _connection)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 3260, in _wrap_pool_connect
    e, dialect, self
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 2107, in _handle_dbapi_exception_noconnection
    sqlalchemy_exception, with_traceback=exc_info[2], from_=e
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 207, in raise_
    raise exception
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 3256, in _wrap_pool_connect
    return fn()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 310, in connect
    return _ConnectionFairy._checkout(self)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 868, in _checkout
    fairy = _ConnectionRecord.checkout(pool)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 476, in checkout
    rec = pool._do_get()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/impl.py", line 256, in _do_get
    return self._create_connection()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 256, in _create_connection
    return _ConnectionRecord(self)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 371, in __init__
    self.__connect()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 666, in __connect
    pool.logger.debug("Error on connect(): %s", e)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 72, in __exit__
    with_traceback=exc_tb,
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 207, in raise_
    raise exception
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/pool/base.py", line 661, in __connect
    self.dbapi_connection = connection = pool._invoke_creator(self)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/create.py", line 590, in connect
    return dialect.connect(*cargs, **cparams)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 597, in connect
    return self.dbapi.connect(*cargs, **cparams)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/mysql/connector/__init__.py", line 272, in connect
    return CMySQLConnection(*args, **kwargs)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/mysql/connector/connection_cext.py", line 85, in __init__
    self.connect(**kwargs)
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/mysql/connector/abstracts.py", line 1028, in connect
    self._open_connection()
  File "/home/aingeru/anaconda3/envs/pyspark_env/lib/python3.7/site-packages/mysql/connector/connection_cext.py", line 242, in _open_connection
    sqlstate=exc.sqlstate)
sqlalchemy.exc.OperationalError: (mysql.connector.errors.OperationalError) 2013 (HY000): Lost connection to MySQL server during query
(Background on this error at: <https://sqlalche.me/e/14/e3q8>)
What am I doing wrong?
d
Hi Angel, assuming you have a MySQL server set up (and are able to access it e.g. from the command-line) - I see some configuration tips on the mysql side for dealing with this particular error here: https://dev.mysql.com/doc/refman/5.7/en/error-lost-connection.html