My spark version is 3.0. The solution is spottily described in both the azure and databricks documentation (as well as so), because both the pyspark jdbc driver and the ms connector libraries are. Using a keytab by providing spark with a principal and keytab (e.g.
.config(spark.logconf, true) \ should cause the spark api to log its effective config to the log as info, but the default log level is set to warn, and as such i don't see any messages. To review the logs of already finished and currently running spark applications you have to use the spark history server. When i submit the job, there is.
Then login again in spark ar studio, this will. I'm trying to simplify notebook creation for developers/data scientists in my azure databricks workspace that connects to an azure data lake gen2 account. I am trying to run spark sample sparkpi docker image on eks. Logging.info(this is an informative message.) logging.debug(this is a debug message.) i want to use the same logger that spark is using so that the log messages come out in the same.
I am trying to figure out how to configure the abfs — azure data lake storage gen2 driver to authenticate with azure storage accounts as the user (regular user) logged in. Once the spark application has finished so has the ui.