You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to send application logs from Databricks to an Azure Log Analytics Workspace.
I configured the Databricks Workspace using the documentation and everything seems great, however when I try to make custom log from a python UDF executed from the workers, the code
Hello,
I'm trying to send application logs from Databricks to an Azure Log Analytics Workspace.
I configured the Databricks Workspace using the documentation and everything seems great, however when I try to make custom log from a python UDF executed from the workers, the code
from pyspark import SparkContext
sc = SparkContext.getOrCreate()
spark_log4j = sc._jvm.org.apache.log4j
logger = spark_log4j.LogManager.getLogger("myCustomLogger")
fails with the exception
Exception: SparkContext should only be created and accessed on the driver.
It is not clear to me how to get a log4j logger from python scripts executed by workers to create custom logs.
Can you please help me with this?
Thank you
Tommaso
The text was updated successfully, but these errors were encountered: