While running a databricks job, especially running a job with large datasets and longer running queries that creates a lot of temp space - we might be facing below issue if we have a minimal configuration set to the cluster.
The simple way to fix this would be changing the spark driver config in the databricks cluster tab
spark.driver.maxResultSize = 100G (change the GB based on your cluster size)
No comments:
Post a Comment