Issue
I have looked through similar questions that have been asked before. No luck so far. I'm using PySpark within a venv environment. How do I go about changing the setting? Do I do it from within jupyter notebook/python script? Or do I need to use bash command? Is it in a specific configuration file? If so, where is it located?
Solution
This config, along many others, has been moved to: SQLConf - sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
This can be set either in the config file or via command line in spark, using:
spark.conf.set("spark.sql.debug.maxToStringFields", 1000)
Answered By - ozeyboy Answer Checked By - Pedro (PHPFixing Volunteer)
0 Comments:
Post a Comment
Note: Only a member of this blog may post a comment.