config - is it possible to get the current spark context settings in pyspark? -


i'm trying path spark.worker.dir current sparkcontext.

if explicitly set config param, can read out of sparkconf, there anyway access complete config (including defaults) using pyspark?

yes: sc._conf.getall()

which uses method:

sparkconf.getall() 

as accessed by

sparkcontext.sc._conf 

note underscore: makes tricky. had @ spark source code figure out ;)

but work:

in [4]: sc._conf.getall() out[4]: [(u'spark.master', u'local'),  (u'spark.rdd.compress', u'true'),  (u'spark.serializer.objectstreamreset', u'100'),  (u'spark.app.name', u'pysparkshell')] 

Comments

Popular posts from this blog

angularjs - ADAL JS Angular- WebAPI add a new role claim to the token -

php - CakePHP HttpSockets send array of paramms -

node.js - Using Node without global install -