Question
How can we configure Spark to use the Hive Metastore for metadata?
Answer
To do this you need to set the following spark conf:
'spark.sql.catalogImplementation=hive'
This can be done at spark-submit time by adding it as a command line parameter:
'spark-submit --conf spark.sql.catalogImplementation=hive 356.py'
To configure this for all requests (desirable):
- Open /etc/spark/conf/spark-defaults.conf
- Set the parameter
'spark.sql.catalogImplementation=hive'
in the file.
Comments
0 comments
Please sign in to leave a comment.