I can work around by using:

bin/pyspark --conf spark.sql.catalogImplementation=in-memory

now, but still wonder what's going on with HiveConf..

On Thu, Jun 14, 2018 at 11:37 AM, Li Jin <[EMAIL PROTECTED]> wrote: