Anyone know what I need to set in order for spark-submit to use the HDP
version of spark and not the internal one?

currently i see:

export HADOOP_CONF_DIR=/ebs/kylin/hadoop-conf &&
/ebs/kylin/apache-kylin-2.2.0-bin/spark/bin/spark-submit
I see in the kylin.properties files:
## Spark conf (default is in spark/conf/spark-defaults.conf)

Although it doesn't how how I can change this to use the HDP spark-submit.

Also HDP is on 1.6.1 version of spark and kylin internally uses 2.x.  Not
sure if that matters during submit.  I can't seem to get more than 2
executors to run without it failing with other errors.  We have about 44
slots on our cluster.

Also uncommented:
## uncomment for HDP

kylin.engine.spark-conf.spark.driver.extraJavaOptions=-Dhdp.version=current

kylin.engine.spark-conf.spark.yarn.am.extraJavaOptions=-Dhdp.version=current

kylin.engine.spark-conf.spark.executor.extraJavaOptions=-Dhdp.version=current

see attached for other properties set.