spark on yarn搭建
1、在hadoop on yarn环境基础上, 增加spark配置.
spark-env.sh
HADOOP_CONF_DIR=/usr/local/hadoop34/etc/hadoop
YARN_CONF_DIR=/usr/local/hadoop34/etc/hadoop
workers文件:
slave1
2、运行测试
./bin/spark-submit --master yarn --class org.apache.spark.examples.SparkPi ./examples/jars/spark-examples_2.12-3.5.5.jar 10
# 运行pyspark
./bin/pyspark --master yarn