Docker Container Build and Run setup with foundations for adding docker integration tests. Docker images built with Hadoop 2.8.4 Hive 2.3.3 and Spark 2.3.1 and published to docker-hub
Look at quickstart document for how to setup docker and run demo
This commit is contained in:
committed by
vinoth chandar
parent
9710b5a3a6
commit
f3418e4718
16
docker/hoodie/hadoop/sparkworker/worker.sh
Normal file
16
docker/hoodie/hadoop/sparkworker/worker.sh
Normal file
@@ -0,0 +1,16 @@
|
||||
#!/bin/bash
|
||||
|
||||
. "/spark/sbin/spark-config.sh"
|
||||
|
||||
. "/spark/bin/load-spark-env.sh"
|
||||
|
||||
mkdir -p $SPARK_WORKER_LOG
|
||||
|
||||
export SPARK_HOME=/opt/spark
|
||||
|
||||
ln -sf /dev/stdout $SPARK_WORKER_LOG/spark-worker.out
|
||||
|
||||
date
|
||||
echo "SPARK HOME is : $SPARK_HOME"
|
||||
/opt/spark/sbin/../bin/spark-class org.apache.spark.deploy.worker.Worker \
|
||||
--webui-port $SPARK_WORKER_WEBUI_PORT $SPARK_MASTER >> $SPARK_WORKER_LOG/spark-worker.out
|
||||
Reference in New Issue
Block a user