add spark setup configs
This commit is contained in:
parent
d375b6fd08
commit
73346a9f8f
2 changed files with 63 additions and 0 deletions
18
spark/README.txt
Normal file
18
spark/README.txt
Normal file
|
@ -0,0 +1,18 @@
|
||||||
|
# There is some docker images which can be
|
||||||
|
# used ofr every node in th spark cluster
|
||||||
|
|
||||||
|
|
||||||
|
### spark-master ###
|
||||||
|
image: chasik/sparkmaster:v1
|
||||||
|
# NB: the container created from this image should necessary have
|
||||||
|
#"spark-master" as name for the workers to be able to join it
|
||||||
|
# docker pull chasik/sparkmaster:v1
|
||||||
|
|
||||||
|
### spark-worker ###
|
||||||
|
image: chasik/sparkworker:v1
|
||||||
|
# docker pull chasik/sparkworker:v1
|
||||||
|
|
||||||
|
### spark-submit ###
|
||||||
|
image chasik/sparksubmit:v1
|
||||||
|
# docker pull chasik/sparksubmit:v1
|
||||||
|
|
45
spark/swarmtoken.txt
Normal file
45
spark/swarmtoken.txt
Normal file
|
@ -0,0 +1,45 @@
|
||||||
|
Informations relatives cluster spark
|
||||||
|
######################################################################################
|
||||||
|
|
||||||
|
**************************************************************************************
|
||||||
|
SWARM MANAGER
|
||||||
|
docker swarm init --advertise-addr IP_FLOATINGc1:PORTx --data-path-addr IP_FLOATINGc1 --listen-addr 0.0.0.0:PORTx
|
||||||
|
|
||||||
|
SWARM WORKER
|
||||||
|
docker swarm join --token <...> --advertise-addr IP_FLOATINGc2:PORTy IP_FLOATINGc1:PORTx
|
||||||
|
|
||||||
|
NB: PORTx et PORTy >= 50000 (restriction INSA)
|
||||||
|
|
||||||
|
**************************************************************************************
|
||||||
|
MASTER
|
||||||
|
docker run -dit --name spark-master --network sparknet -p 8080:8080 chasik/sparkmaster:v1
|
||||||
|
|
||||||
|
SUBMIT
|
||||||
|
docker run -it --name spark-submit -v /home/user/entry:/entry --network sparknet -p 4040:4040 chasik/sparksubmit:v1 bash
|
||||||
|
|
||||||
|
WORKER1
|
||||||
|
docker run -dit --name spark-worker1 --network sparknet -p 8081:8081 -e CORES=2 chasik/sparkworker:v1
|
||||||
|
**************************************************************************************
|
||||||
|
WORKER2
|
||||||
|
docker run -dit --name spark-worker2 --network sparknet -p 8081:8081 -e CORES=2 chasik/sparkworker:v1
|
||||||
|
|
||||||
|
WORKER3
|
||||||
|
docker run -dit --name spark-worker3 --network sparknet -p 8082:8081 -e CORES=2 chasik/sparkworker:v1
|
||||||
|
**************************************************************************************
|
||||||
|
WORKER4
|
||||||
|
docker run -dit --name spark-worker4 --network sparknet -p 8081:8081 -e CORES=2 chasik/sparkworker:v1
|
||||||
|
|
||||||
|
WORKER5
|
||||||
|
docker run -dit --name spark-worker5 --network sparknet -p 8082:8081 -e CORES=2 chasik/sparkworker:v1
|
||||||
|
**************************************************************************************
|
||||||
|
WORKER6
|
||||||
|
docker run -dit --name spark-worker6 --network sparknet -p 8081:8081 -e CORES=2 chasik/sparkworker:v1
|
||||||
|
|
||||||
|
WORKER7
|
||||||
|
docker run -dit --name spark-worker7 --network sparknet -p 8082:8081 -e CORES=2 chasik/sparkworker:v1
|
||||||
|
**************************************************************************************
|
||||||
|
|
||||||
|
Task Submit (nœud spark-submit)
|
||||||
|
/spark/bin/spark-submit --conf spark.executor.cores=2 --conf spark.executor.memory=5G --master spark://spark-master:7077 /spark/examples/src/main/python/<pi.py> <2000> .param2.. <.paramn..>
|
||||||
|
|
||||||
|
Information: files can be put in folder "entry" in order be easily launchable in spark-submit
|
Loading…
Reference in a new issue