bin/start-master.sh --> bin/spark-daemon.sh --> bin/spark-class org.apache.spark.deploy.master.Master
bin/spark-shell --> bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" --> bin/spark-class org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name "Spark shell" --> java org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name "Spark shell"
bin/spark-submit --> bin/spark-class org.apache.spark.deploy.SparkSubmit --> java org.apache.spark.launcher.Main
org.apache.spark.launcher.Main build the commands to run and then exit the process. spark-class will use the commands to launch another process using the commands
Copy done < <("$RUNNER" -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main "$@")
exec "${CMD[@]}