Package org.apache.spark.launcher

Library for launching Spark applications.

See: Description

Package org.apache.spark.launcher Description

Library for launching Spark applications.

This library allows applications to launch Spark programmatically. There's only one entry point to the library - the SparkLauncher class.

The SparkLauncher.startApplication( org.apache.spark.launcher.SparkAppHandle.Listener...) can be used to start Spark and provide a handle to monitor and control the running application:

 import org.apache.spark.launcher.SparkAppHandle;
   import org.apache.spark.launcher.SparkLauncher;

   public class MyLauncher {
     public static void main(String[] args) throws Exception {
       SparkAppHandle handle = new SparkLauncher()
         .setAppResource("/my/app.jar")
         .setMainClass("my.spark.app.Main")
         .setMaster("local")
         .setConf(SparkLauncher.DRIVER_MEMORY, "2g")
         .startApplication();
       // Use handle API to monitor / control application.
     }
   }
 
 

It's also possible to launch a raw child process, using the SparkLauncher.launch() method:

 import org.apache.spark.launcher.SparkLauncher;

   public class MyLauncher {
     public static void main(String[] args) throws Exception {
       Process spark = new SparkLauncher()
         .setAppResource("/my/app.jar")
         .setMainClass("my.spark.app.Main")
         .setMaster("local")
         .setConf(SparkLauncher.DRIVER_MEMORY, "2g")
         .launch();
       spark.waitFor();
     }
   }
 
 

This method requires the calling code to manually manage the child process, including its output streams (to avoid possible deadlocks). It's recommended that SparkLauncher.startApplication( org.apache.spark.launcher.SparkAppHandle.Listener...) be used instead.