Class TestUtils

Object
org.apache.spark.TestUtils

public class TestUtils extends Object
Utilities for tests. Included in main codebase since it's used by multiple projects.

TODO: See if we can move this to the test codebase by specifying test dependencies between projects.

  • Constructor Details

    • TestUtils

      public TestUtils()
  • Method Details

    • createJarWithClasses

      public static URL createJarWithClasses(scala.collection.immutable.Seq<String> classNames, String toStringValue, scala.collection.immutable.Seq<scala.Tuple2<String,String>> classNamesWithBase, scala.collection.immutable.Seq<URL> classpathUrls)
      Create a jar that defines classes with the given names.

      Note: if this is used during class loader tests, class names should be unique in order to avoid interference between tests.

      Parameters:
      classNames - (undocumented)
      toStringValue - (undocumented)
      classNamesWithBase - (undocumented)
      classpathUrls - (undocumented)
      Returns:
      (undocumented)
    • createJarWithFiles

      public static URL createJarWithFiles(scala.collection.immutable.Map<String,String> files, File dir)
      Create a jar file containing multiple files. The files map contains a mapping of file names in the jar file to their contents.
      Parameters:
      files - (undocumented)
      dir - (undocumented)
      Returns:
      (undocumented)
    • createJar

      public static URL createJar(scala.collection.immutable.Seq<File> files, File jarFile, scala.Option<String> directoryPrefix, scala.Option<String> mainClass)
      Create a jar file that contains this set of files. All files will be located in the specified directory or at the root of the jar.
      Parameters:
      files - (undocumented)
      jarFile - (undocumented)
      directoryPrefix - (undocumented)
      mainClass - (undocumented)
      Returns:
      (undocumented)
    • assertSpilled

      public static void assertSpilled(SparkContext sc, String identifier, scala.Function0<scala.runtime.BoxedUnit> body)
      Run some code involving jobs submitted to the given context and assert that the jobs spilled.
      Parameters:
      sc - (undocumented)
      identifier - (undocumented)
      body - (undocumented)
    • assertNotSpilled

      public static void assertNotSpilled(SparkContext sc, String identifier, scala.Function0<scala.runtime.BoxedUnit> body)
      Run some code involving jobs submitted to the given context and assert that the jobs did not spill.
      Parameters:
      sc - (undocumented)
      identifier - (undocumented)
      body - (undocumented)
    • assertExceptionMsg

      public static <E extends Throwable> void assertExceptionMsg(Throwable exception, String msg, boolean ignoreCase, scala.reflect.ClassTag<E> evidence$1)
      Asserts that exception message contains the message. Please note this checks all exceptions in the tree. If a type parameter E is supplied, this will additionally confirm that the exception is a subtype of the exception provided in the type parameter.
      Parameters:
      exception - (undocumented)
      msg - (undocumented)
      ignoreCase - (undocumented)
      evidence$1 - (undocumented)
    • testCommandAvailable

      public static boolean testCommandAvailable(String command)
      Test if a command is available.
      Parameters:
      command - (undocumented)
      Returns:
      (undocumented)
    • minimumPythonSupportedVersion

      public static String minimumPythonSupportedVersion()
    • isPythonVersionAvailable

      public static boolean isPythonVersionAvailable()
    • getAbsolutePathFromExecutable

      public static scala.Option<String> getAbsolutePathFromExecutable(String executable)
      Get the absolute path from the executable. This implementation was borrowed from spark/dev/sparktestsupport/shellutils.py.
      Parameters:
      executable - (undocumented)
      Returns:
      (undocumented)
    • httpResponseCode

      public static int httpResponseCode(URL url, String method, scala.collection.immutable.Seq<scala.Tuple2<String,String>> headers)
      Returns the response code from an HTTP(S) URL.
      Parameters:
      url - (undocumented)
      method - (undocumented)
      headers - (undocumented)
      Returns:
      (undocumented)
    • redirectUrl

      public static String redirectUrl(URL url, String method, scala.collection.immutable.Seq<scala.Tuple2<String,String>> headers)
      Returns the Location header from an HTTP(S) URL.
      Parameters:
      url - (undocumented)
      method - (undocumented)
      headers - (undocumented)
      Returns:
      (undocumented)
    • httpResponseMessage

      public static String httpResponseMessage(URL url, String method, scala.collection.immutable.Seq<scala.Tuple2<String,String>> headers)
      Returns the response message from an HTTP(S) URL.
      Parameters:
      url - (undocumented)
      method - (undocumented)
      headers - (undocumented)
      Returns:
      (undocumented)
    • withHttpConnection

      public static <T> T withHttpConnection(URL url, String method, scala.collection.immutable.Seq<scala.Tuple2<String,String>> headers, scala.Function1<HttpURLConnection,T> fn)
    • withListener

      public static <L extends SparkListener> void withListener(SparkContext sc, L listener, scala.Function1<L,scala.runtime.BoxedUnit> body)
      Runs some code with the given listener installed in the SparkContext. After the code runs, this method will wait until all events posted to the listener bus are processed, and then remove the listener from the bus.
      Parameters:
      sc - (undocumented)
      listener - (undocumented)
      body - (undocumented)
    • withHttpServer

      public static void withHttpServer(String resBaseDir, scala.Function1<URL,scala.runtime.BoxedUnit> body)
    • configTestLog4j2

      public static void configTestLog4j2(String level)
      config a log4j2 properties used for testsuite
      Parameters:
      level - (undocumented)
    • recursiveList

      public static File[] recursiveList(File f)
      Lists files recursively.
      Parameters:
      f - (undocumented)
      Returns:
      (undocumented)
    • listDirectory

      public static String[] listDirectory(File path)
      Returns the list of files at 'path' recursively. This skips files that are ignored normally by MapReduce.
      Parameters:
      path - (undocumented)
      Returns:
      (undocumented)
    • createTempJsonFile

      public static String createTempJsonFile(File dir, String prefix, org.json4s.JValue jsonValue)
      Creates a temp JSON file that contains the input JSON record.
    • createTempScriptWithExpectedOutput

      public static String createTempScriptWithExpectedOutput(File dir, String prefix, String output)
      Creates a temp bash script that prints the given output.
    • org$apache$spark$util$SparkTestUtils$$SOURCE

      public static JavaFileObject.Kind org$apache$spark$util$SparkTestUtils$$SOURCE()
    • createCompiledClass

      public static File createCompiledClass(String className, File destDir, SparkTestUtils.JavaSourceFromString sourceFile, scala.collection.immutable.Seq<URL> classpathUrls)
    • createCompiledClass

      public static File createCompiledClass(String className, File destDir, String toStringValue, String baseClass, scala.collection.immutable.Seq<URL> classpathUrls, scala.collection.immutable.Seq<String> implementsClasses, String extraCodeBody, scala.Option<String> packageName)
    • createCompiledClass$default$3

      public static String createCompiledClass$default$3()
    • createCompiledClass$default$4

      public static String createCompiledClass$default$4()
    • createCompiledClass$default$5

      public static scala.collection.immutable.Seq<URL> createCompiledClass$default$5()
    • createCompiledClass$default$6

      public static scala.collection.immutable.Seq<String> createCompiledClass$default$6()
    • createCompiledClass$default$7

      public static String createCompiledClass$default$7()
    • createCompiledClass$default$8

      public static scala.Option<String> createCompiledClass$default$8()