Skip to contents

Save the contents of a SparkDataFrame as an ORC file, preserving the schema. Files written out with this method can be read back in as a SparkDataFrame using read.orc().

Usage

write.orc(x, path, ...)

# S4 method for SparkDataFrame,character
write.orc(x, path, mode = "error", ...)

Arguments

x

A SparkDataFrame

path

The directory where the file is saved

...

additional argument(s) passed to the method. You can find the ORC-specific options for writing ORC files in https://spark.apache.org/docs/latest/sql-data-sources-orc.html#data-source-optionData Source Option in the version you use.

mode

one of 'append', 'overwrite', 'error', 'errorifexists', 'ignore' save mode (it is 'error' by default)

Note

write.orc since 2.0.0

Examples

if (FALSE) {
sparkR.session()
path <- "path/to/file.json"
df <- read.json(path)
write.orc(df, "/tmp/sparkr-tmp1/")
}