DataFrame-class {SparkR}R Documentation

S4 class that represents a DataFrame

Description

DataFrames can be created using functions like createDataFrame, read.json, table etc.

Usage

dataFrame(sdf, isCached = FALSE)

groupedData(sgd)

Arguments

sdf

A Java object reference to the backing Scala DataFrame

isCached

TRUE if the dataFrame is cached

Slots

env

An R environment that stores bookkeeping states of the DataFrame

sdf

A Java object reference to the backing Scala DataFrame

See Also

createDataFrame, read.json, table

https://spark.apache.org/docs/latest/sparkr.html#sparkr-dataframes

Other DataFrame functions: $, $<-, select, select, select,DataFrame,Column-method, select,DataFrame,list-method, selectExpr; [, [, [[, subset; agg, agg, count,GroupedData-method, summarize, summarize; arrange, arrange, arrange, orderBy, orderBy; as.data.frame, as.data.frame,DataFrame-method; attach, attach,DataFrame-method; cache; collect; colnames, colnames, colnames<-, colnames<-, columns, names, names<-; coltypes, coltypes, coltypes<-, coltypes<-; columns, dtypes, printSchema, schema, schema; count, nrow; describe, describe, describe, summary, summary, summary,PipelineModel-method; dim; distinct, unique; dropna, dropna, fillna, fillna, na.omit, na.omit; dtypes; except, except; explain, explain; filter, filter, where, where; first, first; groupBy, groupBy, group_by, group_by; head; insertInto, insertInto; intersect, intersect; isLocal, isLocal; join; limit, limit; merge, merge; mutate, mutate, transform, transform; ncol; persist; printSchema; rbind, rbind, unionAll, unionAll; registerTempTable, registerTempTable; rename, rename, withColumnRenamed, withColumnRenamed; repartition; sample, sample, sample_frac, sample_frac; saveAsParquetFile, saveAsParquetFile, write.parquet, write.parquet; saveAsTable, saveAsTable; saveDF, saveDF, write.df, write.df, write.df; selectExpr; showDF, showDF; show, show, show,GroupedData-method; str; take; unpersist; withColumn, withColumn; write.json, write.json; write.text, write.text

Examples

## Not run: 
##D sc <- sparkR.init()
##D sqlContext <- sparkRSQL.init(sc)
##D df <- createDataFrame(sqlContext, faithful)
## End(Not run)

[Package SparkR version 1.6.1 Index]