with {SparkR}R Documentation

Evaluate a R expression in an environment constructed from a SparkDataFrame


Evaluate a R expression in an environment constructed from a SparkDataFrame with() allows access to columns of a SparkDataFrame by simply referring to their name. It appends every column of a SparkDataFrame into a new environment. Then, the given expression is evaluated in this new environment.


with(data, expr, ...)

## S4 method for signature 'SparkDataFrame'
with(data, expr, ...)



(SparkDataFrame) SparkDataFrame to use for constructing an environment.


(expression) Expression to evaluate.


arguments to be passed to future methods.


with since 1.6.0

See Also


Other SparkDataFrame functions: SparkDataFrame-class, agg, alias, arrange, as.data.frame, attach,SparkDataFrame-method, broadcast, cache, checkpoint, coalesce, collect, colnames, coltypes, createOrReplaceTempView, crossJoin, cube, dapplyCollect, dapply, describe, dim, distinct, dropDuplicates, dropna, drop, dtypes, except, explain, filter, first, gapplyCollect, gapply, getNumPartitions, group_by, head, hint, histogram, insertInto, intersect, isLocal, isStreaming, join, limit, localCheckpoint, merge, mutate, ncol, nrow, persist, printSchema, randomSplit, rbind, registerTempTable, rename, repartition, rollup, sample, saveAsTable, schema, selectExpr, select, showDF, show, storageLevel, str, subset, summary, take, toJSON, unionByName, union, unpersist, withColumn, withWatermark, write.df, write.jdbc, write.json, write.orc, write.parquet, write.stream, write.text


## Not run: 
##D with(irisDf, nrow(Sepal_Width))
## End(Not run)

[Package SparkR version 2.3.0 Index]