R Session

Configuration for the R session.

Spark

Starting RStudio with this mode will configure the Spark, PySpark and SparkR Kernel.

Spark (Static) Spark (Dynamic)

Spark Static

Spark is a general-purpose distributed data processing engine.

When using Spark static you need to set a fixed number of executors. This means that your application will keep the resources it is allocated even if they are not utilized by the application.

Spark Dynamic

With Spark dynamic you can set minimum and maximum number of executors. This means that your application may give resources back to the cluster if they are no longer used and request them again later when there is demand.