Apache Spark cluster manager types
As discussed previously, Apache Spark currently supports three Cluster managers:
- Standalone cluster manager
- ApacheMesos
- Hadoop YARN
We'll look at setting these up in much more detail in Chapter 8, Operating in Clustered Mode, which talks about the operation in a clustered mode.
Building standalone applications with Apache Spark
Until now we have used Spark for exploratory analysis, using Scala and Python shells. Spark can also be used in standalone applications that can run in Java, Scala, Python, or R. As we saw earlier, Spark shell and PySpark
provide you with a SparkContext.
However, when you are using an application, you need to initialize your own SparkContext.
Once you have a SparkContext
reference, the remaining ...
Get Learning Apache Spark 2 now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.