Contexts
Mist creates and orchestrates Apache Spark contexts automatically. Every job is run in a context. In fact context describes a named Spark context and Mist settings for this Spark context.
Contexts may be created using mist-cli or http-api.
Also, there is special default
context. It may be configured only using mist-configuration file.
It’s goal to setup default values for all context, so for creating a new context it isn’t required to define values for all its fields.
Settings:
Key | Default | Meaning |
sparkConf | empty | settings for a [spark](https://spark.apache.org/docs/latest/configuration.html) |
maxJobs | 1 | amount of jobs executed in parallel |
workerMode | exclusive |
|
maxConnFailures | 5 | allowed amount of worker crushes before context will be switched into `broken` state (it fails all incoming requests until context settings is updated). |
runOptions | "" | additional command line arguments for building spark-submit command to start worker, e.x: pass `--jars` |
streamingDuration | 1s | spark streaming duration |
precreated | false | if true starts worker immediately, if false await first job start requests before starting worker *NOTE*: works only with `shared` workerMode |
downtime | infinity | idle-timeout for `shared` worker |