Telemetry/Custom analysis with spark: Difference between revisions

→‎FAQ: Add sc.cancelAllJobs() tip
(→‎Using Spark: Add spark UI)
(→‎FAQ: Add sc.cancelAllJobs() tip)
Line 169: Line 169:


2. The connection from PySpark to the Spark driver might be lost. Unfortunately the best way to recover from this for the moment seems to be spinning up a new cluster.
2. The connection from PySpark to the Spark driver might be lost. Unfortunately the best way to recover from this for the moment seems to be spinning up a new cluster.
3. Canceling execution of a notebook cell doesn't cancel any spark jobs that might be running in the background. If your spark commands seem to be hanging, try running `sc.cancelAllJobs()`.
13

edits