Telemetry/Custom analysis with spark: Difference between revisions

Line 49: Line 49:
##      Set a schedule frequency using the remaining fields.
##      Set a schedule frequency using the remaining fields.


Now, the notebook will be updated automatically and the results can be easily shared. Furthermore, all files stored in the notebook's local working directory will be automatically uploaded to S3, which comes in handy for simple ETL jobs for example.
Now, the notebook will be updated automatically and the results can be easily shared. Furthermore, all files stored in the notebook's local working directory at the end of the job will be automatically uploaded to S3, which comes in handy for simple ETL for example.


For reference, see [https://robertovitillo.com/2015/03/13/simple-dashboards-with-scheduled-spark-jobs-and-plotly Simple Dashboard with Scheduled Spark Jobs and Plotly].
For reference, see [https://robertovitillo.com/2015/03/13/simple-dashboards-with-scheduled-spark-jobs-and-plotly Simple Dashboard with Scheduled Spark Jobs and Plotly].
39

edits