This project uses Apache Spark to explore the popular New York City Current Job Postings Kaggle dataset.
Apache Zeppelin notebooks are used to work interactively with the data, please refer to the Wiki for details of how a development environment was configured for use in this project.
- Hello World - This is a basic notebook to make sure Apache Spark is working and include a simple word count example.
- Word Cloud - This notebook demonstrates how do do a simple word cloud with the D3.js library.
- Spark Core - Simple reference snippets for Apache Spark Core.
- Basic Exploration - Some rudimentary exploration of the data to see what deserves further investigation.
It is quite useful to draw a word cloud of text data such as of the job descriptions. The word-cloud notebook illustrated how this can be done with a limited amount of code with D3.js and two additional external JavaScipt files.
println("%html")
println("<script>")
println(" var wordCollection = [")
textRdd.collect.foreach(t=>
println("{text: \"" + t._1 + "\", size: " + t._2 * 5 + "},")
)
println(" ]")
println("</script>")
print(s"""%html
<script src="http://d3js.org/d3.v3.min.js"></script>
<script src="https://cdn.statically.io/gh/JohnnyFoulds/nyc-job-exploration/b9cd4af7/zeppelin/notebook/word-cloud/js/d3.layout.cloud.js"></script>
<script src="https://cdn.statically.io/gh/JohnnyFoulds/nyc-job-exploration/b9cd4af7/zeppelin/notebook/word-cloud/js/word.cloud.js"></script>
<div id="wordCloud"></div>
<script>
//Create a new instance of the word cloud visualisation.
var myWordCloud = wordCloud('#wordCloud');
myWordCloud.update(wordCollection);
</script>
""")
The d3.layout.cloud.js
library is taken from https://github.com/jasondavies/d3-cloud.
Statistical and Mathematical Functions with DataFrames in Apache Spark - https://databricks.com/blog/2015/06/02/statistical-and-mathematical-functions-with-dataframes-in-spark.html