You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
nitowa 023d7c8b99 checkpoint dir to settings, rename main_back to main_with_collect 2 years ago
config/db add db write to graph impl 2 years ago
spark-packages working graph implementation and improved shell scripts 2 years ago
src/spark checkpoint dir to settings, rename main_back to main_with_collect 2 years ago
.gitignore add db write to graph impl 2 years ago
README.md add clarification to README 2 years ago
clean.py progress on mapping data, finding clusters, probably inefficient 2 years ago
settings.json checkpoint dir to settings, rename main_back to main_with_collect 2 years ago
setup.py progress on mapping data, finding clusters, probably inefficient 2 years ago
small_test_data.csv progress on mapping data, finding clusters, probably inefficient 2 years ago
start_services.sh working graph implementation and improved shell scripts 2 years ago
submit.sh working graph implementation and improved shell scripts 2 years ago
submit_graph.sh working graph implementation and improved shell scripts 2 years ago

README.md

Project Description

TODO

Installation

Prerequisites:

For the graph implementation specifically you need to install graphframes manually from a third party since the official release is incompatible with spark 3.x (pull request pending). A prebuilt copy is supplied in the spark-packages directory.

Setting up

  • Modify settings.json to reflect your setup. If you are running everything locally you can use start_services.sh to turn everything on in one swoop. It might take a few minutes for Cassandra to become available.
  • Load the development database by running python3 setup.py from the project root. Per default this will move small_test_data.csv into the transactions table.

Deploying:

  • Start the spark workload by either running submit.sh (slow) or submit_graph.sh (faster)
  • If you need to clean out the Database you can run python3 clean.py. Be wary that this wipes all table definitions and data.