Search engine for the Interplanetary Filesystem. Sniffs the DHT gossip and indexes file and directory hashes.
Metadata and contents are extracted using ipfs-tika, searching is done using ElasticSearch 5, queueing is done using RabbitMQ. The crawler is implemented in Go, the API and frontend are built using Node.js.
A preliminary start at providing a minimal amount of documentation can be found in the docs folder.
Building a search engine like this takes a considerable amount of resources (money and TLC). If you are able to help out with either of them, mail us at info@ipfs-search.com or find us at #ipfssearch on Freenode (or #ipfs-search:chat.weho.st on Matrix).
For discussing and suggesting features, look at the project planning.
$ go get ./...
$ make
First of all, make sure Ansible 2.2 is installed:
$ pip2 install 'ansible<2.3'
Local installation is done using vagrant:
git clone https://github.com/ipfs-search/ipfs-search.git $GOPATH/src/github.com/ipfs-search/ipfs-search
cd $GOPATH/src/github.com/ipfs-search/ipfs-search
vagrant up
This starts up the API on port 9615, Elasticsearch on 9200 and RabbitMQ on 15672.
Vagrant setup does not currently start up the frontend.
Requires an Ubuntu 16.04 machine.
$ ansible-playbook provisioning/bootstrap.yml --user root --ask-pass
$ ansible-playbook provisioning/backend.yml
$ ansible-playbook provisioning/frontend.yml