Skip to content

User Guide

Andrea Gazzarini edited this page Apr 7, 2015 · 22 revisions

Get me up and running

If you don't have Solr installed

I assume you already have Java (7), Maven (3.x) and git on your system.

Checkout the project

Open a new shell and type the following:

# cd /tmp
# git clone https://github.com/agazzarini/SolRDF.git solrdf-download

Build and run SolrRDF

# cd solrdf-download/solrdf
# mvn clean package cargo:run

The very first time you run this command a lot of things will be downloaded, Solr included. At the end you should see sheomething like this:

[INFO] Jetty 7.6.15.v20140411 Embedded started on port [8080]
[INFO] Press Ctrl-C to stop the container...

SolRDF is up and running!

If you already have Solr installed

I assume you already have Java (7) and Apache Solr (4.8.x - 4.10.x) on your system.

Download the SolRDF home

Download from here a packaged and ready-to-use solr home. It includes configuration files and all required dependencies.

Extract the downloaded archive

Let's suppose the extract directory is $EXTRACT_DIR, after extracing the archive, you will find a solrdf directory within the $EXTRACT_DIR.

Start Solr

Follow the standard procedure for starting Solr: open a new shell under the example folder of the Solr installation and type:

> java -Dsolr.solr.home=$EXTRACT_DIR/solrdf -Dsolr.data.dir=/tmp/tmpdata -jar start.jar

Note I've used /tmp/tmpdata as data directory. You may want to change that.
You should see sheomething like this:

[INFO] Jetty 7.6.15.v20140411 Embedded started on port [8080]
[INFO] Press Ctrl-C to stop the container...

SolRDF is up and running!

Add data

Now let's add some data. You can do that in one of the following ways:

Browser

Open your favourite browser and type the follwing URL (line has been split for readability):

http://localhost:8080/solr/store/update/bulk?commit=true
&update.contentType=application/n-triples
&stream.file=/tmp/solrdf-download/solrdf/src/test/resources/sample-data/bsbm-generated-dataset.nt

This is an example with the bundled sample data. If you have a file somehere (i.e. remotely) you can use the stream.url parameter to indicate the file URL. For example:

http://localhost:8080/solr/store/update/bulk?commit=true
&update.contentType=application/rdf%2Bxml
&stream.url=http://ec.europa.eu/eurostat/ramon/rdfdata/countries.rdf

Command line

Open a shell and type the following

# curl -v http://localhost:8080/solr/store/update/bulk?commit=true \ 
  -H "Content-Type: application/n-triples" \
  --data-binary @/tmp/solrdf-download/solrdf/src/test/resources/sample-data/bsbm-generated-dataset.nt

Ok, you just added (about) 5000 triples.

SPARQL 1.1. endpoint

SolRDF is a fully compliant SPARQL 1.1. endpoint. In order to issue a query just run a query like this:

# curl "http://127.0.0.1:8080/solr/store/sparql" \
  --data-urlencode "q=SELECT * WHERE { ?s ?p ?o } LIMIT 10" \
  -H "Accept: application/sparql-results+json"
  
Or  
  
# curl "http://127.0.0.1:8080/solr/store/sparql" \
  --data-urlencode "**q=SELECT * WHERE { ?s ?p ?o } LIMIT 10**" \
  -H "Accept: application/sparql-results+xml"

Hybrid mode

If the request contains a valid SPARQL query and at least one of the parameters listed below, SolRDF switches in a so-called "Hybrid" mode. That enables a set of interesting features like results pagination (without using the LIMIT keyword) and faceting (on the overall results of the SPARQL query).

Parameter Description Reference
rows The maximum number of results that will be returned in response. In case of negative or invalid value it defaults to 10. Solr Wiki (rows)
start The start offset in the complete result set. In case of negative or invalid value it defaults to 0. Solr Wiki (start)
facet A boolean value that enables or disables (default) faceting Solr Wiki (facet)
facet.field The name of the field which should be treated as a facet. In case of multiple fields, the parameter can be repeated in the request Solr Wiki (facet.field)

For more information about Solr query and facet parameters see here [1] and here [2]. Remember that only parameters listed in the table above are "supported". Hopefully I will gradually all the other parameters.

When SolRDF runs in Hybrid mode, it will produce a response like this:

<?xml version="1.0" encoding="UTF-8"?>
<response>
	<lst name="responseHeader">
		<int name="status">0</int>
		<int name="QTime">25</int>
		<int name="rows">2</int>
		<int name="start">100</int>
		<str name="query">SELECT *
		   WHERE
		   { ?s <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> ?o}
		</str>
	</lst>
	<result name="response" numFound="18176" start="100" maxScore="1.0">
		<sparql xmlns="http://www.w3.org/2005/sparql-results#">
			<head>
				<variable name="s" />
				<variable name="o" />
			</head>
			<results>
				<result>
					<binding name="s">
						<bnode>b0</bnode>
					</binding>
					<binding name="o">
						<uri>http://purl.org/dc/terms/W3CDTF</uri>
					</binding>
				</result>
				<result>
					<binding name="s">
						<uri>http://www.gutenberg.org/feeds/catalog.rdf#etext20867</uri>
					</binding>
					<binding name="o">
						<uri>http://www.gutenberg.org/rdfterms/etext</uri>
					</binding>
				</result>
			</results>
		</sparql>
	</result>
	<lst name="facet_counts">
		<lst name="facet_queries" />
		<lst name="facet_fields">
		    <lst name="p">
			<int name="<http://www.w3.org/1999/02/22-rdf-syntax-ns#type>">18176</int>
		    </lst>
		</lst>
		<lst name="facet_dates" />
		<lst name="facet_ranges" />
	</lst>
</response>

Pseudo-Graph Store HTTP Protocol support

SolRDF supports a subset of the SPARQL 1.1 Graph Store HTTP Protocol specs [3]. The protocol describes a set of HTTP operations for managing a collection of RDF graphs:

HTTP METHOD Description Supported by SolRDF
GET Retrieves the content of a graph (named or default) Yes
POST Adds data to a given graph (named or default) Yes
PUT Replaces data of a given graph (named or default) No
DELETE Deletes a given graph (named or default) No

PUT and DELETE requests are not supported because in Solr those HTTP methods are reserved for REST operations related with schema and configuration. As consequence of that a custom handler (in this case Sparql11GraphStoreProtocolHandler) won't never get a chance to be notified in case of PUT or DELETE requests.

The target graph in HTTP requests is indicated by means of "graph" or "default" parameters. Some examples:

Request Named Graph Default Graph
/rdf-graph-store?default No Yes
/rdf-graph-store No Yes
/rdf-graph-store?graph=http://a.b.c Yes No

As you can see, in case of absence, the request is supposed to refer to the default graph.

GET

A request that uses the HTTP GET method will retrieve an RDF payload that is a serialization of a given graph. Some example:

> curl "http://localhost:8080/solr/store/rdf-graph-store   
> curl "http://localhost:8080/solr/store/rdf-graph-store?default
> curl "http://localhost:8080/solr/store/rdf-graph-store?graph=http://a.b.c

POST

A request that uses the HTTP POST method will add an RDF payload to a given graph. Some example:

> curl -X POST "http://localhost:8080/solr/store/rdf-graph-store \
       -H "Content-Type: application/n-triples" \
       --data-binary @/path/to/your/datafile.nt

> curl -X POST "http://localhost:8080/solr/store/rdf-graph-store?default
       -H "Content-Type: application/n-triples" \
       --data-binary @/path/to/your/datafile.nt

> curl -X POST "http://localhost:8080/solr/store/rdf-graph-store?graph=http://a.b.c
       -H "Content-Type: application/n-triples" \
       --data-binary @/path/to/your/datafile.nt

[1] http://wiki.apache.org/solr/CommonQueryParameters
[2] https://wiki.apache.org/solr/SimpleFacetParameters
[3] http://www.w3.org/TR/sparql11-http-rdf-update