@ldf/server v3.3.0
Linked Data Fragments Server - Quad Pattern Fragments
A Linked Data Fragments server with Quad Pattern Fragments (a.k.a. Triple Pattern Fragments) support.
This package has been renamed from ldf-server to @ldf/server.
Find more information about migrating from ldf-server 2.x.x on our wiki.
This package is a Linked Data Fragments Server module.
Motivation
On today's Web, Linked Data is published in different ways, which include data dumps, subject pages, and results of SPARQL queries. We call each such part a Linked Data Fragment.
The issue with the current Linked Data Fragments is that they are either so powerful that their servers suffer from low availability rates (as is the case with SPARQL), or either don't allow efficient querying.
Instead, this server offers Quad Pattern Fragments (a.k.a. Triple Pattern Fragments). Each Quad Pattern Fragment offers:
- data that corresponds to a quad/triple pattern (example).
- metadata that consists of the (approximate) total triple count (example).
- controls that lead to all other fragments of the same dataset (example).
An example server is available at data.linkeddatafragments.org.
Install the server
This server requires Node.js 10.0 or higher and is tested on OSX and Linux. To install, execute:
$ [sudo] npm install -g @ldf/serverUse the server
Configure the data sources
First, create a configuration file config.json similar to config/config-example.json,
in which you detail your data sources.
For example, this configuration uses an HDT file
and a SPARQL endpoint as sources:
{
"@context": "https://linkedsoftwaredependencies.org/bundles/npm/@ldf/server/^3.0.0/components/context.jsonld",
"@id": "urn:ldf-server:my",
"import": "preset-qpf:config-defaults.json",
"title": "My Linked Data Fragments server",
"datasources": [
{
"@id": "urn:ldf-server:myHdtDatasource",
"@type": "HdtDatasource",
"datasourceTitle": "DBpedia 2014",
"description": "DBpedia 2014 with an HDT back-end",
"datasourcePath": "dbpedia",
"hdtFile": "data/dbpedia2014.hdt"
},
{
"@id": "urn:ldf-server:mySparqlDatasource",
"@type": "SparqlDatasource",
"datasourceTitle": "DBpedia (Virtuoso)",
"description": "DBpedia with a Virtuoso back-end",
"datasourcePath": "dbpedia-sparql",
"sparqlEndpoint": "https://dbpedia.org/sparql"
}
]
}The following sources are supported out of the box:
- HDT files (
HdtDatasourcewithhdtFilesetting) - N-Triples documents (
NTriplesDatasourcewithfilesetting) - Turtle documents (
TurtleDatasourcewithfilesetting) - N-Quads documents (
NQuadsDatasourcewithfilesetting) - TriG documents (
TrigDatasourcewithfilesetting) - JSON-LD documents (
JsonLdDatasourcewithfilesetting) - SPARQL endpoints (
SparqlDatasourcewithsparqlEndpoint)
Support for new sources is possible by creating a new module implementing the Datasource interface.
Start the server
After creating a configuration file, execute
$ ldf-server config.json 5000 4Here, 5000 is the HTTP port on which the server will listen,
and 4 the number of worker processes.
Now visit http://localhost:5000/ in your browser.
Reload running server
You can reload the server without any downtime in order to load a new configuration or version. In order to do this, you need the process ID of the server master process. One possibility to obtain this are the server logs:
$ bin/ldf-server config.json
Master 28106 running.
Worker 28107 running on http://localhost:3000/.If you send the server a SIGHUP signal:
$ kill -s SIGHUP 28106it will reload by replacing its workers.
Note that crashed or killed workers are always replaced automatically.
(Optional) Set up a reverse proxy
A typical Linked Data Fragments server will be exposed
on a public domain or subdomain along with other applications.
Therefore, you need to configure the server to run behind an HTTP reverse proxy.
To set this up, configure the server's public URL in your server's config.json:
{
"title": "My Linked Data Fragments server",
"baseURL": "http://data.example.org/",
"datasources": { … }
}Then configure your reverse proxy to pass requests to your server. Here's an example for nginx:
server {
server_name data.example.org;
location / {
proxy_pass http://127.0.0.1:3000$request_uri;
proxy_set_header Host $http_host;
proxy_pass_header Server;
}
}Change the value 3000 into the port on which your Linked Data Fragments server runs.
If you would like to proxy the data in a subfolder such as http://example.org/my/data,
modify the baseURL in your config.json to "http://example.org/my/data"
and change location from / to /my/data (excluding a trailing slash).
(Optional) Running under HTTPS
HTTPS can be enabled in two ways: natively by the server, or through a proxy (explained above).
With native HTTPS, the server will establish the SSL layer. Set the following values in your config file to enable this:
{
"protocol": "https",
"ssl": {
"keys" : {
"key": "./private-key-server.key.pem",
"ca": ["./root-ca.crt.pem"],
"cert": "./server-certificate.crt.pem"
}
}
} If protocolis not specified, it will derive the protocol from the baseURL. Hence, HTTPS can also be enabled as such:
{
"baseURL": "https://data.example.org/",
"ssl": {
"keys" : {
"key": "./private-key-server.key.pem",
"ca": ["./root-ca.crt.pem"],
"cert": "./server-certificate.crt.pem"
}
}
} If you decide to let a proxy handle HTTPS, use this configuration to run the server as http, but construct links as https (so clients don't break):
{
"protocol": "http",
"baseURL": "https://data.example.org/",
} (Optional) Running in a Docker container
If you want to rapidly deploy the server as a microservice, you can build a Docker container as follows:
$ docker build -t ldf-server .After that, you can run your newly created container:
$ docker run -p 3000:3000 -t -i --rm -v $(pwd)/config.json:/tmp/config.json ldf-server /tmp/config.json(Optional) Host historical version of datasets
You can enable the Memento protocol to offer different versions of an evolving dataset.
Relation to other modules
This package should be used if you want to use an LDF server with the QPF feature.
If you want to extend this server with additional modules,
you can make use of @ldf/preset-qpf instead.
Concretely, it configures the following packages:
@ldf/core: Shared functionality for LDF servers.@ldf/feature-qpf: Feature that enables Quad Pattern Fragments (a.k.a. Triple Pattern Fragments).@ldf/feature-summary: Feature that adds summaries to datasources.@ldf/feature-memento: Feature that enables datetime negotiation using the Memento Protocol.@ldf/datasource-hdt: Datasource that allows HDT files to be loaded.@ldf/datasource-jsonld: Datasource that allows JSON-LD files to be loaded.@ldf/datasource-n3: Datasource that allows N-Quads, N-Triples, Trig and Turtle files to be loaded.@ldf/datasource-sparql: Datasource that allows SPARQL endpoints to be used as a data proxy.@ldf/datasource-composite: Datasource that delegates queries to an sequence of other datasources.
License
The Linked Data Fragments server is written by Ruben Verborgh, Miel Vander Sande, Ruben Taelman and colleagues.
This code is copyrighted by Ghent University – imec and released under the MIT license.