Who says using RDF is hard?

Who says using RDF is hard?

A brief overview on RDF and Linked Data development in JavaScript.

6 October 2019

The Linked Data ecosystem and RDF as its graph data model have been around for many years already. Even though there is an increasing interest in knowledge graphs, many developers are scared off by RDF due to its reputation of being complicated . In this post, I will show some concrete examples on how RDF can be used in JavaScript applications, to illustrate that RDF is actually pretty easy to work with if you use the right tools.

To paraphrase Dan Brickley and Libby Miller, people think using RDF is hard because of its complexity. However, RDF is actually pretty simple , it merely allows you to handle very complex problems . To handle these problems, flexible and expressive tools are needed. These tools should be easy to use, so that we don’t get lost in the complexity of these problems during development. While there have been calls to make RDF easier, this post aims to show that RDF is not the problem, and that good tooling gets us where we want.

Concretely, this post covers the fundamentals of RDF in JavaScript, how to create RDF graphs yourself, how to retrieve them from the Web, and how to execute queries over them. I offer some concrete examples for each aspect using existing tools. It is not my aim to discuss all the available tooling, but instead to present the general idea, and to give pointers to tools that could be used for this.

Read more: Who says using RDF is hard?


Nice writeup. I used to work with RDF a looong time ago, before most of the fancy tools had been built.

In my experience, the difficulties in working with RDF boil down to:

  1. agreeing on a vocabulary/schema for your (possibly shared) application. For public-facing vocabs, this can be an endless process.
  2. mantaining and updating your vocab over time as you add/edit features.
  1. enforcing schemas on the graph so that properties don’t get added willy-nilly in unpredictable ways that make your code blow up. like working with a nosql (no schema) database.
  2. making graph loading and traversal performant enough when dealing with large data sets.

Hopefully the tools available now address these difficulties…