Stream: implementers
Topic: RDF Support in hapi-fhir
Josh Collins (Feb 21 2020 at 16:17):
Hi, I'm new to FHIR and digging into using it with RDF. I see some pulls to hapi-fhir (https://github.com/jamesagnew/hapi-fhir/pull/1321) that appear to add RDF support, but I'm not able to get RDF responses (by adding Accept: text/turtle, application/fhir+turtle , or application/x-turtle or using the _format=turtle, _format=text/turtle.
I'm testing using the hapi-fhir-jpaserver-starter which is using 4.2 release.
Any guidance is appreciated.
James Agnew (Feb 21 2020 at 16:31):
Unfortunately that PR was not successful and was rolled back (I didn't realize at the time that all of its tests were @Ignored).
I know that @Harold Solbrig and @David Booth were working on reimplementing it- They may have updates.
Josh Collins (Feb 21 2020 at 16:43):
Ah, that would certainly explain what I'm seeing! Thank you for the response. Interested in any updates on those working on it and possibly contributing if there is room in that effort.
Miguel Rochefort (Mar 02 2020 at 01:38):
I'm extremely interested in RDF support as well. It seems to be the only thing stopping FHIR from finally making data integration possible. At the very least, I have to be able to use SPARQL to query and manipulate data. I'm willing to contribute, whether by initially adding support for Turtle or JSON-LD.
Grahame Grieve (Mar 02 2020 at 02:08):
there is an RDF format - turtle. All the examples are provided as turtle, and the definitions are made available in turtle format too. In addition, SHEx is published. Also, both test.fhir.org and the HAPI server (including hapi.fhir.org) can read and write the turtle format
Josh Collins (Mar 02 2020 at 14:11):
Thanks @Grahame Grieve , I'm able read turtle from test.fhir.org, but I'm not seeing it supported in hapi.fhir.org. When passing Accept: text/turtle I get JSON instead of RDF.
Josh Collins (Mar 02 2020 at 14:23):
There appears to be a nascent attempt at an RDFParser in the master branch of hapi-fhir-base (https://github.com/jamesagnew/hapi-fhir/blob/master/hapi-fhir-base/src/main/java/ca/uhn/fhir/parser/RDFParser.java) but it isn't fully wired in and the parse itself breaks Jena when trying to render the triples.
@James Agnew @Grahame Grieve would there be a path to using the version specific RDFParsers that exist in the HL7 Fhir Core project (https://github.com/hapifhir/org.hl7.fhir.core/blob/master/org.hl7.fhir.r4/src/main/java/org/hl7/fhir/r4/formats/RdfParser.java?).
Grahame Grieve (Mar 02 2020 at 19:15):
ah. yes, James took the old one out of HAPI, and there's a small project to replace it with an updated one but that hasn't landed yet.
I don't think that core parsers will fit into HAPI at all well. Also, the RDF parser doesn't actually parse yet.
Grahame Grieve (Mar 02 2020 at 19:16):
In Sydney, @Harold Solbrig and I were discussing a new version of JSON-LD that may make the existing JSON into RDF - it's close. Harold, when does the RDF group meet?
Vassil Peytchev (Mar 02 2020 at 20:07):
I think I saw an announcement that the RDF group is switching to JSON-LD for RDF support.
Grahame Grieve (Mar 02 2020 at 20:16):
I saw some interest in that, but it's not something we can just do. there's water to go under the bridge yet
Josh Collins (Mar 02 2020 at 20:43):
I was on a call with Harold's group earlier today - but that does seem focused on JSON-LD 1.1 first. I'm looking to support turtle.
Josh Collins (Mar 03 2020 at 12:33):
@Grahame Grieve @James Agnew I'd like to invest some time in getting the current, non-working implementation to function, but I'm admittedly getting up to speed on both the FHIR spec and the nuances of its RDF representations - is it a fair assumption I should be able to render the appropriate turtle with just the resource definitions, or will there be other external data needed? Currently the JSON and XML parsers are generic across versions, should that be achievable on Turtle as well?
Grahame Grieve (Mar 03 2020 at 12:34):
sure. it should be
Grahame Grieve (Mar 03 2020 at 12:34):
someone else was working on this...
Eric Prud'hommeaux (Mar 03 2020 at 12:44):
@Grahame Grieve i suspect you are thinking of the person at Mayo who's code Josh is pouring through. that seem likely?
Grahame Grieve (Mar 03 2020 at 12:46):
y
Eric Prud'hommeaux (Mar 03 2020 at 12:50):
Josh is looped in with the folks at Mayo so they should be helping route him to existing work
Josh Collins (Mar 03 2020 at 12:53):
The code I'm working through is from Raul Estrada - not sure if he's associated with Mayo or not. Regardless, the Mayo folks seem to have the most interest in this so iterating there is probably my next step.
Harold Solbrig (Mar 04 2020 at 20:37):
Grahame Grieve said:
In Sydney, Harold Solbrig and I were discussing a new version of JSON-LD that may make the existing JSON into RDF - it's close. Harold, when does the RDF group meet?
It meets Thursday at 11:00 AM Eastern at http://tinyurl.com/fhirrdf (A link to a Google Hangout)
David Booth (Mar 05 2020 at 14:52):
Hi @Josh Collins and @Miguel Rochefort ! I wanted to jump in and introduce myself and provide some clarification if I can, since it looks like we haven't been doing quite enough to get the word out. The FHIR/RDF effort is a W3C/HL7 collaboration which I "chair". (I put "chair" in quotes because I'm an official chair from the W3C perspective, but not an official HL7 chair.) Anyway, I am very glad to hear of your interest in FHIR/RDF!
First off, we'd be delighted if you could join our Thursday teleconferences at 11am Boston time on google hangout: http://tinyurl.com/fhirrdf . That's the primary place where we coordinate the FHIR/RDF efforts. We'd also love to hear what you are doing or plan to do with FHIR/RDF, if you're able to talk about it. But whether or not you're able to join the calls or talk, we're still very interested in your input!
The Mayo project that @Eric Prud'hommeaux mentioned also has separate weekly calls. There is a lot of overlap between those calls and the official FHIR/RDF calls because the Mayo project is using FHIR/RDF and contributing a lot of work on it.
Regarding HAPI, there was work on getting an RDF serialization into the HAPI server, but as Grahame mentioned, it was incomplete and pulled out (at least until it can be fully completed, with test cases and all).
Although there is ongoing interest in completing the RDF work on the HAPI server, in recent weeks our Thursday teleconferences have been dominated by two other efforts:
-
We are developing a new way to convert FHIR/JSON to FHIR/RDF, via JSON-LD 1.1, which promises to be easier to maintain than the current mechanism. (More on this below.)
-
Based on experience with existing FHIR/RDF, we are also developing a new revision of the standard FHIR/RDF representation (which we're calling R5), with the main goal of being easier and more natural for RDF users to process. To this end, we are very interested in hearing from any others who have been using the existing FHIR/RDF representation (which we're calling R4), to best understand common usage and query patterns, so that we can make the right design choices. We want to tweak the existing FHIR/RDF spec to make common cases easier while keeping rare cases still possible.
Regarding JSON-LD, our goal with JSON-LD 1.1 is to simplify the maintenance of both the FHIR/RDF specification and conversion between FHIR/JSON and FHIR/RDF. It is not at all to replace Turtle. The 1.1 version of JSON-LD added functionality that was not present in the original 1.0 version, when we first considered and rejected JSON-LD a few years ago. To understand where JSON-LD 1.1 now fits in, consider what we had to do, to standardize FHIR/RDF. We mainly did two things:
-
Implemented appropriate code in the FHIR spec build process to generate both the human-oriented FHIR/RDF specification and the machine-processable artifacts, like the ontology and the ShEx.
-
Implemented converters that translate between FHIR/JSON (or perhaps FHIR/XML) and FHIR/RDF. That's where the HAPI server fits in.
Both of those tasks require the ability to translate between FHIR/JSON (or the FHIR internal model) and FHIR/RDF, and when we did the initial version of FHIR/RDF, we accomplished those tasks using a certain amount of custom code. But with the advent of JSON-LD 1.1, we realized that we could reduce the custom code that is required, by using a JSON-LD @context to perform much of the JSON-to-RDF mapping. The @context allows the mapping to be more declarative in style also.
It is important to note that, even though we intend to adopt JSON-LD 1.1 for some of the internal processing, this does NOT mean that we are abandoning Turtle or ShEx as the languages that are used for visible consumption. In theory we could consider switching from Turtle to JSON-LD as the expository language for describing FHIR/RDF and giving examples, but we have no plans to do that, and I seriously doubt that it would be wise, because I fear it would create too much confusion with standard FHIR/JSON.
In short, even though JSON-LD 1.1 may become an important piece of the internal process, it does not need to be visible to FHIR/RDF users unless they specifically choose to serialize their RDF as JSON-LD, which they are always free to do, since RDF can be serialized in any of several standard formats, including Turtle, RDF/XML, JSON-LD, etc.
I hope this helps a little to clarify our efforts. Let me know what other questions come to mind, and please join our teleconference if you can fit it into your schedule. Thanks!
Josh Collins (Mar 05 2020 at 15:00):
Thanks for the details @David Booth. I won't be able to attend this week's meeting, but I'll put it on my calendar for next week.
Is it safe to say that if RDF (specifically turtle) support is needed in the short term (a ~3 month time horizon) the current JSON-LD 1.1 efforts are likely not going to be in place?
Assuming that is the case, my current plan is to implement RDF support on a fork and then work to get it upstream.
David Booth (Mar 05 2020 at 17:10):
@Josh Collins even when we finish getting the JSON-LD 1.1 approach done, there could still be significant value in adding RDF support to HAPI without going through JSON-LD 1.1, because as @Eric Prud'hommeaux , users will want to know line numbers for errors, and I don't know how those would be carried through if the processing went via JSON-LD 1.1. But maybe you (or someone) can devise a way to do that. Another possible benefit of skipping JSON-LD 1.1 in HAPI is that it might run faster, though of course programmer effort is usually far more expensive than machine cycles, so speed may not matter much.
Miguel Rochefort (Mar 06 2020 at 23:28):
@David Booth Are JSON-LD contexts flexible enough to add all the RDF-equivalent semantics to existing JSON without modifying it? If so, couldn't we just tweak the header of the existing JSON API to include those contexts, or provide an Endpoint-to-Context mapping for clients to do it themselves? JSON-LD to Turtle should be trivial at that point, and then a SPARQL endpoint could be implemented as a proxy service the same way GraphQL maps to existing REST APIs. That is, assuming I'm not oversimplifying things.
Grahame Grieve (Mar 07 2020 at 02:33):
I think it's not quite there yet
David Booth (Mar 07 2020 at 07:11):
@Miguel Rochefort , Grahame is correct. There are still a few essential things that JSON-LD 1.1 cannot do, which must be done with custom code.
Grahame Grieve (Mar 07 2020 at 08:42):
one apparently didn't happen because we wouldn't want it to? Didn't follow that logic...?
Grahame Grieve (Mar 07 2020 at 10:11):
oh it was to do with the list representaion. I don't see how JSON-LD (actually RDF generally) is useable until it's figured out a useable common approach for 0..* relationships
Eric Prud'hommeaux (Sep 05 2020 at 09:21):
btw, native RDF support for HAPI is now implemented, working on release engineering and a PR for HAPI
Last updated: Apr 12 2022 at 19:14 UTC