Stream: hapi
Topic: Use Hapi’s RDF feature
Simon de turck (Dec 17 2018 at 20:39):
Hi, i’m not entirely sure whether this is the correct stream for my question (telling me which one is better would be appreciated:-) ) but since I’m dealing with a Hapi Fhir service this is my best guess.
My question is regarding getting the Fhir data in RDF in order to load it into a graph database like neo4j or, preferably, Dgraph. The Fhir spec has some minimal information and refers to the hapi reference implementation but I was unable to find any information or examples on the hapi site or in the hapi documentation.
My understanding is that it is possible to export the Fhir data in RDF and some of the standard Fhir ontologies are made available. How to actually get that data is not documented anywhere as far as I could see.
Does anyone here know where I can find more information on how to do this with Hapi specifically or does anyone know of a generic way to get Structured Fhir data into a graph database? Or better yet, is there any Fhir implementation that uses a graph database as their data store?
Grahame Grieve (Dec 17 2018 at 21:31):
Here's the situation as I understand it:
- the definitions of the spec itself are available in turtle format from the downloads page
- HAPI doesn't support RDF natively, but does include the class org.hl7.fhir.r4.formats.RdfParser that can convert resources to turtle
- I don't know of any open source implementations that use a graph database in the RDF sense. And the only commercial one I know about is (or has been) migrating to hadoop using json
Last updated: Apr 12 2022 at 19:14 UTC