Stream: implementers
Topic: HAPI FHIR Batch Upload
Aaron Jahns (Jul 24 2018 at 16:39):
The organization I work for is interested in FHIR so I recently set up a HAPI FHIR server. My question is, is there any way to upload data as a batch as opposed to singularly? Also, is there any way to put data results into a table directly on the HAPI FHIR server?
Grahame Grieve (Jul 24 2018 at 19:59):
have you looked at http://hl7.org/fhir/http/html#batch?
Aaron Jahns (Jul 24 2018 at 20:08):
The link you sent is broken. We have been able to convert a large amount of data into resources with no issues. We are trying to find a way to search our millions of resources to find particular variables and apply models to the data that we have converted into resources. We were hoping a HAPI FHIR server might have such capabilities, but it seems only one resource at a time can be uploaded to the HAPI FHIR server (not efficient for millions of resources) and we haven't found anything that would take data results and convert it into a table
Grahame Grieve (Jul 24 2018 at 20:09):
sorry: http://hl7.org/fhir/http.html#batch
Grahame Grieve (Jul 24 2018 at 20:10):
but I think that HAPI isn't the fastest at uploading generally. I've seen bigquery used for your task
Aaron Jahns (Jul 24 2018 at 20:15):
Does that have capabilities to take xml files and parse them?
Grahame Grieve (Jul 24 2018 at 20:21):
json for bigquery
nicola (RIO/SS) (Jul 24 2018 at 22:55):
PostgreSQL and BigQuery can solve your problem - you can create sql batch insert for pg and/or load json into bigquery using google api - then just use SQL and indexing to search your data.
nicola (RIO/SS) (Jul 24 2018 at 22:55):
You can find some hints here - https://github.com/fhir-fuel/fhir-storage-and-analytics-track
Last updated: Apr 12 2022 at 19:14 UTC