Nova postagem

Pesquisar

Artigo
· Fev. 18 5min de leitura

OMOP Odyssey - Celebration (House of Hades)

Hades DatabaseConnector 6.4.0 now does IRIS stuff!

A well deserved celebration 🎉 that OHDSI officially announced IRIS as a supported platform and dialect with DatabaseConnector 6.4.0 in the OHDSI Hades Project!  Lets showcase this accomplishment and create the OMOP CDM 5.4 on InterSystems IRIS, persona time!


Ms. TLS - I am running InterSystems Cloud Document (Health Connect Cloud or equivalent )

I have a running IRIS Cloud Document workload and managed to create the OMOPCDM54 DATABASE with minimal steps on my system, The Cloud Portal and RStudio.


System

However you import certs on your system, it needs to be done here to connect to InterSystems Cloud Document deployment.  I use the following script loosely across solutions, where I paste in the cert I downloaded from the portal and it imports the cert for me.
 

 
import_iris_certificate.sh

 

The Cloud Portal

This can be done alternate ways over the listener as well, but to highlight it can be done in the Cloud Portal, we can use the SQL Query Tools.


RStudio

Next step is Rstudio, where the OHDSI packages will do most of the work after a little bit of setup.
 

install.packages("devtools")
devtools::install_github("OHDSI/CommonDataModel")
install.packages("DatabaseConnector")
install.packages("SqlRender")
Sys.setenv("DATABASECONNECTOR_JAR_FOLDER" = "/home/sween/Desktop/OMOP/iris-omop-extdb/jars")
library(DatabaseConnector)
downloadJdbcDrivers("iris")



Back to RStudio

We can check to see if the iris dialect (now included with OHDSI tooling) is actually in there.
 

CommonDataModel::listSupportedDialects()

Crowd goes wild, indeed IRIS is in there.



I was ready to go here, but for starters I wanted to just dump the ddls it creates to take a peak at them, and check them in somewhere, this dump the ddls in "dry run" fashion.

CommonDataModel::buildRelease(cdmVersions = "5.4",
                              targetDialects = "iris",
                              outputfolder = "./ddl/iris/")

Inspecting our output folder:



Now lets target our InterSystems Cloud Document instance, and actually create the OMOP database, in two steps, one to connect, and one to execute them.

Connect

cd <- DatabaseConnector::createConnectionDetails(dbms = "iris",
                                                 connectionString = "jdbc:IRIS://k8s-0a6bc2ca-a096b3cb-578f6ec1b7-8b0e928263a4990a.elb.us-east-2.amazonaws.com:443/USER/:::true",
                                                 user = "SQLAdmin",
                                                 port = "443",
                                                 password = "PASSWORD",
                                                 pathToDriver = "./jars",
)

Execute

CommonDataModel::executeDdl(connectionDetails = cd,
                            cdmVersion = "5.4",
                            cdmDatabaseSchema = "OMOPCDM54"
                            )

If everything goes well, no errors will be reported in RStudio

Check your work on the InterSystems Cloud Document deployment, its empty, but its there!

👣🔫 Alert, and will mention this one again on the OMOP Journey.  In the CDM for IRIS, there is something to be aware of with a name colission of sorts with the "DOMAIN" table.  IF you look in the screenshot above, you can see DOMAIN is all caps.  This is because domain is a keyword on InterSystems SQL Workloads.  So keep in mind that you have to account for quoting the table name for operations to that table if they exist.
 

for table in cdmtables:
    try:
        if table == "domain":
            sql = f'(select * from OMOPCDM54."{table}") AS temp_table;'
        else:
            sql = f'(select * from OMOPCDM54.{table}) AS temp_table;'

 

 

Mr. Developer ( I am using IRIS and cant be bothered with implementation overhead like TLS )

Im in a hurry, and security is somebody elses job, and want the OMOP database on my local IRIS Community, Eval, Container or POD. Most of the steps apply above, outside a new connection string and we will create the schema through the SMP.


 

Now back in RStudio  again, point to the developer instance.

cd <- DatabaseConnector::createConnectionDetails(dbms = "iris",
                                                 connectionString =
"jdbc:IRIS://192.168.1.162:1972/USER",
                                                 user = "_SYSTEM",
                                                 port = "1972",
                                                 password = "SYS",
                                                 pathToDriver = "./jars",
)

CommonDataModel::executeDdl(connectionDetails = cd,
                            cdmVersion = "5.4",
                            cdmDatabaseSchema = "OMOPCDM54"
                            )

Connecting with nostalgic credentials...



Now Refresh the SMP and expand the OMOPCDM54 schema.



Woooo!

🙌 🎉 

Excellent work and shout outs for those involved in this effort!

1 Comment
Discussão (1)2
Entre ou crie uma conta para continuar
Artigo
· Fev. 18 8min de leitura

OMOP Odyssey - InterSystems OMOP, The Cloud Service (Troy)



InterSystems OMOP, The Cloud Service (Troy)

An Implementer's approach into the OHDSI ( pronounced "Odyssey" ) Community through an Expiring Trial of  InterSystems OMOP Cloud Service.

What is it? 

The InterSystems OMOP, available as a HealthShare service through the InterSystems Cloud Services Portal, transforms HL7® FHIR® data into the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM). The InterSystems OMOP looks at FHIR data stored in an S3 bucket and automatically transforms and sends the data to the cloud-hosted repository in the OMOP CDMO format. You can then use external Observational Health Data Sciences and Informatics (OHDSI) tools, such as ATLAS or HADES, in conjunction with a database driverOpens in a new tab, such as JDBC, to perform analytical tasks on your data.

Abridged: It transforms S3 Hosted FHIR Bulk Export Data to the OMOP CDM to a Cloud Hosted IRIS or a postgres flavored database of your choice.

Going to take the above for a spin here from "soup to nuts" as they say and go end to end with an implementation surrounded by modern powerful tools and the incredible ecosystem of applications from the OHDSI Community.  Will try not to re-hash the docs, neither here or there, and surface some foot guns 👣 🔫 along the way.

Everything Good Starts with a Bucket

When you first provision the service, you may feel immediately that you are in a chicken and egg situation when you get to the creation dialog and prompted for S3 information right out of the gate.   You can fake this best you can and, and update it later or take an approach that is less hurried where you understand how you are provisioning an Amazon S3 Bucket for transformation use.  Its a modern approach implemented in most Cloud based data solutions to share data, where you provision the source location yourself, then grant the service access to interact with it.

  • Provision Bucket and Initial Policy Stack
  • Create the Deployment for the Service
  • Update the Bucket Policy constrained to the Deployment

We can click the console to death , or do this with an example stack.

 
s3-omop-fhir-stack.yaml

Create the stack anyway you want to, one way is to use the aws cli.

aws cloudformation create-stack --stack-name omopfhir --template-body s3-omop-fhir-bucket.yaml --parameters ParameterKey=BucketName,ParameterValue=omop-fhir

Create some initial keys in the bucket to use for provisioning and the source folder for FHIR ingestion.

aws s3api put-object --bucket omop-fhir --key Transactions/in --profile pidtoo
aws s3api put-object --bucket omop-fhir --key termz --profile pidtoo

You should now be setup to provision the service with the following, pay attention to the field asking for the arn, is actually asking for the arn of the bucket despite the description asking for the name... small 👣🔫 here.



After the deployment is created, head over to the "Configurations" navigation item inside the "FHIR to OMOP Pipeline" deployment and grab the policy by Copying it to your clipboard.  You can just follow the directions supplied there, and wedge this into your your current policy or just snag the value of the role and update your stack.

aws cloudformation update-stack --stack-name omopfhir --template-body s3-omop-fhir-bucket.yaml --parameters ParameterKey=PolicyfromOMOPConfiguration,ParameterValue="arn:aws:iam::1234567890:role/skipper-deployment-4a358965ec38ba179ebeeeeeeeeeee-Role"

Either way, you should end up with a policy that looks like this on your source bucket under permissions... (account number, role fuzzed)

 
 Recommended Policy

I used a more open policy that allowed for the opening the root account, but constraining on the buckets.  This way I could support multiple deployments with a single bucket (or multiple buckets).   Not advised I guess, but a second example for reference to support multiple environments in a single policy for IAC purposes.

 
Root Account

That's our source for the transformation, now lets move on to the target, the OMOP Database.


Meet OMOP

Lets take a quick look over to the other deployment "OMOP on IRIS" and meet the Common Data Model.

The OMOP (Observational Medical Outcomes Partnership) database is a monument on how you boil ridiculous complexity to support  multiple sources into a common data model, referred to as the CDM.  Any further explanation outside of the community would be an exercise in cut and paste (or even worse generative content), and the documentation in this community is really, really well done.

Navigating to the "SQL Query Tools" navigation and you can see the InterSystems implementation of the Common Data Model, shown here next to the infamous diagram of OMOP Schema from the OHDSI community.

That's as far as we go with this work of art, let's investigate another option for using the service for transformation purposes only.

BYODB (Bring Your Own Database)

We got a database for free when we provisioned last time, but if we want to target another database, we can surely do that as the service at this time of writing supports transforming to flavors of Postgres as well.  For this we will outline how to wrangle an external database, via Amazon RDS, and connect it to the service.


Compute


I'll throw a flag here and call another 👣🔫 I refer to as "Biggie Smalls" in regards to sizing your database for the service if you bring your own.  InterSystems does a pretty good job of sizing the transform side to the database side, so you will have to follow suit and consider the fact that the speed of your transform performance is dependent on the sql instance you procure to write to, so do so accordingly.   This may be obvious to some, but witnessed it and thought Id call it out, as I went cheap with RDS, Google Cloud SQL et al, and the persistence times of the FHIR Bundles to the OMOP database were impacted.

Having said all that, I do exactly the opposite and give Jeff Bezos the least amount of money possible for the task regardless, with a db.t4g.micro postgres RDS Instance.

We expose it publicly and head over to download the certificate bundle for the corresponding region your database is in... make sure its in .pem format.

Next, however you interact with databases these days, connect to your db instance and create a DATABASE and SCHEMA:




Load OMOP CDM 5.4

Now we get a little help from our friends in the OHDSI Community to provision the supported schema at version 5.4 in RStudio with OMOP Common Data Model schema using OHDSI Tools. 

install.packages("devtools")
devtools::install_github("OHDSI/CommonDataModel")
install.packages("DatabaseConnector")
install.packages("SqlRender")
Sys.setenv("DATABASECONNECTOR_JAR_FOLDER" = "/home/sween/Desktop/OMOP/iris-omop-extdb/jars")
library(DatabaseConnector)
downloadJdbcDrivers("postgresql")

We now have what we need and can connect to our postgres instance and created the tables in the OMOPCDM54 we provisioned above.

Connect

cd <- DatabaseConnector::createConnectionDetails(dbms = "postgresql",
                                                 server = "extrdp-ops.biggie-smalls.us-east-2.rds.amazonaws.com/OMOPCDM54",
                                                 user = "postgres",
                                                 password = "REDACTED",
                                                 pathToDriver = "./jars"
                                                 )


Create

CommonDataModel::executeDdl(connectionDetails = cd,
                            cdmVersion = "5.4",
                            cdmDatabaseSchema = "omopcdm54"
                            )

Barring a "sea of red", it should have executed successfully.



Now lets check out work, we should have an external postgres OMOP database suitable for using with the service.


Configure OMOP Cloud Service

We have the sources, we have the targets, lets configure the service to glue them together to complete the transformation pipeline from FHIR to the external database.


InterSystems OMOP Cloud Service Should be all set!

The OMOP Journey continues...

Discussão (0)2
Entre ou crie uma conta para continuar
Pergunta
· Fev. 18

Using a SQL Server stored procedure in a CSP page

Hi Community, 

I have a problem I am hoping someone can help with - I have created a front-end with HTML/CSS/JS in a CSP page in Iris Studio. I am trying to use objectscript on the back end to connect to a SQL Server Database (with valid credentials), and execute a stored procedure.

I have a fileList variable that is a stored as a comma separated string through a user input textBoxContainer. This is to be used as a parameter in the stored procedure called @Docs

When I run the stored procedure from SSMS and enter the fileList manually on executing, it runs as expected, however when I run from the CSP Page, it doesn't execute. I have put the code snippet below in hopes someone has done this before and can provide an example. For an example fileList, you can assume - fileList = "111,222,333"

 set gc=##class(%SQLGatewayConnection).%New()

set CredentialObj=##Class(Ens.Config.Credentials).%OpenId("DatabaseCred")

 Set pDSN="DB-DSN"

set sc=gc.Connect(Assume DSN and User&Pass Credentials Here,0)

  If $$$ISERR(sc) {
  &js< alert(#(..QuoteJS("DB connection Failed: ", $system.Status.GetErrorText(sc)))#);>
  Quit
  else {
  &js< alert(#(..QuoteJS("Connected To DB"))#);>
  }

 If sc '= 1 {
       &js< alert(#(..QuoteJS("Error executing stored procedure: ", gc.%SQLCODE))#);>
       
    ElseIf scriptName = "procedure1" {
        Set sc = gc.Execute("EXEC ExternalDB.DB.StoredProcName @Docs = ?", fileList)

}

 

Thank in advance. Also, just in case this is relevant, the stored procedure is in one DB and updates the data on a different DB, however, I don't think it would be an issue as it connects to DB as expected from the CSP page, it is just the SP that doesn't work.

Kind Regards,

Dan

1 Comment
Discussão (1)2
Entre ou crie uma conta para continuar
Anúncio
· Fev. 18

[Vídeo] FHIR Repository e FHIR SQL Builder

Olá Comunidade, 

Aproveite o novo vídeo do InterSystems Developers YouTube:

⏯ FHIR Repository and FHIR SQL Builder @ Global Summit 2024

Descubra os desafios que o Ministério da Saúde de Israel enfrentou ao implantar o InterSystems FHIR Server e o FHIR SQL Builder para diversas iniciativas nacionais importantes.
 

🗣 Apresentador: @Keren Skubach, Senior Sales Engineer, InterSystems

Expanda seus horizontes conosco — assista, aprenda e assine para mais! 👍

Discussão (0)1
Entre ou crie uma conta para continuar
Pergunta
· Fev. 18

Apache server .../temp/ does not exist, limiting Gateway functionality

I am using an Apache webserver on RH8, which is working fine. However, I get the following message:

Directory /opt/webgateway/conf/temp/ does not exist or has incorrect permissions. This will limit the Web Gateway's functionality.

The folder is indeed missing, but in what way is it limiting functionality?

3 Comments
Discussão (3)2
Entre ou crie uma conta para continuar