Nova postagem

Encontrar

Artigo
· Jul. 11, 2022 1min de leitura

外部の Pythonファイル を Embedded Python で呼び出す方法

IRISがc:\InterSystems\IRISにインストールされているとします。

(1) a.py を C:\InterSystems\IRIS\lib\python に置きます。

a.py
 

def test():  
  print('Hello World!')

 

(2) それを IRISのクラスから実行します。
 

ClassMethod xx() [ Language = python ]
{
    import a
    a.test()
}

 

USER>do ##class(User.test).xx()
Hello World!

 

もし、a.py を別のディレクトリ C:\temp に置きたい場合は、C:\temp\ をPythonのPATHに指定する必要があります。


以下の方法があります。

 

(A)  管理ポータル>システム管理>構成>追加設定>メモリ詳細>PythonPath

の値に C:\tempを設定し、a.pyをそのディレクトリに置き、 IRISを再起動します。
 

ClassMethod xx() [ Language = python ]
{
   import a
   a.test()
}
Discussão (0)1
Entre ou crie uma conta para continuar
Pergunta
· Jul. 7, 2022

HELP DOUBTS MSM-MUMPS-UNIX

Hi, can anyone tell me how I should do the parameter calculation below in the MSM / MUMPS of Intersystems? STAP SIZE
STACK SIZE
MAX MODIFIED BUFFERS
DISK I / O RATE
SLICE COMMANDS
PARTITION SIZE
BUFFER POOL SIZE
NUMBER OF MUSERVER PROCESSES We have a rolling MSM that suffers from hangovers from the Activate service and the Workstation service several times a day and we are unable to determine if those hangouts are related to the number of users or overloading the system. There are times when MSM / UNIX freezes and we lose any interaction with MSM, we have to perform a kill directly on the service to start again.

5 Comments
Discussão (5)0
Entre ou crie uma conta para continuar
Artigo
· Jul. 5, 2022 4min de leitura

IRIS Data to Google Big Query - InterSystems Cloud SQL via Dataflow

        

How to include IRIS Data into your Google Big Query Data Warehouse and in your Data Studio data explorations.  In this article we will be using Google Cloud Dataflow to connect to our InterSystems Cloud SQL Service  and build a job to persist the results of an IRIS query in Big Query on an interval. 

If you were lucky enough to get access to Cloud SQL at Global Summit 2022 as mentioned in "InterSystems IRIS: What's New, What's Next", it makes the example a snap, but you can pull this off with any publicly or vpc accessible listener you have provisioned instead.

 

Prerequisites

 
Provision InterSystems Cloud SQL for temporary use
 
Setup Google Cloud Platform

Google Dataflow Job
If you followed the steps above you should have the following in your inventory to execute the job to read your InterSystems IRIS data and ingest it into Google Big Query using Google Dataflow.

In the Google Cloud Console, head over to Dataflow and select "Create Job from Template"

 
This is a rather unnecessary/exhaustive illustration on how to instruct you to fill out a form with the generated pre-requisites, but it calls out the source of the components...

 

 ... to round it out, make sure you expand the bottom section and supply your credentials for IRIS.

 

For the ones who found those screenshots offensive to your intelligence, here is the alternate route to go to keep you inside your comfort zone in the CLI to run the job:

gcloud dataflow jobs run iris-2-bq-dataflow \
--gcs-location gs://dataflow-templates-us-central1/latest/Jdbc_to_BigQuery \
--region us-central1 --num-workers 2 \
--staging-location gs://iris-2-datastudio/tmp \
--parameters connectionURL=jdbc:IRIS://k8s-c5ce7068-a4244044-265532e16d-2be47d3d6962f6cc.elb.us-east-1.amazonaws.com:1972/USER,driverClassName=com.intersystems.jdbc.IRISDriver,query=SELECT TABLE_CATALOG, TABLE_SCHEMA, TABLE_NAME, TABLE_TYPE, SELF_REFERENCING_COLUMN_NAME, REFERENCE_GENERATION, USER_DEFINED_TYPE_CATALOG, USER_DEFINED_TYPE_SCHEMA, USER_DEFINED_TYPE_NAME, IS_INSERTABLE_INTO, IS_TYPED, CLASSNAME, DESCRIPTION, OWNER, IS_SHARDED FROM INFORMATION_SCHEMA.TABLES;,outputTable=iris-2-datastudio:irisdata.dataflowtable,driverJars=gs://iris-2-datastudio/intersystems-jdbc-3.3.0.jar,bigQueryLoadingTemporaryDirectory=gs://iris-2-datastudio/input,username=SQLAdmin,password=Testing12!

Once you have kicked off your job, you can bask in the glory a successful job run:

 
Results

Taking a look at our source data and query in InterSystems Cloud SQL...

 

... and then Inspecting the results in Big Query, it appears we do in fact, have InterSystems IRIS Data in Big Query.

 
Once we have the data in Big Query, it is trivial to include our IRIS data into Data Studio by selecting Big Query as the data source... this example below is missing some flair, but you can quickly see the IRIS data ready for manipulation in your Data Studio project.


 
 

3 Comments
Discussão (3)2
Entre ou crie uma conta para continuar
Anúncio
· Jun. 16, 2022

George James Software at the Global Summit

Only a few days to go until the Global Summit! George James Software will be on hand to talk about any projects you may have on the horizon, such as application development, data and platform migration, system integration, training, and support – we can work with you to find practical and maintainable solutions that support the growing needs of your organization.

We're also running a User Group Session on Wednesday, July 22nd at 12pm. It's a great opportunity to find out more about our tools and ask us (and current users!) any questions.

Discussão (0)1
Entre ou crie uma conta para continuar
Artigo
· Jun. 3, 2022 5min de leitura

Questionnaire & Forms in FHIR : From creation to usage

This article will discuss FHIR Questionnaire and QuestionnaireResponse from the creation of the form to the upload on the server and how to fill them.

Discussão (0)2
Entre ou crie uma conta para continuar