検索

InterSystems Oficial
· Out. 22

Maintenance Releases 2025.1.2 and 2024.1.5 of InterSystems IRIS, IRIS for Health, & HealthShare HealthConnect are now available

The 2025.1.2 and 2024.1.5 maintenance releases of InterSystems IRIS® data platform, InterSystems IRIS® for HealthTM, and HealthShare® Health Connect are now Generally Available (GA). These releases include the fixes for a number of recently issued alerts and advisories, including the following: 

Please share your feedback through the Developer Community so we can build a better product together.

Documentation

You can find the detailed change lists & upgrade checklists on these pages:

Early Access Programs (EAPs)

There are many EAPs available now. Check out this page and register to those you are interested.

How to get the software?

Full installation packages for both InterSystems IRIS and InterSystems IRIS for Health are available from this WRC's InterSystems IRIS Data Platform Full Kits. page. HealthShare Health Connect kits are available from WRC's HealthShare Full Kits page. Container images are available from the InterSystems Container Registry.

Availability and Package Information

This release comes with classic installation packages for all supported platforms, as well as container images in Docker container format.  For a complete list, refer to the Supported Platforms document. The build numbers for these Maintenance Releases are: 2025.1.2.374.0 and 2024.1.5.649.0.

Discussão (0)2
Entre ou crie uma conta para continuar
Pergunta
· Out. 22

Windows issue with scripting

Hi,

I'm trying to run some scripting in Windows. I'm using an instance of the IRISHealth community. 

I'm just trying to run a simple sequence of commands so I can run irissession in a non-interactive mode, like:

irissession.exe IRISHEALTH -U "%SYS" < "myprogram.iris"

 

The contents of my program.iris are, for instance, these ones, to run an online backup:

set $namespace="%SYS"
set cbk = "C:\Test1.cbk"
set log = "C:\Test1.log"
w $$BACKUP^DBACK("","F","MyBackup",cbk,"Y",log,"NOINPUT","Y","Y", 1000, "Test1")
HALT

The execution starts, but then gets completely hung, I see infinitely:

%SYS>

%SYS>
```

Which makes this completely unusable. I can see in the log that the backup is actually completed, but I need to kill the process, close the session, and sometimes even the testing VM gets completely frozen.

By the way, the same procedure using a Docker instance from Ubuntu works without issue.

As a result, the problem seems to lie in something about the input/output behavior with this irissession binary in Windows, but I'm not able to solve it after trying many different approaches (running a single line routine, creating a class and loading it, etc).

Is it possible to run something like this unattended in Windows?

Thanks!

2 novos comentários
Discussão (2)2
Entre ou crie uma conta para continuar
Pergunta
· Out. 22

Using the Workflow Engine to capture Data Quality Issues

I am looking for a way to capture Data Quality issues with the Source data that is populating HealthShare Provider Directory. 1 way is to use Managed Alerts, but since it could be multiple Providers and different messages it seems silly to alert on every message that has the error. Instead, I was thinking of using the Workflow Engine so it could populate a Worklist for someone to review and work.

Looking over the Demo.Workflow Engine example, I am not comprehending on how to send a task to the Workflow manager to populate the worklist from a DTL.

Does anyone have any examples on how this could be done? Do I need to do it in a Process or can it be sent via a DTL?

1 novo comentário
Discussão (1)2
Entre ou crie uma conta para continuar
Pergunta
· Out. 22

Help in Creating Index in DTL based of a variable value

Hello. I need to transform a message

FROM:

MSH|^~\&|
SCH||61490||
PID|1||
RGS|1||1
AIS|1||
AIS|2||
AIS|3||
AIL|1||
AIP|1||

TO:

MSH|^~\&|
SCH||61490||
PID|1||
RGS|1||1
AIS|1||
AIL|1||
AIP|1||
RGS|1||1
AIS|2||
AIL|1||
AIP|1||
RGS|1||1
AIS|3||
AIL|1||
AIP|1||

The RGS, AIS, AIL and AIP are all under the RGS group. The one RGS segment that comes in will be copied across the group. If 3 AIS segments come in then I need 3 RGS groups, if 2 I need 2 RGS groups etc.

In my DTL (screenshot below) I have currently hardcoded the RGS index (1,2,3) but this will not be sufficient incase 4 AIS segments are sent in. I need this index to be variable based off the number of AIS segments that are sent in. 

I have seen posts about using "increment and a while" and I did try that but it didn't work. In the DTL above instead of setting the index to 1, 2, 3 I had set it to "trgs". tais is the count of the AIS sent in. 

Schema:

<?xml version="1.0"?> <Category name="TEST_Schema_2.3.1" 
          description="Orders scheduling." 
          base="2.3.1">     <MessageType name="SIU_S12" structure="SIU_ALL" returntype="base:ACK_S12"/>
    <MessageType name="SIU_S13" structure="SIU_ALL" returntype="base:ACK_S13"/>
    <MessageType name="SIU_S14" structure="SIU_ALL" returntype="base:ACK_S14"/>
    <MessageType name="SIU_S15" structure="SIU_ALL" returntype="base:ACK_S15"/>     <MessageStructure name="SIU_ALL" 
          definition="base:MSH~base:SCH~base:PID~{~base:RGS~base:AIS~base:AIL~base:AIP~}" 
          description="Custom structure with repeating RGS group"/>
</Category>

 

Any leads please?

5 novos comentários
Discussão (5)3
Entre ou crie uma conta para continuar
Artigo
· Out. 22 2min de leitura

Tips on handling Large data

Hello community,

I wanted to share my experience about working on Large Data projects. Over the years, I have had the opportunity to handle massive patient data, payor data and transactional logs while working in an hospital industry. I have had the chance to build huge reports which had to be written using advanced logics fetching data across multiple tables whose indexing was not helping me write efficient code.

Here is what I have learned about managing large data efficiently.

Choosing the right data access method.

As we all here in the community are aware of, IRIS provides multiple ways to access data. Choosing the right method, depends on the requirement.

  • Direct Global Access: Fastest for bulk read/write operations. For example, if i have to traverse through indexes and fetch patient data, I can loop through the globals to process millions of records. This will save a lot of time.
Set ToDate=+H
Set FromDate=+$H-1 For  Set FromDate=$O(^PatientD("Date",FromDate)) Quit:FromDate>ToDate  Do
. Set PatId="" For  Set PatId=$Order(^PatientD("Date",FromDate,PatID)) Quit:PatId=""  Do
. . Write $Get(^PatientD("Date",FromDate,PatID)),!
  • Using SQL: Useful for reporting or analytical requirements, though slower for huge data sets.

Streamlining Bulk Operations

Handling millions of records one by one is slow and heavy. To optimize, I have found that saving in batches, using temporary globals for intermediate steps and breaking large jobs into smaller chunks will make a huge difference. Turning off non-essential indices during bulk inserts will also speed up things.

Using Streams

For large text, XML or JSON payloads, Stream objects prevent memory overload. Dealing with huge files can consume a lot of memory if we are loading everything at once. I would prefer stream objects to read or write the data in chunks. This will keep things fast and efficient. 

Set stream = ##class(%Stream.GlobalCharacter).%New()
Do stream.CopyFromFile("C:\Desktop\HUGEDATA.json")
w "Size: "_stream.Size(),!

This will be a simple way of handling data safely without slowing down the system.

Soooo ya. Handling huge data is not just about making things fast, it's about choosing the right way to access, store and keep the system balanced smartly. 

From migrating millions of patient records to building APIs that handle quite large datasets, these approaches have made a real difference in performance and maintainability.

If you are working with similar concepts and want to swap ideas, please feel free to reach out, I am always happy to share what has worked with me. Open for feedbacks and your opinions too...

 

Thanks!!! :-)

6 novos comentários
Discussão (6)3
Entre ou crie uma conta para continuar