Pesquisar

Pergunta
· Out. 11

Can I implement custom stream compression algorithms in IRIS for %Stream.GlobalBinary?

For space optimization, we want to apply a domain-specific compression algorithm to binary stream data before writing to %Stream.GlobalBinary. Is it possible to override or extend stream classes to include compression/decompression?

1 Comment
Discussão (1)2
Entre ou crie uma conta para continuar
Pergunta
· Out. 11

How can I implement an efficient multi-shard SQL query in an InterSystems IRIS sharded cluster?

We are using IRIS with a sharded architecture. Complex SQL queries (with joins, aggregates, and subqueries) are performing slowly. How can I design queries or indexes to optimize distributed execution across shards?

1 Comment
Discussão (1)2
Entre ou crie uma conta para continuar
Artigo
· Out. 11 4min de leitura

Working with Stream Objects in InterSystems IRIS

Introduction

In modern applications, especially those involving large data, documents, logs, or multimedia, handling large or unstructured content efficiently becomes essential. InterSystems IRIS offers a robust and scalable way to manage such data using stream objects.

Stream objects allow developers to work with large text or binary data without being constrained by string size limits or memory inefficiencies. In this article, we’ll explore how to create, read, write, store, and manipulate stream objects within IRIS using ObjectScript.


What Are Stream Objects in IRIS?

InterSystems IRIS provides built-in stream classes under the %Stream package to represent sequences of characters or bytes that can be read or written incrementally. These streams are particularly useful when working with:

  • Text documents (logs, reports, etc.)
  • Binary files (images, PDFs)
  • Large API payloads (file uploads/downloads)
  • Persistent class properties exceeding string limits (~3.6MB)

Types of Streams

Stream Type Class Use Case
Character Stream %Stream.GlobalCharacter, %Stream.FileCharacter Large text data
Binary Stream %Stream.GlobalBinary, %Stream.FileBinary Images, PDFs, binary files

Creating and Writing to a Stream

Here’s how to create and write to a global character stream:

 

Set stream = ##class(%Stream.GlobalCharacter).%New() Do stream.Write("This is line one.") Do stream.WriteLine("This is line two.") The .Write() method appends text to the stream, while .WriteLine() adds a newline at the end.


Reading from a Stream

After writing to a stream, you can read from it using .Read() or .ReadLine():

 

Do stream.Rewind() Set line1 = stream.ReadLine() Set line2 = stream.ReadLine() Write line1, !, line2

Always call .Rewind() before reading if you’ve already written to the stream — this resets the internal pointer to the beginning.


Saving a Stream to a File

Stream objects can be saved to files directly:

 

Set sc = stream.OutputToFile("/tmp/output.txt") If $$$ISERR(sc) { Write "Error saving file" } Similarly, you can read from a file into a stream:

 

Set fileStream = ##class(%Stream.FileCharacter).%New() Set fileStream.Filename = "/tmp/input.txt" Set sc = fileStream.CopyTo(stream) ; Copy contents into another stream


Storing Stream Objects in Persistent Classes

Streams integrate easily with persistent objects. You can define a stream as a property in your data model:

 

Class MyApp.Report Extends (%Persistent) { Property Title As %String; Property Content As %Stream.GlobalCharacter; } You can now save large content as part of your data model:

 

Set report = ##class(MyApp.Report).%New() Set report.Title = "October Report" Set report.Content = ##class(%Stream.GlobalCharacter).%New() Do report.Content.Write("Full report content goes here...") Do report.%Save()


Binary Streams

For non-text data (e.g., images or attachments), use %Stream.GlobalBinary:

 

Set binStream = ##class(%Stream.GlobalBinary).%New() Do binStream.Write($Char(255, 216, 255)) ; Sample binary data (JPEG header) Do binStream.OutputToFile("/tmp/sample.jpg")


Common Use Cases for Stream Objects

  • REST API File Uploads/Downloads
    Handle large multipart file uploads using stream input/output in custom REST services.
  • PDF or Log Storage
    Store application logs or reports in a database without hitting string size limits.
  • Data Archiving
    Save binary backups or exports as BLOBs in the database.
  • Dynamic File Generation
    Generate files in-memory using streams before sending them over HTTP or saving them to disk.

Tips & Best Practices

  • Always rewind before reading a stream you’ve just written.
  • Use stream copying for efficient memory handling: source.CopyTo(destination).
  • For binary data, never use character streams — it may corrupt content.
  • Monitor .Size to track stream length.
  • Use %JSONExportToStream() for serializing objects directly into streams (e.g., for REST responses).

Example: Export Object as JSON Stream

 

Set obj = ##class(MyApp.Report).%OpenId(1) Set jsonStream = ##class(%Stream.GlobalCharacter).%New() Do obj.%JSONExportToStream(.jsonStream) Do jsonStream.OutputToFile("/tmp/export.json")


Conclusion

Stream objects are an essential tool in every InterSystems IRIS developer’s toolbox. They provide a flexible, scalable, and efficient way to handle large or unstructured data within applications — whether you’re storing logs, manipulating files, or transmitting large payloads.

By understanding how to create, manipulate, and persist streams, you unlock a new level of data handling power within your IRIS environment.


1 Comment
Discussão (1)2
Entre ou crie uma conta para continuar
Artigo
· Out. 10 2min de leitura

线上研讨会 | 借助IDFS构建实时数据中枢:从多源整合到智能分析

InterSystems Data Fabric Studio(IDFS)提供了一种新方法,可在安全可控的环境中将正确的数据在正确的时间提供给正确的消费者。10月17日14:00,我们将举办题为“借助IDFS构建实时数据中枢:从多源整合到智能分析”的线上研讨会,欢迎👉点击此处报名参会!

IDFS是一个完全由云计算管理的解决方案 ,旨在轻松实施和维护智能数据编织(smart data fabric),将不同的数据连接并转换为单一的统一可操作信息源。这一自助式解决方案使数据分析师、数据管理员和数据工程师能够访问和处理业务利益相关者所需的数据,而无需依赖开发人员。

本次分享将展示如何通过多源数据管道自动化构建(定义数据源连接、字段提取与清洗规则)、业务日历驱动的实时调度(周期自动运行数据任务)以实现多种异构数据系统的无缝融合。

您将在此次分享中了解到以下经典场景:

  • 数据工程师视角:通过可视化 “配方” 工具(Recipes)定义数据转换逻辑,无需编码即可完成从数据到分析表的自动化加载;
  • 分析师实践:基于整合后的标准化数据集,快速构建生产效率 BI 立方体,联动Power BI 生成动态看板;
  • 合规管理:利用系统内置的 “快照调度” 功能,自动生成符合审计要求的历史数据存档,结合层级化权限控制(管理员 - 工程师 - 分析师分工),确保数据安全可追溯。

无论您是数据工程师、架构师,还是AI应用开发者,都能在本次研讨会中获取IDFS实战经验、技术架构设计思路与前沿趋势洞察,IDFS助力您轻松部署以数据为中心、连接数据和应用孤岛的AI应用!

我们期待与您的进一步互动。

1. 留言互动

在会议进行过程中,如果您有任何疑问,或者希望与我们进一步讨论,可以在屏幕上方点击“提问”按钮,提交您的问题,我们会在分享结束后整理问题,并通过邮件向您回复。

2. 有奖调研

参会期间,点击屏幕右上角“有奖调研”完成问卷,将有机会获得定制小礼品。

快来加入我们吧٩(๑>◡<๑)۶ 👉点击查看

Discussão (0)1
Entre ou crie uma conta para continuar
Artigo
· Out. 10 2min de leitura

Why do I still see old messages after running the purge task?

To manage the accumulation of production data, InterSystems IRIS enables users to manage the database size by periodically purging the data. This purge can apply to messages, logs, business processes, and managed alerts.

Please check the documentation for more details on the settings of the purge task:
https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=EGMG_purge#EGMG_purge_settings

An issue that many users ran into is still finding old messages after running the purge task for messages. For example, a user has a purge task for messages that has NumberOfDaysToKeep=45. This means that messages generated in the last 45 days are retained, and messages generated before 45 days should be deleted. After the purge task completes successfully, most of the messages before the 45-day retention period are deleted, but there are still some old messages that were created before 45 days. Why aren't these messages purged?

In this article, I will discuss the most commonly seen root causes and how to handle the situation. I will assume that the user has a purge task using Ens.Util.Tasks.Purge. 

1) Check the purge task
First, we want to check the purge task's settings. We want to confirm the NumberOfDaysToKeep setting and make sure that the task is set to purge messages in the namespace we are examining, but the setting we want to check the most is KeepIntegrity.

 If KeepIntegrity is enabled, the purge task will only purge complete sessions. By definition, a complete session contains only messages with the status of Complete, Error, Aborted, or Discarded. Notice that if any message in the session is not of one of these four statuses (e.g., the message is Queued or Suspended), all messages in the session will not be purged.

2) Check the message status of all messages in that session
Knowing that KeepIntegrity can make the purge task skip some messages in incomplete sessions, we can now check if this is the current issue by checking the status of the messages in the session. In the Message Viewer, search for messages that should have been deleted based on the NumberOfDaysToKeep criteria by applying the time filter. Check the status of all of the messages in one of these sessions by using the Session ID. Are there any messages with a status other than Complete, Error, Aborted, or Discarded?

In addition to the Message Viewer, you can verify this information using SQL by querying the Ens.MessageHeader table.

3) Manage the incomplete sessions
To resolve the issue, you need to change the status of these messages to they can be purged. For example, some messages might still be in a queue and need to be aborted.

Another way to resolve it is to create a purge task with KeepIntegrity disabled to purge the messages even if the sessions are incomplete. You should choose an appropriate NumberOfDaysToKeep.

Discussão (0)1
Entre ou crie uma conta para continuar