Nova postagem

Rechercher

Pergunta
· Mar. 11, 2024

Reusing and editing recordmaps in studio

I have a fixed width complex recordmap that is an older version of a file format. I am trying to retool it to fit the latest version. The old recordmap has almost 500 entries on it and the new version has a few more fields added along with changes to some field lengths. Adding a new field adds it at the end of the recordmap and moving it from field 490 to field 139 is a painful process. Is there an easier way of doing this?

I have tried editing the recordmap by going to studio and moving the fields to the correct location. I recompiled it and it changed the Storage Default and it looks like it numbered the new fields accordingly. However, this change did not reflect in the recordmap when viewed in the Management portal.

1 Comment
Discussão (1)1
Entre ou crie uma conta para continuar
Artigo
· Mar. 11, 2024 2min de leitura

Embedded Pythonで$LIST()形式のデータを扱う方法

これは InterSystems FAQ サイトの記事です。

現時点(2024年3月)では、コミュニティに掲載されているPythonライブラリ「iris-dollar-list」を利用することでIRISの$LIST()形式のデータをPythonのリストとして利用することができます。

※標準ツールではありませんがご利用いただけます。詳細はコミュニティの記事「もう1つの $ListBuild() の実装:Pythonライブラリ「iris-dollar-list」」をご参照ください。

WindowsにインストールしたIRISで使用する場合は、以下の方法で「iris-dollar-list」をインストールしてください。

※Windows以外にインストールしたIRISでは、pipコマンドを利用した通常の方法でインストールできます。

コマンドプロンプトを開き、以下実行します。(IRISをデフォルトインストールしたときのディレクトリで掲載しています)

> cd C:\InterSystems\IRIS\bin
> irispip install --target C:\InterSystems\IRIS\mgr\python iris-dollar-list

実行例は以下の通りです。

USER>set ^ListTest=$LISTBUILD("test","あいうえお",101)

USER>:py

Python 3.10.12 (main, Jun 11 2023, 05:26:28) [GCC 11.4.0] on linux
Type quit() or Ctrl-D to exit this shell.
>>> from iris_dollar_list import DollarList
>>> glo=iris.gref("^ListTest")
>>> pythonlist=DollarList.from_bytes(glo[None].encode('ascii')).to_list()
>>> pythonlist
['test', 'あいうえお', 101]
>>>

この他に、現時点では日本語が使用できませんが、英数字であれば%SYS.PythonクラスのToList()メソッドを使用する方法もあります(将来のバージョンで日本語もサポートされる予定です)

2024/6/28更新:バージョン2024.1以降では、$LISTBUILD()に日本語が含まれていても、正しくPython リストに変換できるようになりました。詳しくは、 @Ayumu Tanaka さんが記載された返信に含まれる例文をご参照ください。 
 

実行例は以下の通りです。

USER>set ^ListTest2=$LISTBUILD(123,"hello")

USER>:py

Python 3.10.12 (main, Jun 11 2023, 05:26:28) [GCC 11.4.0] on linux
Type quit() or Ctrl-D to exit this shell.
>>> glo=iris.gref("^ListTest2")
>>> pythonlist=iris.cls("%SYS.Python").ToList(glo[None])
>>> pythonlist
[123, 'hello']
>>> 
1 Comment
Discussão (1)2
Entre ou crie uma conta para continuar
Artigo
· Mar. 11, 2024 8min de leitura

Generating meaningful test data using Gemini

We all know that having a set of proper test data before deploying an application to production is crucial for ensuring its reliability and performance. It allows to simulate real-world scenarios and identify potential issues or bugs before they impact end-users. Moreover, testing with representative data sets allows to optimize performance, identify bottlenecks, and fine-tune algorithms or processes as needed. Ultimately, having a comprehensive set of test data helps to deliver a higher quality product, reducing the likelihood of post-production issues and enhancing the overall user experience. 

In this article, let's look at how one can use generative AI, namely Gemini by Google, to generate (hopefully) meaningful data for the properties of multiple objects. To do this, I will use the RESTful service to generate data in a JSON format and then use the received data to create objects.

This leads to an obvious question: why not use the methods from %Library.PopulateUtils to generate all the data? Well, the answer is quite obvious as well if you've seen the list of methods of the class - there aren't many methods that generate meaningful data.

So, let's get to it.

Since I'll be using the Gemini API, I will need to generate the API key first since I don't have it beforehand. To do this, just open aistudio.google.com/app/apikey and click on Create API key.

and create an API key in a new project

After this is done, you just need to write a REST client to get and transform data and come up with a query string to a Gemini AI. Easy peasy 😁

For the ease of this example, let's work with the following simple class

Class Restaurant.Dish Extends (%Persistent, %JSON.Adaptor)
{
Property Name As %String;
Property Description As %String(MAXLEN = 1000);
Property Category As %String;
Property Price As %Float;
Property Currency As %String;
Property Calories As %Integer;
}

In general, it would be really simple to use the built-in %Populate mechanism and be done with it. But in bigger projects you will get a lot of properties which are not so easily automatically populated with meaningful data.

Anyway, now that we have the class, let's think about the wording of a query to Gemini. Let's say we write the following query:

{"contents": [{
    "parts":[{
      "text": "Write a json object that contains a field Dish which is an array of 10 elements. Each element contains Name, Description, Category, Price, Currency, Calories of the Restaurant Dish."}]}]}

If we send this request to https://generativelanguage.googleapis.com/v1beta/models/gemini-pro:generateContent?key=APIKEY we will get something like:

 
Spoiler

Already not bad. Not bad at all! Now that I have the wording of my query, I need to generate it as automatically as possible, call it and process the result.

Next step - generating the query. Using the very useful article on how to get the list of properties of a class we can generate automatically most of the query.

ClassMethod GenerateClassDesc(classname As %String) As %String
{
    set cls=##class(%Dictionary.CompiledClass).%OpenId(classname,,.status)
    set x=cls.Properties
    set profprop = $lb()
    for i=3:1:x.Count() {
        set prop=x.GetAt(i)
        set $list(profprop, i-2) = prop.Name        
    }
    quit $listtostring(profprop, ", ")
}

ClassMethod GenerateQuery(qty As %Numeric) As %String [ Language = objectscript ]
{
    set classname = ..%ClassName(1)
    set str = "Write a json object that contains a field "_$piece(classname, ".", 2)_
        " which is an array of "_qty_" elements. Each element contains "_
        ..GenerateClassDesc(classname)_" of a "_$translate(classname, ".", " ")_". "
    quit str
}

When dealing with complex relationships between classes it may be easier to use the object constructor to link different objects together or to use a built-in mechanism of %Library.Ppulate.

Following step is to call the Gemini RESTful service and process the resulting JSON.

ClassMethod CallService() As %String
{
 Set request = ..GetLink()
 set query = "{""contents"": [{""parts"":[{""text"": """_..GenerateQuery(20)_"""}]}]}"
 do request.EntityBody.Write(query)
 set request.ContentType = "application/json"
 set sc = request.Post("v1beta/models/gemini-pro:generateContent?key=<YOUR KEY HERE>")
 if $$$ISOK(sc) {
    Set response = request.HttpResponse.Data.Read()	 
    set p = ##class(%DynamicObject).%FromJSON(response)
    set iter = p.candidates.%GetIterator()
    do iter.%GetNext(.key, .value, .type ) 
    set iter = value.content.parts.%GetIterator()
    do iter.%GetNext(.key, .value, .type )
    set obj = ##class(%DynamicObject).%FromJSON($Extract(value.text,8,*-3))
    
    set dishes = obj.Dish
    set iter = dishes.%GetIterator()
    while iter.%GetNext(.key, .value, .type ) {
        set dish = ##class(Restaurant.Dish).%New()
        set sc = dish.%JSONImport(value.%ToJSON())
        set sc = dish.%Save()
    }    
 }
}

Of course, since it's just an example, don't forget to add status checks where necessary.

Now, when I run it, I get a pretty impressive result in my database. Let's run a SQL query to see the data.

The description and category correspond to the name of the dish. Moreover, prices and calories look correct as well. Which means that I actually get a database, filled with reasonably real looking data. And the results of the queries that I'm going to run are going to resemble the real results.

Of course, a huge drawback of this approach is the necessity of writing a query to a generative AI and the fact that it takes time to generate the result. But the actual data may be worth it. Anyway, it is for you to decide 😉

 
P.S.

P.P.S. The first image is how Gemini imagines the "AI that writes a program to create test data" 😆

4 Comments
Discussão (4)3
Entre ou crie uma conta para continuar
Artigo
· Mar. 11, 2024 3min de leitura

Deploying IRIS For Health on OpenShift

In case you're planning on deploying IRIS For Health, or any of our containerized products, via the IKO on OpenShift, I wanted to share some of the hurdles we had to overcome.

As with any IKO based installation, we first need to deploy the IKO itself. However we were getting this error:

Warning FailedCreate 75s (x16 over 3m59s) replicaset-controller Error creating: pods "intersystems-iris-operator-amd-f6757dcc-" is forbidden: unable to validate against any security context constraint:

proceeded by a list of all the security context constraints (SCCs) it could not validate against.

If you're like me, you may be surprised to see such an error when deploying in Kubernetes, because a security context constraint is not a Kubernetes object. This comes from the OpenShift universe, which extends the regular Kubernetes definition (read more about that here). 

What happens is that when we install the IKO via helm (see more on how to do that here) we create a service account.

[ User accounts are for humans. Service accounts are for application processes - Kubernetes docs].

This service account is put in charge of creating objects, such as the IKO pod. However, it fails.

OpenShift has a wide array of security permissions that can be limited, and one way to do this is via the security context constraint. 

What we needed to do was to create the following SecurityContextConstraint:

# Create SCC for ISC resources
kind: SecurityContextConstraints
apiVersion: security.openshift.io/v1
metadata:
  name: iris-scc
  namespace: <iris namespace>
allowPrivilegedContainer: false
runAsUser:
  type: RunAsAny
seLinuxContext:
  type: RunAsAny
fsGroup:
  type: RunAsAny
supplementalGroups:
  type: RunAsAny
allowHostNetwork: false
allowHostPID: false
allowHostPorts: false
allowHostDirVolumePlugin: false
allowHostIPC: false
readOnlyRootFilesystem: false
allowPrivilegeEscalation: false
users:
  - system:serviceaccount:<iris namespace>:intersystems-iris-operator-amd

This gives access to the intersystems-iris-operator-amd service account to create objects by allowing it to validate against the iris-scc.

Next is to deploy the IrisCluster itself (more on that here). But this was failing too, because we needed to give the default service account access to the anyuid SCC, allowing our containers to run as any user (more specifically, we need to let the irisowner/51773 user run the containers!). We do this as follows:

ocm adm policy add-scc-to-user anyuid -z default -n <iris namespace>

We then create a rolebinding for the Admin role to the service account intersystems-iris-operator-amd, giving it the ability to create and monitor in the namespace. In OpenShift one can do this via the console, or as explained in kubectl create rolebinding.

One very last thing to note is that you may notice the container getting a SIGKILL, as is shown in the IRIS Messages Log:

Initializing IRIS, please wait...
Merging IRIS, please wait...
Starting IRIS
Startup aborted.
Unexpected failure: The target process received a termination signal 9.
Operation aborted.
[ERROR] Command "iris start IRIS quietly" exited with status 256

This could be due to Resource Quotas and Limit Ranges. Take into account that these exist at both the pod level and the container level.

Hope this helps and happy deploying!

P.S.

You may have noted that in the values.yaml of the Helm chart, there is this snippet:

serviceAccount:
  # Specifies whether a ServiceAccount should be created
  create: true
  # The name of the ServiceAccount to use.
  # If not set and create is true, a name is generated using the fullname template
  name:

You can actually change edit this and use a service account that already exists. For example:

serviceAccount:
  # Specifies whether a ServiceAccount should be created
  create: false
  # The name of the ServiceAccount to use.
  # If not set and create is true, a name is generated using the fullname template
  name: myExistingServiceAccount

Note that this is not a one size fits all, but it could help you if you're deploying on a strict system where you cannot create service accounts, but can use some that already exist.

Discussão (0)1
Entre ou crie uma conta para continuar
Pergunta
· Mar. 8, 2024

Classic View Page for CCR to be deprecated - did we miss anything on the new UI?

Last year we introduced our new angular-based View page for CCR as part of the UI refresh for the application.  This has been used very effectively by close to 1000 users around the world as the default UI for viewing CCR, and as a result we're getting ready to completely disable the "classic" View page. 

Benefits of the new page include:

  • modern look and feel
  • reworked UX   
  • dynamic data updates
  • new tabbed access to reduce scrolling
  • dynamic workflow visualization (coming soon)

Before we turn off access to the old UI, we really want to make sure that users are able to do everything they need to with the new UI.  Have you found that you need to revert to using the Classic View page to accomplish certain tasks?  If so, please let us know in the comments below.

6 Comments
Discussão (6)4
Entre ou crie uma conta para continuar