検索

Artigo
· Out. 10 9min de leitura

IRIS install automation using Ansible

Deploying new IRIS instances can be a time-consuming task, especially when setting up multiple environments with mirrored configurations.

I’ve encountered this issue many times and want to share my experience and recommendations for using Ansible to streamline the IRIS installation process. My approach also includes handling additional tasks typically performed before and after installing IRIS.

This guide assumes you have a basic understanding of how Ansible works, so I won’t go into much detail on its fundamentals. However, if you have questions about anything mentioned here, feel free to ask in the comments below.

The examples provided in this guide were tested using Ansible 3.6 on a Red Hat 8 server, with IRIS 2023.1.1 and Red Hat 8 as the client environment. Other versions of Ansible, Red Hat (or other UNIX flavors), and IRIS may also work, but your mileage may vary.

 

Ansible install

The Ansible server must be a Linux distribution. We use Red Hat 8 in this article, but other Linux distros and versions should work as well.

To install the Ansible packages, you must first install EPEL:

[ansible@auto01 ansible]$ yum install https://dl.fedoraproject.org/pub/epel/epel-release-latest-8.noarch.rpm

Then install Ansible:

[ansible@auto01 ansible]$ yum install ansible

In addition to the packages, Ansible requires SSH access to remote servers. I recommend creating an SSH key pair, which is more secure than using traditional passwords. Also, the user used to connect to remote servers must have administrative privileges (i.e., be part of the wheel group).

 

Files and folders

To maintain an organized structure, I recommend the following files and folders under the ansible directory:

[ansible@auto01 ansible]$ ls -l
total 4
-rw-r--r--. 1 ansible ansible 247 Dec  5 00:57 ansible.cfg
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 files
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 inventory
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 library
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 playbooks
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 templates
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 vars
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 vault
File/Folder Description
ansible.cfg Ansible configuration file. Contains directives on how Ansible behaves.
files Contains extra files needed by playbooks, such as IRIS install tar.gz file.
inventory Contains host inventory files. You can have one large inventory file or multiple smaller ones. Splitting the inventory requires more effort when running playbooks on multiple hosts.
library Contains Ansible extra library files. Not required for this examples, but useful for future extensions.
playbooks Contains all the playbooks that you develop, including the IRIS installation playbook discussed below.
templates Contains template files used by playbooks. These are transferred to servers and instantiated with the correct parameters.
vars Contains variables available to all playbooks.
vault Contains sensitive variables accessible only through the ansible-vault command. Useful for handling passwords.

 

After setting up this folder structure, copy the IRIS installer and IRIS license key into the files folder. It should look like this:

[ansible@auto01 ansible]$ ls -l files/
total 759976
-rw-rw-r--. 1 ansible ansible 778207913 Dec  5 14:32 IRISHealth-2023.1.1.380.0.22870-lnxrh8x64.tar.gz
-rw-rw-r--. 1 ansible ansible      1160 Sep  5 19:13 iris.key

 

The inventory

To run playbooks in Ansible, you must define your inventory of servers. There are several ways to do this, each with its own advantages. In this article, we’ll use a single file to define all servers.

The servers.yml file will contain the entire inventory, listing each server along with the variables required for the IRIS installation. Here’s an example:

[ansible@auto01 ansible]$ cat inventory/servers.yml 
---
all:
  hosts:
    test01.mydomain:
      iris_user: irisusr
      iris_group: irisgrp
      mgr_user: irisown
      mgr_group: irismgr
      platform: lnxrh8x64
      iris_cmd: iris
      iris_instances:
        - name: TEST01
          superserver_port: 51773
          webserver_port: 52773
          binary_file: IRISHealth-2023.1.1.380.0.22870-lnxrh8x64
          key_file: iris.key
          install_dir: /test/iris
          jrnpri_dir: /test/jrnpri
          jrnsec_dir: /test/jrnsec
          config_globals: 16384
          config_errlog: 10000
          config_routines: "0,128,0,128,0,1024"
          config_gmheap: 1048576
          config_locksiz: 128057344

 

The vault

To keep passwords secure, create a vault file containing the passwords for the IRIS SuperUser and CSPSystem accounts.

Use the following command to edit the default vault file:

[ansible@auto01 ansible]$ ansible-vault edit vault/defaults.yml
---
# Default passwords
iris_user_passwd: "Ch4ngeTh!s"

 

The playbook

To perform an IRIS installation, several tasks must be executed on the target server. These tasks are grouped and ordered in a file called a playbook.
A playbook is essentially a list of tasks that are executed sequentially on the remote hosts.

Below is the playbook I developed to install IRIS:

[ansible@auto01 ansible]$ cat playbooks/install_iris.yml
#
# Playbook to install Iris
#
- hosts: all
  become: yes
  gather_facts: no
  tasks:
  - name: "Load default passwords"
    include_vars: "../vault/defaults.yml"
  ### PRE-INSTALL TASKS:
  - name: "Install required packets"
    yum:
      name: "{{ item }}"
      state: latest
    loop:
      - "httpd"
      - "java-1.8.0-openjdk"
      - "mod_auth_mellon"
      - "mod_ssl"
  - name: "Create iris group"
    group:
      name: "{{ iris_group }}"
      gid: 5005
  - name: "Create iris mgr group"
    group:
      name: "{{ mgr_group }}"
      gid: 5006
  - name: "Create iris owner user"
    user:
      name: "{{ mgr_user }}"
      uid: 5006
      group: "{{ iris_group }}"
      groups: "{{ mgr_group }}"
  - name: "Create iris user"
    user:
      name: "{{ iris_user }}"
      uid: 5005
      group: "{{ iris_group }}"
  - name: "Create mgr folder"
    file:
      path: "{{ item.install_dir }}/mgr"
      state: directory
      owner: "{{ iris_user }}"
      group: "{{ iris_group }}"
      mode: 0775
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Copy license key"
    copy:
      src: "../files/{{ item.key_file }}"
      dest: "{{ item.install_dir }}/mgr/iris.key"
      owner: "{{ iris_user }}"
      group: "{{ iris_group }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Create /install folder"
    file:
      path: "/install"
      state: directory
      mode: 0777
  - name: "Create Instances install folders"
    file:
      path: "/install/{{ item.name }}"
      state: directory
      mode: 0777
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Copy IRIS installer"
    copy:
      src: "../files/{{ item.binary_file }}.tar.gz"
      dest: "/install/{{ item.name }}/"
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Untar IRIS installer"
    command:
      cmd: "tar -xzf /install/{{ item.name }}/{{ item.binary_file }}.tar.gz"
      chdir: "/install/{{ item.name }}/"
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  ### IRIS INSTALL:
  - name: "Install Iris"
    command:
      cmd: "./irisinstall_silent"
      chdir: "/install/{{ item.name }}/{{ item.binary_file }}"
    environment:
      ISC_PACKAGE_INSTANCENAME: "{{ item.name }}"
      ISC_PACKAGE_INSTALLDIR: "{{ item.install_dir }}"
      ISC_PACKAGE_PLATFORM: "{{ platform }}"
      ISC_PACKAGE_UNICODE: "Y"
      ISC_PACKAGE_INITIAL_SECURITY: "Normal"
      ISC_PACKAGE_MGRUSER: "{{ mgr_user }}"
      ISC_PACKAGE_MGRGROUP: "{{ mgr_group }}"
      ISC_PACKAGE_USER_PASSWORD: "{{ iris_user_passwd }}"
      ISC_PACKAGE_CSPSYSTEM_PASSWORD: "{{ iris_user_passwd }}"
      ISC_PACKAGE_IRISUSER: "{{ iris_user }}"
      ISC_PACKAGE_IRISGROUP: "{{ iris_group }}"
      ISC_PACKAGE_SUPERSERVER_PORT: "{{ item.superserver_port }}"
      ISC_PACKAGE_WEBSERVER_PORT: "{{ item.webserver_port }}"
      ISC_PACKAGE_CLIENT_COMPONENTS: "standard_install"
      ISC_PACKAGE_STARTIRIS: "N"
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Remove installers"
    file:
      path: "/install/{{ item.name }}"
      state: absent
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  ### IRIS CUSTOMIZATIONS:
  - name: "Change iris.cpf"
    lineinfile:
      path: "{{ item[0].install_dir }}/iris.cpf"
      regexp: "{{ item[1].from }}"
      line: "{{ item[1].to }}"
      backup: yes
    with_nested:
      - "{{ iris_instances }}"
      - [ { from: "^TerminalPrompt=.*", to: "TerminalPrompt=8,3,2" },
          { from: "^FreezeOnError=0", to: "FreezeOnError=1" },
          { from: "^AutoParallel=.*", to: "AutoParallel=0" },
          { from: "^FastDistinct=.*", to: "FastDistinct=0" },
          { from: "^LockThreshold=.*", to: "LockThreshold=10000" },
          { from: "^EnsembleAutoStart=.*", to: "EnsembleAutoStart=1" },
          { from: "^MaxIRISTempSizeAtStart=.*", to: "MaxIRISTempSizeAtStart=300" } ]
    loop_control:
      label: "{{ item[0].name }}: {{ item[1].to }}"
  - name: "Change Journal Current Dir"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^CurrentDirectory=.*"
      line: "CurrentDirectory={{ item.jrnpri_dir }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change Journal Alternate Dir"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^AlternateDirectory=.*"
      line: "AlternateDirectory={{ item.jrnsec_dir }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change Journal Prefix name"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^JournalFilePrefix=.*"
      line: "JournalFilePrefix={{ item.name }}_"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change Globals memory"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^globals=.*"
      line: "globals=0,0,{{ item.config_globals }},0,0,0"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change errlog memory"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^errlog=.*"
      line: "errlog={{ item.config_errlog }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change routines memory"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^routines=.*"
      line: "routines={{ item.config_routines }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change gmheap memory"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^gmheap=.*"
      line: "gmheap={{ item.config_gmheap }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change locksiz memory"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^locksiz=.*"
      line: "locksiz={{ item.config_locksiz }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  ### START IRIS:
  - name: "Start Iris"
    command: "iris start {{ item.name }}"
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
...

As you can see, there are multiple tasks in this playbook—most of them self-explanatory by name. The comments indicate pre-install tasks, installation, and post-install customizations. After executing this playbook, you will have a new IRIS instance installed on the target system, customized with memory and other settings.

 

Run the installation!

After configuring the inventory, vault, and playbooks, you’re ready to execute the IRIS installation using Ansible.
To do so, run the following command:

[ansible@auto01 ansible]$ ansible-playbook -K --ask-vault-pass -i inventory/servers.yml playbooks/install_iris.yml
BECOME password: 
Vault password: 

PLAY [all] **************************************************************************************************************************************************
. . .

Once the playbook execution finishes, you’ll receive a summary of task statuses where you can verify that everything completed successfully.

And that’s it — you’ve just installed IRIS using Ansible! 😁

2 Comments
Discussão (2)1
Entre ou crie uma conta para continuar
Artigo
· Out. 9 6min de leitura

Writing a REST api service for exporting the generated patient data in .csv

Hi,

 

It's me again😁, recently I am working on generating some fake patient data for testing purpose with the help of Chat-GPT by using Python. And, at the same time I would like to share my learning curve.😑

1st of all for building a custom REST api service is easy by extending the %CSP.REST

Creating a REST Service Manually

Let's Start !😂

1. Create a class datagen.restservice which extends  %CSP.REST 

Class datagen.restservice Extends %CSP.REST
{
Parameter CONTENTTYPE = "application/json";
}

 

2. Add a function genpatientcsv() to generate the patient data, and package it into csv string

Class datagen.restservice Extends %CSP.REST
{
Parameter CONTENTTYPE = "application/json";
ClassMethod genpatientcsv() As %String [ Language = python ]
{
    # w ##class(datagen.restservice).genpatientcsv()
    # python.exe -m pip install faker
    # python.exe -m pip install pandas

    from faker import Faker
    import random
    import pandas as pd
    from io import StringIO

    # Initialize Faker
    fake = Faker()

    def generate_patient(patient_id):
        return {
            "PatientID": patient_id,
            "Name": fake.name(),
            "Gender": random.choice(["Male", "Female"]),
            "DOB": fake.date_of_birth(minimum_age=0, maximum_age=100).strftime("%Y-%m-%d"),
            "City": fake.city(),
            "Phone": fake.phone_number(),
            "Email": fake.email(),
            "BloodType": random.choice(["A+", "A-", "B+", "B-", "AB+", "AB-", "O+", "O-"]),
            "Diagnosis": random.choice(["Hypertension", "Diabetes", "Asthma", "Healthy", "Flu"]),
            "Height_cm": round(random.uniform(140, 200), 1),
            "Weight_kg": round(random.uniform(40, 120), 1),
        }

    # Generate 10 patients
    patients = [generate_patient(i) for i in range(1, 11)]

    # Convert to DataFrame
    df = pd.DataFrame(patients)

    # Convert to CSV string (without saving to file)
    csv_buffer = StringIO()
    df.to_csv(csv_buffer, index=False)
    csv_string = csv_buffer.getvalue()

    return csv_string
}
}

you may test the function in the terminal by typing

w ##class(datagen.restservice).genpatientcsv()

3. Add a function GetMyDataCSV() in Python for populating the csv string as a csv file and then output through the REST api service. This can be achieve by,
   3.1. calling the patient data generate function to get the csv string
   3.2. set the %response.ContentType = "text/csv"
   3.3. set the header "Content-Disposition" value to "attachment; filename=mydata.csv"
   3.4.  write the generated csv string as output

remember to pip install the related libraries

Class datagen.restservice Extends %CSP.REST
{
Parameter CONTENTTYPE = "application/json";
ClassMethod GetMyDataCSV() As %Status
{
    // Build CSV string
    Set tCSVString = ##class(datagen.restservice).genpatientcsv()

    //Set headers and output CSV
    Set %response.ContentType = "text/csv"
    Do %response.SetHeader("Content-Disposition","attachment; filename=mydata.csv")
    
    // Output the data
    W tCSVString

    Quit $$$OK
}

ClassMethod genpatientcsv() As %String [ Language = python ]
{
    # w ##class(datagen.restservice).genpatientcsv()
    # python.exe -m pip install faker
    # python.exe -m pip install pandas

    from faker import Faker
    import random
    import pandas as pd
    from io import StringIO

    # Initialize Faker
    fake = Faker()

    def generate_patient(patient_id):
        return {
            "PatientID": patient_id,
            "Name": fake.name(),
            "Gender": random.choice(["Male", "Female"]),
            "DOB": fake.date_of_birth(minimum_age=0, maximum_age=100).strftime("%Y-%m-%d"),
            "City": fake.city(),
            "Phone": fake.phone_number(),
            "Email": fake.email(),
            "BloodType": random.choice(["A+", "A-", "B+", "B-", "AB+", "AB-", "O+", "O-"]),
            "Diagnosis": random.choice(["Hypertension", "Diabetes", "Asthma", "Healthy", "Flu"]),
            "Height_cm": round(random.uniform(140, 200), 1),
            "Weight_kg": round(random.uniform(40, 120), 1),
        }

    # Generate 10 patients
    patients = [generate_patient(i) for i in range(1, 11)]

    # Convert to DataFrame
    df = pd.DataFrame(patients)

    # Convert to CSV string (without saving to file)
    csv_buffer = StringIO()
    df.to_csv(csv_buffer, index=False)
    csv_string = csv_buffer.getvalue()

    return csv_string
}
}

4. Add the route to this function and compile the class

Class datagen.restservice Extends %CSP.REST
{
Parameter CONTENTTYPE = "application/json";
XData UrlMap [ XMLNamespace = "http://www.intersystems.com/urlmap" ]
{
<Routes>
        <Route Url="/export/patientdata" Method="GET" Call="GetMyDataCSV"/>
</Routes>
}

ClassMethod GetMyDataCSV() As %Status
{
    // Build CSV string
    Set tCSVString = ##class(datagen.restservice).genpatientcsv()

    //Set headers and output CSV
    Set %response.ContentType = "text/csv"
    Do %response.SetHeader("Content-Disposition","attachment; filename=mydata.csv")
    
    // Output the data
    W tCSVString

    Quit $$$OK
}

ClassMethod genpatientcsv() As %String [ Language = python ]
{
    # w ##class(datagen.restservice).genpatientcsv()
    # python.exe -m pip install faker
    # python.exe -m pip install pandas

    from faker import Faker
    import random
    import pandas as pd
    from io import StringIO

    # Initialize Faker
    fake = Faker()

    def generate_patient(patient_id):
        return {
            "PatientID": patient_id,
            "Name": fake.name(),
            "Gender": random.choice(["Male", "Female"]),
            "DOB": fake.date_of_birth(minimum_age=0, maximum_age=100).strftime("%Y-%m-%d"),
            "City": fake.city(),
            "Phone": fake.phone_number(),
            "Email": fake.email(),
            "BloodType": random.choice(["A+", "A-", "B+", "B-", "AB+", "AB-", "O+", "O-"]),
            "Diagnosis": random.choice(["Hypertension", "Diabetes", "Asthma", "Healthy", "Flu"]),
            "Height_cm": round(random.uniform(140, 200), 1),
            "Weight_kg": round(random.uniform(40, 120), 1),
        }

    # Generate 10 patients
    patients = [generate_patient(i) for i in range(1, 11)]

    # Convert to DataFrame
    df = pd.DataFrame(patients)

    # Convert to CSV string (without saving to file)
    csv_buffer = StringIO()
    df.to_csv(csv_buffer, index=False)
    csv_string = csv_buffer.getvalue()

    return csv_string
}
}

 

 

OK, now our code is ready. 😁 The next thing is to add the REST service to the web application

Input your Path, Namespace, and Rest service class name, and then Save

Assign the proper application role to this web application (because I am lazy, I just simply assign %All for testing 🤐)

 

OK everything is ready!!😁 Let's test the REST api!!!😂

Input the following path in a bowser

http://localhost/irishealth/csp/mpapp/export/patientdata

It trigger a file download, the file name is mydata.csv😗

Let's check the file 😊

 

Yeah!!! Work well!! 😁😁

Thank you so much for the reading. 😉

3 Comments
Discussão (3)2
Entre ou crie uma conta para continuar
Artigo
· Out. 9 4min de leitura

Expanda a capacidade do ObjectScript de processar YAML

A linguagem ObjectScript possui um suporte incrível a JSON por meio de classes como %DynamicObject e %JSON.Adaptor. Esse suporte se deve à imensa popularidade do formato JSON em relação ao domínio anterior do XML. O JSON trouxe menos verbosidade à representação de dados e aumentou a legibilidade para humanos que precisavam interpretar conteúdo JSON. Para reduzir ainda mais a verbosidade e aumentar a legibilidade, o formato YAML foi criado. O formato YAML, muito fácil de ler, rapidamente se tornou o formato mais popular para representar configurações e parametrizações, devido à sua legibilidade e verbosidade mínima. Embora o XML raramente seja usado para parametrização e configuração, com o YAML, o JSON está gradualmente se limitando a ser um formato de troca de dados, em vez de ser usado para configurações, parametrizações e representações de metadados. Agora, tudo isso é feito com YAML. Portanto, a linguagem primária das tecnologias InterSystems precisa de amplo suporte para processamento YAML, no mesmo nível que para JSON e XML. Por esse motivo, lancei um novo pacote para tornar o ObjectScript um poderoso processador YAML. O nome do pacote é yaml-adaptor.

Vamos começar instalando o pacote

1. Se for de IPM, abra o IRIS Terminal e execute:

USER>zpm “install yaml-adaptor”

2. Se for de Docker, Clone/git pull o repositório do yaml-adaptor em uma pasta local:

$ git clone https://github.com/yurimarx/yaml-adaptor.git

3. Abra o terminal na pasta e execute:

$ docker-compose build

4. Execute o IRIS container do projeto:

$ docker-compose up -d

Por que usar o pacote?

Com este pacote, você poderá interoperar, ler, escrever e transformar YAML em DynamicObjects, JSON e XML bidirecionalmente. Este pacote permite ler e gerar dados, configurações e parametrizações nos formatos mais populares do mercado de forma dinâmica, com pouco código, alto desempenho e em tempo real.

O pacote em ação!

É muito simples. As capacidades são:

1. Converter de YAML string para JSON string

ClassMethod TestYamlToJson() As %Status
{
    Set sc = $$$OK
    set yamlContent = ""_$CHAR(10)_
        "user:"_$CHAR(10)_
        "    name: 'Jane Doe'"_$CHAR(10)_
        "    age: 30"_$CHAR(10)_
        "    roles:"_$CHAR(10)_
        "    - 'admin'"_$CHAR(10)_
        "    - 'editor'"_$CHAR(10)_
        "database:"_$CHAR(10)_
        "    host: 'localhost'"_$CHAR(10)_
        "    port: 5432"_$CHAR(10)_
        ""
    Do ##class(dc.yamladapter.YamlUtil).yamlToJson(yamlContent, .jsonContent)
    Set jsonObj = {}.%FromJSON(jsonContent)
    Write jsonObj.%ToJSON()

    Return sc
}

2. Gerar arquivo YAML de um arquivo JSON 

ClassMethod TestYamlFileToJsonFile() As %Status
{

    Set sc = $$$OK
    Set yamlFile = "/tmp/samples/sample.yaml"
    Set jsonFile = "/tmp/samples/sample_result.json"
    Write ##class(dc.yamladapter.YamlUtil).yamlFileToJsonFile(yamlFile,jsonFile)
    

    Return sc
}

3. Converter de JSON string para YAML string

ClassMethod TestJsonToYaml() As %Status
{
    Set sc = $$$OK
    set jsonContent = "{""user"":{""name"":""Jane Doe"",""age"":30,""roles"":[""admin"",""editor""]},""database"":{""host"":""localhost"",""port"":5432}}"
    Do ##class(dc.yamladapter.YamlUtil).jsonToYaml(jsonContent, .yamlContent)
    Write yamlContent

    Return sc
}

4. Gerar arquivo JSON de um arquivo YAML 

ClassMethod TestJsonFileToYamlFile() As %Status
{

    Set sc = $$$OK
    Set jsonFile = "/tmp/samples/sample.json"
    Set yamlFile = "/tmp/samples/sample_result.yaml"
    Write ##class(dc.yamladapter.YamlUtil).jsonFileToYamlFile(jsonFile, yamlFile)
    

    Return sc
}

5. Carregar um objeto dinâmico a partir de YAML string ou arquivos YAML 

ClassMethod TestYamlFileToDynamicObject() As %Status
{
    Set sc = $$$OK
    Set yamlFile = "/tmp/samples/sample.yaml"
    Set dynamicYaml = ##class(YamlAdaptor).CreateFromFile(yamlFile)

    Write "Title: "_dynamicYaml.title, !
    Write "Version: "_dynamicYaml.version, !

    Return sc
}

6. Gerar YAML de objetos dinâmicos

ClassMethod TestDynamicObjectToYaml() As %Status
{
    Set sc = $$$OK
    Set dynaObj = {}
    Set dynaObj.project = "Project A"
    Set dynaObj.version = "1.0"
    Set yamlContent = ##class(YamlAdaptor).CreateYamlFromDynamicObject(dynaObj)

    Write yamlContent

    Return sc
}

7. Gerar arquivo XML de arquivo YAML

ClassMethod TestXmlFileToYamlFile() As %Status
{

    Set sc = $$$OK
    Set xmlFile = "/tmp/samples/sample.xml"
    Set yamlFile = "/tmp/samples/sample_xml_result.yaml"
    Write ##class(dc.yamladapter.YamlUtil).xmlFileToYamlFile(xmlFile, yamlFile)
    

    Return sc
}

8. Gerar arquivo YAML de arquivo XML

ClassMethod TestYamlFileToXmlFile() As %Status
{

    Set sc = $$$OK
    Set yamlFile = "/tmp/samples/sample.yaml"
    Set xmlFile = "/tmp/samples/sample_result.xml"
    Write ##class(dc.yamladapter.YamlUtil).yamlFileToXmlFile(yamlFile, "sample", xmlFile)
    

    Return sc
}
Discussão (0)1
Entre ou crie uma conta para continuar
Artigo
· Out. 9 4min de leitura

Expand ObjectScript's ability to process YAML

The ObjectScript language has incredible JSON support through classes like %DynamicObject and %JSON.Adaptor. This support is due to the JSON format's immense popularity over the previous dominance of XML. JSON brought less verbosity to data representation and increased readability for humans who needed to interpret JSON content. To further reduce verbosity and increase readability, the YAML format was created. The very easy-to-read YAML format quickly became the most popular format for representing configurations and parameterizations, due to its readability and minimal verbosity. While XML is rarely used for parameterization and configuration, with YAML, JSON is gradually becoming limited to being a data exchange format rather than for configurations, parameterizations, and metadata representations. Now, all of this is done with YAML. Therefore, the primary language of InterSystems technologies needs broad support for YAML processing, at the same level as it does for JSON and XML. For this reason, I've released a new package to make ObjectScript a powerful YAML processor. The package name is yaml-adaptor.

Let's start by installing the package

1. If you using IPM, open the IRIS Terminal and call:

USER>zpm “install yaml-adaptor”

2. If you using Docker, Clone/git pull the yaml-adaptor repo into any local directory:

$ git clone https://github.com/yurimarx/yaml-adaptor.git

3. Open the terminal in this directory and run:

$ docker-compose build

4. Run the IRIS container with your project:

$ docker-compose up -d

Why use this package?

With this package, you'll be able to easily interoperate, read, write, and transform YAML to DynamicObjects, JSON, and XML bidirectionally. This package allows you to read and generate data, configurations, and parameterizations in the most popular market formats dynamically, with little code, high performance, and in real time.

The package in action!

It is very simple use this package. The features are:

1. Convert from YAML string to JSON string

ClassMethod TestYamlToJson() As %Status
{
    Set sc = $$$OK
    
    set yamlContent = ""_$CHAR(10)_
        "user:"_$CHAR(10)_
        "    name: 'Jane Doe'"_$CHAR(10)_
        "    age: 30"_$CHAR(10)_
        "    roles:"_$CHAR(10)_
        "    - 'admin'"_$CHAR(10)_
        "    - 'editor'"_$CHAR(10)_
        "database:"_$CHAR(10)_
        "    host: 'localhost'"_$CHAR(10)_
        "    port: 5432"_$CHAR(10)_
        ""

    Do ##class(dc.yamladapter.YamlUtil).yamlToJson(yamlContent, .jsonContent)
    Set jsonObj = {}.%FromJSON(jsonContent)
    Write jsonObj.%ToJSON()

    Return sc
}

2. Generate YAML file from a JSON file

ClassMethod TestYamlFileToJsonFile() As %Status
{

    Set sc = $$$OK

    Set yamlFile = "/tmp/samples/sample.yaml"
    Set jsonFile = "/tmp/samples/sample_result.json"

    Write ##class(dc.yamladapter.YamlUtil).yamlFileToJsonFile(yamlFile,jsonFile)
    

    Return sc
}

3. Convert from JSON string to YAML string

ClassMethod TestJsonToYaml() As %Status
{
    Set sc = $$$OK
    
    set jsonContent = "{""user"":{""name"":""Jane Doe"",""age"":30,""roles"":[""admin"",""editor""]},""database"":{""host"":""localhost"",""port"":5432}}"

    Do ##class(dc.yamladapter.YamlUtil).jsonToYaml(jsonContent, .yamlContent)
    Write yamlContent

    Return sc
}

4. Generate JSON file from YAML file

ClassMethod TestJsonFileToYamlFile() As %Status
{

    Set sc = $$$OK

    Set jsonFile = "/tmp/samples/sample.json"
    Set yamlFile = "/tmp/samples/sample_result.yaml"

    Write ##class(dc.yamladapter.YamlUtil).jsonFileToYamlFile(jsonFile, yamlFile)
    

    Return sc
}

5. Load a dynamic object from YAML string or YAML files

ClassMethod TestYamlFileToDynamicObject() As %Status
{
    Set sc = $$$OK

    Set yamlFile = "/tmp/samples/sample.yaml"
    
    Set dynamicYaml = ##class(YamlAdaptor).CreateFromFile(yamlFile)

    Write "Title: "_dynamicYaml.title, !
    Write "Version: "_dynamicYaml.version, !

    Return sc
}

Generate YAML from Dynamic Objects

ClassMethod TestDynamicObjectToYaml() As %Status
{
    Set sc = $$$OK

    Set dynaObj = {}
    Set dynaObj.project = "Project A"
    Set dynaObj.version = "1.0"
    
    Set yamlContent = ##class(YamlAdaptor).CreateYamlFromDynamicObject(dynaObj)

    Write yamlContent

    Return sc
}

Generate XML file from YAML file

ClassMethod TestXmlFileToYamlFile() As %Status
{

    Set sc = $$$OK

    Set xmlFile = "/tmp/samples/sample.xml"
    Set yamlFile = "/tmp/samples/sample_xml_result.yaml"

    Write ##class(dc.yamladapter.YamlUtil).xmlFileToYamlFile(xmlFile, yamlFile)
    

    Return sc
}

Generate YAML file from XML File

ClassMethod TestYamlFileToXmlFile() As %Status
{

    Set sc = $$$OK

    Set yamlFile = "/tmp/samples/sample.yaml"
    Set xmlFile = "/tmp/samples/sample_result.xml"

    Write ##class(dc.yamladapter.YamlUtil).yamlFileToXmlFile(yamlFile, "sample", xmlFile)
    

    Return sc
}
1 Comment
Discussão (1)1
Entre ou crie uma conta para continuar
Artigo
· Out. 9 1min de leitura

IRIS のベクトル検索を活用しユーザーへ最新で正確な応答を提供する RAG AI チャットボットを作成するチュートリアル

開発者の皆さん、こんにちは!

この記事では、Developer Hub にあるチュートリアルに新しいチュートリアル:InterSystems IRIS ベクトル検索を使用した RAG が追加されましたので内容をご紹介します。(準備不要でブラウザがあれば試せるチュートリアルです!)

このチュートリアルでは、生成 AI アプリケーションの精度向上に向けて、ベクトル検索と検索拡張生成(Retrieval Augmented Generation)の活用を体験できます。

具体的には、InterSystems IRIS のベクトル検索機能を活用し、生成 AI チャットボット向けのナレッジベースをサンプルコードを利用して作成します。

また、Streamlit を使用して作成したチャットボットを動かしながら、ナレッジベースの情報を追加することで生成 AI からの回答が変化していくことを確認していきます。

アカウント作成やログインも不要で  ボタンをクリックするだけで始められます👍

チュートリアルへのリンクは「開発者コミュニティのリソース」からも辿れます!

ぜひ、お試しください!​​​​​​

Discussão (0)1
Entre ou crie uma conta para continuar