Pesquisar

Artigo
· Out. 10 2min de leitura

Why do I still see old messages after running the purge task?

To manage the accumulation of production data, InterSystems IRIS enables users to manage the database size by periodically purging the data. This purge can apply to messages, logs, business processes, and managed alerts.

Please check the documentation for more details on the settings of the purge task:
https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=EGMG_purge#EGMG_purge_settings

An issue that many users ran into is still finding old messages after running the purge task for messages. For example, a user has a purge task for messages that has NumberOfDaysToKeep=45. This means that messages generated in the last 45 days are retained, and messages generated before 45 days should be deleted. After the purge task completes successfully, most of the messages before the 45-day retention period are deleted, but there are still some old messages that were created before 45 days. Why aren't these messages purged?

In this article, I will discuss the most commonly seen root causes and how to handle the situation. I will assume that the user has a purge task using Ens.Util.Tasks.Purge. 

1) Check the purge task
First, we want to check the purge task's settings. We want to confirm the NumberOfDaysToKeep setting and make sure that the task is set to purge messages in the namespace we are examining, but the setting we want to check the most is KeepIntegrity.

 If KeepIntegrity is enabled, the purge task will only purge complete sessions. By definition, a complete session contains only messages with the status of Complete, Error, Aborted, or Discarded. Notice that if any message in the session is not of one of these four statuses (e.g., the message is Queued or Suspended), all messages in the session will not be purged.

2) Check the message status of all messages in that session
Knowing that KeepIntegrity can make the purge task skip some messages in incomplete sessions, we can now check if this is the current issue by checking the status of the messages in the session. In the Message Viewer, search for messages that should have been deleted based on the NumberOfDaysToKeep criteria by applying the time filter. Check the status of all of the messages in one of these sessions by using the Session ID. Are there any messages with a status other than Complete, Error, Aborted, or Discarded?

In addition to the Message Viewer, you can verify this information using SQL by querying the Ens.MessageHeader table.

3) Manage the incomplete sessions
To resolve the issue, you need to change the status of these messages to they can be purged. For example, some messages might still be in a queue and need to be aborted.

Another way to resolve it is to create a purge task with KeepIntegrity disabled to purge the messages even if the sessions are incomplete. You should choose an appropriate NumberOfDaysToKeep.

Discussão (0)1
Entre ou crie uma conta para continuar
Anúncio
· Out. 10

¡Estrenamos interfaz del chat de IA de la Comunidad de Desarrolladores!

Hola, comunidad:

¡Estamos emocionados de anunciar una actualización importante del Chat de IA de la Comunidad de Desarrolladores: ahora cuenta con una interfaz completamente nueva!

Con esta actualización, vuestra experiencia será más fluida e intuitiva:

  • Navegación más sencilla entre vuestros chats
  • Historial y respuestas más claros a simple vista
  • Retención de contexto sin interrupciones para conversaciones más naturales

Al igual que antes, la experiencia está impulsada por InterSystems IRIS Vector Search, lo que garantiza que vuestras interacciones sigan siendo rápidas, con conocimiento del contexto y altamente relevantes.

Hay funciones antiguas y nuevas disponibles:

  1. Ocultar el panel lateral
  2. Iniciar un nuevo chat
  3. Ver el historial del chat
  4. Compartir el chat (en dos lugares)
  5. Copiar el texto generado
  6. Preguntar a la comunidad: haced clic para crear una nueva publicación en la Comunidad de Desarrolladores
  7. Elegir las fuentes

Nos encantaría recibir vuestros comentarios sobre la precisión de las respuestas, así que no olvidéis darles un pulgar arriba o abajo.

¡Probad la nueva interfaz hoy mismo y descubrid cómo mejora vuestra experiencia! No olvidéis compartir vuestros comentarios sobre la interfaz y la experiencia de usuario (UI/UX) en los comentarios más abajo.

Discussão (0)1
Entre ou crie uma conta para continuar
Anúncio
· Out. 10

[Vidéo] MayVilleHop après 1 an d'utilisation : coordination territoriale & santé populationnelle en Mayenne

Salut la Communauté!

Profitez de regarder la nouvelle vidéo sur la chaîne Youtube d'InterSystems France :

📺 MayVilleHop après 1 an d'utilisation : coordination territoriale & santé populationnelle en Mayenne

Retour d'expérience du GHT de la Mayenne & du Haut Anjou sur l'utilisation de la plateforme MayVilleHop : coordination Ville Hôpital, mise en place de parcours de soins et responsabilité populationnelle pour une meilleure prise en charge des patients sur le territoire. Analyse des résultats après un an d’utilisation.

Intervenants :
🗣  Vincent Errera, Directeur délégué du GHT de la Mayenne et du Haut-Anjou
🗣  Émilie Boudonnet Peloin, IDE Parcours du GHT de la Mayenne et du Haut-Anjou
🗣  Nicolas Eiferman, Directeur général InterSystems France & Benelux
🗣  Dr Hervé Rivière, Directeur Médical InterSystems France

Abonnez-vous à notre chaîne YouTube pour plus de vidéos !

Discussão (0)0
Entre ou crie uma conta para continuar
Artigo
· Out. 10 9min de leitura

IRIS install automation using Ansible

Deploying new IRIS instances can be a time-consuming task, especially when setting up multiple environments with mirrored configurations.

I’ve encountered this issue many times and want to share my experience and recommendations for using Ansible to streamline the IRIS installation process. My approach also includes handling additional tasks typically performed before and after installing IRIS.

This guide assumes you have a basic understanding of how Ansible works, so I won’t go into much detail on its fundamentals. However, if you have questions about anything mentioned here, feel free to ask in the comments below.

The examples provided in this guide were tested using Ansible 3.6 on a Red Hat 8 server, with IRIS 2023.1.1 and Red Hat 8 as the client environment. Other versions of Ansible, Red Hat (or other UNIX flavors), and IRIS may also work, but your mileage may vary.

 

Ansible install

The Ansible server must be a Linux distribution. We use Red Hat 8 in this article, but other Linux distros and versions should work as well.

To install the Ansible packages, you must first install EPEL:

[ansible@auto01 ansible]$ yum install https://dl.fedoraproject.org/pub/epel/epel-release-latest-8.noarch.rpm

Then install Ansible:

[ansible@auto01 ansible]$ yum install ansible

In addition to the packages, Ansible requires SSH access to remote servers. I recommend creating an SSH key pair, which is more secure than using traditional passwords. Also, the user used to connect to remote servers must have administrative privileges (i.e., be part of the wheel group).

 

Files and folders

To maintain an organized structure, I recommend the following files and folders under the ansible directory:

[ansible@auto01 ansible]$ ls -l
total 4
-rw-r--r--. 1 ansible ansible 247 Dec  5 00:57 ansible.cfg
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 files
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 inventory
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 library
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 playbooks
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 templates
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 vars
drwxrwxr-x. 2 ansible ansible   6 Dec  5 00:56 vault
File/Folder Description
ansible.cfg Ansible configuration file. Contains directives on how Ansible behaves.
files Contains extra files needed by playbooks, such as IRIS install tar.gz file.
inventory Contains host inventory files. You can have one large inventory file or multiple smaller ones. Splitting the inventory requires more effort when running playbooks on multiple hosts.
library Contains Ansible extra library files. Not required for this examples, but useful for future extensions.
playbooks Contains all the playbooks that you develop, including the IRIS installation playbook discussed below.
templates Contains template files used by playbooks. These are transferred to servers and instantiated with the correct parameters.
vars Contains variables available to all playbooks.
vault Contains sensitive variables accessible only through the ansible-vault command. Useful for handling passwords.

 

After setting up this folder structure, copy the IRIS installer and IRIS license key into the files folder. It should look like this:

[ansible@auto01 ansible]$ ls -l files/
total 759976
-rw-rw-r--. 1 ansible ansible 778207913 Dec  5 14:32 IRISHealth-2023.1.1.380.0.22870-lnxrh8x64.tar.gz
-rw-rw-r--. 1 ansible ansible      1160 Sep  5 19:13 iris.key

 

The inventory

To run playbooks in Ansible, you must define your inventory of servers. There are several ways to do this, each with its own advantages. In this article, we’ll use a single file to define all servers.

The servers.yml file will contain the entire inventory, listing each server along with the variables required for the IRIS installation. Here’s an example:

[ansible@auto01 ansible]$ cat inventory/servers.yml 
---
all:
  hosts:
    test01.mydomain:
      iris_user: irisusr
      iris_group: irisgrp
      mgr_user: irisown
      mgr_group: irismgr
      platform: lnxrh8x64
      iris_cmd: iris
      iris_instances:
        - name: TEST01
          superserver_port: 51773
          webserver_port: 52773
          binary_file: IRISHealth-2023.1.1.380.0.22870-lnxrh8x64
          key_file: iris.key
          install_dir: /test/iris
          jrnpri_dir: /test/jrnpri
          jrnsec_dir: /test/jrnsec
          config_globals: 16384
          config_errlog: 10000
          config_routines: "0,128,0,128,0,1024"
          config_gmheap: 1048576
          config_locksiz: 128057344

 

The vault

To keep passwords secure, create a vault file containing the passwords for the IRIS SuperUser and CSPSystem accounts.

Use the following command to edit the default vault file:

[ansible@auto01 ansible]$ ansible-vault edit vault/defaults.yml
---
# Default passwords
iris_user_passwd: "Ch4ngeTh!s"

 

The playbook

To perform an IRIS installation, several tasks must be executed on the target server. These tasks are grouped and ordered in a file called a playbook.
A playbook is essentially a list of tasks that are executed sequentially on the remote hosts.

Below is the playbook I developed to install IRIS:

[ansible@auto01 ansible]$ cat playbooks/install_iris.yml
#
# Playbook to install Iris
#
- hosts: all
  become: yes
  gather_facts: no
  tasks:
  - name: "Load default passwords"
    include_vars: "../vault/defaults.yml"
  ### PRE-INSTALL TASKS:
  - name: "Install required packets"
    yum:
      name: "{{ item }}"
      state: latest
    loop:
      - "httpd"
      - "java-1.8.0-openjdk"
      - "mod_auth_mellon"
      - "mod_ssl"
  - name: "Create iris group"
    group:
      name: "{{ iris_group }}"
      gid: 5005
  - name: "Create iris mgr group"
    group:
      name: "{{ mgr_group }}"
      gid: 5006
  - name: "Create iris owner user"
    user:
      name: "{{ mgr_user }}"
      uid: 5006
      group: "{{ iris_group }}"
      groups: "{{ mgr_group }}"
  - name: "Create iris user"
    user:
      name: "{{ iris_user }}"
      uid: 5005
      group: "{{ iris_group }}"
  - name: "Create mgr folder"
    file:
      path: "{{ item.install_dir }}/mgr"
      state: directory
      owner: "{{ iris_user }}"
      group: "{{ iris_group }}"
      mode: 0775
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Copy license key"
    copy:
      src: "../files/{{ item.key_file }}"
      dest: "{{ item.install_dir }}/mgr/iris.key"
      owner: "{{ iris_user }}"
      group: "{{ iris_group }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Create /install folder"
    file:
      path: "/install"
      state: directory
      mode: 0777
  - name: "Create Instances install folders"
    file:
      path: "/install/{{ item.name }}"
      state: directory
      mode: 0777
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Copy IRIS installer"
    copy:
      src: "../files/{{ item.binary_file }}.tar.gz"
      dest: "/install/{{ item.name }}/"
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Untar IRIS installer"
    command:
      cmd: "tar -xzf /install/{{ item.name }}/{{ item.binary_file }}.tar.gz"
      chdir: "/install/{{ item.name }}/"
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  ### IRIS INSTALL:
  - name: "Install Iris"
    command:
      cmd: "./irisinstall_silent"
      chdir: "/install/{{ item.name }}/{{ item.binary_file }}"
    environment:
      ISC_PACKAGE_INSTANCENAME: "{{ item.name }}"
      ISC_PACKAGE_INSTALLDIR: "{{ item.install_dir }}"
      ISC_PACKAGE_PLATFORM: "{{ platform }}"
      ISC_PACKAGE_UNICODE: "Y"
      ISC_PACKAGE_INITIAL_SECURITY: "Normal"
      ISC_PACKAGE_MGRUSER: "{{ mgr_user }}"
      ISC_PACKAGE_MGRGROUP: "{{ mgr_group }}"
      ISC_PACKAGE_USER_PASSWORD: "{{ iris_user_passwd }}"
      ISC_PACKAGE_CSPSYSTEM_PASSWORD: "{{ iris_user_passwd }}"
      ISC_PACKAGE_IRISUSER: "{{ iris_user }}"
      ISC_PACKAGE_IRISGROUP: "{{ iris_group }}"
      ISC_PACKAGE_SUPERSERVER_PORT: "{{ item.superserver_port }}"
      ISC_PACKAGE_WEBSERVER_PORT: "{{ item.webserver_port }}"
      ISC_PACKAGE_CLIENT_COMPONENTS: "standard_install"
      ISC_PACKAGE_STARTIRIS: "N"
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Remove installers"
    file:
      path: "/install/{{ item.name }}"
      state: absent
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  ### IRIS CUSTOMIZATIONS:
  - name: "Change iris.cpf"
    lineinfile:
      path: "{{ item[0].install_dir }}/iris.cpf"
      regexp: "{{ item[1].from }}"
      line: "{{ item[1].to }}"
      backup: yes
    with_nested:
      - "{{ iris_instances }}"
      - [ { from: "^TerminalPrompt=.*", to: "TerminalPrompt=8,3,2" },
          { from: "^FreezeOnError=0", to: "FreezeOnError=1" },
          { from: "^AutoParallel=.*", to: "AutoParallel=0" },
          { from: "^FastDistinct=.*", to: "FastDistinct=0" },
          { from: "^LockThreshold=.*", to: "LockThreshold=10000" },
          { from: "^EnsembleAutoStart=.*", to: "EnsembleAutoStart=1" },
          { from: "^MaxIRISTempSizeAtStart=.*", to: "MaxIRISTempSizeAtStart=300" } ]
    loop_control:
      label: "{{ item[0].name }}: {{ item[1].to }}"
  - name: "Change Journal Current Dir"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^CurrentDirectory=.*"
      line: "CurrentDirectory={{ item.jrnpri_dir }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change Journal Alternate Dir"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^AlternateDirectory=.*"
      line: "AlternateDirectory={{ item.jrnsec_dir }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change Journal Prefix name"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^JournalFilePrefix=.*"
      line: "JournalFilePrefix={{ item.name }}_"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change Globals memory"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^globals=.*"
      line: "globals=0,0,{{ item.config_globals }},0,0,0"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change errlog memory"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^errlog=.*"
      line: "errlog={{ item.config_errlog }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change routines memory"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^routines=.*"
      line: "routines={{ item.config_routines }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change gmheap memory"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^gmheap=.*"
      line: "gmheap={{ item.config_gmheap }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  - name: "Change locksiz memory"
    lineinfile:
      path: "{{ item.install_dir }}/iris.cpf"
      regexp: "^locksiz=.*"
      line: "locksiz={{ item.config_locksiz }}"
      backup: yes
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
  ### START IRIS:
  - name: "Start Iris"
    command: "iris start {{ item.name }}"
    loop: "{{ iris_instances }}"
    loop_control:
      label: "{{ item.name }}"
...

As you can see, there are multiple tasks in this playbook—most of them self-explanatory by name. The comments indicate pre-install tasks, installation, and post-install customizations. After executing this playbook, you will have a new IRIS instance installed on the target system, customized with memory and other settings.

 

Run the installation!

After configuring the inventory, vault, and playbooks, you’re ready to execute the IRIS installation using Ansible.
To do so, run the following command:

[ansible@auto01 ansible]$ ansible-playbook -K --ask-vault-pass -i inventory/servers.yml playbooks/install_iris.yml
BECOME password: 
Vault password: 

PLAY [all] **************************************************************************************************************************************************
. . .

Once the playbook execution finishes, you’ll receive a summary of task statuses where you can verify that everything completed successfully.

And that’s it — you’ve just installed IRIS using Ansible! 😁

2 Comments
Discussão (2)1
Entre ou crie uma conta para continuar
Artigo
· Out. 9 6min de leitura

Writing a REST api service for exporting the generated patient data in .csv

Hi,

 

It's me again😁, recently I am working on generating some fake patient data for testing purpose with the help of Chat-GPT by using Python. And, at the same time I would like to share my learning curve.😑

1st of all for building a custom REST api service is easy by extending the %CSP.REST

Creating a REST Service Manually

Let's Start !😂

1. Create a class datagen.restservice which extends  %CSP.REST 

Class datagen.restservice Extends %CSP.REST
{
Parameter CONTENTTYPE = "application/json";
}

 

2. Add a function genpatientcsv() to generate the patient data, and package it into csv string

Class datagen.restservice Extends %CSP.REST
{
Parameter CONTENTTYPE = "application/json";
ClassMethod genpatientcsv() As %String [ Language = python ]
{
    # w ##class(datagen.restservice).genpatientcsv()
    # python.exe -m pip install faker
    # python.exe -m pip install pandas

    from faker import Faker
    import random
    import pandas as pd
    from io import StringIO

    # Initialize Faker
    fake = Faker()

    def generate_patient(patient_id):
        return {
            "PatientID": patient_id,
            "Name": fake.name(),
            "Gender": random.choice(["Male", "Female"]),
            "DOB": fake.date_of_birth(minimum_age=0, maximum_age=100).strftime("%Y-%m-%d"),
            "City": fake.city(),
            "Phone": fake.phone_number(),
            "Email": fake.email(),
            "BloodType": random.choice(["A+", "A-", "B+", "B-", "AB+", "AB-", "O+", "O-"]),
            "Diagnosis": random.choice(["Hypertension", "Diabetes", "Asthma", "Healthy", "Flu"]),
            "Height_cm": round(random.uniform(140, 200), 1),
            "Weight_kg": round(random.uniform(40, 120), 1),
        }

    # Generate 10 patients
    patients = [generate_patient(i) for i in range(1, 11)]

    # Convert to DataFrame
    df = pd.DataFrame(patients)

    # Convert to CSV string (without saving to file)
    csv_buffer = StringIO()
    df.to_csv(csv_buffer, index=False)
    csv_string = csv_buffer.getvalue()

    return csv_string
}
}

you may test the function in the terminal by typing

w ##class(datagen.restservice).genpatientcsv()

3. Add a function GetMyDataCSV() in Python for populating the csv string as a csv file and then output through the REST api service. This can be achieve by,
   3.1. calling the patient data generate function to get the csv string
   3.2. set the %response.ContentType = "text/csv"
   3.3. set the header "Content-Disposition" value to "attachment; filename=mydata.csv"
   3.4.  write the generated csv string as output

remember to pip install the related libraries

Class datagen.restservice Extends %CSP.REST
{
Parameter CONTENTTYPE = "application/json";
ClassMethod GetMyDataCSV() As %Status
{
    // Build CSV string
    Set tCSVString = ##class(datagen.restservice).genpatientcsv()

    //Set headers and output CSV
    Set %response.ContentType = "text/csv"
    Do %response.SetHeader("Content-Disposition","attachment; filename=mydata.csv")
    
    // Output the data
    W tCSVString

    Quit $$$OK
}

ClassMethod genpatientcsv() As %String [ Language = python ]
{
    # w ##class(datagen.restservice).genpatientcsv()
    # python.exe -m pip install faker
    # python.exe -m pip install pandas

    from faker import Faker
    import random
    import pandas as pd
    from io import StringIO

    # Initialize Faker
    fake = Faker()

    def generate_patient(patient_id):
        return {
            "PatientID": patient_id,
            "Name": fake.name(),
            "Gender": random.choice(["Male", "Female"]),
            "DOB": fake.date_of_birth(minimum_age=0, maximum_age=100).strftime("%Y-%m-%d"),
            "City": fake.city(),
            "Phone": fake.phone_number(),
            "Email": fake.email(),
            "BloodType": random.choice(["A+", "A-", "B+", "B-", "AB+", "AB-", "O+", "O-"]),
            "Diagnosis": random.choice(["Hypertension", "Diabetes", "Asthma", "Healthy", "Flu"]),
            "Height_cm": round(random.uniform(140, 200), 1),
            "Weight_kg": round(random.uniform(40, 120), 1),
        }

    # Generate 10 patients
    patients = [generate_patient(i) for i in range(1, 11)]

    # Convert to DataFrame
    df = pd.DataFrame(patients)

    # Convert to CSV string (without saving to file)
    csv_buffer = StringIO()
    df.to_csv(csv_buffer, index=False)
    csv_string = csv_buffer.getvalue()

    return csv_string
}
}

4. Add the route to this function and compile the class

Class datagen.restservice Extends %CSP.REST
{
Parameter CONTENTTYPE = "application/json";
XData UrlMap [ XMLNamespace = "http://www.intersystems.com/urlmap" ]
{
<Routes>
        <Route Url="/export/patientdata" Method="GET" Call="GetMyDataCSV"/>
</Routes>
}

ClassMethod GetMyDataCSV() As %Status
{
    // Build CSV string
    Set tCSVString = ##class(datagen.restservice).genpatientcsv()

    //Set headers and output CSV
    Set %response.ContentType = "text/csv"
    Do %response.SetHeader("Content-Disposition","attachment; filename=mydata.csv")
    
    // Output the data
    W tCSVString

    Quit $$$OK
}

ClassMethod genpatientcsv() As %String [ Language = python ]
{
    # w ##class(datagen.restservice).genpatientcsv()
    # python.exe -m pip install faker
    # python.exe -m pip install pandas

    from faker import Faker
    import random
    import pandas as pd
    from io import StringIO

    # Initialize Faker
    fake = Faker()

    def generate_patient(patient_id):
        return {
            "PatientID": patient_id,
            "Name": fake.name(),
            "Gender": random.choice(["Male", "Female"]),
            "DOB": fake.date_of_birth(minimum_age=0, maximum_age=100).strftime("%Y-%m-%d"),
            "City": fake.city(),
            "Phone": fake.phone_number(),
            "Email": fake.email(),
            "BloodType": random.choice(["A+", "A-", "B+", "B-", "AB+", "AB-", "O+", "O-"]),
            "Diagnosis": random.choice(["Hypertension", "Diabetes", "Asthma", "Healthy", "Flu"]),
            "Height_cm": round(random.uniform(140, 200), 1),
            "Weight_kg": round(random.uniform(40, 120), 1),
        }

    # Generate 10 patients
    patients = [generate_patient(i) for i in range(1, 11)]

    # Convert to DataFrame
    df = pd.DataFrame(patients)

    # Convert to CSV string (without saving to file)
    csv_buffer = StringIO()
    df.to_csv(csv_buffer, index=False)
    csv_string = csv_buffer.getvalue()

    return csv_string
}
}

 

 

OK, now our code is ready. 😁 The next thing is to add the REST service to the web application

Input your Path, Namespace, and Rest service class name, and then Save

Assign the proper application role to this web application (because I am lazy, I just simply assign %All for testing 🤐)

 

OK everything is ready!!😁 Let's test the REST api!!!😂

Input the following path in a bowser

http://localhost/irishealth/csp/mpapp/export/patientdata

It trigger a file download, the file name is mydata.csv😗

Let's check the file 😊

 

Yeah!!! Work well!! 😁😁

Thank you so much for the reading. 😉

3 Comments
Discussão (3)2
Entre ou crie uma conta para continuar