Nova postagem

Pesquisar

Pergunta
· Mar. 15, 2024

数据库出现WriteDaemon Alert严重警告有没有问题?相关的帮助文档哪里寻找?

日志总是出现如下“严重”级别报错,这个错误要紧吗?

03/13/24-09:44:12:182 (39059) 2 [SYSTEM MONITOR] WriteDaemon Alert: Write Daemon still on pass 581

在 Documentation 里搜索关键词也只有如下信息,哪里还能找到更详细的教程或说明呢?

SYS.History.WriteDaemon — The properties in this class describe the performance of write daemon cycles. The system automatically keeps track of the last 20 write daemon cycles, and the History Monitor stores the data for the cycles that occurred in each interval. Typically, there are multiple cycles within each interval.

 

Cache 的资料太少了,想买本书来学习都买不到。

2 Comments
Discussão (2)1
Entre ou crie uma conta para continuar
Artigo
· Mar. 15, 2024 4min de leitura

Uncovering Clues by Querying the Interoperability Message tables

When using InterSystems IRIS as an interoperability engine, we all know and love how easy it is to use the Message Viewer to review message traces and see exactly what's going on in your production. When a system is handling millions of messages per day, you may not know exactly where to begin your investigation though.

Over my years supporting IRIS productions, I often find myself investigating things like...

  • What sort of throughput does this workflow have?
  • Where is the bottleneck?
  • What are my most common errors?

One of my favorite places to look for clues is the Message Header table, which stores metadata about every message running through the system. These are the same messages that appear in the Message Viewer and the Visual Traces. 

I've built up a collection of handy SQL queries, and I'd love to share them with you. My examples are mostly from HealthShare or IRIS for Health use cases, but they can be easily adapted for whatever workflow you have...

-- SQL query to find the # of messages through a component per day
select {fn SUBSTRING(timeprocessed,1,10)} AS day, count(*) MessagesThisDay 
FROM Ens.MessageHeader
where TargetConfigName = 'HS.Hub.Push.Evaluator' 
GROUP BY {fn SUBSTRING(timeprocessed,1,10)}
ORDER BY day ASC
-- SQL query to find long-running messages through particular components
SELECT PReq.SessionID as SessionId, 
  PReq.TimeCreated as pReqTimeCreated, 
  PRes.TimeCreated as pResTimeCreated, 
  {fn TIMESTAMPDIFF(SQL_TSI_SECOND, PReq.TimeCreated,PRes.TimeCreated)} as TimeDelay
FROM (
  SELECT ID, SessionId, TimeCreated
  FROM Ens.MessageHeader
  WHERE MessageBodyClassName = 'HS.Message.PatientSearchRequest'
  AND SourceConfigName = 'HS.Hub.MPI.Manager'
  AND TargetConfigName = 'HUB'
) as PReq
INNER JOIN (
  SELECT ID, SessionId, TimeCreated
  FROM Ens.MessageHeader
  WHERE MessageBodyClassName = 'HS.Message.PatientSearchResponse'
  AND SourceConfigName = 'HS.Hub.MPI.Manager'
  AND TargetConfigName = 'HS.IHE.PIX.Manager.Process'
) as PRes on pReq.SessionId = PRes.SessionId
WHERE {fn TIMESTAMPDIFF(SQL_TSI_SECOND, PReq.TimeCreated,PRes.TimeCreated)} > 1
ORDER BY SessionId desc ----------------------------------------------------------
/*-- Query to find the bottleneck message through a particular component
  -- set your threshold for "how long is too long (e.g. 20 seconds)
  -- look for clusters of messages that are longer than that (e.g. the first cluster started at 3:22:00, then there was a second cluster at 5:15:30)
  -- in each cluster, look at the first message in that cluster (chronologically). That is likely to be the bottleneck message, and all messages after it are victims of its bottleneck 
*/
SELECT %NOLOCK req.TargetConfigName, req.MessageBodyClassName, req.SessionId, req.TimeCreated, req.TimeProcessed, {fn TIMESTAMPDIFF(SQL_TSI_SECOND, req.TimeCreated, req.TimeProcessed)} as TimeToProcess
FROM Ens.MessageHeader AS req
WHERE req.TargetConfigName = 'HS.Hub.Management.Operations'
  AND req.TimeCreated BETWEEN '2021-04-21 00:00:00' AND '2021-04-21 11:00:00'
  AND {fn TIMESTAMPDIFF(SQL_TSI_SECOND, req.TimeCreated, req.TimeProcessed)} > 20
/* If you have a particular error that you're investigating, try this one. It scans through the Ensemble Error Log for "Object to Load not found" entries, then returns some key fields from the relevant PatientSearchRequest message */
SELECT l.SessionId, mh.MessageBodyID, mh.TimeCreated, psr.SearchMode, psr.RequestingUser, psr.FirstName, psr.MiddleName, psr.LastName, psr.SSN, psr.Sex, psr.DOB
FROM Ens_Util.Log as l
INNER JOIN Ens.MessageHeader as mh on l.SessionId = mh.SessionId
INNER JOIN HS_Message.PatientSearchRequest as psr on mh.MessageBodyID = psr.ID
WHERE l.Type = 'Error'
AND l.ConfigName = 'HSPI.Server.APIOperation'
AND l.Text like 'ERROR #5809: Object to Load not found%'
AND mh.MessageBodyClassName = 'HS.Message.PatientSearchRequest'
AND mh.SourceConfigName = 'HSPI.Server.APIWebService'
AND mh.TargetConfigName = 'HSPI.Server.APIOperation'
-- Scan the Ensemble Error Log for a particular timeframe. Count up the different types of errors
SELECT substring(text,1,80) as AbbreviatedError, count(*) as NumTheseErrors
FROM Ens_Util.Log
WHERE Type = 'Error'
AND TimeLogged > '2022-03-03 00:00:00' -- when the last batch started
AND TimeLogged < '2022-03-03 16:00:00' -- when we estimate this batch might end
GROUP BY substring(text,1,80)
ORDER BY NumTheseErrors desc
-- Find the Gateway Processing Time for each StreameltRequest / ECRFetchResponse pair
SELECT sr.Gateway,request.sessionid, response.sessionid, request.timecreated AS starttime, response.timecreated AS stoptime, 
  datediff(ms,request.timecreated,response.timecreated) AS ProcessingTime, 
  Avg(datediff(ms,request.timecreated,response.timecreated)) AS AverageProcessingTimeAllGateways
FROM Ens.MessageHeader request
INNER JOIN Ens.MessageHeader AS response ON response.correspondingmessageid = request.id
INNER JOIN HS_Message.StreamletRequest AS sr ON sr.ID = request.MessageBodyId
WHERE request.messagebodyclassname = 'HS.Message.StreamletRequest'
AND response.messagebodyclassname = 'HS.Message.ECRFetchResponse'
2 Comments
Discussão (2)1
Entre ou crie uma conta para continuar
InterSystems Oficial
· Mar. 14, 2024

InterSystems announces General Availability of InterSystems IRIS 2024.1

The 2024.1 release of InterSystems IRIS Data Platform is now Generally Available (GA).

Release Highlights

In this release, you can expect a host of exciting updates, including:

  1. Using vectors in ObjectScript: A powerful capability for optimizing data manipulation.
  2. Vector Search (experimental): A cutting-edge feature for efficient data retrieval.
  3. Multi-Volume Database: Enhancing scalability and storage management.
  4. FastOnline Backup (experimental): Streamlining backup processes.
  5. Multiple Super Server Ports: Providing flexibility in network configuration.
  6. and much more!

 

Documentation

Details on all the highlighted features are available through these links below:

In addition, check out this link for upgrade information related to this release.

 

Early Access Programs (EAPs)

There are many EAPs available now. Check out to this page and register to those you are interested.

     

    How to get the software?

    As usual, Extended Maintenance (EM) releases come with classic installation packages for all supported platforms, as well as container images in Docker container format. For a complete list, refer to the Supported Platforms page.

    Classic installation packages

    Installation packages are available from the WRC's Extended Maintenance Releases page. Additionally, kits can also be found in the Evaluation Services website. InterSystems IRIS Studio is still available in the release, and you can get it from the WRC's Components distribution page.  

    Containers

    Container images for both Enterprise and Community Editions of InterSystems IRIS and IRIS for Health and all corresponding components are available from the InterSystems Container Registry web interface.

    ✅ Build number for this InterSystems IRIS data platform release is: 2024.1.0.262.0. If you're in the ICR, containers are tagged as both "2024.1" or "latest-em".

    5 Comments
    Discussão (5)3
    Entre ou crie uma conta para continuar
    Artigo
    · Mar. 14, 2024 7min de leitura

    Tutorial: Adding OpenAI to Interoperability Production

    Artificial Intelligence (AI) is getting a lot of attention lately because it can change many areas of our lives. Better computer power and more data have helped AI do amazing things, like improving medical tests and making self-driving cars. AI can also help businesses make better decisions and work more efficiently, which is why it's becoming more popular and widely used. How can one integrate the OpenAI API calls into an existing IRIS Interoperability application?

     

    Prerequisites

    In this tutorial we will assume that you already have an existing interoperability production and a set of OpenAI credentials to make calls to OpenAI APIs. You can download a code we use in this tutorial from the following GitHub project branch: https://github.com/banksiaglobal/bg-openai/tree/test-app-original
    To learn how to get OpenAI credentials, follow this tutorial https://allthings.how/how-to-get-your-open-ai-api-key/ or just open OpenAI API Keys page and create one https://platform.openai.com/api-keys

    Original Application

     

    Our application, AppExchange, emulates InterSystems OpenExchange publishing: it gets a request with a project description, project logo and GitHub URL and publishes it in the AppExchange repository.

     

    Adding a bit of Artificial Intelligence

    Now let's assume that a person who looks after our repository noticed that some app developers are lazy and not providing either short summary or logo for the apps they are publishing. That's where our AI friend can come to the rescue!

    The desired workflow would look like this:

    1. The application receives a URL of a repository, summary and a URL of logo as input.
    2. If summary is empty, the URL is sent to a GPT-based model that parses the repository contents and generates a descriptive summary of the project. This process may involve parsing README files, code comments, and other relevant documentation within the repository to extract key information about the project's purpose, features, and usage.
    3. The generated project summary is then used as an input to another GPT-based model, which is tasked with creating a logo for the project. This model uses the description to understand the project's theme, and then designs a logo that visually represents the project's essence and identity.
    4. The application outputs a response that includes the original URL, the generated project summary, and the newly created logo. This response provides a comprehensive overview of the project, along with a visual identifier that can be used in branding and marketing efforts.

    To achieve this integration, we will use the Business Process Designer to visually design the application's workflow.

     

    Step 1: Installation

    To start, we will download bg-openai package from Open Exchange using ZPM package manager:

    zpm "install bg-openai"


    You can have a look at this package here https://openexchange.intersystems.com/package/bg-openai-1 and check out it's source code here https://github.com/banksiaglobal/bg-openai

    This package is based on the great work of Francisco Lopez available here https://github.com/KurroLopez/iris-openai with four small changes: we changed class names to be more in line with standard IRIS naming conventions, we added a new SimplePrompt request which allows users to send simple AI text prompts very easily, we changed Api Key to be a credential rather than a setting, and we changed top level package name to "Banksia" in line with company standards.

     

    Step 2: Set up OpenAI Operation

    For further work and configuration of the products, let's move to the management portal located at the following link if you are using Docker image with our original application:

    http://localhost:42773/csp/sys/UtilHome.csp


    Navigate to the Interoperability->[Namespace]->Configure->Production and make sure that our original production is running.

    Add a new Operation based on class Banksia.OpenAi.Operation and name it OpenAiOut. Make it enabled. This operation will communicate with OpenAI API servers.

    • Operation Class: Banksia.OpenAi.Operation
    • Operation Name: OpenAiOut

     

     

    Now let's make a minimal setup required to use our new Operation in Production: add an API key and SSL Configuration.

    Navigate to OpenAiOut->Settings->Basic Settings->Credentials and click on magnifying glass icon to configure credentials.

     Fill in the form data and add apiKey in the password field. Save the data by clicking on SaveID and User Name fields you can fill as you like. 

     

    In the Credentials field, select the ID of the credentials we saved earlier.

     

     

    Setup SSL Configuration: create a new Client SSL Configuration OpenAiSSL and select it in the dropdown.

     

     

     

    Step 3 - Add Summary Generation to Business Process using the Business Process Designer

    Navigate to Interoperability > Business Process Designer  and open AppExchange.Process
    business process by clicking Open.

    Build a flowchart of the process based on the algorithm we described above.
    An example implementation is shown in the image below.

     

    Сheck the repository URL is provided and that we need to query ChatGPT to create a description if no description has been entered.

    (request.Summary="") & (request.GitHubUrl '="")

       

    Then, add the <Сall> block and make a target OpenAiOut, which, depending on the type of request, will make a call to OpenAi api.

    •  Name: Generate Summary 

       

    Customize the type of request and the received response, as well as distribute variables for actions.

    • Request Message Class: Banksia.OpenAi.Msg.SimplePrompt.Request

    set  callrequest.Prompt   = "Visit the website you will be provided on the next step. Describe the main idea of the project, its objectives and key features in one paragraph." 

    set callrequest.UserInput = request.GitHubUrl 

    set callrequest.Model = "gpt-4" 

    • Response Message Class: Banksia.OpenAi.Msg.SimplePrompt.Response

    set request.Summary = callresponse.Content 

     

    Add a <sync> step to wait for a response, in the Calls field add the name of the previous <call> 

    • Calls: Generate Summary

     

    Step 4 - Add Logo Generation to Business Process

     

    After getting the repository description, let's move on to the next logical part - logo generation. Let's check that there is a description for which the image will be generated and check if there is no image URL provided. Let's set the following condition:

    (request.LogoUrl="") & (request.Summary'="")

     

    Сonfigure the next <call> element, make a target our OpenAiOut  operation as well.

    •  Name: Generate Logo

     

     

    Customize the type of request and the received response.

    • Request Message Class: Banksia.OpenAi.Msg.Images.Request

    set  callrequest.ResponseFormat  = "url"

    set  callrequest.Operation  = "generations"

    set  callrequest.Prompt  = "Create a simple app icon for the following mobile application: "_request.Summary

    set  callrequest.Size  = "256x256"

    • Response Message Class: Banksia.OpenAi.Msg.Images.Response

    set  request.LogoURL  = callresponse.Data.GetAt(1).Url

        

    After completing the modification of our business process, click the compile button. 

    You can download the finished OpenAI integrated sample from the following GitHub project branch: https://github.com/banksiaglobal/bg-openai/tree/test-app

     

    Step 5: Test our new Business Process in Production

    Go to the Interoperability->Configure->Production section

     

    First we need to restart our process to apply all the latest changes, navigate to AppProcess->Actions->Restart.

    To test the process, go to AppProcess->Actions->Test.
    Create a test message with a GitHub URL for the OpenAI API and send it through production:

     

     

    Verify that the response from the OpenAI API is received and processed correctly by the application. Go to Visual Trace to see the full application cycle and make sure that the correct data is transmitted in each process element.

     

     

    This is AI's take on our app logo:

     

    Conclusion

    By following these steps, you can integrate the OpenAI API into the interoperability production using the Business Process in InterSystems IRIS. The bg-openai module is a great resource for developers looking to incorporate AI into their applications. By simplifying the integration process, it opens up new possibilities for enhancing applications with the power of artificial intelligence.

     

    About Author

    Mariia Nesterenko is a certified IRIS developer at Banksia Global. She is involved in application development, data structures, system interoperability, and geospatial data.

    About Banksia Global

    Banksia Global is an international boutique consultancy headquartered in Sydney, Australia, specializing in providing professional services for InterSystems technologies. With a team of dedicated and experienced professionals, we pride ourselves on being an official InterSystems Premier Partner, authorized to provide services worldwide. Our passion for excellence and innovation drives us to deliver high-quality solutions that meet the unique needs of our clients.

    6 Comments
    Discussão (6)2
    Entre ou crie uma conta para continuar
    Pergunta
    · Mar. 13, 2024

    Change Stream Property in Ens.StreamContainer with DTL

    I'm trying to change the Stream property inside a DTL with a Source Class of Ens.StreamContainer. The code, below, will change it within the DTL testing tool, but running an actual message through the Production's Process doesn't change the Stream property. I can change other properties of Ens.StreamContainer by using the normal Set action and it is reflected when running it through the Process. For context, this uses a FTP service to grab a file. Any thoughts on why I can't just write modified stream data to the Stream property?

    I use target here because I'm using the DTL copy function to preserve the OriginalFileName property.

    1 set   streamData  target.Stream.Read(target.Stream.Size)  ""   
    2 set   streamData  $REPLACE(streamData,"header1,header2"...  ""   
    3 code     do target.Stream.Write(streamData)     
    7 Comments
    Discussão (7)2
    Entre ou crie uma conta para continuar