Exist db triggers: after-create-document event does not fire when a document is stored as a result of another trigger - exist-db

With Exist DB I am able to get triggers working. For example, in my setup adding a file to a collection A triggers an XQuery which reads this particular file and generates several new files based on the content of this file (to another collection B - sibling of A).
Next, I would like to set a trigger on collection B, which starts an XQuery for every file that's written into it.
If the file is written by / on behalve of the first trigger, this seems to be impossible. I can see that the files are written into collection B by the trigger on collection A, but then the after-create-document event does not fire. However, if I manually place a file into B with webdav, the event is fired.
Is this a fundamental limitation or is there a better way to do this

This sounds like a bug to me, can you file an issue on the GitHub issue tracker for the project: https://github.com/exist-db/exist/issues

Related

Flow Triggering Itself(Possibly), Each run hits past IDs that were edited

I am pretty new to power automate. I created a flow that triggers when an item is created or modified. It initializes some variables and then does some switch cases to assign values to each of them. The variables then go into an array and another variable is incremented to get the total of the array. I then have a conditional to assign a value to a column in the list. I tested the flow specifically going into the modern view of the list and clicking the save button. This worked a bunch of times and I sent it for user testing. One of the users edited multiple items by double clicking into the item which saves after each column change(which I assume triggers a run of the flow)
The flow seemingly works but seemed to get bogged down at a point based on run history. I let it sit overnight and then tested again and now it shows runs from multiple IDs at a time even though I only edited one specific one.
I had another developer take a look at my flow and he could not spot anything wrong with it and it never had a hard error in testing only warnings about conditionals causing a loop but all my conditionals rectify. Pictures included. I am just not sure of any caveats I might be missing.
I am currently letting the flow sit to see if it finishes getting caught up. I read about the concurrent run option as well as conditions on the trigger itself. I am curious as to why it seems to run on two records(or more) all at once without me or anyone editing each one.
You might be able to ignore the updates from the service account/account which is used in the connection of the actions by using the following trigger condition expression:
#not(equals(triggerOutputs()?['body/Editor/Claims'], 'i:0#.f|membership|johndoe#contoso.onmicrosoft.com'))

Maximo: Use script to update work order when a related table is updated

I have an automation script in Maximo 7.6.1.1 that updates custom fields in the WORKORDER table.
I want to execute the automation script when the LatitudeY and LongitudeX fields (in the WOSERVICEADDRESS table) are edited by users.
What kind of launch point do I need to do this?
Edit:
For anyone who's learning automation scripting in Maximo, I strongly recommend Bruno Portaluri's Automation Scripts Quick Reference PDF. It doesn't have information about launch points, but it's still an incredibly valuable resource.
I wish I'd known about it when I was learning automation scripting...it would have made my life so much easier.
You can create an attribute action launch point on the latitudeY field and another on the longitudeX field. These will trigger whenever each field is modified, so it will fire once when the latitudeY field was changed, again if the longitudeX field is changed, again if the longitudeX field is changed again, and so on. This is also all before the data is saved, so the user may choose to cancel their changes, but the scripts will still have fired.
You could also make an "on save" object launch point for WOSERVICEADDRESS (if that's what is actually being updated via the map). This will run any time data in the object is saved, so you would have to do the extra checks of seeing if either of those fields have changed and then do your logic, but at least it would run once and only if the user commits to their changes.
Related:
Populates WORKORDER.WOSAX and WORKORDER.WOSAY (custom fields) from the values in WOSERVICEADDRESS.LONGITUDEX and WOSERVICEADDRESS.LATITUDEY.
woMbo=mbo.getOwner()
longitudex=mbo.getString('longitudex')
latitudey=mbo.getString('latitudey')
if woMbo is not None:
wosax=woMbo.getString('WOSAX');
wosay=woMbo.getString('WOSAY');
if longitudex!=wosax:
woMbo.setValue('WOSAX',longitudex)
if latitudey!=wosay:
woMbo.setValue('WOSAY',latitudey)
The launch points are Attribute Launch Points, not Object Launch Points.

VS Load Test Scenario - How can we add custom scenario info to database

I have a Load Test project that works perfectly and is able to saves all the perf counter in the LoadTest db as expected.
I wanted to add some specific scenario attributes information to my test run.
Therefore when I create a report in Excel at the end, ill be able to filter based on those attributes
Example:
Environment Attribute: (QA, PreProd, Production)
Target Attribute: (UI, API,..)
I searched everywhere but couldn't find the information. Not sure if I have to create a new table in that DB and populate it myself, or if there is another easier way.
The closest I have achieved to doing what you ask is by modifying the "reporting names" of requests to include the wanted information. On one test I added a test case that executed a small number of times (so few that the overall results would not be skewed). It had two requests. The first collected some data and saved it to a context parameter. The second had a PreRequest plugin that wrote the data to the ReportingName field of that request. (Experiments showed that setting the field in a PostRequest plugin had no effect.)
Another approach I have used is have a PostWebTest plugin that writes a line of useful data to a logging file, so I get one line per test case executed.
It may be possible to add another table to the load test database, but I would wonder whether it would be properly handled by the Open, Import, Export and Delete commands of the Open and Manage Test Results window.

Get created/modified/deleted files by a specific process from an event tracing (ETW) session

I've been searching for a solution to get all created/modified and deleted files by a specific process from an event trace (ETW) session (I will process data from an etl file not from a real-time session).
Apparently the simplest solution to get this done was to get the FileCreate and FileDelete events from FileIo_Name class and map them to the corresponding DiskIo_TypeGroup1 events. However, this solution isn't working for me since I don't receive any DiskIo_TypeGroup1 events for the corresponding FileDelete events, so I can not get the process ID. Also not all FileCreate events have an associated DiskIo_TypeGroup1 event (I think this happens for the empty created files or only for the opened files).
Note: I need DiskIo_TypeGroup1 mapping because FileIo_Name events don't have the ThreadId and ProcessId members populated - they are set to (ULONG)-1. Also, I can not decide which files where just opened or modified without knowing the "file write size". DiskIo_TypeGroup1 also don't have the ThreadId and ProcessId (in event header, on newer OS's) members populated, but it has the IssuingThreadId structure member from which I can obtain the ProcessId mapping to Thread_TypeGroup1 class events.
So I investigated how the FileIo_Create class can help me, and remarked that I can get the CreateOptions member which can have the following flags: (FILE_SUPERSEDE, FILE_CREATE, FILE_OPEN, FILE_OPEN_IF, FILE_OVERWRITE, FILE_OVERWRITE_IF). But the initial problem still persists. How can I check if a file was created from scratch instead of being just opened (e.g. in case of FILE_SUPERSEDE)?
Maybe I can use the FileIo_ReadWrite class to get Write event. Like using the DiskIo_TypeGroup1 class. So, if something was written to a file, then can I suppose that the file was either created or modified?
To find the deleted files I think that the FileIo_Info class and Delete event are the solution. Guess that I can receive Delete events and map them to FileIo_Name to get the file names.
Note: The FileIo_Create, FileIo_Info, FileIo_ReadWrite contain information about process id.
Are my suppositions right? What will be the best solution for my problem?
I will share my implemented solution as follow :
Created Files:
I have stored all FileIo_Create events as a pending create operation and waited to receive associated FileIo_OpEnd to decide if the file was opened, created, overwritten, or superseded from the ExtraInfo structure member.
Modified Files:
I marked files as dirty for every Write event from FileIo_ReadWrite and every SetInfo event with InfoClass->FileEndOfFileInformation and InfoClass->FileValidDataLengthInformation from FileIo_Info. Finally on Cleanup event from FileIo_SimpleOp verify if the file was marked as dirty and store as modified.
Deleted files:
I marked the files as deleted if was opened with the CreateOptions->FILE_DELETE_ON_CLOSE flag from FileIo_Create or if a Delete event from FileIo_Info appears. Finally on Cleanup event from FileIo_SimpleOp stored the file as deleted.
Also the process id and file name was obtained from the FileIo_Create events, more precisely from OpenPath structure member and ProcessId event header member.

How to force ActiveRecord to ALWAYS insert, update, delete and retrieve from the database?

We are using ActiveRecord stand-alone (not part of a Rails application) to do unit testing with RSpec. We are testing a trigger on the database that it inserts rows into an audit table.
The classes are:
Folder has many File
Folder has many FileAudit
The sequence of events is like this:
Create Folder
START TEST ONE
Create File
Do some stuff to File
Get Folder.file_audits
Check associated FileAudit records
Destroy File
Destroy FileAudits
END TEST ONE
START TEST TWO
Create File
Do some other stuff to File
Get Folder.file_audits
Check associated FileAudit records
Destroy File
Destroy FileAudits
END TEST TWO
Destroy Folder
The FileAudits from test one are getting destroyed, but not from test two. ActiveRecord seems to think that there is nothing new in that table to delete at the end of the second test.
I can do Folder.file_audits(true) to refresh the cache, but I would rather just disable any and all kinds of caching and have ActiveRecord just do what I tell it instead of it doing what it thinks is best.
I also need to set a flag on File to the same value and verify that the trigger did not create an audit record. When I set the flag to a different value, I can see the update statement in the log, but when I set it to the same value and save, there is no update in the log.
I am sure that the caching and etc. is fine for a web site, but we are not doing that. We need it to always get all records from the database and always update and delete no matter what. How can we do that?
We are using ActiveRecord 3.1.3.
Thanks
My initial guess would be that there you are doing something with transactions in one of the tests. If you are, you are effectively eliminating the outer transaction that wraps the unit test itself, in which case causes the unit test cleanup to have nothing to rollback.
I don't know if this applies to you or not, but I've had problems in the past with doing model.save instead of model.save!. Sometimes I would get validation errors on save, but without the bang the validation errors don't raise an actual exception, so I never know that the save wasn't successful.

Resources