How to make event based flow run only once in a while - power-automate

I have made a flow that would trigger when a file is created in a folder (lets call it event folder). Based on that the flow would create another file in a different folder and send me a message that a new file has been created.
Now the event folder could have 1 or multiple files generated at once. My flow would trigger for each file created and spam me for all the files at once. I want only to have one message for any number of files created within the span of 5 minutes. Is there a way to do that?
Here is my flow

Looks like you'll need to store a 'new files created' status in some variable (possible hints at https://learn.microsoft.com/power-automate/create-variable-store-values ).
Then create some other flow, scheduled to run every 5 minutes (https://learn.microsoft.com/power-automate/run-scheduled-tasks) to check the status, optionally send a message and clear the status to some 'nothing to do'.

Using When a file is created (properties only) block with Split On setting turned off and Concurrency Control turned on & Degree of parallelism set to 1 does the trick
See attached image below

Related

Spring integration with spring batch

Please let me know , when i am putting say 5 files in a directory , 5 messages gets generated by the poller , i want that the spring batch job will get triggered only one time, not five times ,if the files are coming together say within 1 min duration. is it possible?
You may consider to use an Aggregator for this kind of task. So, you will collect several files together by expected size or withing some time window. You need to use some static correlationKey to let the component to group files.
When the group is ready, a single message is emitted and you are good to trigger a Batch job for this set of files.

Watch new files in directory NIFI

I'm having a use case where I have new files everyday at differents moments like every hour or two hours so I need to watch a directory in my folder, and on adding new files it triggers an event which sends those new files paths to my webservice on NIFI , any idea how to implement this and what tool to use for this ?
Or maybe this is not the best approach ?
Take a look at the ListFile and FetchFile processors:
https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.5.0/org.apache.nifi.processors.standard.ListFile/index.html
https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.5.0/org.apache.nifi.processors.standard.FetchFile/index.html
Complete NiFi documentation can be found at https://nifi.apache.org/docs.html
If your file is in file sytem then use 'GETFILE' processor which on adding new file on provided 'input directory' triggers an event and immediately feed data into NIFI without any delay.
If your requirement is to schedule it like every hour or any specific time then use 'Scheduling' tab present on each processor's configuration and schedule it using 'Cron-Driven' strategy and set cron for every hour like this
*/60 * * * *?
If your file is in S3 bucket then you have to use SQS queue notification using 'GETSQS' processor documented in detailed in below link
http://crazyslate.com/apache-nifi-intergration-with-aws-s3/
https://community.hortonworks.com/content/idea/76919/how-to-integrate-aws-sqs-queue-with-nifi.html

jmeter - Clean up previously saved files before running a new test plan

I have been using the following BeanShell script in order to delete all output files before each new run of a test plan
import org.apache.commons.io.FileUtils;
FileUtils.cleanDirectory(new File(vars.get("OutPutFolder")));
My test plan consists of a Thread group that is using 100 users (number of threads) with a loop count of 5.
I have tried several ways to incorporate the aforementioned beanshell script but i cannot seem to find the right place to put it in.
Things that i have already tried:
Creating a new "setup thread group" ("Run thread groups consecutively" is checked)
and place script in a beanshell samples > Files do get deleted but outcome files are not persisted to the disk (re-deleted by my script)
and place script in a beanchsell PreProcessor > Files do not get deleted
Placing script in my main Thread group > Files do get deleted but outcome files are not persisted to the disk (re-deleted by my script)
Using an If Controller (${__BeanShell(vars.getIteration() == 1)} && ${__threadNum} == 1)
Not solving your problem, but based on your comment, here's a different simple solution, which provides you with ability to separate current test results from previous test results.
All the listeners in JMeter can accept a dynamic file path, and one of the dynamic values they can use is TESTSTART.MS property (see here), which will be different for each test execution.
So in order to make name of each output file (or report) unique for the run you can use that property in the name of the output file:
your/path/${TESTSTART.MS}_errors.csv
You can even create a sub-folder holding set of the output files for specific execution, like this:
your/path/${TESTSTART.MS}/name1.ext
your/path/${TESTSTART.MS}/name2.ext
That way the names never collide, and they are grouped by execution (all files from the same execution will have the same timestamp, so you can filter them).
On top of that, on linux you could use logrotate to zip old jmeter logs (and since they are just text files, they zip nicely)

How to architecture file processing in laravel

I have task observe folder where files are coming from SFTP. File are big and processing one file is relatively time consuming. I am looking for best approach to do it. Here are some ideas how to do it, but I am not sure what is the best way.
Run scheduller each 5 min to check for new files
For each new file trigger event that there is new file.
Create listener which will listen for this event and which will using queues. In the listener for new files copy new file in the processing folder and process it. When processing of new files start insert record in the DB with status processing. When processing is done change record status and copy file to processed folder.
I this solution I have 2 copy operations for each file. This is because it is possible if second scheduler executes before all files are processed than some files could overlap in 2 processing jobs.
What is the best way to do it? Should I use another approach to avoid 2 copy operations? Something like to put database check during scheduler execution to see if the file is already in the processing state?
You should use the ->withoutOverlapping(); as stated in the manual of task Scheduler here.
Using this you will make sure that only one instance of the task run at any given time.

Running A Large List of Files Once and Only Once in JMeter (Possibly in a Particular Order)

I'm trying to run around 15000 soap requests through JMeter. I have 15000 individual soap files in a folder.
I know that the WebService(SOAP) Request component has the option to point to a folder.
But, the problem is that the files in the folder will get picked up and run randomly and a file can get run multiple times.
This is not ideal because each request has a unique correlation id and if a file get's run twice, the second run will fail due to a duplicated correlation id.
Is there anyway, I could tell jmeter to run the files only once?
Also, as certain soap requests are dependent upon other request having already run, the ability to run these in a specified order would be desirable. Is this possible?
These seem like common problems that should have already been solved. But, I can't find much on google.
Do you guys have any ideas?
I would use the JSR223 Sampler to run a script (e.g. Groovy) to iterate through the files in the directory and store the text of each file in a String.
See, for example, this other answer about using a Groovy script to iterate a list of values.
You could put the data into a csv file and read it in using a CSV Data Set Config. If you need unique values over multiple threads then you have to create multiple files, one per thread.
You could also put the data in a database and use a JDBC Config/Sampler to access it, making sure to either a: delete the data after it is read, or b: mark it as 'read' using a flag. Both methods would prevent the same record being read twice by different threads.
If you need to run requests in order you should structure the test plan as such, requests will be made sequentially, top to bottom.

Resources