Is there a way to speed up the triggering of a Power Automate workflow (Cloud)? - power-automate

I have a few Power Automate workflows that are triggered when:
An XLSX file is created/modified
A JSON file has been created/modified
An HTML file has been created/modified
on designated SharePoint folders. Each case takes up to 5 minutes to trigger and therefore, the whole process takes more than 15 minutes in total. Is there a way to speed up the triggering of the workflows?
Thank you

Related

Sync Files from folder to another using windows task scheduler

I was trying to sync files from folder to another automatically,
I managed to do this using windows 10 task scheduler ,
I used a PowerShell code given by someone, then I run it every 5 min,
I wonder if there is any other way to make the task runs every time a file is added to the source folder to avoid running the task every 5 min.
yes there is. you can use the FileSystemWatcher. a sample script can be found in the technet gallery: https://gallery.technet.microsoft.com/scriptcenter/Powershell-FileSystemWatche-dfd7084b

How to report on hours per period in Azure Devops?

I can't work out how to report how many hours per week were recorded by each team member.
For example if a task is 20 hrs, and 10 hours was done last week, and 10 this week, how can I tell how meany hours were recorded against the task each week?
I saw some time tracker plugins, they are not what I'm talking about. They require that the user start and stop time tracking kind of like upwork.
The developers already enter completed hours as they progress the tasks. So if devops remember the dates of each entry then I should be able to query how many hours were added this week, or month, or whatever per user per task.
Here is an excerpt from a discussion we had about it:
Meh, as far as Im concerned:
I make the tasks in notepad usually, so someone can add them to devops.
Someone adds them to develops.
The developers receive them in devops and record their status and times as they develop.
Devops doesnt seem to report on weekly times to us, so we ask the developers to write everything down in notepad or doc and pass us a
list or what they did with times.
We get the developers lists and use them to report to mgmt.
SO
What is the point of steps 2 and 3? May as well just pass the guys
their task in plain test list as I make it in notepad, it will be
easier for them to add their times and pass it back. For us devops is
just extra unwanted work while ever we cant work out how to report on
developer hours.
I think, you can use several ways:
Way 1. Use PowerBi reporting.
Each developer updates work hours every day.
You can use Analytics view to get task history:
Add Completed work to this view.
Connect to Analytics with Power BI Data Connector. Then you can find revisions for each tasks and calculate hour difference
Way 2. Customize your process template.
Add to your process template new work item type (like Time or Activity). Add to these work items new field (like Activity date) or use existing field Target Date. Add to these work items the Completed Work field. Add a custom work item type, Add a custom field to a work item type.
You developers add a new Time work item as child work item to their task and update the Completed Work field. You can use work item queries to see time updates.
TimeTracking in Azure Devops is pain. We have developed our custom solution to track time in DevOps. You can install extension https://marketplace.visualstudio.com/items?itemName=Sense4code.Sense4code-Manday. For more info you can visit also https://www.manday.io/doc/azure-devops-getting-started for more info.

Can TFS 2010 generate an e-mail alert when an item has been checked out for longer than 'X' days?

If a user checks out a file and fails to check it back in after a certain number of days, I'd like TFS 2010 to generate an e-mail. (For example, the user would be notified after the item has been checked out for 1 day, and the entire team notified after 2 days.)
I found the article about writing TFS server plugins
but I was wondering if there is a simpler way.
The problem is, that notifications/alerts are event triggered in TFS. So if nothing changes there will be no trigger.
A little bit easier than writing your own plugin, which also requires a trigger, but can check for more than the changed item, a small application would be easier. A small command line tool is enough and you could run it by using scheduled tasks once a day. The tool would check for all pending changes and how long they are pending and if they are to old, it will send a mail to the user.

VBScript - How to know when complete?

I'm running a simple/single vbscript in Windows Scheduler to perform 13 individual file exports from our SalesForce app. The script runs as expected. Depending upon network traffic, the 13 exports take 3-5 minutes total to complete.
My intent was to run these exports serially, but vbscript seems happy to run them in parallel. SalesForce accommodates with no issue or complaint.
Upon successful completion of the Export, I run a second vbscript to import these results into another application (via an msaccess function). This second vbscript also provides the desired result.
Question: Is there any way to programatically determine when the Export script has completed, to permit me to safely kick-off the Import script? Currently I have setup a 2nd Scheduler job to run the Import script 10 minutes after the separate Export script...but this could fail. I am looking to tie these two script more closely to one another.
Any suggestions?
Thanks!
There are a couple of options. If both scripts are running on the same system with the same permissions, you could have the first script actually kick off the second script whenever it's finished.
If the scripts require different permissions, or you need them to start from a task manager, have your first script start by looking for an existing file such as SCRIPT1.COMPLETE. If that file exists, have script1 delete the file and start processing. When script1 finishes it's processing, create that file. Then in script2, create a while loop that looks for SCRIPT1.COMPLETE. If the file is not there, hold off for a few seconds then try again. Don't exit the while loop until the complete file shows up. Have script2 delete the COMPLETE file when it finishes processing. I would recommend setting your "wait a while" function to at least 30 seconds or so, that way your script isn't just constantly checking.

why downloading speed drastically getting decreased using cuteftp 8.0

while downloading large number of Files nearly around of (7000 to 10000) with the help of Cute Ftp, downloading time is taking long time per file by file, when i started the my service files are downloading 1kb file per second..slowly it has comes per minute only 2 or 3 three files
while uploading files i have not facing any issue, all of my file will be around of 1KB size files.
please find MY cute ftp settings which i have configured:
transfer settings under global options
--> In transfer window
transfer mode is : AutoDetect
data Mode is : AUTO
uncheck for allow transfers to occur the existing (browse) session
CHECKED send REST Command prior to APPE when resume a transfer
when selecting max in a multi-part transfer is default value as 4
threads
--> IN transfer/events window
Checked for below options
Removed successful transfer items from the queue automatically
preserve remote time stamp for the downloaded files
unchecked for remaining options
--> In General/log files window
Checked for below options
Delete saved logs every 5 minutes
delete error logs after existing the transfer engine
delete item logs of successful transfer immediately after completion
uncheck for the
record/display time stamp
when the transfer engine exits : Delete images in thumbnail cache
please ask me if any information i need to provide, kindly guide me for to improving the performance speed for downloading transfers
thats just normal, it has to write each file into a sector and special stuff for individual files, unlike big files where the data part is seamless,
if ftp server has a online panel or something similar (for me i have godaddy) that you can compress it all into a zip file, then you can download the zip file at a fast speed. that is what i do

Resources