Can I use ms flow "peek code"/json to setup new flows? - power-automate

We use Sharepoint for our project files and that means there are often several layers of work to be done (project work, finances, administrative, etc) with different teams and working in different locations. There are some folders that I would like to keep "in sync" by copying files back and forth on change. I made some flows to do so, but it is pretty tedious as there are 6 sub directories for each project and a bunch of projects.
I noticed the peek code is pretty simple, it is just the name part of the directory that needs to change. Is there any way to manually define a flow or some way I can use the json/peek code to setup a flow?
For reference, I am using the sharepoint "when a file is created or modified (properties only)" and copy file flow. My directory structure looks like the following.
Thanks!
Projects Team:
ProjectNumber ProjectName
projNumber Administrative
projNumber invoices
projNumber Received
projNumber Sent
projNumber Requested
Accounting Team:
Invoices
projNumber projName
projNumber Received
projNumber Sent
projNumber Requested

It is possible, but still, it is not recommended. Because if you end up with some different hierarchy it may cause a severe impact. Anyway, let me put it in steps:-
Export your existing Flow.
Open the exported .zip file, it should have a file called definition.json.
Search for the exact hierarchy you are looking for and replace the hierarchy with what you have.
Compress it to .zip file.
Now go to https://us.flow.microsoft.com/
In the My Flows tab, click on import. import the compressed .zip file in step 4.
You are done. It should be working, if not -> Make sure you have followed JSON syntax throughout the file.

Related

check if file exist on Sharepoint Online by File Name with Power Automate

I am trying to create a flow in Power Automate Cloud that allows me to extract some files from Google Drive and copy them to a Sharepoint folder. But before copying it, it has to check by the file name, that the file does not already exist in the sharepoint folder. I have managed to do the whole flow, except the checking.
I would be very grateful for your help.
Thank you very much. Best regards.
A File exists action doesn't currently exist but you can use the Get file metadata using path action from the SharePoint connector to do the same sort of thing.
This flow below shows you an example of what you can do ...
For demonstration purposes, I've created an array with a list of file names (full path required) that I will then test the existence for.
One of those files doesn't exist.
I then loop through each one of those and try and get the metadata for each file.
If the resulting statusCode from the call to get the metadata is not equal to 200, then we know the file doesn't exist.
Examples ...
To make this work however, you need to make sure that the Condition step will execute after the metadata call no matter what the result.
Do this by making sure the Configure run after settings are correct ...
So after all of that, all you need to do is deal with the true and false sections of the condition.

'ClientFilter' validation failed: client view is too loose . each client should include one project only

I am trying to do same operations that I do with perforce GUI using command line.
When I tried to do p4 edit on any file, it said Client '<host-name>' unknown - use 'client' command to create it.
So, I tried p4 client -o <my-workspace-name> | p4 client -i but this returned:
Error in client specification.
'ClientFilter' validation failed:
======================================================
client view is too loose !!!
each client should include one project only
======================================================
I have no experience of p4 tool. Please help me explain what it means with client view too loose !!!
This is a trigger that your admin has set up. Based on the error, I surmise that they want you to set up your client's View to only include one project (they want to keep you from syncing down the entire world when you set up your new client).
To create a new client, run:
p4 set P4CLIENT=your_workspace_name
p4 client
and take a look at the form that pops up. The View field defines which part of the depot(s) your client will "see" and operate on. According to the error message, your admin wants you to restrict this to a single "project" -- I don't know what that means in this context (maybe it means just a single depot, or maybe a single folder in a particular depot?) so you might need to talk to your admin about it, or maybe browse around in the GUI and try to glean from context clues (i.e. names of directories) what that message is referring to.
Just to use a made-up example, if you have a few different depots your default ("loose") View might look like:
//depot_one/... //your_workspace_name/depot_one/...
//mumble/... //your_workspace_name/mumble/...
//widgets/... //your_workspace_name/widgets/...
and if you want to only map the project //mumble/core to your workspace root, you'd change that View to:
//mumble/core/... //your_workspace_name/...

Email Alert for Inserting/updating/removing files inside folder in windows7

i have above mentioned case happen at our company, they need to know under certain folder whether its shared or not if any files inserted or removed or update to send alert by email periodically for that action, any thought an ideas please
You can fairly easily create an application in C# that monitors a folder using the FileSystemMonitor class:
https://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher%28v=vs.110%29.aspx
With that you can then have it perform whatever actions you want.

windows folder for temporary files created by my application

I'm trying to decide where the 'correct folder' to store documents and logs created by my windows form application. The application is used in education and has all paths held in the SQL server. Some (like logs file paths are shared) are accessible on network but specifically for temporary documents where should I default the storage to? I've recently tried the Users/username/AppData/ folder but I seem to be getting differing results after installation; so far I have put this down to people user credentials as often in schools they can do whatever they want (yes I know shocking indeed).
If anyone can point me in the direction of an MSDn article or knows better please reply - Thanks.
** Edit 10/09/2013 - Sorry all I should be further explicit. I'm looking for the folder / structure Microsoft has designed for this sort off activity. My application already provides users with the ability to create thier own working directories (there are several required) but I'm keen to use the 'correct' locations for this sort of activity... I thought the right place would be c:/Users/USERNAME/Appdata/APPLICATION FOLDER/ but as I mention I've come across a few access rights issues when uses install the application.... hope that explains better - thanks
To create temporary directory you can use something like this:
public string GetTempDirectory() {
string path = Path.GetRandomFileName();
Path.Combine(Path.GetTempPath(), path);
Directory.CreateDirectory(path);
return path;
}
Path class info
Directory class info

Using PDI transformation in Pentaho BI server as data source for report parameters

Any advice on how to use PDI transformation as data source for report parameters in BI server's console?
I've uploaded the prpt reports to BI server but the I get a message "Error parsing parameter information". The .prpt and .ktr files are both in the same directory.
Actually, just realized that the issue could be solved by adding Transformation (KTR) as a resource. In this case, one can use the File-Resources menu selection. In the dialog select the transformation you wish to import and pick text/xml format. Give the resource a name and save it. You must save your PRPT file again (File-Save).
The caveat here is that transformation should be in the same folder as PRPT file. Then in the data sources, don't select transformation via folder path, but use the name of the resource that was assigned during the previous step (there is no drop down menu for looking thorough the files). You have to know exact name of the resource in order to do so.
Check the logs carefully. I suspect it's not finding the KTR. When you select the KTR in the prpt it usually (annoyingly) saves the whole path, so it's probably the full path to the ktr as defined on your dev box.
This does work, so do persevere!

Resources