How to simple test the file download and upload? - go

I have two api for file download and upload. (/static/{filename} api for downloading, /upload api for uploading) For consistency, it need a test file to ensure the functionality of them two. I check the official fs_test.go, but it is too massive to use, is there a trick way to do it?

Do you need a unit test? If you are supposed to test saving the file you can use a brute force.
Create file
Check if it exists (optional: check the file data)
Delete file

Related

check if file exist on Sharepoint Online by File Name with Power Automate

I am trying to create a flow in Power Automate Cloud that allows me to extract some files from Google Drive and copy them to a Sharepoint folder. But before copying it, it has to check by the file name, that the file does not already exist in the sharepoint folder. I have managed to do the whole flow, except the checking.
I would be very grateful for your help.
Thank you very much. Best regards.
A File exists action doesn't currently exist but you can use the Get file metadata using path action from the SharePoint connector to do the same sort of thing.
This flow below shows you an example of what you can do ...
For demonstration purposes, I've created an array with a list of file names (full path required) that I will then test the existence for.
One of those files doesn't exist.
I then loop through each one of those and try and get the metadata for each file.
If the resulting statusCode from the call to get the metadata is not equal to 200, then we know the file doesn't exist.
Examples ...
To make this work however, you need to make sure that the Condition step will execute after the metadata call no matter what the result.
Do this by making sure the Configure run after settings are correct ...
So after all of that, all you need to do is deal with the true and false sections of the condition.

delete uploaded file from folder using JMeter

I am sending a HTTP request to upload a file. And the request is setup like this:
uploadFile
And, the Directory Listing plugin pointing to a directory with all files and the request picks one file at a time. It works fine when run with one thread but, when i run in multiple threads, i see that already uploaded file is picked again to upload which leads to error.
I have added regular expression extrator to get the filename from the request body like this:
extract-filename-from-requestbody
And then, I am trying to use a post processor beanshell script to either delete the file from the folder or move to a different folder. But, not been successful. Need some help on this.
The first issue is i am not sure if i am extracting the value the right way. The value is to be got from request body and not request header. But, i dont see that option in the extractor.
Second, i am unable to use/retrieve the value from the extractor. Tried vars.get, vars.getObject and simply "${fileName}". Nothing works.
I don't think that deleting the file will help because Directory Listing Config reads the folder at the beginning of the test (see Execution Order chapter) so no matter whether the file is physically present or not JMeter will try to upload it
If you want to get unique files without repetitions just untick "Rewind on end of list" box:
This way each virtual user will read the next value so there will be no duplicates. When the last file will be used - the test will stop.
More information: Introducing the Directory Listing Config Plugin on JMeter
Also going forward consider using JSR223 Test Elements and Groovy language instead of Beanshell, it's the recommended option since JMeter 3.1

VS Load testing: delete file test case

I am testing my share point site by uploading/downloading/deleting files using webtest in Visual Studio Ulitmate.
The upload and download file test cases are working as expected.
But the delete file test cases are not working and getting test case failed error.
Even I tried upload and delete file in same test cases also, but this test case also failed.
Kindly assist how to perform delete file test case in VS Load test.
Regards,
David.
Load testing is all about replaying the same scenario multiple times. Note the difference between file upload, download and delete test cases. You can upload a file and download the same file multiple times. However you cannot just delete the same file multiple times.
The problem with the delete test case is likely caused by your script trying to delete the same file multiple times which is causing a SharePoint error.
If this is the case, to fix your script you need to correlate the identifier of the file you're trying to delete.
Currently the request that triggers file delete transaction, contains a constant identifier of the file that you deleted during recording. You need to replace it with an extractor of the file identifier which you will delete during the test execution. For example, if you record a test scenario where you delete the top file on the webpage list, you need to create an extractor from this webpage's response that retrieves the top file identifier. Then use it in the subsequent request that deletes the file.
It looks like Visual Studio did not auto-correlate this test case, so you have to do it manually.
Keep in mind that the "identifier" here can be more than just a single id. It depends on your SharePoint version and some configuration settings.
If you cannot figure out how to find a file identifier in the response, or how to extract it, here's a hint: our load testing tool StresStimulus auto correlates SharePoint. Download it, record the delete file test case and examine the parameter used in file-delete request and what extractor it uses.
After that recreate them in VS.

How To Write Your Own FTP Uploader with Automator

Is it possible to write your own application/command that will allow you to automatically upload your files to your ftp server?
Basically the flow I want to achieve is this
My app/action/whatever is scheduled to upload at a certain time
When the certain time arrives, the files in my specified folder will be uploaded
Of course, to upload, some data must be set like the username, password, ftp server etc...
After my files have been uploaded, the local files will be wiped-out.
I don't exactly know where to start. Can someone help me with this. Thank you.
Take a look at http://editkid.com/upload_to_ftp/. It comes with the source code so you can modify it to fit your needs. You can combine it with an Automator action to delete the files after upload.
To schedule it, http://smallbusiness.chron.com/schedule-automator-tasks-mac-os-x-39132.html.

Serving Files in Zend Framework MVC

What is the best practice when serving files from the Zend Framework MVC? These files have to be served from the MVC as they are protected.
I know you can read in the file and place it into the Response object but this seems like a bad practice as you would be reading the entire file into memory then serving it. Right now I usually do:
header('Content-type: image/jpeg');
fpassthru(fopen($path, 'rb'));
exit;
But this also doesn't seem right as I'm stopping the execution of the script. Any suggestions?
I see nothing wrong with just exit(); What you will need to be careful of is any output buffering layers you may have on (gzip compression, etc). Large files could blow up those buffers pretty quick, so you'll want to close them out and potentially 'chunk' your output with a fopen/fread loop.
I would suggest building a super-simple script for retrieving files based on ticket system like in CMS you generate ticket to DB - filename, unique-hash and than redirect to the super-simple file-retieving script (file.php?hash=asd52ad3as1g5). It will get the hash from query and based on it fetch the real filename and push that to output as you have written using fpassthru. The hash need to be unique and hard to guess...
You could try using the X-Sendfile header. It is supported by lighttpd and newer versions of apache. Basically the webserver will replace the output of the script with the file you specified. The downside being that it is specific to the configuration of the webserver, so you may be on a host that doesn't support it.

Resources