We have 2 servers here, one is development server and another is my local machine both running an application with the same code (using same branch on local as deployed on dev).
One of the API there downloads a csv file. Both of them are returning the same response (checked with the diff checker). But as soon as I save the csv file. Double quotes in the csv files are replaced with Double double quotes. Also this file is not opening properly in excel (even after changing the delimiters).
This is the file i saved using api on local machine
This is the one from dev server
Related
I've set up the pipeline and it works (I followed this documentation https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-ftp), it downloads the zip file and loads it to the blob storage.
however the resulted zip file is corrupted. it has a slightly different size than the original file.
I set the infer content type to YES. Also tried this setting to no but didn't change result.
I tried with hardcoded and dynamic naming.
I would like to know if there is a way to read a file on a remote server using url? For example i have been shared a url to a file on a remote server, i need to read it using bash commands or any tool to retrieve data (eg: view first 50 rows and write to a file) without downloading the original file to local system.
The use case is to avoid downloading/uploading huge files located on the remote server to local systems instead access the file content using the url.
Any resources on this would help.
I have an Excel file at a shared location on windows environment. I have data stage server on Unix box. I want to read the excel file and load data to a Teradata table. I need help with reading the excel. One option for me is to transfer file to the server location and access it from there but can i read the excel from the shared folder in windows environment?
I tried to use ftp first in datastage. But getting the below error.
<FTP_Enterprise_18> Error occurred during initializeFromArgs().
<FTP_Enterprise_18> uri : ftp://server/path/file.xlsx is not valid remote file.
<main_program> Creation of a step finished with status = FAILED.
No - it is not possible to read it from a remote location so you will need to tranfer it first (if the shared location is not a SAMBA mount on the Unix maschine).
You can use the "Unstructured data" stage to read the Excel fle one it is on the Unix server.
I have a spring-batch job scanning the SFTP server at a given interval. When it finds a new file, it starts the processing.
It works fine for most cases, but there is one case when it doesn't work:
User starts uploading a new file to the SFTP server
Batch job checks the server and finds a new file
It start processing it
But since the file is still being uploaded, during the processing it encounters unexpected end of input block, and the error occurs.
How can I check that file was fully uploaded to the SFTP server before batch job processing starts?
Locking files while uploading / Upload to temporary file name
You may have an automated system monitoring a remote folder and you want to prevent it from accidentally picking a file that has not finished uploading yet. As majority of SFTP and FTP servers (WebDAV being an exception) do not support file locking, you need to prevent the automated system from picking the file otherwise.
Common workarounds are:
Upload “done” file once an upload of data files finishes and have
the automated system wait for the “done” file before processing the
data files. This is easy solution, but won’t work in multi-user
environment.
Upload data files to temporary (“upload”) folder and move them atomically to target folder once the upload finishes.
Upload data files to distinct temporary name, e.g. with .filepart extension, and rename them atomically once the upload finishes. Have the automated system ignore the .filepart files.
Got from here
We had similar problem, Our solution was, we configured spring-batch cron trigger to trigger the job every 10min(though we could configure for 5min, as file transfer was taking less than 3min), then we read/process all the files created prior to 10 minutes. We assume the FTP operation completes within 3 minutes. This gave us some additional flexibility such as when spring-batch app was down etc.
For example if the batch job triggered at 10:20AM we read all the files that were created before 10:10AM, like-wise job that runs at 10:30, reads all the files created before 10:20.
Note: Once Read you need to either delete or move to history folder for duplicate reads.
I am new to the jmeter. I prepared some jmeter scripts to move file from local directory to remote ftp directory. I have succeeded doing this by using jmeter FTP sampler. Now I am facing a challenge in changing the filename every time before I put it in remote directory. I want to process multiple ftp requests with different file names.
Is there any way that I can change the filename in every FTP request before moving to FTP request. jmeter version 2.13.
Thanks,
Ajeesh
Do you have a test plan that looks something like the following?
If you right click on "FTP Request" in the tree in the pane on the left, you can select "Duplicate" to add another FTP request as part of this test plan.
You can change the remote / local file in the pane on the right for the newly added FTP request.
You could use an csv configuration element, this talks about the same problem but for http.
First you need to add an FTP sampler:
And you need to load the file names from a csv configuration file:
.
It will use a CSV entry per each iteration, it will replace the name by the variable ${files}.