I had many files which I got from the censor. Moreover, the files increase every hour. The files consist of 3 parts, rain_date_time. How can I open each file recursively to get what's inside the file and add it to database. I have found the way to read the file one by one, yet I face difficulty in reading
this is my code
Related
I have to upload around 40-50 piece of files (same extensions) to ftp, every day, to separated folders, where those file belongs.
Im, complete newbie in scripting, I just got to know rclone and it's amazing what it can do.
So I'm wondering is there any script to reclone to find out destination folder automatically for files to be uploaded, based on their names? More precisely based on the numbers in the file name:
Filenames 2nd and 3rd number are the same as destination folders last two digits,
Destination folders are in different places, but under the same root folder.
Is there any way to ask rclone to check the 2nd and 3rd character of each files waiting to be uploaded, and based on that two numbers, upload it to the a directory where these two numbers are listed.
For example:
50321_XXXXX.txt -----goes_to-----ftp:/xxxx/yyyy/zzzz/nn03/
51124_XXXXX.txt -----goes_to-----ftp:/xxxx/wwww/kkkk/nn11/
53413_XXXXX.txt -----goes_to-----ftp:/xxxx/dddd/aaaa/nn34/
Could you help me with where to go?
Thank you for your answers.
Nothing. I don't know where to start.
I'm new to Talend and I'm having trouble with tidying up after I have looped through files and loading them to Postrgres.
The load works, and it moves all but one of the files. So if there are 5 files in the folder, it archives 4 and if there is one file it doesn't archive any. It always leaves one file left in the folder though and gives a permissions error.
I have tried various configurations of this job with and without the "tUnite", with a second loop (as shown) and just a move as part of the main job flow. File locking is the common theme across all of the different approaches I've used. This is the current error I get
tFileCopy_1 C:\Users\stuar\Documents\vb-stock-20200705.csv -> C:\Users\stuar\Documents\Archive\vb-stock-20200705.csv: The process cannot access the file because it is being used by another process.
C:\Users\stuar\Documents\vb-stock-20200705.csv tFileCopy_1 - The source file "C:\Users\stuar\Documents\vb-stock-20200705.csv" could not be removed from the folder because it is open or you only have read-only rights.*
Should I be spliting this into 2 jobs and calling them from a parent with load in one and the move in a separate job? or writing to a new file after the tUnite and loading the. It feels like the load is still hanging on to the file I am trying to move.
OK I solved it. I needed to
add in a tFileOutputDelimited in after the tUnite, which merges all the files into a new file
add in a tFileDelete at the end of my tPostJob to delete the staging file I created in the previous step
This all worked fine.
I am trying to download few files using 3 threads. my requirement is i want to achieve file download on 3 threads so that all files download 3 times in 3 different folders so that the files dont overwrite. I am using __counter to append 1,2,3 to the folders. Problem is if i give Thread count as 1 or 2 or 3 , it is behaving same in all the scenarios i.e. it always create two folders Folder1 and Folder2 and in all in folder1 it download all the files and in folder2 only last file gets downloaded with size as 0 KB.
Number of threads = 1
Attaching what i have tried so far-
Please try without counter function and with prefix, and two threads. I am guessing it based on the below information.
https://jmeter.apache.org/usermanual/component_reference.html#Save_Responses_to_a_file
Please note that Filename Prefix must not contain Thread related data,
so don't use any Variable (${varName}) or functions like
${__threadNum} in this field
Or try to keep some delay/pacing between two threads.
Hope this helps.
Update:-
Just give the folder path and file name without extension. It will save the with extension. I tried with image and it is save as Myfile1.jpeg
I'd like to make one file representing (linking) bunch of files - something as on Linux named pipe do. The motivation is not to concatenate files (not to create the new one when I have originals and I want to keep them) so do not duplicate data. For example I want to use this to load videos from camera which are divided by approx. 2 GB.
I've got an edge case where two files have the same name but different contents and are written to the same tarball. This causes there to be two entries in the tarball. I'm wondering if there's anything I can do to make the tar overwrite the file if it already exists in the tarball as opposed to creating another file with the same name.
No way as the first file have already been written when you ask to write the second one and the stream has advanced the position. Remember tar files are sequentially accessed.
You should do deduplication before starting to write.