Bash script behaving differently for different files - bash

I have a bash script that uses awk to process some files that I have downloaded. If I run the script on any of the files it does not work properly. However, if I transfer the contents of a file in a newly created one it seems to work as supposed. Could it have anything to do with the settings of the files?
I have two files file hotel_12313.dat and hotel_99999.dat . The first one is downloaded and the second one is created by me. If I copy the data from the first file into the second one and I execute the script on both of them the output is different.

Related

Configure file creation output directory of Batch script, when it is called by a seperate script

I currently have a Batch script that produces a file in the current working directory (a.bat), but I am unable to edit it.
I also have a second Batch script (b.bat) that calls it.
I would like to be able to change the output directory of a.bat, without directly editing it, possibly by implementing some form of configuration within b.bat.
Is this possible?

Uploading files from multiple directories to an SFTP site using Shell Scripting

I'm trying to upload items from multiple folder locations locally to an SFTP site. I'm using an existing shell script that I know works for uploads from a single local location, but I can't figure out how to make it work for uploads from multiple local locations.
I'm fairly new to coding and have only basic experience with batch scripting and some minor editing of existing shell scripts, so I would appreciate any help that can be given.
Here's the sample of my existing single local location upload script
open sftp://(userid):(password)#(sftp site) -hostkey="(hostkey)"
pwd
ls
lcd "(local directory)"
lls
cd (remote directory)
ls
put * -filemask=|*/ ./
exit
This has worked well for us previously, but I'm trying to clean up some of our existing scripts by combining them into one process that runs as an automated task, but I can't figure out how to chain multiple tasks like this together.
Just repeat the upload code for each location:
cd /remote/directory
lcd /local/directory1
put * -filemask=|*/ ./
lcd /local/directory2
put * -filemask=|*/ ./
Though if it's really a WinSCP script, you can use just one command like:
put -filemask=|*/ /local/directory1/* /local/directory2/* /remote/directory/
See the documentation for the put command:
put <file> [ [ <file2> ... ] <directory>/[ <newname> ] ]
...
If more parameters are specified, all except the last one specify set of files to upload. Filename can be replaced with Windows wildcard to select multiple files. To upload all files in a directory, use mask *.
The last parameter specifies target remote directory and optionally operation mask to store file(s) under different name. Target directory must end with slash. ...

concatenate fastq files in a directory

I have a file uploader, resumable.js, which takes a file and breaks it into 1MB 'chunks' and than sends over the files 1MB at a time. So after an upload I have a directory with thousands, sometimes millions of individual fastq files. I can concatenate all of these 'chunks' back into the files original state with this line of code..
cat file_name.* > merged.fastq
How would I go about concatenating the files back into its original state without manually running this script in the command line? Should I set up some bash script to handle this issue, maybe a cronjob? Any ideas to solve this issue are greatly appreciated.
ANSWER: For what its worth I used this npm module and it works great.
https://www.npmjs.com/package/joiner

Installer for a .bin file that will run on Ubuntu

I have a .bin file that will comprise of 3 files
1. tar.gz file
2. .zip file
3. install.sh file
For now the install.sh file is empty. I am trying to write a shell script that should be able to extract the .zip file and copy the tar.gz file to a specific location when the *.bin file is executed on an Ubuntu machine. There is a Jenkins job that will pull in these 3 files to create the *.bin file
My Question is how do I access the tar.gz and .zip file from my shell script ?
There are two general tricks that I'm aware of for this sort of thing.
The first is to use a file format that will ignore invalid data and find the correct file contents automatically (I believe zip is one such format/tool).
When this is the case you just run the tool on the packed/concatenated file and let the tool do its job.
For formats and tools where that doesn't work and/or isn't possible the general trick is to embed markers in the concatenated file such that the original script ignores the data but can operate on itself to "extract" the embedded data so the other tool can operate on the extracted contents.

MeshLab: processing multiple files in meshlabserver

I'm new to using meshlabserver and meshlab in general.
I created the .mlx file and tries to run a command in meshlabserver for one file and it worked. I would like to know how do I write a command for hundreds of files?
Thanks in Advance.
I've just created a batch file with necessary loops and calls the .mlx file that will run the meshlabserver command. However one should know that the resulting files will be saved in the same directory where meshlabserver.exe is.

Resources