File as parameter shell - Ruby - ruby

I would like to pass to a given Ruby script, a file as parameter. This file contains just a number (ID).
The command to run the Ruby scripts looks something like:
test export 123456 -o ./path/to/export -x
The number 123456 rappresents the parameter that i want to pass via txt/dat file from GitLab.
I tried:
test export "$(< /home/file.dat)" -o ./path/to/export -x
And also:
test export "`cat file.dat`" -o ./path/to/export -x
But i always get the same error:
cat: file.dat: No such file or directory
The very interesting point is that if i run cat before the other command, the content of the file is there (so the file is found). If i run it "nested" inside the Ruby command, it won't be found.
Any ideas how can i solve this?
Thank you very much

I'm not sure if this is what you're looking for, but you could pass the name of the file via command line and read the content of the file within the ruby script:
id = nil
# ARGV holds all the parameters passed to the script
File.readlines(ARGV[0]).each do |line|
id = line # id here will be set to the value contained in the file you passed a parameter
end

Related

Passing bash script variables to .yml file for use as child and subdirectories

I'm trying to write a bash script test-bash.sh that will pass arguments from a txt file test-text.txt to a .yml file test-yaml.yml, to be used as child and subdirectory arguments (by that I mean e.g. /path/to/XXXX/child/ and /path/to/XXXX where XXXX is the argument passed from the .sh script.
I'm really struggling to wrap my head around a way to do this so here is a small example of what I'm trying to achieve:
test-text.txt:
folder1
folder2
test-yaml.yml:
general:
XXXX: argument_from_bash_script
rawdatadir: '/some/data/directory/XXXX'
input: '/my/input/directory/XXXX/input'
output: '/my/output/directory/XXXX/output'
test-bash.sh:
#!/bin/bash
FILENAME="test-text.txt"
FOLDERS=$(cat $FILENAME)
for FOLDER in $FOLDERS
do
~ pass FOLDER to .yml file as argument ~
~ run stuff using edited .yml file ~
done
Where the code enclosed in '~' symbols is pseudo code.
I've found this page on using export, would this work in the looping case above or am I barking up the wrong tree with this?
Does envsubst solve your problem?
For example, if I have a test-yaml.yml that contains $foo:
cat test-yaml.yml
output:
general:
$foo: argument_from_bash_script
rawdatadir: '/some/data/directory/$foo'
input: '/my/input/directory/$foo/input'
output: '/my/output/directory/$foo/output'
You can replace $foo inside test-yaml.yml with shell variable $foo by envsubst:
export foo=123
envsubst < test-yaml.yml
output:
general:
123: argument_from_bash_script
rawdatadir: '/some/data/directory/123'
input: '/my/input/directory/123/input'
output: '/my/output/directory/123/output'

How to output variables and values defined in a shell file

I want to print out all the variables defined in the file (not environment variables), so that I can quickly locate the error. I thought of printing through echo, but this is not friendly, is there any easy way to achieve this?
For example is as follow:
var1=${VAR1:-"test1"}
var2=${VAR2:-"test2"}
var3=${VAR1:-"test3"}
var4=${VAR1:-"test4"}
print like below:
var1=test1
var2=modify // modified by environment var
var3=test3
var4=test4
I really appreciate any help with this.
In Bash you can:
# List all variables to a file named before
declare -p > before
# source the file
. the_file
# list all variables to a file named after
declare -p > after
# difference between variables before and after sourcing the file
diff before after
You can manipulate with env -i bash -c to get a clean environment.
The other way is just to write a parser for your file. Simple sed 's/=.*//' the_file will give you a list of all variable definitions.

Get the full path of the only file in a directory

I have a directory with a single file. The name of the file is randomized, but the extension is fixed.
myDirectory
|----file12321.txt
Is there a one-line way to extract the full path of that file?
MY_FILE=myDirectory/*.txt
Current output:
/home/me/myDirectory/*.txt
Expected:
/home/me/myDirectory/file12321.txt
Use readlink to get canonized path.
MY_FILE=$(readlink -f myDirectory/*.txt)
If you want only myDirectory/file12321.txt part you could run a command that will let shell expand *, like:
MY_FILE=$(printf "%s\n" myDirectory/*.txt)
If it's certain that there is exactly one file, you can just use an array:
MY_FILE=( /home/me/myDirectory/*.txt )
Filename expansion takes place inside an array definition but not when setting the value of a normal variable. And you can just use the array like a normal variable, as that will provide the value of the first element:
$ foo=(1 2 3)
$ echo "$foo"
1
MY_FILE=$(pwd)/$(ls myDirectory/*.txt)
# $MYFILE == /home/me/myDirectory/file12321.txt

How to make my script use a CSV file that was given in the terminal as a parameter

I tried to google this, but cant really find "good words" to get to my solution. So maybe someone here can help me out.
I have a script (lets call it script.rb) that uses File.read to read a csv file called somefile.csv and i have another csv file called somefileV2.csv.
Script.rb
csv_text = File.read('/home/XXX/XXX/XXX/somefile.csv')
Right now it uses somefile.csv as default, but I would like to know, if it is posseble to make my script use a CSV file that was given in the terminal as a parameter like:
Terminal
home$ script.rb somefileV2
so instead of it reading the file that is in the script, it reads the other csv file (somefileV2.csv) that is in the directory. It is kinda annoying to change the file manually everytime in the script itself.
You can access the parameters (arguments) using the ARGV array.
So your program could be like:
default = "/home/XXX/XXX/XXX/somefile.csv"
csv_text = File.read(ARGV[0] || default)
which gives you the possibility to supply a filename or, if not supplied, use the default value.
ARGV[0] refers to the first, ARGV[1] to the second argument and so on.
ruby myscript.rb foo bar baz would result in ARGV being
´["foo", "bar", "baz"]´. Note that the elements will always be strings. So if you want anything else (Numbers, Date, ...) you need to process it accordingly in your program.

Shell script file takes partial path from parameter file

I have a parameter file(parameter.txt) which contain like below:
SASH=/home/ec2-user/installers
installer=/home/hadoop/path1
And My shell script(temp_pull.sh) is like below:
EPATH=`cat $1|grep 'SASH' -w| cut -d'=' -f2`
echo $EPATH
${EPATH}/data-integration/kitchen.sh -file="$KJBPATH/hadoop/temp/maxtem/temp_pull.kjb"
When I run my temp_pull.sh like below:
./temp_pull.sh parameter.txt
$EPATH gives me correct path, but 3rd line of code takes only partial path.
Error code pasted below:
/home/ec2-user/installers-->out put of 2nd line
/data-integration/kitchen.sh: No such file or directory**2-user/installer** -->out put of 3rd line
There is no need to manually parse the values of the file, because it already contains data in the format variables are defined: var=value.
Hence, if the file is safe enough, you can source the file so that SASH value will be available just by saying $SASH.
Then, you can use the following:
source "$1" # source the file given as first parameter
"$SASH"/data-integration/kitchen.sh -file="$KJBPATH/hadoop/temp/maxtem/temp_pull.kjb"
The problem is file which we were using was copied from windows to UNIX.So delimiter issue are the root cause.
By using dos2unix paramfile.txt we are able to fix the isue.
command:
dos2unix paramfile.txt
This will convert all the delemeter of windows to unix format.

Resources