I would use programming-quote like this in Bash
$ export b=`ls /`
$ echo $b
Applications Library Network System Users Volumes tmp usr var
and now I want to find similar functionality in Matlab. Also, I want to find a command that outputs relative paths, not absolute paths like Matlab's ls -- I feel like reinventing the wheel if I am parsing this with a regex. I need to find this command to debug what is wrong with namespaces here. Familiar-Bash-style functionatilities would be so cool.
For your first question, I think you can get that behavior with anonymous functions:
b = #() ls('C:\'); %I'm on windows
b()
The expression b() now returns the contents of my C drive.
The Matlab equivalent of bash backticks is calling the system() function and using the second output argument. It will run an external command and capture the output.
[status,b] = system('ls /');
If it's a string of Matlab code you want to run and capture the console output of, use evalc.
But to just get a listing of files, you want the Matlab dir function. Lots easier than parsing that output string, and you get more info. See Matlab dir documentation or doc dir for more details.
children = dir('/');
childNames = { children.name };
childIsDir = [ children.isdir ];
Related
I have a set of text files and a set of GoLang files. The GoLang files contain directives such as the following:
//go:embed hello.txt
var s string
I want to write a bash script which takes the above code and substitutes the following in its place:
var s string = "<contents of hello.txt>"
Specifically, I want to bash script to go through all GoLang source files and replace all go:embed/string declaration pairs with a string defined to be the contents of the file specified in the embed directive.
I'm wondering if there is an existing program which can be configured to do the above. Otherwise, I'm planning on writing the algorithm myself.
Further explaination:
I am trying to replicate GoLang's embed directive (https://tip.golang.org/pkg/embed/).
We are not yet on GoLang 1.16, so we cannot use this functionality, but we are replicating it as closely as possible so that moving over to the standard implementation is as painless as possible.
Below is an attempt at solving your problem:
for i in file1 file2; do
awk '/^\/\/go:embed /{f=$2;next}/^var/&&f{printf"%s = \"",$0;system("cat "f);print"\"";f=0;next}1' < "$i" > "$i.new"
done
The awk script prints all normal lines, only if it encounters the embed directive this line will be skipped (and the file name remembered in variable f). A subsequent line starting with var will then be extended by the content of the file with the remembered name (using the system call "cat").
Beware, there are no error checks at all, no attempt to fix quotes and whatever. So for practical use - unless the file contents you are about to embed are known to be good-natured - you probably have to take a more sophisticated approach.
I'm pretty new to shell scripting, but it's been great in helping me automating cumbersome tasks in OS X.
One of the functions I'm trying to write in a new script needs to find the specific filename in a subdirectory given a regex string. While I do know that the file exists, the version (and therefore filename itself) is being continually updated.
My function is currently as follows:
fname(){
$2=$(find ./Folder1 -name "$1*NEW*")
}
Which I'm then calling later in my script with the following line:
fname Type1 filename1
What I'm hoping to do is save the filename I'm looking for in variable filename1. My find syntax seems to be correct if I run it in Terminal, but I get the following error when I run my script:
./myscript.sh: line 13: filename1=./Folder1/Type1-list-NEW.bin: No such file or directory
I'm not sure why the result of find is not just saving to the variable I've selected. I'd appreciate any help (regardless of trivial this question may end up being). Thanks!
EDIT: I have a bunch of files in the subdirectory, but with the way I'm setting that folder up I know my "find" query will return exactly 1 filename. I just need the specific filename to do various tasks (updating, version checking, etc.)
The syntax for writing output to a file is command > filename. So it should be:
fname() {
find ./Folder1 -name "$1*NEW*" > "$2"
}
= is for assigning to a variable, not saving output in a file.
Are you sure you need to put the output in a file? Why not put it in a variable:
fname() {
find ./Folder1 -name "$1*NEW*"
}
var=$(fname Type1)
If you really want the function to take the variable name as a parameter, you have to use eval
fname() {
eval "$2='$(find ./Folder1 -name "$1*NEW*")'"
}
Okay, so I'm reading this as, you want to take the output of the find and save it in a shell variable given by $2.
You can't actually use a shell variable to dynamically declare the name of a new shell variable to rename, when the shell sees an expansion at the beginning of a line it immediately begins processing the words as arguments and not as an assignment.
There might be some way of pulling this off with declare and export but generally speaking you don't want to use a shell variable to hold n file names, particularly if you're on OS X, those file names probably have whitespaces and you're not protecting for that in the way your find is outputting.
Generally what you do in this case is you take the list of files find is spitting out and you act on them immediately, either with find -exec or as a part of a find . -print0 | xargs -0 pipeline.
Its my first time to use BASH scripting and been looking to some tutorials but cant figure out some codes. I just want to list all the files in a folder, but i cant do it.
Heres my code so far.
#!/bin/bash
# My first script
echo "Printing files..."
FILES="/Bash/sample/*"
for f in $FILES
do
echo "this is $f"
done
and here is my output..
Printing files...
this is /Bash/sample/*
What is wrong with my code?
You misunderstood what bash means by the word "in". The statement for f in $FILES simply iterates over (space-delimited) words in the string $FILES, whose value is "/Bash/sample" (one word). You seemingly want the files that are "in" the named directory, a spatial metaphor that bash's syntax doesn't assume, so you would have to explicitly tell it to list the files.
for f in `ls $FILES` # illustrates the problem - but don't actually do this (see below)
...
might do it. This converts the output of the ls command into a string, "in" which there will be one word per file.
NB: this example is to help understand what "in" means but is not a good general solution. It will run into trouble as soon as one of the files has a space in its nameāsuch files will contribute two or more words to the list, each of which taken alone may not be a valid filename. This highlights (a) that you should always take extra steps to program around the whitespace problem in bash and similar shells, and (b) that you should avoid spaces in your own file and directory names, because you'll come across plenty of otherwise useful third-party scripts and utilities that have not made the effort to comply with (a). Unfortunately, proper compliance can often lead to quite obfuscated syntax in bash.
I think problem in path "/Bash/sample/*".
U need change this location to absolute, for example:
/home/username/Bash/sample/*
Or use relative path, for example:
~/Bash/sample/*
On most systems this is fully equivalent for:
/home/username/Bash/sample/*
Where username is your current username, use whoami to see your current username.
Best place for learning Bash: http://www.tldp.org/LDP/abs/html/index.html
This should work:
echo "Printing files..."
FILES=(/Bash/sample/*) # create an array.
# Works with filenames containing spaces.
# String variable does not work for that case.
for f in "${FILES[#]}" # iterate over the array.
do
echo "this is $f"
done
& you should not parse ls output.
Take a list of your files)
If you want to take list of your files and see them:
ls ###Takes list###
ls -sh ###Takes list + File size###
...
If you want to send list of files to a file to read and check them later:
ls > FileName.Format ###Takes list and sends them to a file###
ls > FileName.Format ###Takes list with file size and sends them to a file###
I have created a shell script (.run) that accepts the prefix for the names of the pictures as a parameter, and then calls gnuplot. However, for some reason, the picture is not saved. The code is:
#!/bin/sh
molecule=$1
echo "Plotting DFT-ADF PY results for $molecule"
echo "Tranmission plot (negatory SO)"
gnuplot -p << EOF
#!/usr/bin/gnuplot
set terminal epslatex size 5,3 color colortext
set output '$molecule_trans.tex'
plot cos(x) w l title 'cos(x)', sin(x) w l title 'sin(x)'
EOF
For my bachelor thesis I have to make several plots that are the same. Additionally, the computational cluster uses a qeueing system. For the purpose of being true to this system, I have created several shell scripts that automatically do stuff. In particular, about 45 simulations are called by the shell scripts, followed by a shell script that enters each simulations' directory and uses python files to evaluate the data into [.dat] files. Next, it should use a gnuplot file to make the graph. I use EPSLaTeX to make my figures, because it is so much nicer. However, in the current implementation this required me to manually edit the various latex files to rename the pictures.
In case you'll need more variables and do not want a $1, $2... mess:
You must use brackets around the variable name
set output '${molecule}_trans.tex'
because the underscore is a valid character for variable names, and bash looks for the variable $molecule_trans, see http://www.gnu.org/software/bash/manual/bashref.html#Shell-Parameter-Expansion.
I'm dealing with a pipeline of predominantly shell and Perl files, all of which pass parameters (paths) to the next. I decided it would be better to use a single file to store all the paths and just call that for every file. The issue is I am using awk to grab the files at the beginning of each file, and it's turning out to be a lot of repetition.
My question is: I do not know if there is a way to store key-value pairs in a file so shell can natively do something with the key and return the value? It needs to access an external file, because the pipeline uses many scripts and a map in a specific file would result in parameters being passed everywhere. Is there some little quirk I do not know of that performs a map function on an external file?
You can make a file of env var assignments and source that file as need, ie.
$ cat myEnvFile
path1=/x/y/z
path2=/w/xy
path3=/r/s/t
otherOpt1="-x"
Inside your script you can source with either . myEnvFile or the more versbose version of the same feature sourc myEnvFile (assuming bash shell) , i.e.
$cat myScript
#!/bin/bash
. /path/to/myEnvFile
# main logic below
....
# references to defined var
if [[ -d $path2 ]] ; then
cd $path2
else
echo "no pa4h2=$path2 found, can't continue" 1>&1
exit 1
fi
Based on how you've described your problem this should work well, and provide a-one-stop-shop for all of your variable settings.
IHTH
In bash, there's mapfile, but that reads the lines of a file into a numerically-indexed array. To read a whitespace-separated file into an associative array, I would
declare -A map
while read key value; do
map[$key]=$value
done < filename
However this sounds like an XY problem. Can you give us an example (in code) of what you're actually doing? When I see long piplines of grep|awk|sed, there's usually a way to simplify. For example, is passing data by parameters better than passing via stdout|stdin?
In other words, I'm questioning your statement "I decided it would be better..."