I have large pcapng files, and I want to split them based on my desired wireshark filters. I want to split my files by the help of bash scripts and using pcapsplitter, but when I use a loop, it always gives me the same file.
I have written a small code.
#!/bin/bash
for i in {57201..57206}
do
mkdir destination/$i
done
tcp="tcp port "
for i in {57201..57206}
do
tcp="$tcp$i"
pcapsplitter -f file.pcapng -o destination/$i -m bpf-filter -p $tcp
done
the question is, can I use bash for my goal or not?
If yes, why it does not work?
Definitely, this is something Bash can do.
Regarding your script, the first thing I can think of is this line :
pcapsplitter -f file.pcapng -o destination/$i -m bpf-filter -p $tcp
where the value of $tcp is actually tcp port 57201 (and following numbers on the next rounds. However, without quotes, you're actually passing tcp only to the -p parameter.
It should work better after you've changed this line into :
pcapsplitter -f file.pcapng -o destination/$i -m bpf-filter -p "$tcp"
NB: as a general advice, it's usually safer to double-quote variables in Bash.
NB2 : you don't need those 2 for loops. Here is how I'd rewrite your script :
#!/bin/bash
for portNumber in {57201..57206}; do
destinationDirectory="destination/$portNumber"
mkdir "$destinationDirectory"
thePparameter="tcp port $portNumber"
pcapsplitter -f 'file.pcapng' -o "$destinationDirectory" -m bpf-filter -p "$thePparameter"
done
Related
I have a bash script that looks like below.
$TOOL is another script which runs 2 times with different inputs(VAR1 and VAR2).
#Iteration 1
${TOOL} -ip1 ${VAR1} -ip2 ${FINAL_PML}/$1$2.txt -p ${IP} -output_format ${MODE} -o ${FINAL_MODE_DIR1}
rename mods mode_c_ ${FINAL_MODE_DIR1}/*.xml
#Iteration 2
${TOOL} -ip1 ${VAR2} -ip2 ${FINAL_PML}/$1$2.txt -p ${IP} -output_format ${MODE} -o ${FINAL_MODE_DIR2}
rename mods mode_c_ ${FINAL_MODE_DIR2}/*.xml
Can I make these 2 iterations in parallel inside a bash script without submitting it in a queue?
If I read this right, what you want is to run them in background.
c.f. https://linuxize.com/post/how-to-run-linux-commands-in-background/
More importantly, if you are going to be writing scripts, PLEASE read the following closely:
https://www.gnu.org/software/bash/manual/html_node/index.html#SEC_Contents
https://mywiki.wooledge.org/BashFAQ/001
I am trying to provide a clickable .command to set up printers in Macs for my workplace. I thought since it is something I do very frequently, I can write a shellscript for each printer and save it on a shared server. Then, when I need to add a printer for someone, I can just find the shell script on the server and execute it. My current command works in terminal, but once executed as a .command, it comes up with the errors.
This is my script:
#!/bin/sh
lpadmin -p ‘PRINTERNAME’ -D PRINTER\ NAME -L ‘OFFICE’ -v lpd://xx.xx.xx.xx -P /Library/Printers/PPDs/Contents/Resources/Xerox\ WorkCentre\ 7855.gz -o printer-is-shared=false -E
I get this error after running the script:
lpadmin: Unknown option “?”.
I find this strange, because there is no "?" in the script.
I have a idea, why not try it like this ? there are huge differences between sh shells, so let me know if it rocks, I have more ideas.
#!/bin/sh
PPD="PRINTERNAME"
INFO="PRINTER\ NAME"
LOC="OFFICE"
URI="lpd://xx.xx.xx.xx"
OP ="printer-is-shared=false"
# This parameter P is new to me. Is it the paper-name ?
P="/Library/Printers/PPDs/Contents/Resources/Xerox\ WorkCentre\ 7855.gz"
lpadmin -p "$PPD" -D "$INFO" -L "$LOC" -v "$URI" -P "$P" -o "$OP" -E;
I have a script that was kicking off ~200 jobs for each sub-analysis. I realized that a job array would probably be much better for this for several reasons. It seems simple enough but is not quite working for me. My input files are not numbered so I've following examples I've seen I do this first:
INFILE=`sed -n ${SGE_TASK_ID}p <pathto/listOfFiles.txt`
My qsub command takes in quite a few variables as it is both pulling and outputting to different directories. $res does not change, however $INFILE is what I am looping through.
qsub -q test.q -t 1-200 -V -sync y -wd ${res} -b y perl -I /master/lib/ myanalysis.pl -c ${res}/${INFILE}/configFile-${INFILE}.txt -o ${res}/${INFILE}/
Since this was not working, I was curious as to what exactly was being passed. So I did an echo on this and saw that it only seems to expand up to the first time $INFILE is used. So I get:
perl -I /master/lib/ myanalysis.pl -c mydirectory/fileABC/
instead of:
perl -I /master/lib/ myanalysis.pl -c mydirectory/fileABC/configFile-fileABC.txt -o mydirectory/fileABC/
Hoping for some clarity on this and welcome all suggestions. Thanks in advance!
UPDATE: It doesn't look like $SGE_TASK_ID is set on the cluster. I looked for any variable that could be used for an array ID and couldn't find anything. If I see anything else I will update again.
Assuming you are using a grid engine variant then SGE_TASK_ID should be set within the job. It looks like you are expecting it to be set to some useful variable before you use qsub. Submitting a script like this would do roughly what you appear to be trying to do:
#!/bin/bash
INFILE=$(sed -n ${SGE_TASK_ID}p <pathto/listOfFiles.txt)
exec perl -I /master/lib/ myanalysis.pl -c ${res}/${INFILE}/configFile-${INFILE}.txt -o ${res}/${INFILE}/
Then submit this script with
res=${res} qsub -q test.q -t 1-200 -V -sync y -wd ${res} myscript.sh
`
I want to create multiple subdirectories.
My command is:
mkdir -p dir1/{dir1.1/{dir1.1.1,dir1.1.2},dir1.2,dir1.3}
It works, result is:
dir1
dir1.1
dir1.1.1
dir1.1.2
dir1.2
dir1.3
However I want to make this command look nicer (more readable). Tried to:
mkdir -p \
dir1/{\
dir1.1/{\
dir1.1.1,\
dir1.1.2},\
dir1.2,\
dir1.3}
And this doesn't work. Result is:
ls *
dir1 dir1.1 dir1.1.1, dir1.1.2}, dir1.2, dir1.3}
How can I wrap such mkdir command?
Try the following:
eval mkdir -p `echo \
dir1/{\
dir1.1/{\
dir1.1.1,\
dir1.1.2},\
dir1.2,\
dir1.3}\
| sed -E 's/\s*//g'`
Explanation: Your original code introduces spaces into the parameter, so instead of calling
mkdir -p dir1/{dir1.1/{dir1.1.1,dir1.1.2},dir1.2,dir1.3}
You are actually calling the command with the following parameters:
mkdir -p dir1/{ dir1.1/{ dir1.1.1, dir1.1.2}, dir1.2, dir1.3}
And this is why you got the wrong directories created. Therefore, to solve this, I first stripped the whitespaces using sed, and then used eval to evaluate the resulting command. This solution should work for simple cases, but some special characters within the directory names (such as white spaces) may cause issues.
Hope this helps!
If you want readable, just call mkdir multiple times. I doubt that directory creation is going to form any kind of bottleneck in your program.
mkdir dir1
mkdir -p dir1/dir1.{1,2,3}
mkdir -p dir1/dir1.1/dir1.1.{1,2}
The problem is the whitespace in the beginning of each line, which causes the lines to be treated as different arguments of the mkdir command. To overcome this, you can do:
mkdir -p \
dir1/{\
dir1.1/{\
dir1.1.1,\
dir1.1.2},\
dir1.2,\
dir1.3}
with no whitespace in the beginning. Whether this is more readable than the first command is debatable.
I'm sure there is a simple way to do this, but I am not finding it. What I want to do is execute a series of commands using lftp, and I want to avoid repeatedly connecting to the server if possible.
Basically, I have a file with a list full of ftp directories on the server. I want to connect to the server then execute something like the following: (assume at this point that I have already converted the text file into an array of lines using cat)
for f in "${myarray}"
do
cd $f;
nlist >> $f.txt;
cd ..;
done
Of course that doesn't work, but I have to imagine there is a simple solution to what I am trying to accomplish.
I am quite inexperienced when it comes to shell scripting. Any suggestions?
First build a string that contains the list of lftp commands. Then call lftp, passing the command on its standard input. Lftp itself can redirect the output of a command to a file, with a syntax that resembles the shell.
list_commands=""
for dir in "${myarray[#]}"; do
list_commands="$list_commands
cd \"$dir\"
nlist >\"$dir.txt\"
cd .."
done
lftp <<EOF
open -u $username,$password $site
$list_commands
bye
EOF
Note that I assume that the directory names don't contain backslashes, single quotes or globbing characters. Add proper escaping if necessary.
By the way, to read lines from a file, see Why is while IFS= read used so often, instead of IFS=; while read..?. You might prefer to combine reading from the list of directories and building the commands:
list_commands=""
while IFS= read -r dir; do
list_commands="$list_commands
cd \"$dir\"
nlist >\"$dir.txt\"
cd .."
done <directory_list.txt