Batch localize using IBTools? - xcode

Is there a way to run IBTools on a bunch of NIB files with a single command? I'm trying to extract strings from NIBs. Am I supposed to run ibtools once for each NIB?
I find it tedious to run IBTools so many times. (I have only 9 NIB files. It could be worse...)

I don't think ibtool can take multiple files as argument. The only way I see would be to write a bash script to perform this task.
#!/bin/bash
find . -name "*.xib" | while read FILENAME;
do
ibtool --export-strings-file $FILENAME.strings $FILENAME
done

Here is a much more full featured script I made to use for the same operation:
#!/bin/bash
# Argument = -o output_dir -i input_dir
usage()
{
cat << EOF
usage: $0 [options]
This script generates strings files from all xibs in a given directory.
OPTIONS:
-h Show this message
-i Input directory where XIBs are located [./]
-o Output directory where .strings files will be generated
EOF
}
INPUT_DIRECTORY="."
OUTPUT_DIRECTORY="."
while getopts “hi:o:” OPTION
do
case $OPTION in
h)
usage
exit 1
;;
i)
INPUT_DIRECTORY=$OPTARG
;;
o)
OUTPUT_DIRECTORY=$OPTARG
;;
?)
usage
exit
;;
esac
done
if [[ -z $INPUT_DIRECTORY ]] || [[ -z $OUTPUT_DIRECTORY ]]
then
usage
exit 1
fi
# do the generation
find $INPUT_DIRECTORY -name "*.xib" | while read FILENAME;
do
XIBNAME=$(basename "$FILENAME")
XIBNAME="${XIBNAME%.*}"
ibtool --generate-strings-file $OUTPUT_DIRECTORY/$XIBNAME.strings $FILENAME
done

Related

How can I use getopts in a script that appends lines from files in a separate directory to a new file?

I am trying to write a bash script that takes in a directory, reads each file in the directory, and then appends the first line of each file in that directory to a new file. When I hard-code the variables in my script, it works fine.
This works:
#!/bin/bash
rm /local/SomePath/multigene.firstline.btab
touch /local/SomePath/multigene.firstline.btab
btabdir=/local/SomePath/test/*
outfile=/local/SomePath/multigene.firstline.btab
for f in $btabdir
do
head -1 $f >> $outfile
done
This does not work:
#!/bin/bash
while getopts ":d:o:" opt; do
case ${opt} in
d) btabdir=$OPTARG;;
o) outfile=$OPTARG;;
esac
done
rm $outfile
touch $outfile
for f in $btabdir
do
head -1 $f >> $outfile
done
Here is how I call the script:
bash /local/SomePath/Scripts/btab.besthits.wBp-q_wBm-r.sh -d /local/SomePath/test/* -o /local/SomePath/out.test/multigene.firstline.btab
And here is what I get when I run it:
rm: missing operand
Try 'rm --help' for more information.
touch: missing file operand
Try 'touch --help' for more information.
/local/SomePath/Scripts/btab.besthits.wBp-q_wBm-r.sh: line 23: $outfile: ambiguous redirect
Any suggestions? I'd like to be able to use getopts so I can make the script more generic. Thanks!
You have to pay extra attention to quoting and globbing when writing bash scripts.
When you call the script with a glob (* here) it gets expanded and split into words by your shell. This happends before your script even gets executed.
If you for example do cat *.txt cat will get all .txt files in the directory as its arguments. It will be the same as calling cat afile.txt nextfile.txt (and so on). Cat will never see the asterisk.
In your script it means that the input -d /local/SomePath/test/* gets expanded som something like /local/SomePath/test/someFile /local/SomePath/test/someOtherFile /test/someThirdFile.
Subsequently getopts only takes the first file after -d as for $btabdir and the -o doesn't get handled in the case switch.
I suggest you start by quoting every variable, preferable in the "${name}" style, and only invoke the script with quoted input.
It might also be send in a directory path, test that it is a directory (test -d), and change your for loop to for f in "${btabdir}"/*
This also works:
head -n1 -q /local/SomePath/test/* >> /local/SomePath/out.test/multigene.firstline.btab
I think the right answer here is "don't do it that way." :-)
The reason your current script isn't working may be that the wildcard is expanded by your interactive shell, not by your script. Try running your command with an echo at the beginning of the line for a hint at what's really happening. Once getopts sees the second of the matched files in the glob, it stops processing options, so -o never gets read, and $outfile remains unset. And since you don't quote your variable in rm $outfile, it's as if you're running rm without options. Test the difference in your shell between rm alone and rm "".
Also, what happens to your for loop if there's a space in a filename? Since you have bash, you have arrays. And arrays are much better for processing lists of files.
Perhaps use something like this instead:
#!/bin/bash
# initialize an array
files=()
while getopts :d:o: opt; do
case "$opt" in
d)
if [[ ! -d "$OPTARG" ]]; then
printf 'ERROR: not a directory: %s\n' "$OPTARG" >&2
exit 65
fi
# add to the array
files+=( "$OPTARG"/* )
;;
o) outfile="$OPTARG" ;;
*)
printf 'ERROR: unknown option: %s\n' "$opt" >&2
exit 64
;;
esac
done
if ! rm -f "$outfile" && touch "$outfile"; then
printf 'ERROR: cannot create %s\n' "$outfile" >&2
exit 73
fi
for f in "${files[#]}"; do
read -r < "$f"
printf '%s\n' "$REPLY"
done > "$outfile"
Here are some highlights of the changes....
We're using arrays, of course. The array ${files[#]} will contain one-file-per-record, without relying on whitespace, so with proper quoting you'll avoid problems with special characters in filenames.
We test for more error conditions, and actually show errors and exit if we see them. (The exit values are sysexits.)
Instead of using head, we use read and a single redirect to $outfile. This saves multiple forks to an external program, and multiple fopen() calls to your output file.
Note that the argument to -d should be a directory, not a glob. And you can specify options multiple times. Multiple -d options will be added together, but only the last -o option will be used.

comparing file in incrementally numbered directories using bash

I have created a subdir in test dir for each of the test cases.
#ls test/
test1 test2 test3 test4 test5
Each subdir contains expected.txt & actual.txt files. I am trying validate that contents in both of those files are identical using bash diff command.
Since there will tens of such test cases, I am iterating over the subdirectories as they are named with increasing number as suffix.
However, getting following error on while statement:
num=1
while [[ -f ./test/test${num} ]] ; do
DIFF=$(diff test/test${num}/actual.txt test/test${num}/expected.txt)
if [ "$DIFF" != "" ]
then
echo "Test Case ${num} ...Failed"
else
echo "Test Case ${num} ...Passed"
fi
num++
done
Error
while [[ -f ./test/test ]] ; do
/bin/sh: -c: line 1: syntax error: unexpected end of file
If there better way of achieving what I am trying to do, that will be a learning too.
You get an syntax error in your script, because you use bash extension [[ and run your script using /bin/sh. Either run your script under bash or replace the [[ with a single [.
You script makes no sense:
[[ -f - checks if ./test/test${num} is a file. So, if it is a file, the while body will execute. Then you compare ./test/test${num}/actual.txt file. But if ./test/test${num} is a file it is not a directory, so there can't be any ./test/test${num}/anything because it's a file. Most probably you meant to use -d.
diff is for nice looking output in terminal. Use cmp to compare files in scripts.
I guess your script could look like this:
find . -mindepth 2 -maxdepth 2 -type d -regex './test/test[0-9]+' |
while read -r dir; do
if cmp "$dir"/actual.txt "$dir"/expected.txt; then
echo "Test case $dir Failed"
else
echo "Test case $dir Passed"
fi
done

How to access a target directory using bash scripting

I am relatively new to shell scripting. I am writing a script to compress all the files in current and target directory. I have found success in compressing the files of a current directory but I'm unable to write a script for compressing files in a target directory can anyone guide me?
I want to do something like this
% myCompress -t /home/users/bigFoot/ pdf ppt jpg
next time try to spread your code (it will make it easier to answer):
#!/bin/bash
if [[ $# == 0 ]]; then
echo "This shell script compress files with a specific extensions"
echo "Call Syntax: compress <extension_list>"
exit
fi
for ext in $*; do
for file in ls *.$ext; do
gzip -k $file
done
done
Mistakes made
1) $* - all args coming after command - so.... -t and path are not $ext variables
2) ls *.$ext is red in loop as 2 strings "ls and *.$ext" should be written as $(ls *.$ext) to get ls command executed
My script for your request
#!/bin/bash
script_name=`basename "$0"`
if [[ $# == 0 ]]; then
echo "This shell script compress files with a specific extensions"
echo "Call Syntax: $script_name <dirctories_list> <extension_list>"
exit
fi
# check if $1 is a directory
path=". "
file_type=""
for check_type in $* ; do
if [[ -d $check_type ]]; then
path=$path$check_type" "
else
file_type=$file_type"*."$check_type" "
fi
done
echo paths to gzip $path
echo files type to check "$file_type"
for x in $path; do
cd $x
for file in $(ls $file_type); do
gzip $file
done
cd -
done
Explanation
1) basename "$0" - get scripts name - it is more generic for usage - in case you change script's name
2) path=". " - variable hold a string of all directories to be compressed, your request is to run it also on current directory ". "
file_type="" - variable hold a string of all extensions to be compressed in $path string
3) running a loop on all input ARGS and concatenate directories names to $path string and other file types to $file_type
4) for each of the directories inserted to script:
i. cd $x - enter directorie
ii. gzip - compress all files with inserted extensions
iii. cd - - go back to base directories
Check gzip
I'm not familiar with the gzip command , check that you have -k flag

Use string as argument in a bash scipt

Trying to write a script that use vlc to create a playlist.
#!/bin/bash
filename=/media/*/*.mp3
while [ "$1" != "" ]; do
case $1 in
-f | --filepath ) shift
filename=$1
;;
-h | --help ) usage
exit
;;
* ) usage
exit 1
esac
shift
done
#echo $filename
vlc $filename --novideo --quiet
This script is working but it only finds mp3 files in the root of any usb device. So i want to change the filename variable. This code gives similar results but it lists evething.
filename=$(find /media/* -name *.mp3 -print)
filename=$(tr '\n' ' ' <<<$filename)
Now the problem is that i can't pass it as an argument. I tried:
vlc $filename --novideo --quiet
or
vlc $*filename --novideo --quiet
or
vlc "$filename" --novideo --quiet
nothing worked. Any suggestions?
UPDATE:
Guys the problem I want help with is how to make vlc accept the filename variable as argument or arguments of files to use in the playlist. filename contains
/media/MULTIBOOT/Linkin Park - In The End.mp3 /media/MULTIBOOT/Man with a
Mission ft. Takuma - Database.mp3 /media/MULTIBOOT/Sick Puppies - You're
Going Down.mp3 /media/MULTIBOOT/Skillet - Rise.mp3 /media/MULTIBOOT/Song
Riders - Be.mp3 /media/MULTIBOOT/30 Seconds to Mars - This is War.mp3
/media/MULTIBOOT/Fade - One Reason.mp3
Now this is a string how to use it file path arguments?
I would use bash's recursive globbing and arrays:
#!/bin/bash
shopt -s globstar nullglob
files=()
while [[ $1 ]]; do
case $1 in
-f | --filepath ) shift
files+=("$1")
;;
-h | --help ) usage
exit
;;
* ) usage
exit 1
esac
shift
done
if [[ ${#files[#]} -eq 0 ]]; then
files=( /media/**/*.mp3 )
if [[ ${#files[#]} -eq 0 ]]; then
echo "no mp3 files found"
exit 1
fi
fi
#printf "%s\n" "${files[#]}"
vlc "${files[#]}" --novideo --quiet
With this code, you can specify -f filename multiple times to play a few songs.
You need to quote *.mp3. Otherwise it will be expanded in the current directory.
filename=$(find /media/* -name '*.mp3' -print)
You also don't need to remove the newlines. When you use the variable without quoting it, all whitespace, including newlines, will be converted to word delimiters.
rather than storing all filenames in a variable, you can tell find to call an application with all the files. this will prevent problems with whitespace, newlines and the like:
find /media -name '*.mp3' -exec vlc --novideo --quiet \{\} \+
A better way to handle options in your script might be to use getopts, if you don't mind losing the option of long options. For example:
#!/usr/bin/env bash
while getopts vqt opt; do
case "$opt" in
f) filename=($OPTARG) ;;
h) usage; exit 0 ;;
*) usage; exit 1 ;;
esac
done
shift $((OPTIND - 1))
filename=($(find /media/ -name \*.mp3 -type f))
vlc --novideo --quiet "${filename[#]}"
I don't know this usage of VLC, but the effect of this script is to create a command line with all the files found by the find command, which were stored in array called $filename.
An advantage of handling things in an array is that it lends itself to use in for loops.
for thisfile in "${filename[#]}"; do
vlc "$thisfile" # with options to convert just one file
done
NOTE that since you're using bash, you may not need to use find at all.
shopt -s globstar
filelist=(/media/**/*.mp3)
Check man bash for discussion of globstar.

Passing a path as an argument to a shell script

I've written bash script to open a file passed as an argument and write it into another file. But my script will work properly only if the file is in the current directory. Now I need to open and write the file that is not in the current directory also.
If compile is the name of my script, then ./compile next/123/file.txt should open the file.txt in the passed path. How can I do it?
#!/bin/sh
#FIRST SCRIPT
clear
echo "-----STARTING COMPILATION-----"
#echo $1
name=$1 # Copy the filename to name
find . -iname $name -maxdepth 1 -exec cp {} $name \;
new_file="tempwithfile.adb"
cp $name $new_file #copy the file to new_file
echo "compiling"
dir >filelist.txt
gcc writefile.c
run_file="run_file.txt"
echo $name > $run_file
./a.out
echo ""
echo "cleaning"
echo ""
make clean
make -f makefile
./semantizer -da <withfile.adb
Your code and your question are a bit messy and unclear.
It seems that you intended to find your file, given as a parameter to your script, but failed due to the maxdepth.
If you are given next/123/file.txt as an argument, your find gives you a warning:
find: warning: you have specified the -maxdepth option after a
non-option argument -iname, but options are not positional (-maxdepth
affects tests specified before it as well as those specified after
it). Please specify options before other arguments.
Also -maxdepth gives you the depth find will go to find your file until it quits. next/123/file.txt has a depth of 2 directories.
Also you are trying to copy the given file within find, but also copied it using cp afterwards.
As said, your code is really messy and I don't know what you are trying to do. I will gladly help, if you could elaborate :).
There are some questions that are open:
Why do you have to find the file, if you already know its path? Do you always have the whole path given as an argument? Or only part of the path? Only the basename ?
Do you simply want to copy a file to another location?
What does your writefile.c do? Does it write the content of your file to another? cp does that already.
I also recommend using variables with CAPITALIZED letters and checking the exit status of used commands like cp and find, to check if these failed.
Anyway, here is my script that might help you:
#!/bin/sh
#FIRST SCRIPT
clear
echo "-----STARTING COMPILATION-----"
echo "FILE: $1"
[ $# -ne 1 ] && echo "Usage: $0 <file>" 1>&2 && exit 1
FILE="$1" # Copy the filename to name
FILE_NEW="tempwithfile.adb"
cp "$FILE" "$FILE_NEW" # Copy the file to new_file
[ $? -ne 0 ] && exit 2
echo
echo "----[ COMPILING ]----"
echo
dir &> filelist.txt # list directory contents and write to filelist.txt
gcc writefile.c # ???
FILE_RUN="run_file.txt"
echo "$FILE" > "$FILE_RUN"
./a.out
echo
echo "----[ CLEANING ]----"
echo
make clean
make -f makefile
./semantizer -da < withfile.adb

Resources