how to filter svn export - windows

I want to get only specific files using svn command line utlity.
I've a batch script that gets only specific files from vss, using ss tool of vss.
In vssthe command is:
ss get *.c
I need similar functionality with svn command utility.
How can I start ?

You can try something like, get the file list which are matching some extension, like .txt or .c.
svn list -R http://svn/url/till/the/path/you/need/ | grep ".extension"
Then for every line in the output of the above command use svn export to get the files in your local machine.
Edit:
repository=http://svn/url/till/the/path/need/
$target_directory=some path in machine
for line in svn list -R http://svn/url/till/the/path/need/ | grep ".extension"
do
filename=`echo "$line" |sed "s|$repository||g"`
if [ ! -d $target_directory$filename ]; then
directory=`dirname $filename`
mkdir -p $target_directory$directory
svn export --force -r HEAD $repository$line $target_directory$filename --username abc --password password123
fi
done

Related

hash method to verify integrity of dir vs dir.tar.gz

I'm working on a python scrip that verify the integrity of some downloaded projects.
On my nas, I have all my compressed folder: folder1.tar.gz, folder2.tar.gz, …
On my Linux computer, the equivalent uncompressed folder : folder1, folder2, …
So, i want to compare the integrity of my files without any UnTar or download !
I think i can do it on the nas with something like (with md5sum):
sshpass -p 'pasword' ssh login#my.nas.ip tar -xvf /path/to/my/folder.tar.gz | md5sum | awk '{ print $1 }'
this give me a hash, but I don't know how to get an equivalent hash to compare with the normal folder on my computer. Maybe the way I am doing it is wrong.
I need one command for the nas, and one for the Linux computer, that output the same hash ( if the folders are the same, of course )
If you did that, tar xf would actually extract the files. md5sum would only see the file listing, and not the file content.
However, if you have GNU tar on the server and the standard utility paste, you could create checksums this way:
mksums:
#!/bin/bash
data=/path/to/data.tar.gz
sums=/path/to/data.md5
paste \
<(tar xzf "$data" --to-command=md5sum) \
<(tar tzf "$data" | grep -v '/$') \
| sed 's/-\t//' > "$sums"
Run mksums above on the machine with the tar file.
Copy the sums file it creates to the computer with the folders and run:
cd /top/level/matching/tar/contents
md5sums -c "$sums"
paste joins lines of files given as arguments
<( ...) runs a command, making its output appear in a fifo
--to-command is a GNU tar extension which allows running commands which will receive their data from stdin
grep filters out directories from the tar listing
sed removes the extraneous -\t so the checksum file can be understood by md5sum
The above assumes you don't have any very-oddly named files (for example, the names can't contain newlines)

git grep and xargs in Windows Batch file?

I am trying to create a Windows friendly .bat implementation of the following .sh script. The top few lines are all fine, just add SET and cd is fine. git grep is fine, however, xargs isn't... What would the git grep | xargs logic look like in .bat ?
INFINITY=10000
TOPDIR=$(pwd)
METEOR_DIR="./code"
cd "$METEOR_DIR"
# Call git grep to find all js files with the appropriate comment tags,
# and only then pass it to JSDoc which will parse the JS files.
# This is a whole lot faster than calling JSDoc recursively.
git grep -al "#summary" | xargs -L ${INFINITY} -t \
"$TOPDIR/node_modules/.bin/jsdoc" \
-t "$TOPDIR/jsdoc/docdata-jsdoc-template" \
-c "$TOPDIR/jsdoc/jsdoc-conf.json" \
2>&1 | grep -v 'WARNING: JSDoc does not currently handle'
Any recent Git for Windows release has more than 200 Linux commands packaged in it.
Add to your PATH <path\to\Git\usr\bin and you will have xargs.
vonc#VONCM D:\prgs\git\PortableGit-2.9.2-64-bit\usr\bin
> dir xargs.exe
Directory of D:\prgs\git\PortableGit-2.9.2-64-bit\usr\bin
20/01/2016 10:17 64 058 xargs.exe

how to unzip the latest file only

am downloading daily FTP File through the following command:
wget -mN --ftp-user=myuser --ftp-password=mypassword ftp://ftp2.link.com/ -P /home/usr/public_html/folder/folder2
my file structure are like this:
Data_69111232_2016-01-29.zip
Data_69111232_2016-01-28.zip
Data_69111232_2016-01-27.zip
can you please let me know how can extract only the latest downloaded file only
usually am using the following command to unzip the file, but i don't know what should i add to extract only the latest file
unzip -o /home/user/public_html/folder/folder2/ftp2.directory/????.zip -d /home/user/public_html/folder/folder2/
you help is really approciated
Thanks in Advance
Updated Answer
I thought your question was about FTP, but it is maybe about finding the newest file to unzip.
You can get the newest file like this:
newest=$(ls -t /home/user/public_html/folder/folder2/ftp2.directory/*zip | head -1)
and see the value like this:
echo $newest
and use it like this:
unzip -o "$newest" ...
Original Answer
You can probably string something together using lftp. For example, I can get a listing in reverse time order with the newest file at the bottom like this:
lftp -e 'cd path/to/daily/file; ls -lrt; bye' -u user,password host | tail -1

Passing curl results to wget with bash

I have a small script that i'd like to use with cron.
Purpose: Get webpage with links, extract dates from link and download files.
Script below is not working 100% and i can't see the problem.
#!/bin/bash
for i in $(curl http://107.155.72.213/anarirecap.php 2>&1 | grep -o -E 'href="([^"#]+)"' | cut -d'"' -f2 | grep '_whole_1_3000.mp4'); do
GAMEDAY=$(echo "$i" | grep -Eo '[[:digit:]]{4}/[[:digit:]]{2}/[[:digit:]]{2}')
wget "$i" --output-document="$GAMEDAY.mp4"
done
It get's the webpage "curl http://...etc" - works
$DAY - extracts the date - works.
wget part not working when i add $DAY. I'm i blind ... what am i missing.
Look at your output format here:
wget "$i" -O 2015/05/12.mp4
This is looking for a directory named 2015 with a subdirectory named 05 in which to place the file 12.mp4. Those directories don't exist, so you get 2015/05/12.mp4: No such file or directory.
If you want to replace the /s with underscores:
wget -O "${GAMEDAY//\//_}" "$i"
Alternately, if you want to create the directories if they don't exist:
mkdir -p -- "(dirname "$GAMEDAY")"
wget -O "$GAMEDAY" "$i"

TAR doesn't work properly with the crontab

First of all, I'm saying that it doesn't work properly with the crontab because when I run the script manually it works fine.
The problem is that when I run the backup script with the cronjob and... it's coming to tar up the mysql dump, the tar archive has only 16 bytes size (and its empty, so it looks like there were no files to pack into the archive), the strange thing about that is that when I run the script manually, it runs almost 5~ minutes, and the tar package size is ~1.8GB.
Here is my bash code:
#!/usr/local/bin/bash
# Configuration
BACKUPD="/backup/mysql"
MySQLuser='root'
MySQLpass='xxxx'
# End configuration
ROK=`date +%Y`
MIESIAC=`date +%m`
DZIEN=`date +%d`
GIM=`date +%H-%M`
if [ -d $BACKUPD/$ROK/$MIESIAC/$DZIEN ]
then
echo
else
mkdir -p $BACKUPD/$ROK/$MIESIAC/$DZIEN
fi
for db in $(echo "SHOW DATABASES;" | mysql --user=$MySQLuser --password=$MySQLpass | grep -v -e "Database" -e "information_schema")
do
mysqldump --skip-lock-tables --ignore-table=log.log --user="$MySQLuser" --password="$MySQLpass" $db >$BACKUPD/$ROK/$MIESIAC/$DZIEN/$db.sql
done
cd $BACKUPD/$ROK/$MIESIAC/$DZIEN && tar jcPf $BACKUPD/$ROK/$MIESIAC/$DZIEN/mysql-$GIM.tar.bz2 *.sql && rm -rf *.sql
Where is the problem? Did anyone experienced a problem like this before?
Regards.
Can you try with full path name for mysqldump and mysql inside your script.
So:
if which mysql is equal to /usr/local/mysql/bin/mysql
and
if which mysqldump is equal to /usr/local/mysql/bin/mysqldump
Modify your script to:
for db in $(echo "SHOW DATABASES;" | /usr/local/mysql/bin/mysql --user=$MySQLuser --password=$MySQLpass | grep -v -e "Database" -e "information_schema")
do
/usr/local/mysql/bin/mysqldump --skip-lock-tables --ignore-table=log.log --user="$MySQLuser" --password="$MySQLpass" $db >$BACKUPD/$ROK/$MIESIAC/$DZIEN/$db.sql
done
My guess is that the last line is your problem. The shell glob (*.sql) in:
cd $BACKUPD/$ROK/$MIESIAC/$DZIEN && tar jcPf $BACKUPD/$ROK/$MIESIAC/$DZIEN/mysql-$GIM.tar.bz2 *.sql && rm -rf *.sql
is expanded in the current directory and not after the cd as you might expect. Try the following instead, it is safer.
old_dir=`pwd`
cd "$BACKUPD/$ROK/$MIESIAC/$DZIEN"
tar jcPf mysql-$GIM.tar.bz2 *.sql
rm -fr *.sql
cd "$old_dir"
There still might not be any .sql files to tar ball. I don't have mysql installed but I suspect that the for loop is messed up as well. Try something like the following instead:
mysqlshow | \
xargs mysqldump --databases | \
bzip2 > $BACKUPD/$ROK/$MIESIAC/$DZIEN/mysql-$GIM.bz2
You will probably beed to insert other arguments for the mysqlshow and mysqldump commands. Of course this won't create a tarball but it will give you a compressed backup.

Resources