I am trying to create a Windows friendly .bat implementation of the following .sh script. The top few lines are all fine, just add SET and cd is fine. git grep is fine, however, xargs isn't... What would the git grep | xargs logic look like in .bat ?
INFINITY=10000
TOPDIR=$(pwd)
METEOR_DIR="./code"
cd "$METEOR_DIR"
# Call git grep to find all js files with the appropriate comment tags,
# and only then pass it to JSDoc which will parse the JS files.
# This is a whole lot faster than calling JSDoc recursively.
git grep -al "#summary" | xargs -L ${INFINITY} -t \
"$TOPDIR/node_modules/.bin/jsdoc" \
-t "$TOPDIR/jsdoc/docdata-jsdoc-template" \
-c "$TOPDIR/jsdoc/jsdoc-conf.json" \
2>&1 | grep -v 'WARNING: JSDoc does not currently handle'
Any recent Git for Windows release has more than 200 Linux commands packaged in it.
Add to your PATH <path\to\Git\usr\bin and you will have xargs.
vonc#VONCM D:\prgs\git\PortableGit-2.9.2-64-bit\usr\bin
> dir xargs.exe
Directory of D:\prgs\git\PortableGit-2.9.2-64-bit\usr\bin
20/01/2016 10:17 64 058 xargs.exe
Related
I'm working on a python scrip that verify the integrity of some downloaded projects.
On my nas, I have all my compressed folder: folder1.tar.gz, folder2.tar.gz, …
On my Linux computer, the equivalent uncompressed folder : folder1, folder2, …
So, i want to compare the integrity of my files without any UnTar or download !
I think i can do it on the nas with something like (with md5sum):
sshpass -p 'pasword' ssh login#my.nas.ip tar -xvf /path/to/my/folder.tar.gz | md5sum | awk '{ print $1 }'
this give me a hash, but I don't know how to get an equivalent hash to compare with the normal folder on my computer. Maybe the way I am doing it is wrong.
I need one command for the nas, and one for the Linux computer, that output the same hash ( if the folders are the same, of course )
If you did that, tar xf would actually extract the files. md5sum would only see the file listing, and not the file content.
However, if you have GNU tar on the server and the standard utility paste, you could create checksums this way:
mksums:
#!/bin/bash
data=/path/to/data.tar.gz
sums=/path/to/data.md5
paste \
<(tar xzf "$data" --to-command=md5sum) \
<(tar tzf "$data" | grep -v '/$') \
| sed 's/-\t//' > "$sums"
Run mksums above on the machine with the tar file.
Copy the sums file it creates to the computer with the folders and run:
cd /top/level/matching/tar/contents
md5sums -c "$sums"
paste joins lines of files given as arguments
<( ...) runs a command, making its output appear in a fifo
--to-command is a GNU tar extension which allows running commands which will receive their data from stdin
grep filters out directories from the tar listing
sed removes the extraneous -\t so the checksum file can be understood by md5sum
The above assumes you don't have any very-oddly named files (for example, the names can't contain newlines)
Looking for a simple way to download a .zip from a latest GitHub release.
There are other similar questions, but I havent been able to get them to work. :(
Trying to pull latest release from https://github.com/CTCaer/hekate
Currently ive got:
#!/bin/bash
curl -s https://api.github.com/repos/CTCaer/hekate/releases/latest | jq -r ".assets[] | select(.name | test(\"hekate_ctcaer\")) | .browser_download_url"
trying to fetch the url of the latest .zip and only grab the "hekate_ctcaer_X.X.X_Nyx_X.X.X.zip"
I saw someone trying to achieve this with 'Xidel', so im open to trying that if someone knows the syntax to grab a specific file from the GitHub api.
As I understand it (?), the Github API spits out an array for the release 'assets', so im trying to specify an item in this array that matches "hekate_ctcaer", and download the specified file.
Github is also a compatible git repo. I provide a new train of thought.
use git ls-remote to fetch last release tag.
git -c 'versionsort.suffix=-' ls-remote --tags --sort='v:refname' http://github.com/CTCaer/hekate.git
| tail --lines=1
| cut --delimiter='/' --fields=3
Here this examples outputs v5.8.0
then clone remote repo
git clone --branch v5.8.0 http://github.com/CTCaer/hekate.git
zip repos to zipped file.
zip hekate.zip -r hekate/
This will print out the url to the zip file of the latest release:
curl -sL https://api.github.com/repos/CTCaer/hekate/tags \
| jq -r '.[0].zipball_url' \
| xargs -I {} curl -sL {} -o latest.zip
I saw someone trying to achieve this with 'Xidel'
I assume you're referring to my answer here. That answer is tagged batch-file, so you first of all have to swop the quotes for bash ("function('string')" --> 'function("string")'). And secondly, you're right. You have to select the appropriate object in the "assets"-array.
$ xidel -s "https://api.github.com/repos/CTCaer/hekate/releases/latest" \
-f '$json/(assets)()[starts-with(name,"hekate_ctcaer")]/browser_download_url' \
--download '{substring-after($headers[starts-with(.,"Content-Disposition")],"filename=")}'
This downloads 'hekate_ctcaer_5.8.0_Nyx_1.3.0.zip' in the current dir.
With r8389 or newer you can just use --download ..
also how would I modify this for the following: github.com/Atmosphere-NX/Atmosphere/releases/tag/1.3.2 the .zip AND the .bin
Strictly speaking you'd have to raise a new question for this, but ok.
It appears that (at the moment) v1.3.2 is also the latest release for this repo, so you can use...
$ xidel -s "https://api.github.com/repos/Atmosphere-NX/Atmosphere/releases/latest" \
-e '$json'
or alternatively...
$ xidel -s "https://api.github.com/repos/Atmosphere-NX/Atmosphere/releases" \
-e '$json()[tag_name="1.3.2"]'
The "assets"-array here has just 2 objects; one with the zip-file and one with the bin-file, so just "follow" (--follow / -f) the 2 "browser_download_url"-keys to download:
$ xidel -s "https://api.github.com/repos/Atmosphere-NX/Atmosphere/releases" \
-f '$json()[tag_name="1.3.2"]//browser_download_url' \
--download .
I'm trying to create a cron job that downloads the latest version of WhatsApp's APK from their website using a bash script and make it available through my site.
So far, I'm able to obtain the version number from the site using the following (user-agent part omitted):
wget -q -O - "$#" whatsapp.com/android | grep -oP '(?<=Version )([\d.]+)'
And I can download the APK using the following command:
wget http://www.whatsapp.com/android/current/WhatsApp.apk
That part is fine. What I can't figure out is how to download the APK only if it's newer than the existing APK on the server. How should the script be?
Since I'm not a command-line pro, I guess there's a better way to achieve this than my current approach, so if you have any suggestions, I'd appreciate it very much.
Seems like you need to manage the version yourself.
I would store the apk files with a version number in the filename, e.g WhatsApp_<version-number>_.apk. So the script that downloads the newer file can be as following:
# Get the local version
oldVer=$(ls -v1 | grep -v latest | tail -n 1 | awk -F "_" '{print $2}')
# Get the server version
newVer=$(wget -q -O - "$#" whatsapp.com/android | grep -oP '(?<=Version )([\d.]+)')
# Check if the server version is newer
newestVer=$(echo -e "$oldVer\n$newVer" | sort -n | tail -n 1)
#Download the newer versino
[ "$newVer" = "$newestVer" ] && [ "$oldVer" != "$newVer" ] && wget -O WhatsApp_${newVer}_.apk http://www.whatsapp.com/android/current/WhatsApp.apk || echo "The newest version already downloaded"
#Delete all files that not is a new version
find ! -name "*$newVer*" ! -type d -exec rm -f {} \;
# set the link to the latest
ln -sf $(ls -v1 | grep -v latest| tail -n1) latest
I want to get only specific files using svn command line utlity.
I've a batch script that gets only specific files from vss, using ss tool of vss.
In vssthe command is:
ss get *.c
I need similar functionality with svn command utility.
How can I start ?
You can try something like, get the file list which are matching some extension, like .txt or .c.
svn list -R http://svn/url/till/the/path/you/need/ | grep ".extension"
Then for every line in the output of the above command use svn export to get the files in your local machine.
Edit:
repository=http://svn/url/till/the/path/need/
$target_directory=some path in machine
for line in svn list -R http://svn/url/till/the/path/need/ | grep ".extension"
do
filename=`echo "$line" |sed "s|$repository||g"`
if [ ! -d $target_directory$filename ]; then
directory=`dirname $filename`
mkdir -p $target_directory$directory
svn export --force -r HEAD $repository$line $target_directory$filename --username abc --password password123
fi
done
#!/bin/bash
mkdir /tmp
curl -O http://www.mucommander.com/download/nightly/mucommander-current.app.tar.gz /tmp/mucommander.tgz
tar -xvzf /tmp/mucommander.tgz */mucommander.app/*
cp -r /tmp/mucommander.app /Applications
rm -r /tmp
I'm trying to create a shell script to download and extract muCommander to my applications directory on a Mac.
I tried cd into the tmp dir, but then the script stops when I do that.
I can extract all using the -C argument, but the current tgz path is muCommander-0_9_0/mucommander.app, which could change on later builds, so I'm trying to keep it generic.
Can anyone give me pointers where I'm going wrong?
Thanks in advance.
Strip the first path component when you untar the archive, from tar(1):
--strip-components count
(x mode only) Remove the specified number of leading path ele-
ments. Pathnames with fewer elements will be silently skipped.
Note that the pathname is edited after checking inclusion/exclu-
sion patterns but before security checks.
Update
Here is a working bash example of how to, fairly generically, copy the contents of the tgz file to /Applications.
shopt -s nocaseglob
TMPDIR=/tmp
APP=mucommander
TMPAPPDIR=$TMPDIR/$APP
mkdir -p $TMPAPPDIR
curl -o $TMPDIR/$APP.tgz http://www.mucommander.com/download/nightly/mucommander-current.app.tar.gz
tar --strip-components=1 -xvzf $APP.tgz -C $TMPAPPDIR
mv $TMPAPPDIR/${APP}* /Applications
# rm -rf $TMPAPPDIR $TMPDIR/$APP
The rm command is commented out for now, verify that it does no harm before you use it.
The following will update your muCommander.
#for the safety, remove old temporary extraction from the /tmp
rm -rf /tmp/muCommander.app
#kill the running mucommander - you dont want replace the runnung app
ps -ef | grep ' /Applications/muCommander.app/' | grep -v grep | awk '{print $2}' | xargs kill
#download, extract, remove old, move new, open
#each command run only when the previous ended with success
curl http://www.mucommander.com/download/nightly/mucommander-current.app.tar.gz |\
tar -xzf - -C /tmp --strip-components=1 '*/muCommander.app' && \
rm -rf /Applications/muCommander.app && \
mv /tmp/muCommander.app /Applications && \
open /Applications/muCommander.app
Beware, after the '\' must following new line, and not any spaces...