I would like to download a file with curl, check its checksum with sha1sum (or a similar tool) and pipe the file into tar to unpack it, given that the result of sha1sum was a 0.
I know that without the checksum verfication it would be a simple curl <link> | tar x , however I'm having a hard time fitting sha1sum in there since its syntax is very foreign to me. I could probably manage to do it if sha1sum was able to receive the checksum as a parameter and read the file from stdin, but as far as I have seen this is not possible. Is there a way to achieve this nontheless?
set -ex; \
curl -o wordpress.tar.gz -fSL "https://wordpress.org/wordpress-4.7.3.tar.gz"; \
echo "35adcd8162eae00d5bc37f35344fdc06b22ffc98 *wordpress.tar.gz" | sha1sum -c -; \
tar -xzf wordpress.tar.gz -C ./
Related
I'm working on a python scrip that verify the integrity of some downloaded projects.
On my nas, I have all my compressed folder: folder1.tar.gz, folder2.tar.gz, …
On my Linux computer, the equivalent uncompressed folder : folder1, folder2, …
So, i want to compare the integrity of my files without any UnTar or download !
I think i can do it on the nas with something like (with md5sum):
sshpass -p 'pasword' ssh login#my.nas.ip tar -xvf /path/to/my/folder.tar.gz | md5sum | awk '{ print $1 }'
this give me a hash, but I don't know how to get an equivalent hash to compare with the normal folder on my computer. Maybe the way I am doing it is wrong.
I need one command for the nas, and one for the Linux computer, that output the same hash ( if the folders are the same, of course )
If you did that, tar xf would actually extract the files. md5sum would only see the file listing, and not the file content.
However, if you have GNU tar on the server and the standard utility paste, you could create checksums this way:
mksums:
#!/bin/bash
data=/path/to/data.tar.gz
sums=/path/to/data.md5
paste \
<(tar xzf "$data" --to-command=md5sum) \
<(tar tzf "$data" | grep -v '/$') \
| sed 's/-\t//' > "$sums"
Run mksums above on the machine with the tar file.
Copy the sums file it creates to the computer with the folders and run:
cd /top/level/matching/tar/contents
md5sums -c "$sums"
paste joins lines of files given as arguments
<( ...) runs a command, making its output appear in a fifo
--to-command is a GNU tar extension which allows running commands which will receive their data from stdin
grep filters out directories from the tar listing
sed removes the extraneous -\t so the checksum file can be understood by md5sum
The above assumes you don't have any very-oddly named files (for example, the names can't contain newlines)
This is a 2 part question.
Ive made a bash script that logs into a remote server makes a list.txt and saves that locally.
#!/bin/bash
sshpass -p "xxxx" ssh user#pass ls /path/to/files/ | grep "^.*iso" > list.txt
It then starts a for loop using the list.txt
for f in $(cat list.txt); do
The next command splits the target file and saves it locally
sshpass -p "xxxx" ssh user#pass tar --no-same-owner -czf - /path/to/files/$f | split -b 10M - "$f.tar.bz2.part"
Question 1
I need help understanding the above command, why is it saving the *part files locally? Even though that is what I intend to do I would like to understand it better, How would I do this the other way round, tar and split files saving output to remote directory (flip around what is happening in the above command using the same tools sshpass is a requirement)
Question 2
When running the above command even though I have made it not verbose it still prints this message
tar: Removing leading `/' from member names
How do I get rid of it as I have my own echo output as part of the script I have tried the following after searching online but I think me piping a few commands together confuses tar and breaks the operation.
I have tried these with no luck
sshpass -p "xxxx" ssh user#pass tar --no-same-owner -czfP - /path/to/files/$f | split -b 10M - "$f.tar.bz2.part
sshpass -p "xxxx" ssh user#pass tar --no-same-owner -czf -C /path/to/files/$f | split -b 10M - "$f.tar.bz2.part
sshpass -p "xxxx" ssh user#pass tar --no-same-owner -czf - /path/to/files/$f | split -b 10M - "$f.tar.bz2.part > /dev/null 2>&1
sshpass -p "xxxx" ssh user#pass tar --no-same-owner -czf - /path/to/files/$f > /dev/null 2>&1 | split -b 10M - "$f.tar.bz2.part
All of the above break the operation and I would like it to not display any messages at all. I suspect it has something to do with regex and how the pipe passes through arguments. Any input is appreciated.
Anyways this is just part of the script the other part uploads the processed file after tar and splitting it but Ive had to break it up into a few commands a 'tar | split' locally, then uploading via rclone. It would be way more efficient if I could pipe the output of split and save it remotely via ssh.
First and foremost, you must consider the security vulnerabilities when using sshpass.
About question 1:
Using tar with -f - option will create the tar on the fly and will send to stdout.
The | separates the commands.
sshpass -p "xxxx" ssh user#pass tar --no-same-owner -czf - /path/to/files/$f - Runs remotely
split -b 10M - "$f.tar.bz2.part" - Runs in local shell
The second command reads the stdin from the first command (the tar output) and it creates the file locally.
If you want to perform all the operations in the remote machine, you could enclose the rest of the commands in quotes like this (read other sources about qouting).
sshpass -p "xxxx" ssh user#pass 'tar --no-same-owner -czf - /path/to/files/$f | split -b 10M - "$f.tar.bz2.part"'
About question 2.
tar: Removing leading '/' from member names is generated by tar command which sends errors/warnings to STDERR which in the terminal, STDERR defaults to the user's screen.
So you can suppress tar errors by adding 2>/dev/null:
sshpass -p "xxxx" ssh user#pass tar --no-same-owner -czf - /path/to/files/$f 2 > /dev/null | split -b 10M - "$f.tar.bz2.part
#!/bin/bash
mkdir /tmp
curl -O http://www.mucommander.com/download/nightly/mucommander-current.app.tar.gz /tmp/mucommander.tgz
tar -xvzf /tmp/mucommander.tgz */mucommander.app/*
cp -r /tmp/mucommander.app /Applications
rm -r /tmp
I'm trying to create a shell script to download and extract muCommander to my applications directory on a Mac.
I tried cd into the tmp dir, but then the script stops when I do that.
I can extract all using the -C argument, but the current tgz path is muCommander-0_9_0/mucommander.app, which could change on later builds, so I'm trying to keep it generic.
Can anyone give me pointers where I'm going wrong?
Thanks in advance.
Strip the first path component when you untar the archive, from tar(1):
--strip-components count
(x mode only) Remove the specified number of leading path ele-
ments. Pathnames with fewer elements will be silently skipped.
Note that the pathname is edited after checking inclusion/exclu-
sion patterns but before security checks.
Update
Here is a working bash example of how to, fairly generically, copy the contents of the tgz file to /Applications.
shopt -s nocaseglob
TMPDIR=/tmp
APP=mucommander
TMPAPPDIR=$TMPDIR/$APP
mkdir -p $TMPAPPDIR
curl -o $TMPDIR/$APP.tgz http://www.mucommander.com/download/nightly/mucommander-current.app.tar.gz
tar --strip-components=1 -xvzf $APP.tgz -C $TMPAPPDIR
mv $TMPAPPDIR/${APP}* /Applications
# rm -rf $TMPAPPDIR $TMPDIR/$APP
The rm command is commented out for now, verify that it does no harm before you use it.
The following will update your muCommander.
#for the safety, remove old temporary extraction from the /tmp
rm -rf /tmp/muCommander.app
#kill the running mucommander - you dont want replace the runnung app
ps -ef | grep ' /Applications/muCommander.app/' | grep -v grep | awk '{print $2}' | xargs kill
#download, extract, remove old, move new, open
#each command run only when the previous ended with success
curl http://www.mucommander.com/download/nightly/mucommander-current.app.tar.gz |\
tar -xzf - -C /tmp --strip-components=1 '*/muCommander.app' && \
rm -rf /Applications/muCommander.app && \
mv /tmp/muCommander.app /Applications && \
open /Applications/muCommander.app
Beware, after the '\' must following new line, and not any spaces...
I have a pkg file created by Install Maker for Mac.
I want to replace one file in pkg. But I must do this under Linux system, because this is a part of download process. When user starts to download file server must replace one file in pkg.
I have a solution how unpack pkg and replace a file but I dont know how pack again to pkg.
http://emresaglam.com/blog/1035
http://ilostmynotes.blogspot.com/2012/06/mac-os-x-pkg-bom-files-package.html
Packages are just .xar archives with a different extension and a specified file hierarchy. Unfortunately, part of that file hierarchy is a cpio.gz archive of the actual installables, and usually that's what you want to edit. And there's also a Bom file that includes information on the files inside that cpio archive, and a PackageInfo file that includes summary information.
If you really do just need to edit one of the info files, that's simple:
mkdir Foo
cd Foo
xar -xf ../Foo.pkg
# edit stuff
xar -cf ../Foo-new.pkg *
But if you need to edit the installable files:
mkdir Foo
cd Foo
xar -xf ../Foo.pkg
cd foo.pkg
cat Payload | gunzip -dc |cpio -i
# edit Foo.app/*
rm Payload
find ./Foo.app | cpio -o | gzip -c > Payload
mkbom Foo.app Bom # or edit Bom
# edit PackageInfo
rm -rf Foo.app
cd ..
xar -cf ../Foo-new.pkg
I believe you can get mkbom (and lsbom) for most linux distros. (If you can get ditto, that makes things even easier, but I'm not sure if that's nearly as ubiquitously available.)
Here is a bash script inspired by abarnert's answer which will unpack a package named MyPackage.pkg into a subfolder named MyPackage_pkg and then open the folder in Finder.
#!/usr/bin/env bash
filename="$*"
dirname="${filename/\./_}"
pkgutil --expand "$filename" "$dirname"
cd "$dirname"
tar xvf Payload
open .
Usage:
pkg-upack.sh MyPackage.pkg
Warning: This will not work in all cases, and will fail with certain files, e.g. the PKGs inside the OSX system installer. If you want to peek inside the pkg file and see what's inside, you can try SuspiciousPackage (free app), and if you need more options such as selectively unpacking specific files, then have a look at Pacifist (nagware).
You might want to look into my fork of pbzx here: https://github.com/NiklasRosenstein/pbzx
It allows you to stream pbzx files that are not wrapped in a XAR archive. I've experienced this with recent XCode Command-Line Tools Disk Images (eg. 10.12 XCode 8).
pbzx -n Payload | cpio -i
In addition to what #abarnert said, I today had to find out that the default cpio utility on Mountain Lion uses a different archive format per default (not sure which), even with the man page stating it would use the old cpio/odc format. So, if anyone stumbles upon the cpio read error: bad file format message while trying to install his/her manipulated packages, be sure to include the format in the re-pack step:
find ./Foo.app | cpio -o --format odc | gzip -c > Payload
#shrx I've succeeded to unpack the BSD.pkg (part of the Yosemite installer) by using "pbzx" command.
pbzx <pkg> | cpio -idmu
The "pbzx" command can be downloaded from the following link:
pbzx Stream Parser
If you are experiencing errors during PKG installation following the accepted answer, I will give you another procedure that worked for me (please note the little changes to xar, cpio and mkbom commands):
mkdir Foo
cd Foo
xar -xf ../Foo.pkg
cd foo.pkg
cat Payload | gunzip -dc | cpio -i
# edit Foo.app/*
rm Payload
find ./Foo.app | cpio -o --format odc --owner 0:80 | gzip -c > Payload
mkbom -u 0 -g 80 Foo.app Bom # or edit Bom
# edit PackageInfo
rm -rf Foo.app
cd ..
xar --compression none -cf ../Foo-new.pkg
The resulted PKG will have no compression, cpio now uses odc format and specify the owner of the file as well as mkbom.
Bash script to extract pkg: (Inspired by this answer:https://stackoverflow.com/a/23950738/16923394)
Save the following code to a file named pkg-upack.sh on the $HOME/Downloads folder
#!/usr/bin/env bash
filename="$*"
dirname="${filename/\./_}"
mkdir "$dirname"
# pkgutil --expand "$filename" "$dirname"
xar -xf "$filename" -C "$dirname"
cd "$dirname"/*.pkg
pwd
# tar xvf Payload
cat Payload | gunzip -dc |cpio -i
# cd usr/local/bin
# pwd
# ls -lt
# cp -i * $HOME/Downloads/
Uncomment the last four lines, if you are using a rudix package.
Usage:
cd $HOME/Downloads
chmod +x ./pkg-upack.sh
./pkg-upack.sh MyPackage.pkg
This was tested with the ffmpeg and mawk package from rudix.org (https://rudix.org) search for ffmpeg and mawk packages on this site.
Source : My open source projects : https://sourceforge.net/u/nathan-sr/profile/
I often need to fetch tgz files, decompress them, and then delete the tgz.
How can I do all three steps with one simple command?
wget http://site/path/file.tgz -O - | tar -zxvf -
You can can use:
curl <url> | tar xz
Or put in your bashrc:
function ctxz {
curl $1 | tar xz
}
and just use:
ctxz <url>