Is there any command line code which can help me verify passwords for files within zip files? - shell

I have about 200 zip files and I need to verify the password for each of the zip files. Is there any command line code which can help me verify passwords for files within these zip files?

pass="testpassword"
for arch in *.zip; do
unzip -P $pass -qq -t "$arch" && echo "Pass $pass for $arch"
done

Related

How define output folder of curl while running script

I have command that executed when script over and download file from list. I use Termux on android and it say you can't use cd while running script.
xargs -n 1 curl -O -C - <url
But it download all file to folder where I runned this script. How I can change output directory.
PS: Only curl please. Aria2c and wget will ignored by me.
Okay. This script I use now
while read url
do
curl --create-dirs -o "$file path/name" $url
I use "basename" of url for name.
Please answer if you have better code.

curl script to extract tar files

I have a curl script that i use to get read files from a url. I now need to extract the tar files after they are downloaded.
for file in $(/usr/bin/curl 'https://linktofile.com/yourfile.txt' );
do
echo $file;
/usr/bin/curl -L -J -O "$file";
done;
output to the screen is
curl: Saved to filename 'your_file_1.tar'
How do I extract the file that was saved?
I tried adding tar -xvf $file but nothing happens.
How do I get the name of the file that was just saved?
Using -J makes the file's location depend on the Content-Disposition header, so you need to retrieve the filename it specifies. You can do so by specifying through -w the output of the curl command, which you will want to output the filename :
for url in $(/usr/bin/curl 'https://linktofile.com/yourfile.txt' ); do
filename=$(/usr/bin/curl -sLJOw '%{filename_effective}' "$url")
tar -xvf "$filename"
done

How to replace the manifest of a war file from bash?

I have a war file foo.war in my current working directory. I have a MANIFEST.MF file, too.
ls .
foo.war MANIFEST.MF
Within foo.war, there is a file foo/META-INF/MANIFEST.MF.
I want to replace that file inside the war with the file in my working directory.
I think zip can be used to do this, but I don't understand the zip documentation.
What would be the correct usage?
How can I verify that it was done correctly?
How would I do that with jar?
After experimenting a while I found this solution:
#! /bin/bash
set -e
apt-get install -y zip
apt-get install -y unzip
rm -f foo.war
cp foo.original.war foo.war
touch META-INF/MANIFEST.MF
zip -f foo.war META-INF/MANIFEST.MF
rm -f test
unzip -p foo.war META-INF/MANIFEST.MF >test
#Uncomment next line to test verification:
#echo "X" >>test
if diff test META-INF/MANIFEST.MF; then
echo "OK."
else
ERR=$?
echo "Failed (diff exit code $ERR)."
exit 1
fi
zip seems to replace the file only if the time stamp is new enough. What "new enough" exactly means, is still unclear to me.
I extracted the MANIFEST.MF file from zip using unzip -p foo.war META-INF/MANIFEST.MF >test
I use the exit code of diff to verify MANIFEST.MF has really been replaced.
zip spits out the warning: "zip warning: Local Entry CRC does not match CD: META-INF/MANIFEST.MF" The meaning of this warning is unclear to me.
Do I really have to put MANIFEST.MF inside the META-INF folder or is there a possibility to specify the source location of the file that should be replaced?
I don't like that "touch". Is there a way to force the replacement without fiddling with timestamps?
What about character encoding, unusual file names and line endings?
Try placing the MANIFEST.MF in a directory called foo/META-INF/
then run:
zip -f foo.war foo/META-INF/MANIFEST.MF
to verify the file run:
unzip -l foo.war
use the same for .jar files

Bash scp several files password issue

I am trying to copy several files from a remote server into local drive in Bash using scp.
Here's the part of the code
scp -r -q $USR#$IP:/home/file1.txt $PWD
scp -r -q $USR#$IP:/home/file2.txt $PWD
scp -r -q $USR#$IP:/root/file3.txt $PWD
However, the problem is that EVERY time that it wants to copy a file, it keeps asking for the password of the server, which is the same. I want it to ask only once and then copy all my files.
And please, do not suggest rsync nor making a key authentication file since I do not want to do that.
Are there any other ways...?
Any help would be appreciated
You can use expect script or sshpass
sshpass -p 'password' scp ...
#!/usr/bin/expect -f
spawn scp ...
expect "password:"
send "ur_password"
An disadvantage is that your password is now in plaintext
I'm assuming that if you can scp files from the remote server, you can also ssh in and create a tarball of the remote files.
The -r flag is recursive, for copying entire directories but your listing distinct files in your command, so -r becomes superfluous.
Try this from the bash shell on the remote system:
$ mkdir /home/file_mover
$ cp /home/file1.txt /home/file_mover/
$ cp /home/file2.txt /home/file_mover/
$ cp /root/file3.txt /home/file_mover/
$ tar -cvf /home/myTarball.tar /home/file_mover/
$ scp -q $USR#$IP:/home/myTarball.tar $PWD
Well, in this particular case, you can write...
scp -q $USR#$IP:/home/file[1-3].txt $PWD

Download data from FTP Website

There is something which I am missing or might be the whole case. So I am trying to download NCDC data from NCDC Datasets and unable to do it the unix box.
The command which I have used this far are
wget ftp://ftp.ncdc.noaa.gov:21/pub/data/noaa/1901/029070-99999-1901.gz">029070-99999-1901.gz
This is for one file, but will be very happy if I can downlaod the entire parent directory.
You seem to have a lonely " just before the >
to download everything you can try this command to get the whole directory content
wget -r ftp://ftp.ncdc.noaa.gov:21/pub/data/noaa/1901/*
for i in {1990..1993}
do
echo "$i"
cd /home/chile/data
# -nH Disable generation of host-prefixed directories
# -nd all files will get saved to the current directory
# -np Do not ever ascend to the parent directory when retrieving recursively.
# -R index.html*,227070*.gz* don't download files with this regex
wget -r -nH -nd -np -R *.html,999999-99999-$i.gz* http://www1.ncdc.noaa.gov/pub/data/noaa/$i/
/data/noaa/$i/
done

Resources