sha256sum: 'standard input': no properly formatted checksum lines found - macos

I'm trying to verify an installer by typing this command.
grep node-vx.y.z.tar.gz SHASUMS256.txt | sha256sum -c -
But I get a response like this from Terminal
sha256sum: 'standard input': no properly formatted checksum lines found
Obviously, I already have SHASUMS256.txt file and on the right path on Terminal(MacOS).
Also prior to this, I've already installed CoreUtils from brew by running this command.
brew install coreutils
So what does 'standard input': no properly formatted checksum lines found mean in this context on MacOS Terminal?

It would help if you posted the contents of the file:
SHASUMS256.txt
If it is properly formatted it should contain a line that looks something like:
09ac98ea433e8cb19d09ead1a0bd439cafb1c3e4cb699af04b4e4da1f6ca3f02 node-vx.y.z.tar.gz
You could get your stderr output for 2 possible reasons.
The line in SHASUMS256.txt is not formatted correctly. It should be formatted as <ChecksumHash> <Filename> without the <>s.
The output from your grep search is blank (no output because grep returned no matches). This is more likely and I would suggest running the grep command by itself first to see what the output is prior to piping it into sha256sum.

In my macOS Monterey Version 12.61 I have to add an asterisk prior to the filename, in this way:
36a2c2c01723151b36987e377ebfc152337cc61699eaa6bf45d9460a4c3e216b *nifi-toolkit-1.17.0-bin.zip
And then run the command:
shasum -a 256 -c input.txt
Finally, I got this:
nifi-toolkit-1.17.0-bin.zip: OK

Related

How to exclude a line from being printed in terminal on Mac OS

I am trying to use grep to filter my terminal commands.
When searching the entire pc for a certain term I get a few results I want to exclude.
I use the following command taken from this tutorial.
sudo find / -iname "searchterm" | grep -v "exclude from search term"
For some reason my terminal still prints every line containing the excluded search term.
My grep version is grep (BSD grep) 2.5.1-FreeBSD according to grep --version. I have also installed grep via homebrew and executed the above command with ggrep instead of grep with the same results.
I use this command quite a lot and would like to find a less verbose method to use it. Does anyone know what I am missing here?
UPDATE:
Since my question might have been misleading. I want to suppress lines from the output to the terminal, but not from the search result.
I repeaditly use sudo find / -iname "searchterm" to search for leftover files after uninstalling an application. Even with the sudo command I still get multiple lines of find: /some/path/*: Operation not permitted. This verbose output makes it difficult to find the files that I am actually looking for.

How to download firefox with bash?

I need to download and run a firefox through a bash script, so I tried running the commands below:
curl -o ~/firefox.tar.bz2 https://download.mozilla.org/?product=firefox-latest-ssl&os=linux64
tar xjf ~/firefox.tar.bz2
~/firefox/firefox
Yet already the first command fails to download the tar file.
Note: The OS is Ubuntu 16, and I don't want to use apt-get.
Quote the address, otherwise the shell interprets the ampersand as a shell order and it ends up trying to download something different to what you expect. Also, add the -L parameter to tell cURL to follow the links:
curl -L -o ~/firefox.tar.bz2 "https://download.mozilla.org/?product=firefox-latest-ssl&os=linux64"

Using curl with an unpredictable target filename

The filename of my curl download target is unpredictable and globbing with an asterisk isn't possible. I can download the file using the following command, but only after I've determined its' name in advance:
curl -O -vvv -k -u user:password https://myURL/ws/myfile.zip
How can I tailor my curl command to succeed with an unpredictable target name?
There's no easy way to get a directory listing using HTTP. You can use curl to just print the HTML generated by the site. If there's an index with links to the files on that server, simply running
curl -s -u user:password https://myURL/ws/ | grep .zip
will print HTML-formatted links to the zip files available for download on that page.
Intro:
Like the OP, I had a similar issue scripting the download of a binary- for docker-compose- from Github because the version number keeps iterating making the file name unpredictable.
This is how I solved it. Might not be the tidiest solution, but if you have a more elegant way, ping me a comment and I'll update the answer.
Solution:
I merely used an auto-populating variable that takes the output of curl, prints the 1st line- which will be the most recent release- and thengrep for the release number prefaced by a "v". The result is saved to the the path /home/ubuntu as the arbitrary file name "docker-compose-latest"
curl -L "https://github.com/docker/compose/releases/download/$(curl https://github.com/docker/compose/releases | grep -m1 '<a href="/docker/compose/releases/download/' | grep -o 'v[0-9:].[0-9].[0-9]')/docker-compose-$(uname -s)-$(uname -m)" -o /home/ubuntu/docker-compose-latest
And we validate that we received the correct binary (I'm downloading to a Raspberry Pi which has an ARM processor on 64 bit Ubuntu 20.04 LTS:
file /home/ubuntu/docker-compose-latest
Produces the following feedback on the file:
/home/ubuntu/docker-compose-latest: ELF 64-bit LSB executable, ARM aarch64, version 1 (SYSV), statically linked, Go BuildID=QqyJMzYMWOofWehXt3pb/T7U4zg-t8Xqz_11RybNZ/ukJOlZCpzQuZzBcwSK3b/d6ecQ2m2VfqKb_EQRUZA, stripped
To validate this solution works, just execute the above commands remembering to change the path of the file command if not using Ubuntu.
Conclusion:
Again, might not be the most elegant solution, but it's a solution for how one can download a target with curl that has an unpredictable filename.

Strange behavior when trying to source AWS environment variables from .sh file

To quickly switch identities while using the AWS CLI I have a folder of downloaded credentials (csv files). I hear there are more sophisticated ways to switch profiles but this way works for me and I've been too lazy to change it. I tried to change one of them into a shell script that I could source:
# identity.sh
export AWS_ACCESS_KEY_ID='AKIABLAHBLAHKQKLQLGA'
export AWS_SECRET_ACCESS_KEY='secretblahblahblah'
If I copy/paste these two lines one at a time into my session, everything works fine. But if I source the file - . ~/credentials/identity.sh the variables are corrupted. It seems like my shell script thinks the first variable has a newline in it. To illustrate:
$ export | grep AWS
"eclare -x AWS_ACCESS_KEY_ID="AKIABLAHBLAHKQKLQLGA
"eclare -x AWS_SECRET_ACCESS_KEY="secretblahblahblah
If I export the variables one at a time via copy/paste the output is as follows:
$ export | grep AWS
declare -x AWS_ACCESS_KEY_ID="AKIABLAHBLAHKQKLQLGA"
declare -x AWS_SECRET_ACCESS_KEY="secretblahblahblah"
It will surprise no one to learn that the CLI does not work when i try to source the file:
$ aws s3 ls
Invalid header value 'AWS AKIABLAHBLAHKQKLQLGA\r:blahblah='
Now this has been more of a curious puzzle than an actual showstopper issue, but I've tried various approaches to solving this:
Both variables on one line without a newline separated by && or ;
Double quotes, single quotes, no quotes.
Different editors for the file (the \r made me think that the editor was using the old Mac \r convention for newlines but even using nano and vi the same issue occurred)
Regenerating credentials (the first secret had a slash in it).
Permissions on the .sh file (you don't need -x to source variables and adding -x doesn't have any effect)
Anyone have a clue? Thanks in advance.
It is likely the identity.sh file was created on Windows machine and has unprintable DOS characters. You can check it by using cat -t:
-t Display non-printing characters
$ cat -t identity.sh
will show you the control characters. You can convert it to Mac by using dos2unix command. If you don't have dos2unix installed:
$ brew install dos2unix
Another option is to create that file scratch on Mac using vi or your favorite editor.
NAME
dos2unix - DOS/Mac to Unix and vice versa text file format converter
SYNOPSIS
dos2unix [options] [FILE ...] [-n INFILE OUTFILE ...]
unix2dos [options] [FILE ...] [-n INFILE OUTFILE ...]
DESCRIPTION
The Dos2unix package includes utilities "dos2unix" and "unix2dos" to convert plain text files in DOS or Mac format to Unix format and vice versa.

Ubuntu Shell Script Output Directory

I don't use bash much, but lately I've had to use it and I have been continuously running into an issue with command-line arguments, specifically output directories. Some example code below:
Running:
log2timeline -f win7 -z EST5EDT -r -p -o csv -w /home/sansforensics/Desktop/PATH/file /mnt/hgfs/DRIVE/INPUT
on the command line works exactly as expected, the -w flag signifies the output path/file and that is exactly where it ends up, on the user's desktop in the PATH folder.
But when I run it from a .sh file sitting on the Desktop of the same user, i.e.
sudo bash test.sh
and all the file contains is the same exact command, the output file does not show up in the expected location. What am I missing here??

Resources