I'm trying to implement a git hook that edits some JSON every time I push.
I have JQ installed on my Mac using homebrew "brew install jq", but when the git hook runs my .sh I get the error
jq: command not found
My latest attempts have been to use curl to download the jq library, point to it, and run jq that way:
jq=/usr/local/Cellar/jqz
curl -L -o $jq https://github.com/stedolan/jq/releases/latest/download/jq-osx-amd64
Unfortunately, this is also returning the same 'command not found' error.
Sidenote: jq=/usr/bin/jq gives me a permission error when I try to write to it
jq=/usr/local/Cellar/jqz
curl -L -o $jq https://github.com/stedolan/jq/releases/latest/download/jq-osx-amd64
It looks like you are storing the binary with the name jqz. No surprise that it cannot be executed as jq; you would have to invoke it as jqz.
I don't know if /usr/local/Cellar is part of your PATH?
The canonical way would be:
jq='/usr/local/bin/jq'
curl -L -o "$jq" https://github.com/stedolan/jq/releases/latest/download/jq-osx-amd64
you could also store it in the bin directory of your home folder: `jq="$HOME/bin" which should be added automatically to your PATH on most installations (might require a logout & login).
Using a ubuntu 16.04 what I do is :
Download the .sh script using wget https://gist.githubusercontent.com/...
Turn the .sh file executable sudo chmod guo+x sysInit.sh
Execute the code through sudo ./sysInit.sh
I was wondering if it is possible to run the code directly from the web.
Would be something like: sudo ./ https://gist.githubusercontent.com/....
Is it possible to do that?
You can use cUrl to download and run your script. I don't think its installed by default on Ubuntu so you'll have to sudo apt-get install curl first if you want to use it. To download and run your script with sudo just run
curl -sL https://gist.githubusercontent.com/blah.sh | sudo sh
Be warned this is very risky and not advised for security reasons. See this related question why-using-curl-sudo-sh-is-not-advised
Yes, it is possible using curl and piping the result to sh.
Try the following command.
curl https://url-to-your-script-file/scriptFile.sh | sh
No, sudo only works from a command line prompt in a shell
I am trying to download a huge file via curl. As far as I can see it there is some bash script hooked in between to deliver the correct file (in that case a virtual machine that runs IE10):
curl -s https://raw.githubusercontent.com/xdissent/ievms/master/ievms.sh | IEVMS_VERSIONS=10 bash
Due to a wobbly internet connection the download fails constantly so I need a way to resume the download at its current position. I've tried resuming the download like so:
curl -s -C - https://raw.githubusercontent.com/xdissent/ievms/master/ievms.sh | IEVMS_VERSIONS=10 bash
However, all I get is some MD5 check failed error...am I missing something?
The curl command you're running there doesn't download the VM images. It downloads a bash script called ievms.sh and then pipes the script to bash, which executes it.
Looking at the script, it looks like the file it downloads for IE10 is here:
http://virtualization.modern.ie/vhd/IEKitV1_Final/VirtualBox/OSX/IE10_Win8.zip
I think if you download that file (you could use your browser or curl) and put it in ~/.ievms, and then run the command again, it should see that the file has already been downloaded and finish the installation.
If the partially-downloaded file is already there, then you could resume that download with this command:
curl -L "http://virtualization.modern.ie/vhd/IEKitV1_Final/VirtualBox/OSX/IE10_Win8.zip" \
-C - -o ~/.ievms/IE10_Win8.zip
(Then run the original IEVMs curl command to finish installation.)
wget command wont work in cron?
I am using wget command in a script to hit a URL and fetch its content. I run the script manually in command prompt, it works fine. But setting same script in cron is not working.
wget -o wgetlog2 --output-document=wgettesting2.html "URL"
Any help would be appreciated.
How can I do an HTTP GET from a Un*x shell script on a stock OS X system? (installing third-party software is not an option, for this has to run on a lot of different systems which I don't have control on).
For example if I start the Mercurial server locally doing a hg serve:
... $ hg serve
And then, from a Linux that has the wget command I do a wget:
... $ wget http://127.0.0.1:8000
--2010-12-31 22:18:25-- http://127.0.0.1:8000/
Connecting to 127.0.0.1:8000... connected.
HTTP request sent, awaiting response... 200 Script output follows
Length: unspecified [text/html]
Saving to: `index.html
And on the terminal in which I launched the "hg serve" command, I can indeed see that an HTTP GET made its way:
127.0.0.1 - - [30/Dec/2010 22:18:17] "GET / HTTP/1.0" 200 -
So on Linux one way to do an HTTP GET from a shell script is to use wget (if that command is installed of course).
What other ways are there to do the equivalent of a wget? I'm looking, in particular, for something that would work on stock OS X installs.
The following native command will work:
curl http://127.0.0.1:8000 -o outfile
Note that curl does not follow redirects by default. To tell it to do so, add -L to the argument list.
brew install wget
Homebrew is a package manager for OSX analogous to yum, apt-get, choco, emerge, etc. Be aware that you will also need to install Xcode and the Command Line Tools. Virtually anyone who uses the command line in OSX will want to install these things anyway.
If you can't or don't want to use homebrew, you could also:
Install wget manually:
curl -# "http://ftp.gnu.org/gnu/wget/wget-1.17.1.tar.xz" -o "wget.tar.xz"
tar xf wget.tar.xz
cd wget-1.17.1
./configure --with-ssl=openssl -with-libssl-prefix=/usr/local/ssl && make -j8 && make install
Or, use a bash alias:
function _wget() { curl "${1}" -o $(basename "${1}") ; };
alias wget='_wget'
Curl has a mode that is almost equivalent to the default wget.
curl -O <url>
This works just like
wget <url>
And, if you like, you can add this to your .bashrc:
alias wget='curl -O'
It's not 100% compatible, but it works for the most common wget usage (IMO)
1) on your mac type
nano /usr/bin/wget
2) paste the following in
#!/bin/bash
curl -L $1 -o $2
3) close then make it executable
chmod 777 /usr/bin/wget
That's it.
Use curl;
curl http://127.0.0.1:8000 -o index.html
Here's the Mac OS X equivalent of Linux's wget.
For Linux, for instance Ubuntu on an AWS instance, use:
wget http://example.com/textfile.txt
On a Mac, i.e. for local development, use this:
curl http://example.com/textfile.txt -o textfile.txt
The -o parameter is required on a Mac for output into a file instead of on screen. Specify a different target name for renaming the downloaded file.
Use capital -O for renaming with wget. Lowercase -o will specify output file for transfer log.
Instead of going with equivalent, you can try "brew install wget" and use wget.
You need to have brew installed in your mac.
You can either build wget on the mac machine or use MacPorts to install it directly.
sudo port install wget
This would work like a charm, also you can update to the latest version as soon as it's available. Port is much more stable than brew, although has a lot less number of formula and ports.
You can install MacPorts from https://www.macports.org/install.php you can download the .pkg file and install it.
You could use curl instead. It is installed by default into /usr/bin.
wget Precompiled Mac Binary
For those looking for a quick wget install on Mac, check out Quentin Stafford-Fraser's precompiled binary here, which has been around for over a decade:
https://statusq.org/archives/2008/07/30/1954/
MD5 for 2008 wget.zip: 24a35d499704eecedd09e0dd52175582
MD5 for 2005 wget.zip: c7b48ec3ff929d9bd28ddb87e1a76ffb
No make/install/port/brew/curl junk. Just download, install, and run. Works with Mac OS X 10.3-10.12+.