Piping raw code from github to ruby not working? - ruby

I am doing some basic piping of some simple raw code from github to terminal as shown here i.e.
curl https://raw.github.com/leachim6/hello-world/master/r/ruby.rb | ruby
When I try it, it doesn't produce "Hello World", but instead I just see
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- 0:00:01 --:--:-- 0

use
curl -sSL https://raw.github.com/leachim6/hello-world/master/r/ruby.rb | ruby
this should work
Update to explain
this URL is redirecting to
https://raw.githubusercontent.com/leachim6/hello-world/master/r/ruby.rb
so -L option was required to follow the redirection (-L, --location)
this option will make curl redo the request on the new place
sS to hide the progress bar and show errors if happened
to debug curl request you can use -v option which will make you see exactly what is happening

Related

cURL command not returning data or error messages

Figuring out a new API.
I’m trying to call an endpoint and populate a JSON file with the data. I see the following output:
admin#server:~$ curl -H "Authorization: Bearer <NOT DISPLAYED FOR THIS POST>" -o /home/admin/result.json https://www.endpoint.com/manage/query/run?id=55408&cmd=service&output=json
[1] 14493
[2] 14494
admin#server:~$ % Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
It hangs here seemingly indefinitely until I hit enter and then the below displays
[1]- Done curl -H "Authorization: Bearer <NOT DISPLAYED FOR THIS POST>" -o /home/admin/result.json https://www.endpoint.com/manage/query/run?id=55408
[2]+ Done cmd=service
No error messages, and no data in result.json. Calling the same cURL command without the -o option also returns the same results, when normally I would expect to see the data pop up in my terminal. If I visit the endpoint URL in browser (auth token can be a URL parameter as well for this API), I see the exact data I want to download. Changing the Auth Token makes no difference in the output.
I know every API is different, and there are a hundred different troubleshooting questions I haven't addressed in this post, but has anyone experienced this type of output before with a cURL command? I've never seen this behavior before.
The API is for Slate, an SIS for universities, if that helps. Thank you!
Your URL has a & in it which is a bash syntax character to run a task in the background. Quote the URL to prevent it being interpreted as syntax.
admin#server:~$ curl -H "Authorization: Bearer <NOT DISPLAYED FOR THIS POST>" -o /home/admin/result.json "https://www.endpoint.com/manage/query/run?id=55408&cmd=service&output=json"

Bash script - check how many times public IP changes

I am trying to create my first bash script. The goal of this script is to check at what rate my public IP changes. It is a fairly straight forward script. First it checks if the new address is different from the old one. If so then it should update the old one to the new one and print out the date along with the new IP address.
At this point I have created a simple script in order to accomplish this. But I have two main problems.
First the script keeps on printing out the IP even tough it hasn't changed and I have updated the PREV_IP with the CUR_IP.
My second problem is that I want the output to direct to a file instead of outputting it into the terminal.
The interval is currently set to 1 second for test purposes. This will change to a higher interval in the final product.
#!/bin/bash
while true
PREV_IP=00
do
CUR_IP=$(curl https://ipinfo.io/ip)
if [ $PREV_IP != "$CUR_IP" ]; then
PREV_IP=$CUR_IP
"$(date)"
echo "$CUR_IP"
sleep 1
fi
done
I also get a really weird output. I have edited my public IP to xx.xxx.xxx.xxx:
Sat 20 Mar 09:45:29 CET 2021
xx.xxx.xxx.xxx
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:--
while true
PREV_IP=00
do
is the reason you are seeing ip each loop. It's the same as while true; PREV_IP=00; do. The exit status of true; PREV_IP=00 is the exit status of last command - the exit status of assignment is 0 (success) - so the loop will always execute. But PREV_IP will be reset to 00 each loop... This is a typo and you meant to set prev_ip once, before the loop starts.
"$(date)"
will try execute the output of date command, as a next command. So it will print:
$ "$(date)"
bash: sob, 20 mar 2021, 10:57:02 CET: command not found
And finally, to silence curl, read man curl first and then find out about -s. I use -sS so errors are also visible.
Do not use uppercase variables in your scripts. Prefer lower case variables. Check you scripts with http://shellcheck.net . Quote variable expansions.
I would sleep each loop. Your script could look like this:
#!/bin/bash
prev=""
while true; do
cur=$(curl -sS https://ipinfo.io/ip)
if [ "$prev" != "$cur" ]; then
prev="$cur"
echo "$(date) $cur"
fi
sleep 1
done
that I want the output to direct to a file instead of outputting it into the terminal.
Then research how redirection works in shell and how to use it. The simplest would be to redirect echo output.
echo "$(date) $cur" >> "a_file.txt"
The interval is currently set to 1 second for test purposes. This will change to a higher interval in the final product.
You are still limited with the time it takes to connect to https://ipinfo.io/ip. And from ipinfo.io documentation:
Free usage of our API is limited to 50,000 API requests per month.
And finally, I wrote a script where I tried to use many public services as I found ,get_ip_external for getting external ip address. You may take multiple public services for getting ipv4 address and choose a random/round-robin one so that rate-limiting don't kick that fast.

Calling curl command from Ruby's system method

I'm using curl command inside logstash exec plugin to post a message to a stride group. The plugin documentation stated that they were using ruby system method to execute the command, so I'm trying to run it in my ruby IRB.
Escaping the double quotes with backslash character is giving error The request body cannot be parsed as valid JSON. Here is the full error
irb(main):050:0' --inf-ruby-2f3827a9-23243-13517-726000--
curl: (6) Couldn't resolve host 'first'
curl: (3) [globbing] unmatched close brace/bracket in column 9
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 165 0 87 100 78 60 54 0:00:01 0:00:01 --:--:-- 60
{
"statusCode": 400,
"message": "The request body cannot be parsed as valid JSON"
}=> true
I tried swapping double quotes with single quotes and using full double quotes everywhere. Nothing is working.
system("curl -X POST -H 'Content-Type: application/json' -H 'Authorization: Bearer blah-blah' -d '{\"body\":{\"version\":1,\"type\":\"doc\",\"content\":[{\"type\":\"paragraph\",\"content\":[{\"type\":\"text\",\"text\":\"My first message!\"}]}]}}' --url 'https://api.atlassian.com/site/blah-blah/conversation/blah-blah/message'")
Is there any way to make this work?
EDIT: I have tried running the cURL command in the terminal and it is working fine.
Besides the following answers on StackOverflow, NetHTTP has notoriously poor documentation but can be used to post what you're interested in.

How do I download a file to a newly created directory with curl on OS X?

I am trying to download my Heroku backups to a folder.
Downloading to the current folder like this works:
curl -o latest.dump `heroku pg:backups public-url`
But when I tried adding a folders path to latest.dump it looks like this:
$ curl -o /db-bkups/latest.dump `heroku pg:backups public-url`
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 44318 0 0 0 0 0 0 --:--:-- 0:00:01 --:--:-- 0Warning: Failed to create the file
Warning: /db-bkups/latest.dump: No such file
Warning: or directory
36 44318 36 16384 0 0 9626 0 0:00:04 0:00:01 0:00:03 9626
curl: (23) Failed writing body (0 != 16384)
Ideally, I would like it be saved and downloaded like this:
/db-bkups/nov-1-2016/timestamp-db.dump
Where the folder nov-1-2016 is created dynamically when the cron is run, and the filename is the timestamp when the bkup was run.
You could try using the --create-dirs argument which was added in curl 7.10.3:
Here is an example that will create the directory hierarchy, (if it doesn't already exist), and will name the subdirectory you require renamed with the output of the datecommand:
curl -o /db-bkups/$(date +"%b-%d-%Y")/timestamp-db.dump --create-dirs http://www.w3schools.com/xml/simple.xml
The result is a file stored in a directory like so /db-bkups/Nov-04-2016/timestamp-db.dump.

Elasticsearch 1.7.1 not creating a snapshot, but not reporting error either

I'm trying to create a snapshot of all indexes in my local Elasticsearch instance.
I've set path.repo as follows in elasticsearch.yml:
path.repo: ["F:\\backup\\elasticsearch"]
And here's the command I'm using to create the snapshot:
curl -XPUT http://localhost:9200/_snapshot/my_test_backup -d '
{
"type": "fs",
"settings": {
"location": "F:\\backup\\elasticsearch\\my_test_backup"
}
}'
Executing this generates the following output from Elasticsearch:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 136 100 21 100 115 269 1474 --:--:-- --:--:-- --:--:-- 1854{"acknowledged":true}
Note, no error.
There are a couple of indexes set up on my local instance, which aren't particularly large so, when I check the snapshot status, it shows no in progress snapshots:
$ curl -XGET http://localhost:9200/_snapshot/_status
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 16 100 16 0 0 1000 0 --:--:-- --:--:-- --:--:-- 1000{"snapshots":[]}
As I say, this isn't necessarily a worry because the snapshot would be small anyway. I can see the snapshot I've just created by executing the following, but it appears to have hung:
$ curl -XGET http://localhost:9200/_snapshot
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 100 100 100 0 0 100 0 0:00:01 --:--:-- 0:00:01 97k{"my_test_backup":{"type":"fs","settings":{"location":"F:\\backup\\elasticsearch\\my_test_backup"}}}
Consistent with this, when I navigate to F:\backup\elasticsearch\my_test_backup the folder is empty.
Could somebody tell me why my snapshot isn't working? What am I doing wrong?
Many thanks,
Bart
All you have done there is to create a repository, not a snapshot. Creating a repository is a necessary artifact that will store all future snapshots you will create.
So now that you have your repository, you can simply kick off the snapshot creation as follows:
curl -XPUT "localhost:9200/_snapshot/my_test_backup/snapshot_1"
If you run the following command instead, it will only return when the snapshot is done:
curl -XPUT "localhost:9200/_snapshot/my_test_backup/snapshot_1?wait_for_completion=true"

Resources