How to use `jq` in a shell pipeline? - shell

I can't seem to get jq to behave "normally" in a shell pipeline. For example:
$ curl -s https://api.github.com/users/octocat/repos | jq | cat
results in jq simply printing out its help text*. The same thing happens if I try to redirect jq's output to a file:
$ curl -s https://api.github.com/users/octocat/repos | jq > /tmp/stuff.json
Is jq deliberately bailing out if it determines that it's not being run from a tty? How can I prevent this behavior so that I can use jq in a pipeline?
Edit: it looks like this is no longer an issue in recent versions of jq. I have jq-1.6 now and the examples above work as expected.
* (I realize this example contains a useless use of cat; it's for illustration purposes only)

You need to supply a filter as an argument. To pass the JSON through unmodified other than the pretty printing jq provides by default, use the identity filter .:
curl -s https://api.github.com/users/octocat/repos | jq '.' | cat

One use case I have found myself doing frequently as well is "How do I construct JSON data to supply into other shell commands, for example curl?" The way I do this is by using the --null-input/-n option:
Don’t read any input at all! Instead, the filter is run once using null as the input. This is useful when using jq as a simple calculator or to construct JSON data from scratch.
And an example passing it into curl:
jq -n '{key: "value"}' | curl -d #- \
--url 'https://some.url.com' \
-H 'Content-Type: application/json' \
-H 'Accept: application/json'

Related

How to get more than one json element from a cURL call using jq?

I am trying to gather multiple json fields from a curl call in the command line. I have only been able to get one field at a time, each time it requires me to redo the curl call. For example, if there are two fields field_1 and field_2 under a parent field, I am having to do:
curl "https://www.somewebsite.com" | jq -r '.parent | .field_1'
and
curl "https://www.somewebsite.com" | jq -r '.parent | .field_2'
which is inefficient since the curl call already contains all piece of information. How can I gather both fields in just one curl call?
Just separate them with commas:
url "https://www.somewebsite.com" | jq -r '.parent | .field_1, .field_2'

how to pass json file name to jq?

currently I call
cat my_file.json | jq
to pretty print json data. I am a bit surprised that I can't do
I would like to avoid the extra cat; i.e.,
jq my_file.json
Can I specify a file name?
You need to specify the jq program to run:
jq . my_file.json
jq -h
The usage line produced by jq -h:
Usage: jq [options] <jq filter> [file...]
Note that the summary produced by invoking jq with the -h option does not (currently) provide a complete listing of the options. For the supported options, see the jq manual: https://stedolan.github.io/jq/manual/
Two undocumented options of note are:
--debug-dump-disasm
--debug-trace
jq .
Under certain circumstances, jq . can be abbreviated to jq but it's always safe to use the full form; a good rule of thumb is: if in doubt, do so.

What does # mean?

Here's the code I'm looking at:
#!/bin/bash
nc -l 8080 &
curl "http://localhost:8080" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
--data #<(cat <<EOF
{
"me": "$USER",
"something": $(date +%s)
}
EOF
)
What does the # do? Where is there documentation about #?
It is a curl-specific symbol. man curl shows you:
-d, --data <data>
(HTTP) Sends the specified data in a POST request to the HTTP server, in the
(same way that a browser does when a user has filled in an HTML form and
(presses the submit button. This will cause curl to pass the data to the
(server using the content-type application/x-www-form-urlencoded. Compare to
(-F, --form.
--data-raw is almost the same but does not have a special interpretation of
the # character. To post data purely binary, you should instead use the
--data-binary option. To URL-encode the value of a form field you may use
--data-urlencode.
If any of these options is used more than once on the same command line, the
data pieces specified will be merged together with a separating &-symbol.
Thus, using '-d name=daniel -d skill=lousy' would generate a post chunk that
looks like 'name=daniel&skill=lousy'.
If you start the data with the letter #, the rest should be a file name to
read the data from, or - if you want curl to read the data from stdin.
Multiple files can also be specified. Posting data from a file named
'foobar' would thus be done with -d, --data #foobar. When --data is told to
read from a file like that, carriage returns and newlines will be stripped
out. If you don't want the # character to have a special interpretation use
--data-raw instead.
See also --data-binary and --data-urlencode and --data-raw. This option
overrides -F, --form and -I, --head and -T, --upload-file.

Parsing and storing the json output of a curl command in bash

I have five cURL statements that work fine by themselves and am trying to put them together in a bash script. Each cURL statement relies on a variable generated from a cuRL statement executed before it. I'm trying to figure out the smartest way to go about this. Here is the first cURL statement;
curl -i -k -b sessionid -X POST https://base/resource -H "Content-Type: application/json" -H "Authorization: Authorization: PS-Auth key=keyString; runas=userName; pwd=[password]" -d "{\"AssetName\":\"apiTest\",\"DnsName\":\"apiTest\",\"DomainName\":\"domainNameString\",\"IPAddress\":\"ipAddressHere\",\"AssetType\":\"apiTest\"}"
This works fine, it produces this output;
{"WorkgroupID":1,"AssetID":57,"AssetName":"apiTest","AssetType":"apiTest","DnsName":"apiTest","DomainName":"domainNameString","IPAddress":"ipAddressHere","MacAddress":null,"OperatingSystem":null,"LastUpdateDate":"2017-10-30T15:18:05.67-07:00"}
However, in the next cURL statement, I need to use the integer from AssetID in order to execute it. In short, how can I take the AssetID value and store it to a variable to be used in the next statement? In total, I'll be using 5 cURL statements and they rely on values generated in the preceeding statement to execute. Any insight on how is appreciated.
Download and install jq which is like sed for JSON data. You can use it to slice and filter and map and transform structured data with the same ease that sed, awk, grep does for unstructured data. Remember to replace '...' with your actual curl arguments
curl '...' | jq --raw-output '.AssetID'
and to store it in a variable use command-substitution syntax to run the command and return the result.
asset_ID=$( curl '...' | jq --raw-output '.AssetID' )
In the curl command, drop the -i flag to output only the JSON data without the header information.

bash ldap search - variable as filter

I am arguing with something i expected to be simple....
I want to lookup a users manager from ldap, then get the managers email and sam name.
I expected to be able to get the cn for the manager from ldap like this:
manager=$(/usr/bin/ldapsearch -LLL -H ldap://company.ads -x -D admin#company.ads -w password -b ou=employees,dc=company,dc=ads sAMAccountName=employee1 | grep "manager:" | awk '{gsub("manager: ", "");print}' | awk 'BEGIN {FS=","}; {print $1, $2 }' )
that gives me the cn like this:
CN=manager,\ Surname
Now when I run another query like this:
/usr/bin/ldapsearch -LLL -H ldap://company.ads -x -D admin#company.ads -w password -b ou=employees,dc=company,dc=ads $manager
I get bad search filter (-7) echo the command copy, paste run it i get the record back....
Ive tried a number of variations on this, can anyone see what im missing?
Thanks.
Since there's a space in $manager, you need to quote it to prevent it from being split into multiple arguments.
/usr/bin/ldapsearch -LLL -H ldap://company.ads -x -D admin#company.ads -w password -b ou=employees,dc=company,dc=ads "$manager"
In general, it's best to always quote your variables, unless you specifically want it to be split into words.
You also need to remove the backslash \ from the LDAP entry. Backslashes are for escaping literal spaces in scripts, they shouldn't be used in data, because they're not processed when expanding variables.

Resources