BASH - post array with "wget --post-data" - bash

I need to download an xls table from a url that require two arrays as parameters.
First orderids.
Second columns.
For every order i have that columns.
wget --load-cookies cookies.txt \
--post-data='orderids=xxxx,xxxx,xxxx&columns=x,x,x,x,x' \
https://www.url.com/createordersexcel
In this way i get only first value inserted

That wget command looks OK, and worked in a small test I did.
Your problem is probably on the server side.

Related

How to add sysdate from bcp

I have a .csv file with the following sample data format:
REFID|PARENTID|QTY|DESCRIPTION|DATE
AA01|1234|1|1st item|null
AA02|12345|2|2nd item|null
AA03|12345|3|3rd item|null
AA04|12345|4|4th item|null
To load the above file into a table I am using below BCP command:
/bcp $TABLE_NAME in $FILE_NAME -S $DB_SERVER -t "|" -F 1 -U $DB_USERNAME -d $DB_NAME
What i am trying to look here is like below (adding sysdate instead of null from bcp)
AA01|1234|1|1st item|3/16/2020
AA02|12345|2|2nd item|3/16/2020
AA03|12345|3|3rd item|3/16/2020
AA04|12345|4|4th item|3/16/2020
Update : I was able to exclude header with #Jamie answer by -F 1 option, but looking for some help on inserting date with bcp. Tried looking some old Q&A, but no luck so far..
To exclude a single header record, you can use the -F option. This will tell BCP which line in the file is the first line to begin loading from. For your sample, -F2 should work fine. However, your command has other issues. See comments.
There is no way to introduce new data using the BCP command as you stated. BCP cannot introduce a date value while copying data into your table. To accomplish this I suggest a default for your date column or to first load the raw data into a table without the date column then you can introduce the date value as you see fit in late processing.

How to for loop CURL commands?

I am using Watson's Speech-To-Text Lite service and I am trying to find a way to automate the loading of new audio files to transcribe. I am very new to Bash and so I'm unclear of even the more rudimentary terms - so I'm finding the problem hard to find a solution for.
For a single use-case, I run the following file (my API key omitted with 'MY APIKEY')
curl -X POST -u "apikey: MY APIKEY" --header "Content-Type: audio/flac" --data-binary "#audiofile_1.flac" "https://gateway-lon.watsonplatform.net/speech-to-text/api/v1/recognize?model=en-US_BroadbandModel&speaker_labels=true" > C:/Users/outputpath/output_1.txt
What I am essentially trying to achieve is to overcome having to manually type and retype the names of the audio files and output. So if I had three (or more) audio files (i.e. audiofile_1, 2, and 3.flac), i would like to create an output file corresponding to each audio file - Some psuedo-code that might help explain what I mean would be
files = [file_1, file_2, file_3]
for file_x in files:
run curl command
save as output_x
You almost got it. You just need to learn some shell syntax:
files=("file_1" "file_2" "file_3")
for file_x in "${files[#]}"
do
curl -X POST -u "apikey: MY APIKEY" --header "Content-Type: audio/flac" --data-binary "#${file_x}" "https://gateway-lon.watsonplatform.net/speech-to-text/api/v1/recognize?model=en-US_BroadbandModel&speaker_labels=true" > "C:/Users/outputpath/${file_x}.txt"
done
First you create the files array with your list of files. Then, you iterate over those files and run the curl command on each of them.

curl error 18 transfer closed with outstanding read data remaining

Setup
I'm Using curl in the following bash script to push a JSON file to a REST API running in tomcat sitting behind nginx.
while IFS= read -d '' -r file; do
base=$(basename "$file")
datetime=$(find $file -maxdepth 0 -printf "%TY/%Tm/%Td %TH:%TM:%.2TS")
curl -vX POST -H "Content-Type: application/json" -H "Cache-Control: no-cache" \
-d #"$file" -u vangeeij:eian12 \
"http://192.168.105.10/homeaccess/services/aCStats/uploadData?username=vangeeij&filename=$base&datetime=$datetime"
#sudo mv "$file" /home/vangeeij/acserver/resultsOld
done < <(sudo find . -type f -print0)
Problem
When running this script I get a http 400 response with curl error:
curl: (18) transfer closed with outstanding read data remaining
What I have tried
I have found 2 things. First running the same URL and body through Postman yields a successful POST.
I found that this error goes away when the last parameter is removed from the URL &datetime=$datetime
I have also found a few connections between this error and setting a curl option something like
curl_setopt($curl, CURLOPT_HTTPHEADER, array('Expect:'));
But I'm not sure where/how to set this exactly when using curl in a simple bash script
Question
What do I need to change in my curl command to get rid of the error and still be able to use all parameters?
UPDATE
Starting a new question, as further investigation has lead me to a better understanding of the problem.
New Question Link
The error has to do with the fact that the parameter datetime= ends up with text in it that needs to be URL encoded.
This was confirmed by replacing the variable with 2017%2F03%2F01%2008%3A50%3A56
and it worked.
So now the problem is, that I can't get --data-urlencode datetime=$datetime to work. It seems this just gets appended to the JSON data or something.
This error is being generated by the fact that the datetime= paramater is being passed in with non encoded non URL friendly characters... (eg. space).
The fix to this would be to find a way to convert the $datetime to a URLEncoded String.
eg. convert:
2017/03/01 08:50:56
TO
2017%2F03%2F01%2008%3A50%3A56
See the following discussion for one method to accomplish this.
Post JSON data to Rest with URLEncoded query paramaters

cURL call works with number but not with variable containing number

I've ran into a strange issue. I'm trying to script my router to collect usage stats and other stuff. I'm making one cURL to the auth URL to get a valid session id, then another using that session id to the page I need.
Here is my script:
SESSION_ID=$(curl --silent -D - -X POST http://10.0.0.1/login.cgi -d'admin_username=admin&admin_password=admin' | grep 'SESSION' | sed 's/Set-Cookie: SESSION=//' | sed 's/; path=\///')
echo $SESSION_ID # 1234567890
curl -v -H "Cookie: SESSION=$SESSION_ID" http://10.0.0.1/modemstatus_dslstatus.html
If I manually take SESSION_ID and insert it in place of '"$SESSION_ID"' everything is dandy. cURL shows the headers (via -v) and they are correct. Running the command while manually inserting the session id produces identical headers.
I'm sure it's something small. Please teach me something :)
Check for carriage returns \r in your variables which wouldn't appear with a simple echo in some cases.

curl file upload with semicolons in filename

I'm implementing an automated, command line file uploader using curl to a servlet.
The problem is, I've got tens of thousands of files with semicolons (;) in the filenames. I'm well aware of the annoyance of this but it is a legacy app that continues to produce new files each day. Renaming is not really an option for compatibility reasons downstream.
I've tried quoting, escaping, converting to "%3b", fully qualifying the path... the obvious stuff... but nothing seems to work, and it fails to send from the client side. I'm on my mac (bundled curl version 7.21.3) but that shouldn't make a difference?
Any ideas?
macbookpro:~$ curl -F upload=#"my file.txt" http://localhost:8080/data/upload
ok
macbookpro:~$ curl -F upload=#"my;file.txt" http://localhost:8080/data/upload
curl: (26) failed creating formpost data
macbookpro:~$ curl -F upload=#"my\;file.txt" http://localhost:8080/data/upload
curl: (26) failed creating formpost data
macbookpro:~$ curl -F upload=#"my\\;file.txt" http://localhost:8080/data/upload
curl: (26) failed creating formpost data
macbookpro:~$
curl uses ; to separate type (or other directives) from the actual name, so I'd simply use stdin instead:
cat 'my;file.txt' | curl -F upload=#- http://localhost:8080/data/upload
You may possibly add filename= directive as well if desired (but without the semicolon in the name!).
According to CURL man pages, when you have ; or , in your form data(file or raw) you should always enclose it in double quotes.
CURL MAN
So just enclose the filename in double quotes and it should work

Resources