Bash shells script 101 - variable definition, pbcopy, etc - macos

I am trying to create a (my first) bash script, but I need a little help. I have the following:
#!/bin/bash
echo "Write a LaTeX equation:"
read -e TeXFormula
URIEncoded = node -p "encodeURIComponent('$(sed "s/'/\\\'/g" <<<"$TeXFormula")')"
curl http://latex.codecogs.com/gif.latex?$URIEncoded -o /Users/casparjespersen/Desktop/notetex.gif | pbcopy
I want it to:
Require user input (LaTeX equation)
URIEncode user input (the top result in Google was using node.js, but anything will do..)
Perform a cURL call to this website converting equation to a GIF image
Copy the image to the placeholder, so I can paste it to a note taking app like OneNote, Word, etc.
My script is malfunctioning in the following:
URIEncoded is undefined, so there is something wrong with my variable definition.
When I copy using pbcopy the encrypted text content of the image is copied, and not the actual image. Is there a workaround for this? Otherwise, the script could automatically open the image and I could manually Cmd + C the content.

URIEncoded is undefined, so there is something wrong with my variable
definition.
The line should read
URIEncoded=$(node -p "encodeURIComponent('$(sed "s/'/\\\'/g" <<<"$TeXFormula")')")
without spaces around the = sign, and using the $() construct to actually perform the command, otherwise, the text of the command would be assigned to the variable.
When I copy using pbcopy the encrypted text content of the
image is copied, and not the actual image. Is there a workaround for
this? Otherwise, the script could automatically open the image and I
could manually Cmd + C the content.
pbcopy takes input from stdin but you are telling curl to write the output to a file rather than stdout. Try simply
curl http://latex.codecogs.com/gif.latex?$URIEncoded | pbcopy
or, for the second option you describe
curl http://latex.codecogs.com/gif.latex?$URIEncoded -o /Users/casparjespersen/Desktop/notetex.gif && open $_

Related

How to use base64 encoded image as an argument?

I am trying to use the base64 encoding of an image as a flag when I run my program. Im getting back: Argument list too long
I am on a Ubuntu 16.04 Docker image on a mac.
$ ./myProgram -input "/9j/4AAQSkZJRgABA [...]"
Instead of accepting it as an argument, I would suggest you read it from standard in instead.
$ base64 someImage.jpg | ./myProgram
If this program is a shell script, you can save standard in into a variable with something like this:
#!/bin/sh
MY_BASE64_IMAGE_INPUT=$(cat -)
# do something with that info
echo $MY_BASE64_IMAGE_INPUT
There is a length limit of a single command. This is your program... It should be easier in every way to send the name of the file as the argument to your program which would then read the desired information from the file. You could cat the contents of the file into a variable (if you really want it as a variable). It may be more conventional if you passed the contents of the file to your program via the standard input stream.

Curl through list in bash

I am trying to iterate through a list to curl each one, this ultimately is to kick of a list of Jenkins jobs.
so i have a text file which contents is
ApplianceInsurance-Tests
BicycleInsurance-Tests
Breakdown-Tests
BridgingLoans-Tests
Broadband-Tests
Business-Loans
BusinessElectric-Tests
BusinessGas-Tests
and i am trying to create a loop in which i fire a curl command for each line in the txt file
for fn in `cat jenkins-lists.txt`; do "curl -X POST 'http://user:key#x.x.x.xxx:8080/job/$fn/build"; done
but i keep getting a error - No such file or directory.
Getting a little confused
Your do-done body is quoted wrong. It should be:
curl -X POST "http://user:key#x.x.x.xxx:8080/job/$fn/build"
I'd also recommend:
while read -r fn; do
curl -X POST "http://user:key#x.x.x.xxx:8080/job/$fn/build"
done < jenkins-list.txt
instead of for fn in $(anything); do .... With the second way you don't
have to worry about inadvertent globbing and the jenkins-list file may
get nicely buffered instead of needing to be read all into memory at once (not that it matters for such a small file but why not have a technique that works well more or less regardless of file size?).
If the error had come from curl, it would probably have been html-formatted. The only way I can reproduce the error you describe is by cat-ing a non-existent file.
Double check the name of the jenkins-lists.txt file, and make sure your script is running in the same directory as the file. Or use an absolute path to the file.

Piping input from a file to a command in windows cmd

My understanding is that the redirection operator, <, should allow me to take text from a file and give it as input to another file as if I had written out the contents of that file. Here is what I am trying to do:
python code.py < input.txt
I expect this to act as though I had typed the contents of input.txt after python code.py, but instead it acts as if I passed no input.
If I use cat, I get the contents of the file:
> cat input.txt
['2015-1-1','2015-5-1','2015-9-1','2015-10-1','2015-12-1','2016-1-1','2016-2-1','2016-4-1','2016-5-1'] [65,50,30,45,55,39,45,30,20]
And if I just copy and paste the contents of the file, I get the correct behavior.
I know this must be a really simple misunderstanding on my part, but I can't figure it out.
It's called Redirection, not piping, but you are correct that the < operator will push the file to the command. You can see this in action by using Sort instead of echo.
sort < input.txt
This will display the text file as a list, sorted alphabetically. Echo does not work with text files, so sending a text file to Echo simply runs "Echo".
If you just want to send a file to the command window, you can use Type instead, and not use the redirector.
type input.txt

Read a text file with Automator.app line by line

I'm a novice at coding so please be patient with me.
I've created a Workflow with Automator (OSX) which works fine. The only issue I have is that I want it to run on a number of inputs (that is as a batch). I've inserted the Loop action but the problem I'm having is about changing the initial input each time.
I would like to use an applescript to automate the insertion of the initial input each time.
I have a TXT file with URLs. With an apple script, I'd like to copy a URL (or a line of text) to clipboard.
In the next iteration I'd like to copy the next URL (or line of text).
Can anyone help?
Thank you!!
You can create one looping workflow (called as LinesToClipboard.workflow) what will
get a line from an text file (not rtf, or doc)
copy the line to clipboard
run your current workflow
loop again for the next line
The workflow:
Create new automator workflow
create a variable
at the bottom find the icon "Show or hide the workflow variables list" and show the workflow wariables (empty)
right click and "New variable..."
name the variable as "LineNumber"
add actions:
Get Value of Variable (LineNumber)
Run Shell Script
shell: /bin/bash
important: change the Pass input to as arguments
add the following content (copy exactly, with all quotes and such):
in the content of script, change the /etc/passwd to the full path of your filename, like /Users/myname/Documents/myfile.txt
at the end of this action the clipboard will contain one line from the file
linenum=${1:-0}
filename="/etc/passwd" # full path of your text-filename
let linenum++
sed -n "${linenum}p" < "$filename" | pbcopy
echo $linenum
Set Value of Variable (LineNumber)
Run Workflow - add your current workflow (or the "ShowClipboard.workflow" - see bellow)
the Wait for workflow to finish should be checked
important The Output menu should be: "Return action input"
Loop
add your count...
Run Shell Script (Ignore this action's input), content one line: echo 0 (This will reset the variable LineNumber to zero, when the loop ends)
Set Value of Variable (LineNumber)
For testing, you can create another workflow, called ShowClipboard.workflow, with an content:
Get Contents of Cliboard
Set Value of Variable (clipval)
Ask for confirmation (and drag the (clipval) to the Message field)
Run the first workflow.
Screenshots (for sure) :)
The second workflow (for testing)
You don't need AppleScript to get the URLs but can directly extract them with Automator by using a shell task. After using the task that's getting contents of a folder (this is a Finder task in Automator), add a shell task as the next task. Make sure you select that the input is send as arguments instead of sending it to stdin. When you have done that you only need something like one of the following shell scripts.
cat $# | egrep -io '\S?(http|https|ftp|afp|smb|mailto|webcal):\S+''
It first read all the files using cat. The $# is a shell variable that contains the arguments collected by the previous task: A list of paths to all files of batch folder. We pipe them to egrep will which will only output the URL filtered by their schemes. If you want to support any scheme (official and unofficial schemes):
cat $# | egrep -io '\S?[A-Z][A-Z0-9+-.]+:\S+'

KDE: klipper action script

So KDE's clipboard manager - klipper - allows one to write a script to be applied to clipboard contents matching regexp. Say, I want klipper to download an image through bash script.
Here's a klipper QRegExp:
^http://.*\.(png|svg|gif|jpg|jpeg)
I know that this regexp works - klipper notifies me every time I copy image URL to clipboard. Then, here's a bash script
#!/bin/bash
# let's name it clip.bash
name=`basename $1`
curl -o ~/Downloads/$name $1
I put this script to the PATH (I tried to feed this script with a image URL my self - it works), and finally I specify an action the following way:
clip.bash \%s
everything's fine and taken care about - but it doesn't work!
So my question is: "how to make klipper download an image through the bash script?"
First thoughts:
Are you sure about the backslash before the '%'? I haven't access to KDE right now, but I'm not sure you need it.
Are you sure that klipper "sees" your change to the PATH variable? You could try to use absolute path (something like "/home/../clip.bash")
If those don't work, you can try to log some debug info from your script. For example:
#!/bin/bash
name=`basename $1`
echo "curl -o ~/Downloads/$name $1" 1>&2
Run
tail ~/.xsession-errors
to see what command your script have just tried to execute.

Resources