piping Linux cat command to web in openWRT - shell

I want to run a shell script from openWRT. Basically its need to constantly read arduino serial port and when its reads something its need to be sent to a web based service.
Currently this is my script which only save to text file:
cat /dev/ttyACM0 >> /www/home/log.txt &
I want to avoid saving to file and send the output string right to a web based service that store the readings in mySQL DB.
All the data saving web service is all set and working something like this:
http://my-service.com/?data=what-ever-the-arduino-spits
Is there a way to do it with wget?
maybe something like this:
cat /dev/ttyACM0 | xargs -n % wget http://ivardi.info?todb=%
keep in mind that the openWRT is on a 32 RAM and 4MB flash storage so this is only possible with shell script and not Phyton/PHP.
Regards

Note that it could be dangerous in some cases to directly read the serial (/dev/ttyACM0) device and pass it direct to wget in case the read blocked for some reason (what happens if the serial port is disconnected and reconnected?)
It could be safer to route the output to a file; then in a loop read the most recent data and 'pushing' that using wget. Perhaps something like:
#!/bin/bash
while true; do
tail -1 /www/home/log/txt | wget <...options...>
sleep 60
done
In reality you would probably need to do something a little more advanced so that you don't keep sending duplicate data.
Of course, in your own situation what you proposed may be sufficient...

Related

linux script to send me an email every time a log file changes

I am looking for a simple way to constantly monitor a log file, and send me an email notification every time thhis log file has changed (new lines have been added to it).
The system runs on a Raspberry Pi 2 (OS Raspbian /Debian Stretch) and the log monitors a GPIO python script running as daemon.
I need something very simple and lightweight, don't even care to have the text of the new log entry, because I know what it says, it is always the same. 24 lines of text at the end.
Also, the log.txt file gets recreated every day at midnight, so that might represent another issue.
I already have a working python script to send me a simple email via gmail (called it sendmail.py)
What I tried so far was creating and running the following bash script:
monitorlog.sh
#!/bin/bash
tail -F log.txt | python ./sendmail.py
The problem is that it just sends an email every time I execute it, but when the log actually changes, it just quits.
I am really new to linux so apologies if I missed something.
Cheers
You asked for simple:
#!/bin/bash
cur_line_count="$(wc -l myfile.txt)"
while true
do
new_line_count="$(wc -l myfile.txt)"
if [ "$cur_line_count" != "$new_line_count" ]
then
python ./sendmail.py
fi
cur_line_count="$new_line_count"
sleep 5
done
I've done this a bunch of different ways. If you run a cron job every minute that counts the number of lines (wc -l) compares that to a stored count (e.g. in /tmp/myfilecounter) and sends the emails when the numbers are different.
If you have inotify, there are more direct ways to get "woken up" when the file changes, e.g https://serverfault.com/a/780522/97447 or https://serverfault.com/search?q=inotifywait.
If you don't mind adding a package to the system, incron is a very convenient way to run a script whenever a file or directory is modified, and it looks like it's supported on raspbian (internally it uses inotify). https://www.linux.com/learn/how-use-incron-monitor-important-files-and-folders. Looks like it's as simple as:
sudo apt-get install incron
sudo vi /etc/incron.allow # Add your userid to this file (or just rm /etc/incron.allow to let everyone use incron)
incron -e # Add the following line to the "cron" file
/path/to/log.txt IN_MODIFY python ./sendmail.py
And you'd be done!

How to use a variable instead of file to let any command write to?

I'm using curl --cookie-jar <filename> to save cookie temperley and load it later in a script.
In OS X, there's no /dev/shm and I don't want too many temp files write to SSD.
Is it possible use a variable instead of a file to let this/any command write to?
A variable can be read like a file with curl --cookie <(echo "$variable"), is it involved with disk access?
Literal Question
In the completely generic sense: You can't. Shell variables don't have independent existence on the filesystem any more than Python variables or C programs' variables do. (Environment variables are exposed to be read by processes, but changes aren't propagated back to parent processes, so even if an operating system had extensions that provided environment variables to be accessible via a filesystem interface -- akin to /proc/self/environ on Linux -- that's not helpful here, where two-way communication is needed).
MacOS Workaround
You can use hdiutil and diskutil to create a ramdisk with a filesystem on it to serve the same purpose as /dev/shm. See ie. https://gist.github.com/rxin/5085564
Alternate Approach: FIFO Abuse
One fugly-but-feasible approach is to use a background process pumping data between a pair of FIFOs:
#!/usr/bin/env bash
# These are the only two operations that touch disk
mkfifo cookie-write.fifo || exit
mkfifo cookie-read.fifo || exit
# Start a background process that pumps data from the read FIFO to the write FIFO
datapump() {
while IFS= read -r -d '' content <cookie-write.fifo || [[ $content ]]; do
printf '%s\0' "$content" >cookie-read.fifo
done
}
datapump & datapump_pid=$!
# run an initial curl with cookies written to cookie-write.fifo
curl -c cookie-write.fifo http://example.com/login
cookies=$(<cookie-read.fifo) # read cookies set by login process
# write back from the shell variable to the FIFO to allow read by another curl
printf '%s\0' "$cookies" >cookie-write.fifo
# read in in that new curl process, write back to the FIFO again
curl -b cookie-read.fifo -c cookie-write.fifo http://example.com/do-something
cookies=$(<cookie-read.fifo) # read cookies as updated by do-something process
This kind of approach requires a great deal of care to avoid deadlocks: Note that the coprocess first reads and then writes; if either of those operations doesn't take place, then it's going to hang indefinitely. (Thus, if your curl operation doesn't write any cookies, the pump process won't switch modes over from reading to writing, and a subsequent attempt to read the cookie state may hang).
Ofcourse!
Your usage:
curl something.com > file
Storing in a variable:
variable=$(curl something.com)
Note that this is not recommended for larger outputs (unless you introduce parsing/regex matching/whatever).

using pinentry-tty in a bash script (like read)

Is there a way to use pinentry-tty directly in a bashscript? E.G. as a more secure replacement for 'read'. I was thinking of something like this:
local pass=$(pinentry-tty);
This allows me to enter several lines, but nothing gets saved to the variable.
The different pinentry implementations cannot easily be called the way you wanted to use it. It follows a simple protocol, which also enables several possibilities to configure prompts and print error messages. An example session, with GETPIN being the command issued on STDIN and foo being the passphrase the user entered, returned with other status messages on STDOUT:
$ pinentry
OK Pleased to meet you
GETPIN
D foo
OK
The full documentation is included in pinentry's source tarball, but also available online.
While this won't work with all pinentry implementations you can try invoking it as follows:
password=$(echo -e "SETPROMPT Please enter your password:\nGETPIN\n" | \
pinentry | \
sed -nr '0,/^D (.+)/s//\1/p')
On my Arch machine this works for pinentry-gnome3, pinentry-gtk-2 and pinentry-qt - but not with pinentry-curses or pinentry-tty

Send command to open process in Shell file

In bash how can I issue a command to a running process I just started?
For example;
# Start Bluez gatttool then Connect to bluetooth device
gatttool -b $MAC -I
connect # send 'connect' to the gatttool process?
Currently my shell script doesn't get to the connect line because the gatttool process is running.
If you simply want to send the string "connect\n" to the process, you can use a standard pipe:
echo "connect" | gatttool -b $MAC -I
If you want to engage in a more complex "conversation" with the gatttool process, take a look at the expect (1) and chat (8) tool, which allow you to send a sequence of strings, and wait for certain responses.
If you'd prefer a slightly "lighter" way of piping you could use a heredoc such as in:
gatttool -b $MAC -I <<EOF
connect
(...)
EOF
Everything contained between the two EOF tags will be piped to the command's input. I believe this will not allow you to interact with the command whilst between the EOF tags so, as mentioned in the previous answer, you might want to consider using expect if you need to act upon the commands' output before sending something back to it.

Bash programming, interrogating ttyUSB port

I'm new in this of bash programming in linux, basically what I want to do is to program a bash file that can open the port ttyUSB0 and then I need to interrogate it with AT commands (like "0100") and then assign the response to a variable, I've been trying this with this different ways:
1) Using cat
#!/bin/bash
PORT= \ls /dev/ttyU*
cat $PORT
????
2) Using Minicom
`#!/bin/bash
minicom
????
'
3) Using Screen
#!/bin/bash
PORT= \ls /dev/ttyU*
screen $PORT
????
How can I interrogate it before the cat, minicom and screen starts? What should I have to put in ???? of the 3 different codes?
Thank you so much!!!
Don't try writing to a tty device using bash, you'll end up chasing your own tail forever. Use minicom or C-Kermit for that.
If you want to check that the device is active before starting minicom, you can read from it with bash and there is a good explanation of how to achieve this here: Bash read from ttyUSB0 and send to URL
You should be able to use my atinout program for this. It is a command line tool to talk with a modem:
$ echo AT | atinout - /dev/ttyUSB0 -
AT
OK
$
So with a little bit of scripting you should be able to extract the response you want (remember to always check for a successful OK response).

Resources