Display live git-clone progress using dialog - bash

I've been trying to get git-clone to provide output to a textbox in dialog. I tried something like this:
git clone github.com/CrazyWillBear/yes-replacement 2> /tmp/clone.log &
dialog --title "Cloning repo..." --textbox /tmp/clone.log 50 100
It doesn't work and usually displays nothing, writes to the terminal, or just the first line of the output. A loop doesn't work either, it creates some really funky problems.
The result I'm trying to get is this but with the entire output instead of just the first line: example

Use tee to write the output to a file, then use the tailbox feature on dialog to read it live.
git clone https://github.com/CrazyWillBear/pig --progress 2>&1 | tee -a /tmp/clone.$$ &> /dev/null &
dialog --title "Progress" --tailbox /tmp/clone.$$ 50 100

Related

Displaying an Xcode variable in a shell script in an alert fails to display the alert

I have a shell script that I run from Xcode to open the root folder in the Terminal. It works as expected when invoked by a function key in Xcode and opens a Terminal window pointing to the value of $SRCROOT as defined in Xcode and an alert with the expected message appears before that.
#! /bin/bash
osascript -e 'display dialog "message" with title "Hi"'
open -a Terminal "$SRCROOT"
Yet when I try to replace "message" to display the contents of $SRCROOT, the dialog doesn't display at all. I've tried all of the approaches listed in the solutions here: osascript using bash variable with a space
Using any of those approaches results in the alert not displaying at all.
I even tried to display the contents of $SRCROOT in a notification with various methods of escaping it but that just displays an empty notification.
Any ideas? TIA.
Save following in test.sh :
#!/usr/bin/env bash
SRCROOT=~/Developer/SwiftVC
osascript \
-e "on run(argv)" \
-e 'display dialog item 1 of argv with title "Hi"' \
-e "end" \
-- "$SRCROOT"
open -a Terminal "$SRCROOT"
and run it with :
chmod +x test.sh
./test.sh

Email the output of a shell script

I have a shell script (script.sh) in which I run a simple python web scraper
python3 script.py
As I run this shell script in a cron job, I want to be notified by email if something goes wrong. I found this code on stackoverflow, which I put at the bottom of my .sh script:
./script.sh 2>&1 | tee output.txt | mail -s "Script log" email#address.com
The problem is that the script now seems to be looping; where it should only take a few seconds, it now takes a few minutes to run and I receive 10-20 emails in my mailbox. The content of these emails differs, most of the times the emails are empty, but sometimes they contain messages such as:
./script.sh: 4: ./script.sh: Cannot fork
or:
mail: Null message body; hope that's ok
I'm not sure what goes wrong here. What can I do to fix this?
The script runs, runs the python, then calls itself again, until it runs out of resources and fails to fork.
The .sh should contain:
python3 script.py 2>&1 | tee output.txt | mail -s "Script log" email#address.com
Assuming you actually want to create output.txt in whatever folder pwd is... otherwise you could leave that out entirely.
Alternatively you could just configure your MAILTO in your crontab (see https://www.cyberciti.biz/faq/linux-unix-crontab-change-mailto-settings/)

sending echo and error to both terminal and file log

I am trying to modify a script someone created for unix in shell. This script is mostly used to run on backed servers with no human interaction, however I needed to make another script to allow users to input information. So, it is just modifying to old version for user input. But the biggest issue I am running into is trying to get both error logs and echos to be saved in a log file. The script has a lot of them, but I wanted to have those shown on the terminal as well as send them to the log file specified, to be looked into later.
What I have is this:
exec 1> ${LOG} 2>&1
This line is pretty much send everything to the log file. That is all good, but I also have people trying to enter in information in the script, and it is sending everything to the log file including the echo needed for the prompt. This line is also at the beginning of the script, but reading more into the stderr and stdout messages. I tried:
exec 2>&1 1>>${LOG}
exec 1 | tee ${LOG} But only getting error when running it this "./bash_pam.sh: line 39: exec: 1: not found"
I have went over site such as this to solve the issue, but I am not understanding why it does not print to both. The way I insert it, it either only sends it to the log location and not to the terminal, or it sends it to the terminal, but nothing is persevered in the log.
EDIT: Some of the solutions, for this have mentioned that certain fixes will work in bash, but not in /bin/sh.
If you would like all output to be printed onto the console, while also being printed to a logfile.txt you would run this command on your script:
bash your_script.sh 2>&1 | tee -a logfile.txt
Or calling it within the file:
<bash_command> 2>&1 | tee -a logfile.txt
To append to logfile.txt instead of overwriting, add the -a option to tee.

Why isn't this command returning to shell after &?

In Ubuntu 14.04, I created the following bash script:
flock -nx "$1" xdg-open "$1" &
The idea is to lock the file specified in $1 (flock), then open it in my usual editor (xdg-open), and finally return to prompt, so I can open other files in sequence (&).
However, the & isn't working as expected. I need to press Enter to make the shell prompt appear again. In simpler constructs, such as
gedit test.txt &
it works as it should, returning the prompt immediately. I think it has to do with the existence of two commands in the first line. What am I doing wrong, please?
EDIT
The prompt is actually there, but it is somehow "hidden". If I issue the command
sudo ./edit error.php
it replies with
Warning: unknown mime-type for "error.php" -- using "application/octet-stream"
Error: no "view" mailcap rules found for type "application/octet-stream"
Opening "error.php" with Geany (application/x-php)
__
The errors above are not related to the question. But instead of __ I see nothing. I know the prompt is there because I can issue other commands, like ls, and they work. But the question remains: WHY the prompt is hidden? And how can I make it show normally?
Why isn't this command returning to shell after &?
It is.
You're running a command in the background. The shell prints a new prompt as soon as the command is launched, without waiting for it to finish.
According to your latest comment, the background command is printing some message to your screen. A simple example of the same thing:
$ echo hello &
$ hello
The cursor is left at the beginning of the line after the $ hello.
As far as the shell is concerned, it's printed a prompt and is waiting a new command. It doesn't know or care that a background process has messed up your display.
One solution is to redirect the command's output to somewhere other than your screen, either to a file or to /dev/null. If it's an error message, you'll probably have to redirect both stdout and `stderr.
flock -nx "$1" xdg-open "$1" >/dev/null 2>&1 &
(This assumes you don't care about the content of the message.)
Another option, pointed out in a comment by alvits, is to sleep for a second or so after executing the command, so the message appears followed by the next shell prompt. The sleep command is executed in the foreground, delaying the printing of the next prompt. A simple example:
$ echo hello & sleep 1
hello
[1] + Done echo hello
$
or for your example:
flock -nx "$1" xdg-open "$1" & sleep 1
This assumes that the error message is printed in the first second. That's probably a valid assumption for you example, but it might not be in general.
I don't think the command is doing what you think it does.
Have you tried to run it twice to see if the lock cannot be obtained the second time.
Well, if you do it, you will see that it doesn't fail because xdg-open is forking to exec the editor. Also if it fails you expect some indication.
You should use something like this
flock -nx "$1" -c "gedit '$1' &" || { echo "ERROR"; exit 1; }

Hide dialog output in bash scrollback (Ubuntu)

I use the following script to display a dialog in the bash shell:
#!/bin/bash
TFILE=/tmp/habitat_resp_`whoami`.$$
dialog --menu "Commander?" 20 50 10 \
1 "MySQL" \
2 "Apache" \
3 "Postfix" \
4 "Dovecot" \
5 "Owncloud" \
2> $TFILE
# get response
RESPONSE=$(cat $TFILE)
echo $RESPONSE
clear
The problem is, when I scroll up, i can still see the dialog in my scrollback. I want it like vi. I open my script and the dialog appears and if the script is over you cant see the dialog in scrollback.
How can this be achieved?
Regards
S.
Add the --keep-tite option, which tells dialog to use the "alternative screen", which doesn't participate in terminal scrollback. This produces some slightly annoying display artefacts, but possibly not as annoying as polluting your scrollback.

Resources