I just open two separated Terminal windows, and one is ttys000, the other is ttys001. I typed the command on the ttys000 window;
$ echo "Test" > /dev/ttys001
and then show up the info that "Permission denied". So I turn in the root and want to do it again like above command, but it says "Operation not supported"...
I thought the ttys001 window should show the "Test" string.
Why this happen?
Related
I am trying to run following command
ssh xxx#99.99.99.99 ". ./.profile; myscript 2&1 >> /tmp/2244455.log"
But it comes up with following error
sh: 1: execute permission denied
When i run myscript 2&1 >> /tmp/2244455.log on my remote server it works perfectly.
Also when i run ssh xxx#99.99.99.99 ". ./.profile; myscript it works perfectly.
Please can yo help me with this issue
myscript is not owned by xxx, permissions are
ls -ltrh myscript
-rwxr-xr-x 1 yyy other 11K May 18 15:04 myscript
The 2&1 syntax is wrong; if you want to redirect stderr to stdout, you need to:
2>&1
You also have quotes that are not nested properly; the backtick `` is overlapped with the double quotes. EDIT: I see you've edited the question, but there's now an unpaired backtick at the beginning of your command
So, I'm guessing what you're after with the quotes, but your whole command might be:
ssh xxx#99.99.99.99 ". ./.profile; myscript 2>&1 >> /tmp/2244455.log"
This would create /tmp/2244455.log on the remote 99.99.99.99 machine.
Putting backpacks around the ssh command will cause your local shell to attempt to run its output as a new command line. Get rid of those, if that's not just a typo from formatting your question.
The real problem is that you're missing the > on the redirect. 2>&1, not 2&1. The >less version just adds 2 as an argument to the command and then attempts to run that command in the background while running a command named 1 in the foreground.
In Ubuntu 14.04, I created the following bash script:
flock -nx "$1" xdg-open "$1" &
The idea is to lock the file specified in $1 (flock), then open it in my usual editor (xdg-open), and finally return to prompt, so I can open other files in sequence (&).
However, the & isn't working as expected. I need to press Enter to make the shell prompt appear again. In simpler constructs, such as
gedit test.txt &
it works as it should, returning the prompt immediately. I think it has to do with the existence of two commands in the first line. What am I doing wrong, please?
EDIT
The prompt is actually there, but it is somehow "hidden". If I issue the command
sudo ./edit error.php
it replies with
Warning: unknown mime-type for "error.php" -- using "application/octet-stream"
Error: no "view" mailcap rules found for type "application/octet-stream"
Opening "error.php" with Geany (application/x-php)
__
The errors above are not related to the question. But instead of __ I see nothing. I know the prompt is there because I can issue other commands, like ls, and they work. But the question remains: WHY the prompt is hidden? And how can I make it show normally?
Why isn't this command returning to shell after &?
It is.
You're running a command in the background. The shell prints a new prompt as soon as the command is launched, without waiting for it to finish.
According to your latest comment, the background command is printing some message to your screen. A simple example of the same thing:
$ echo hello &
$ hello
The cursor is left at the beginning of the line after the $ hello.
As far as the shell is concerned, it's printed a prompt and is waiting a new command. It doesn't know or care that a background process has messed up your display.
One solution is to redirect the command's output to somewhere other than your screen, either to a file or to /dev/null. If it's an error message, you'll probably have to redirect both stdout and `stderr.
flock -nx "$1" xdg-open "$1" >/dev/null 2>&1 &
(This assumes you don't care about the content of the message.)
Another option, pointed out in a comment by alvits, is to sleep for a second or so after executing the command, so the message appears followed by the next shell prompt. The sleep command is executed in the foreground, delaying the printing of the next prompt. A simple example:
$ echo hello & sleep 1
hello
[1] + Done echo hello
$
or for your example:
flock -nx "$1" xdg-open "$1" & sleep 1
This assumes that the error message is printed in the first second. That's probably a valid assumption for you example, but it might not be in general.
I don't think the command is doing what you think it does.
Have you tried to run it twice to see if the lock cannot be obtained the second time.
Well, if you do it, you will see that it doesn't fail because xdg-open is forking to exec the editor. Also if it fails you expect some indication.
You should use something like this
flock -nx "$1" -c "gedit '$1' &" || { echo "ERROR"; exit 1; }
I'm trying to run a command on detached Screen. The problem is that it does not work when the command has additional parameters, e. g.:
screen -L "ls"
It produces a file (screenlog.0) with directory listing. But when I'm running
screen -L "ls -la"
Screen fails with error: Cannot exec 'ls -la': No such file or directory
Is there any way to run it properly in a Screen session?
You should use screen without quotes, then should be ok.
I am trying to write a script that displays its output to the terminal only while it's running, much like the 'less' or 'ssh' commands.
When I launch said script, it would take over the whole terminal, print what it needs to print, and then when I exit the script, I would return to my terminal where the only record that my script has run will be the line that shows the command itself. When I scroll up, I don't want to see what my script output.
[snoopdougg#machine /home/snoopdougg/logs]$ ls
alog.log blog.log clog.log myScript.sh
[snoopdougg#machine /home/snoopdougg/logs]$ whoami
snoopdougg
[snoopdougg#machine /home/snoopdougg/logs]$ ./myScript.sh
[snoopdougg#machine /home/snoopdougg/logs]$
(Like nothing ever happened... but myScript.sh would have print things to the terminal while it was running).
How can I do this?
You're talking about the alternate screen, which you can access with a pair of terminal attributes, smcup and rmcup. Put this in a script and run it for a small demo:
tput smcup
echo hello
sleep 5
tput rmcup
Use screen:
screen ./myScript.sh
The following should print "hello" (or some reminder) on my Linux command line at 9:00 AM today:
$ at 9:00AM
warning: commands will be executed using /bin/sh
at> echo "hello"
at> <EOT>
However, at the specified time, nothing happens.
I have an empty etc/at.deny and no /etc/at.allow file, so there shouldn't be any problems with permissions to use the command. Also, writing a file at 9:00 AM works:
$ at 9:00AM
at> echo "hello" > /home/mart/hello.txt
at> <EOT>
$ cat /home/mart/hello.txt
hello
All jobs are shown as scheduled, I just can't get any output to the terminal window (I'm on Crunchbang Linux with Terminator). Why? Do I need to somehow specify the window for that output?
Thanks for any help!
at runs commands from a daemon (atd), which doesn't have access to your terminal. Thus, output from the script isn't going to appear in your terminal (unless you pipe to the right tty in your command).
Instead, it does as man at says:
The user will be mailed standard error and standard output from his commands, if any.
You may be able to access these reports using mail if your machine is suitably configured.
If you want to have at write to your terminal, you can try piping the output to write, which writes a message to a user's TTY, or to wall if you want to write to every terminal connected to the system.
Okay, nneonneo's explanations led me to using wall, which sends a message to all users. So setting oneself reminders in a terminal window can be done like this:
$ at 9:00AM
warning: commands will be executed using /bin/sh
at> echo "hello" | wall
at> <EOT>