is there a way to copy a string to clipboard from command line?
To be more specific, I want to make a script which copies my email address to clipboard, so that when I need to insert it several times for logging in / register, I just run the script once and then CMD+V it whenever I need.
I heard of pbcopy, but I think this is not my case. Any suggestion?
Many thanks!
You need to pipe the output of your script to pbcopy
For example:
./somescript.sh | pbcopy
echo 'your-email#example.com' | pbcopy
(as #Jonathan Leffler stated above)
Or see the answer for the related question on piping to clipboard on different operating systems:
Pipe to/from the clipboard in Bash script
Related
I know :w means to save the file, but what does :%w do?
And also what does !pbcopy do?
See :h :w_c. It sends some lines as standard input to an external command (accessible via your shell). The % sign is a range, it tells vim to send the whole file, not only a part.
Note that :w! somefile.txt is totally different from :w !ext_command (with a space after :w).
I don't know about the pbcopy external command, but you certainly can do something like $ pbcopy -h in your shell to get some help about it.
There are plenty of threads here discussing how to do this for scripts or for the cmdline (mostly involving pipes, redirections, tee).
What I didn't find is a solution which can be set up once and then just works globally, without manipulating single scripts or adding something to every command line.
What I want to achieve is something like described in the top answer of
How do I write stderr to a file while using "tee" with a pipe?
Isn't it possible to configure the bash session so that all stderr output is logged to a file, while still writing it to console? Something I could add to .bashrc and thus automatically set up every time I login?
Software: Bash 4.2.24(1)-release (x86_64-pc-linux-gnu), xterm, Ubuntu 12.04
Try this variation on #0xC0000022L's previous solution (put it in your .bash_profile):
exec 2> >( tee log.file > /dev/tty )
A couple of caveats:
The prompt and anything you type at the command line are printed to stderr, and so will be logged in your file.
There could be an issue with the newline that terminates a command not being displayed in your terminal; I observe it on my Linux host, but not on my Mac OS X laptop. Perhaps someone else can explain and/or fix the issue. For example, if I type "echo stdout", I see the following:
$ echo stdoutstdout
$
I have a program (that I did not write) which is not designed to read in commands from a file. Entering commands on STDIN is pretty tedious, so I'd like to be able to automate it by writing the commands in a file for re-use. Trouble is, if the program hits EOF, it loops infinitely trying to read in the next command dropping an endless torrent of menu options on the screen.
What I'd like to be able to do is cat a file containing the commands into the program via a pipe, then use some sort of shell magic to have it switch from the file to STDIN when it hits the file's EOF.
Note: I've already considered using cat with the '-' for STDIN. Unfortunately (I didn't know this before), piped commands wait for the first program's output to terminate before starting the second program -- they do not run in parallel. If there's some way to get the programs to run in parallel with that kind of piping action, that would work!
Any thoughts? Thanks for any assistance!
EDIT:
I should note that my goal is not only to prevent the system from hitting the end of the commands file. I would like to be able to continue typing in commands from the keyboard when the file hits EOF.
I would do something like
(cat your_file_with_commands; cat) | sh your_script
That way, when the file with commands is done, the second cat will feed your script with whatever you type on stdin afterwards.
Same as Idelic answer with more simple syntax ;)
cat your_file_with_commands - | sh your_script
I would think expect would work for this.
Have you tried using something like tail -f commandfile | command I think that should pipe the lines of the file to command without closing the file descriptor afterwards. Use -n to specify the number of lines to be piped if tail -f doesn't catch all of them.
I recently discovered 'comint-show-output' in emacs shell mode, which jumps to the first line of shell output, which I find incredibly handy when looking at shell output that exceeds a screen length. The advantages of this command over scrolling with 'page up' are A) you don't have to scan with your eyes for the first line of the output B) you only have to hit the key combo once (instead of 'page up' a number of times which probably is not known beforehand).
I thought about ending all my commands with '| more' but actually this is not what I want since most of the time, I want to retain all output in the terminal buffer, and I usually want to see the end of the shell output first.
I use OSX. Is there a terminal app (on os x) and shell (on remote linux) combination equivalent (so I can do something similar without using emacs all the time - I know, crazy talk)? I normally use bash, but would be fine with switching shells just for this feature.
The way I do this sort of thing is by sending my output to a file and then watching the file as it is written. You still get the results of the command dumped to terminal history in real time and can still inspect the output's actual contents further after the fact (or in another terminal, etc...)
command > output &
tail -f output
head output
You could always do something in bash like this:
alias foo='!! | more'
which would make foo run the previous command with more. I'm not sure of any way to do exactly what you are suggesting.
If you're expecting a lot of output and don't want to run your command twice, you can use tee(1) to fork the output:
my-command | tee /tmp/my-command.log | less
This will pipe the output to a paginator (less), while simultaneously logging the output to a file (in this case, a file named /tmp/my-command.log). If you need to review the output after you've quit from less, you can just cat the log file instead of re-running the command.
Can AppleScript take the output of a shell script or variable and put it in the paste buffer?
I have a file with a password for every day (formatted "date,password") and I want to write a script that when run will look up the date and output the password for that date.
That part is not a problem, I'm just wondering if there is a way to get the output to automatically go into the paste buffer?
Using AppleScript to put output of a shell command into the clipboard:
set the clipboard to (do shell script "ls")
If you do not want to use AppleScript you can use pbcopy in the shell:
sh$ some command | pbcopy