Use drag and drop to populate bash scrip for awk [duplicate] - bash

I compiled mplayer from source on Ubuntu. I didn't want to use a GUI but I wanted to make a executable bash file that gets the path from an file that gets dropped onto the bash file. How do I make such a thing possible?
I wanted to make it look something like this:
mplayer <get full path to file.file-ending>
I want the executable bash file to sit on my desktop ;)
If possible, I'd just like an rightclick -> start with mplayer function, but I don't know how to make one.

You can access arguments passed to the script with $1 (for the first argument). And also you should make a .desktop file so Nautilus (or your desktop manager) know what to do and use %u to pass the dropped path to the script.
For example you can create a file named DropOverMe.desktop:
[Desktop Entry]
Encoding=UTF-8
Name=Drop Over Me
Comment=Execute the script with the file dropped
Exec=gnome-terminal -e "/folder/to/the/script/launchme.sh \"%u\""
Icon=utilities-terminal
Type=Application
I use gnome-terminal as I have Ubuntu on my PC, use your preferred terminal application.
And the script could be something like:
#! /bin/bash
echo "Launched with $1" >> /tmp/history.log

Try:
#!/bin/bash
mplayer "$1"
The file path of the dropped file will be passed to the script file as the 1th command line argument.

In openned terminal
By using mate-terminal, gnome-terminal or konsole, you could use drag'n drop into oppened window.
This will send URL as tipped, with a space added, but without endline.
For this, run mplayer, I wrote this little loop function:
while IFS=$' \t\r\n' read -d '' -p "Wait for URL to play: " -rsn 1 str &&
[ "$str" ];do
while IFS= read -d '' -rsn 1 -t .02 char
do str+="$char"
done
if [ "$str" ] ;then
read -a req <<<"$str"
echo $req
mplayer $req
fi
done
First read will determine if something is comming or else end loop.
Second loop with very short timeout will read dropped URL as a single string
read -a req will split string to consider only 1st part

Related

Set target when open .html file via terminal on MacOS

There is this solved topic about opening .html files via command line.
I use the solution and it works well for using
open ./myfile.html
However, it opens the file always in a new tab. I would like to open it always in the same tab (using the browser target). This is an easy thing to do in JavaScript, but I can't figure out a way to do it in combination with the above mentioned code.
My assumption for now is, that there must be a way to pass the target as a parameter to the open command. The man open reveals the following for the parameter --args:
All remaining arguments are passed to the opened application in the
argv parameter to main(). These arguments are not opened or
interpreted by the open tool.
So I tried the following:
open ./myfile.html --args target=myfile_target # still opens in new tab
open ./myfile.html --args target="myfile_target" # still opens in new tab
open ./myfile.html --args target:myfile_target # still opens in new tab
I am not sure if this even works but I think there must be a way to do this.
Edit: for now it is enough to make this work with chrome.
This Bash script incorporates a bit of AppleScript in order to open a browser window with a reference that the script can keep track of and continue to target with ongoing URL requests.
You ought to be able to copy-n-paste it into a text editor and save it as whatever you wish to call this replacement open function. I saved it as url, in one of the directories listed in my $PATH variable. That way, I could simply type url dropbox.com from the command-line, and it would run.
You will have to make it executable before you can do that. So, after it's saved, run this command:
chmod +x /path/to/file
Then you should be good to go. Let me know if you encounter any errors, and I'll fix them.
#!/bin/bash
#
# Usage: url %file% | %url%
#
# %file%: relative or absolute POSIX path to a local .html file
# %url%: [http[s]://]domain[/path/etc...]
IFS=''
# Determine whether argument is file or web address
[[ -f "$1" ]] && \
_URL=file://$( cd "$( dirname "$1" )"; pwd )/$( basename "$1" ) || \
{ [[ $1 == http* ]] && _URL=$1 || _URL=http://$1; };
# Read the value on the last line of this script
_W=$( tail -n 1 "$0" )
# Open a Safari window and store its AppleScript object id reference
_W=$( osascript \
-e "use S : app \"safari\"" \
-e "try" \
-e " set S's window id $_W's document's url to \"$_URL\"" \
-e " return $_W" \
-e "on error" \
-e " S's (make new document with properties {url:\"$_URL\"})" \
-e " return id of S's front window" \
-e "end try" )
_THIS=$( sed \$d "$0" ) # All but the last line of this script
echo "$_THIS" > "$0" # Overwrite this file
echo -n "$_W" >> "$0" # Appened the object id value as final line
exit
2934

2nd loop inside the script or asking to enter value

I have written a code but I am having a problem to make the double loop in my bash script. This script should read all the files 1 by 1 in the given directory to upload but the value of "XYZ" changes for each file. Is there a way for me to make the code ask me to enter the "XYZ" every time it reads a new file to upload? (if possible with the name of the file read) like "please enter the XYZ value of 'read file's name'" I could not think of any possible ways of doing so. I also have the XYZ values listed in a file in a different directory so maybe can it be called like the do loop I did for the path? I might actually need to use both cases as well...
#!/bin/bash
FILES=/home/user/downloads/files/
for f in $FILES
do
curl -F pitch=9 -F Name='astn' -F
"path=#/home/user/downloads/files;$f" -F "pass 1234" -F "XYZ= 1.2" -
F time=30 -F outputFormat=json
"http://blablabla.com"
done
try following once.
#!/bin/bash
FILES=/home/user/downloads/files/
for f in $FILES
do
echo "Please enter the name variable value here:"
read Name
curl -F pitch=9 -F "$Name" -F
"path=#/home/user/downloads/files;$f" -F "pass 1234" -F "XYZ= 1.2" -
F time=30 -F outputFormat=json
"http://blablabla.com"
done
I have entered a read command inside loop so each time it will prompt user for a value, since you haven't provided more details about your requirement so I haven't tested it completely.
The problem was actually the argument. By changing it to:
-F Name="$Name"
solved the problem. Trying to link the argument such as only $Name or "$Name" causes a bad reception.

Create executable bash script that accepts drag & drop

I compiled mplayer from source on Ubuntu. I didn't want to use a GUI but I wanted to make a executable bash file that gets the path from an file that gets dropped onto the bash file. How do I make such a thing possible?
I wanted to make it look something like this:
mplayer <get full path to file.file-ending>
I want the executable bash file to sit on my desktop ;)
If possible, I'd just like an rightclick -> start with mplayer function, but I don't know how to make one.
You can access arguments passed to the script with $1 (for the first argument). And also you should make a .desktop file so Nautilus (or your desktop manager) know what to do and use %u to pass the dropped path to the script.
For example you can create a file named DropOverMe.desktop:
[Desktop Entry]
Encoding=UTF-8
Name=Drop Over Me
Comment=Execute the script with the file dropped
Exec=gnome-terminal -e "/folder/to/the/script/launchme.sh \"%u\""
Icon=utilities-terminal
Type=Application
I use gnome-terminal as I have Ubuntu on my PC, use your preferred terminal application.
And the script could be something like:
#! /bin/bash
echo "Launched with $1" >> /tmp/history.log
Try:
#!/bin/bash
mplayer "$1"
The file path of the dropped file will be passed to the script file as the 1th command line argument.
In openned terminal
By using mate-terminal, gnome-terminal or konsole, you could use drag'n drop into oppened window.
This will send URL as tipped, with a space added, but without endline.
For this, run mplayer, I wrote this little loop function:
while IFS=$' \t\r\n' read -d '' -p "Wait for URL to play: " -rsn 1 str &&
[ "$str" ];do
while IFS= read -d '' -rsn 1 -t .02 char
do str+="$char"
done
if [ "$str" ] ;then
read -a req <<<"$str"
echo $req
mplayer $req
fi
done
First read will determine if something is comming or else end loop.
Second loop with very short timeout will read dropped URL as a single string
read -a req will split string to consider only 1st part

Shell Script to load multiple FTP files

I am trying to upload multiple files from one folder to a ftp site and wrote this script:
#!/bin/bash
for i in '/dir/*'
do
if [-f /dir/$i]; then
HOST='x.x.x.x'
USER='username'
PASSWD='password'
DIR=archives
File=$i
ftp -n $HOST << END_SCRIPT
quote USER $USER
quote PASS $PASSWD
ascii
put $FILE
quit
END_SCRIPT
fi
It is giving me following error when I try to execute:
username#host:~/Documents/Python$ ./script.sh
./script.sh: line 22: syntax error: unexpected end of file
I can't seem to get this to work. Any help is much appreciated.
Thanks,
Mayank
It's complaining because your for loop does not have a done marker to indicate the end of the loop. You also need more spaces in your if:
if [ -f "$i" ]; then
Recall that [ is actually a command, and it won't be recognized if it doesn't appear as such.
And... if you single quote your glob (at the for) like that, it won't be expanded. No quotes there, but double quotes when using $i. You probably also don't want to include the /dir/ part when you use $i as it's included in your glob.
If I'm not mistaken, ncftp can take wildcard arguments:
ncftpput -u username -p password x.x.x.x archives /dir/*
If you don't already have it installed, it's likely available in the standard repo for your OS.
First, the literal, fixing-your-script answer:
#!/bin/bash
# no reason to set variables that don't change inside the loop
host='x.x.x.x'
user='username'
password='password'
dir=archives
for i in /dir/*; do # no quotes if you want the wildcard to be expanded!
if [ -f "$i" ]; then # need double quotes and whitespace here!
file=$i
ftp -n "$host" <<END_SCRIPT
quote USER $user
quote PASS $password
ascii
put $file $dir/$file
quit
END_SCRIPT
fi
done
Next, the easy way:
lftp -e 'mput -a *.i' -u "$user,$password" "ftp://$host/"
(yes, lftp expands the wildcard internally, rather than expecting this to be done by the outer shell).
First of all my apologies in not making myself clear in the question. My actual task was to copy a file from local folder to a SFTP site and then move the file to an archive folder. Since the SFTP is hosted by a vendor I cannot use the key sharing (vendor limitation. Also, SCP will require password entering if used in a shell script so I have to use SSHPASS. SSHPASS is in the Ubuntu repo however for CentOS it needs to be installed from here
Current thread and How to run the sftp command with a password from Bash script? did gave me better understanding on how to write the script and I will share my solution here:
#!/bin/bash
#!/usr/bin
for i in /dir/*; do
if [ -f "$i" ]; then
file=$i
export SSHPASS=password
sshpass -e sftp -oBatchMode=no -b - user#ftp.com << !
cd foldername/foldername
put $file
bye
!
mv $file /somedir/test
fi
done
Thanks everyone for all the responses!
--Mayank

bash save last user input value permanently in the script itself

Is it possible to save last entered value of a variable by the user in the bash script itself so that I reuse value the next time while executing again?.
Eg:
#!/bin/bash
if [ -d "/opt/test" ]; then
echo "Enter path:"
read path
p=$path
else
.....
........
fi
The above script is just a sample example I wanted to give(which may be wrong), is it possible if I want to save the value of p permanently in the script itself to so that I use it somewhere later in the script even when the script is re-executed?.
EDIT:
I am already using sed to overwrite the lines in the script while executing, this method works but this is not at all good practice as said. Replacing the lines in the same file as said in the below answer is much better than what I am using like the one below:
...
....
PATH=""; #This is line no 7
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )";
name="$(basename "$(test -L "$0" && readlink "$0" || echo "$0")")";
...
if [ condition ]
fi
path=$path
sed -i '7s|.*|PATH='$path';|' $DIR/$name;
Someting like this should do the asked stuff :
#!/bin/bash
ENTERED_PATH=""
if [ "$ENTERED_PATH" = "" ]; then
echo "Enter path"
read path
ENTERED_PATH=$path
sed -i 's/ENTERED_PATH=""/ENTERED_PATH='$path'/g' $0
fi
This script will ask user a path only if not previously ENTERED_PATH were defined, and store it directly into the current file with the sed line.
Maybe a safer way to do this, would be to write a config file somewhere with the data you want to save and source it . data.saved at the begining of your script.
In the script itself? Yes with sed but it's not advisable.
#!/bin/bash
test='0'
echo "test currently is: $test";
test=`expr $test + 1`
echo "changing test to: $test"
sed -i "s/test='[0-9]*'/test='$test'/" $0
Preferable method:
Try saving the value in a seperate file you can easily do a
myvar=`cat varfile.txt`
And whatever was in the file is not in your variable.
I would suggest using the /tmp/ dir to store the file in.
Another option would be to save the value as an extended attribute attached to the script file. This has many of the same problems as editing the script's contents (permissions issues, weird for multiple users, etc) plus a few of its own (not supported on all filesystems...), but IMHO it's not quite as ugly as rewriting the script itself (a config file really is a better option).
I don't use Linux, but I think the relevant commands would be something like this:
path="$(getfattr --only-values -n "user.saved_path" "${BASH_SOURCE[0]}")"
if [[ -z "$path" ]]; then
read -p "Enter path:" path
setfattr -n "user.saved_path" -v "$path" "${BASH_SOURCE[0]}"
fi

Resources