I'm writing a bash script to automatically decrypt file for editing and encrypt it back after file is closed. File type could be any: plain-text, office document, etc. I am on Linux Mint with Mate.
I'm stuck: can't reliably detect if file was closed in application so that script can proceed to encrypting it back and removing decrypted version.
The first version of the script simply used vim with text files. Script was calling it directly and hadn't been going any further until vim was closed. Now as I want to be able to do so with other files, I tried the following things:
xdg-open: exits immediately after calling the application associated with file type. Thus script continues and does no good.
xdg-open's modified function for calling an associated app: runs it inside the current script so now I see program exit. Works only if the application hasn't been running already. If it has, then new process is finished and script continues.
So what I am trying to do now is to watch somehow that file was closed in already running application. Currently experimenting with pluma/gedit and inotifywait. It doesn't work either - instantly after file was opened it detects CLOSE_NOWRITE,CLOSE event.
Is it at all possible to detect this without specific hooks for different applications? Possibly some X hooks?
Thank you.
You could use lsof to determine if a file is opened by a process:
myFile="/home/myUser/myFile"
/usr/sbin/lsof "$myFile" | grep "$myFile"
You can use a 1 second loop and wait until the lsof response is empty. I have used this to help prevent a script from opening a newly discovered file that is still being written or downloaded.
Not all processes hold a file open while they are using it. For example, vim holds open a temporary file (/home/myUser/.myFile.swp) and may only open the real file when loading or saving.
You might do something like this.
decrypt "TheFile"&
pluma "TheFile"
encrypt "TheFile"
The & at the end of a line will execute the line then fall through to Pluma. The script will pause until pluma closes.
I could offer more help if you post your script.
Related
I am looking for a strategy suggestion.
I am very new to Linux shell scripting. Just learning tcsh not more than a month.
I need a script to automatically detects when is the result files are done copied back from a remote server to a folder in a remote machine, then start scp the files back to my workstation.
I do not know in advance when the job will finish run, so the folder could have no result files for a long while. I also do not know when will the last result file done copied back from remote server to the folder (and thus can start the scp).
I had tried crontab. Work fine when I guess correctly, most of the time just disappointing.
So I tried to write a script myself and I have it now. I intend to produce a script that serves me and my colleagues too.
To use the script, the user first need to login to the remote machine manually. Then only execute the script at remote machine. The script first asks for user to input their local machine name and directory where they wish to save the result files.
Then the script will just looping to test when is the total number of files change. When it detected that, which means the first result file is starting to be copied back from the remote server, then it loops again to detect when is the total files size in the folder stop changing, which means last result file is finished copied to the folder. After that it executes scp to send all the result files to the user workstation, at the initially specified directory.
Script works fine but I wish to make the script able to run in background and still running by itself even if the user logout from the remote machine and close the terminal. And also I wish to let the user just type in a simple command in terminal to start the script, something like a simple
./script.tcsh
I tried to run the script by command
./script.tcsh &
but fails, because background process unable to accept user input.
Google and found something called disown, but the command is not found. Apparently the remote machine and my machine does not support this command.
Tried to modify the script to first accept the user input, then attempt to use
cat > temp_script.tcsh << EOF
{rest of my script}
EOF
and then a line of
./temp_script.tcsh &
to try to create another script file and use the first script to initiate the second script in background. Also fail, because cat does not treat $variable as a literal text, it replaces it with values. I have a foreach i(1 2) loop, and the cat command just keep reporting error (missing value of variable i, which is just a counter in foreach loop syntax).
I am out of idea at the moment.
Can anyone enlighten me with some strategy that I can try myself?
The goal is to use only 1 script file, and prompt user for 2 inputs (machine name and directory to save), then no more interaction with user or waiting, and able to run even closing the terminal.
Note: I do not need password to login to remote machine and back.
In a batch script redirecting the output to a file like so
set "output=C:\output.txt"
echo Blah blah blah. >> %output%
Is it required that the file is closed after the redirection of writing stuff to it is completed (similar to the approach in other programming languages)?
I have tried searching for related information online but could not find something on it; I assume the fact that most scripts are closed after they finish their tasks (commands) is maybe the reason why.
But if say a script is to run in an endless loop where a different output file is written (e.g. by appending the time to the output file name) or if new output is constantly being redirected to the same output file, could the "not closing of the file" potentially lead to problems, memory or other?
No, you don't have to close any file handles in batch scripts. You don't know the file handle value so you could not even close it if you wanted to.
On Windows, all open kernel handles are closed when a process ends/crashes but since a batch file is interpreted by cmd.exe without starting a new cmd.exe process in most cases, it cannot take advantage of the automatic handle cleanup and will manually close the file handle after each redirected operation.
I want to automate really simple ftp transfers with WinSCP (Example script file shown below. The real file would handle many files, but all simple stuff.)
open ftp://username:password#ftp.site.com/
option confirm off
cd remotedirectory
get file.csv
close
exit
A batch file containing:
winscp.com /script="staging get.txt"
opens a command prompt window and executes correctly in Windows 10, but in Windows 7 the command window opens and then immediately closes, and no files are transferred. WinSCP is in the path in both environments. I assume that a parameter or command is missing from one or the other file, but I don't know what it would be.
I was making a couple of small syntax errors. I couldn't see them because the command prompt window closed almost immediately, but the log file showed me what was happening and it was easy to fix. The lesson is - always create a log file.
system( "start /wait file.docx")
This starts the file but fails to wait if another docx file already open. Works perfectly if there is no file open.
What I am trying to do : I would like to open a file in windows with its default editor and wait for the user to input and do some changes to the file post save, hence I am using
/wait.
Thanks for any tip?
The default behaviour in winword is to reuse existing instances of the executable to open multiple documents. So, the second open file operation delegates its work into the existing one and exits, so the start command returns.
One usual option is to use COM to open the file and test for closing of the instance. But i know nothing of ruby or if it supports COM.
The best approach will be locate the winword executable and call it directly using as paramenters /w filename.docx to force opening the file into a new instance.
I have a shell script which contains a sed command that does the insertion into an existing file:
sed -i "/<test name=\"test-$NUMBER\">/i $NEW_TEST_SUITE" test.xml
After running this shell script, I opened the file test.xml in Notepad++, and there is indeed a new line being inserted before:
<test name="test-XXXX">
However, when I tried to do a pretty print (by clicking CTRL+ALT+SHIFT+B) and save that file, it popped up an alert saying:
Please check if this file is opened in another program
So I was thinking could that be caused by modifying the file while not closing it? Do I need to close the file after using sed? If so, could you tell me what the command is since I've searched online but didn't find anything regarding this? (my platform is Windows 7)
No. When sed exits, the file is closed.
This is probably a permissions issue. Verify that your Windows user has write access to the file.
If it's on a Windows partition, try running Notepad++ as administrator. If it's on a Linux shared fs, try chmod.
No, sed does not keep files open. Once the script has completed, all open files are then closed.
Try using Process Explorer to find what process has the file open. Use Ctrl-F to find an open handle that is attached to the file you are having problems with.