Executing the following steps in bash [closed] - bash

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Here are the steps I am doing again and again, and I was wondering if I can write a script which does this.
I have two local accounts:
thrust
hduser
Now, I am writing Java code in Eclipse in my thrust account.
After the code runs satisfactorily, I do:
mvn clean
cp -r /home/thrust/projectA -r /tmp/
su - hduser
cp -r /tmp/projectA /home/hduser/
cd /home/hduser/projectA
mvn package
Is there a way I can automate all these steps?
Or is there a way I can write code on this thrust account and the code automatically syncs with the hduser account?

Given that you are writing code (and you are doing it "again and again"), it seems you should be using a revision control system (like Subversion or Git), either with a local repository or with a hosting service (for example: GitHub or Bitbucket).
If you don't want to use an RCS, you can create a shell script to automate what you are already doing, as suggested by #iamnotmaynard.
Also, take a look at rsync or ssh. They can help you to copy files from one user to another more easily (rsync can also help you to keep these files synchronized).

Related

bash iterate over files in folder, and update the loop when new files in folder [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
Hi I have this bash script:
for mesh in meshFiles/*.unv;
do
scaleFolder=`echo $mesh|cut -d'_' -f 1|cut -d'/' -f 2`
mkdir $scaleFolder
caseFolder="$scaleFolder/`basename ${mesh%.*}`"
mkdir $caseFolder
cp -r baseCase/* $caseFolder
cp $mesh $caseFolder/mesh.unv
echo "runing" `basename $mesh`
cd $caseFolder
./Allrun
cd ../..
done
and works fine for all files in the folder, the problem is that the process is slow, so I want to put more files in the meshFile folder, and avoid to brake and restart the loop on every new file.
One way to solve this is to write a make file. With this you won't be able to add files while the process is running, but you will be able to re-run the script after it finishes, and not re-process things that were already processed, i.e. not start from scratch. That is pretty simple to do, and might be enough for some cases.
If you really need to add files while it is running, you need to have a supervisor process that watches the folder contents and spawns processing tasks for new files. This also requires some task management (to know which ones are new/in progress/done). This is not straightforward, so I recommend trying the make first.
This is one intro, but there are many more if you google a bit. This example is about compiling C code when a .c source code file changes, but the make is generic, and can run any command you tell it to run (like run ./Allrun if any of the .unv changes, or if a new one appears).

How can I use Filezilla and vsftpd to write to an AWS EC2 instance of Ubuntu 14.04? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I use FileZilla and vsftpd on another server and understand that I have to change vsftpd.conf and uncomment the line(s) that say:
# Uncomment this to allow local users to log in.
local_enable=YES
#
# Uncomment this to enable any form of FTP write command.
write_enable=YES
So, I have that done and have restarted vsftpd but still I am unable to move files to the server. Should I chmod the directory that I am putting things in? That directory is /var/www/html and current permissions are:
drwxr-xr-x 2 root root 4096 Jan 9 20:13 html
I don't know where else to look. It must be something simple.
If you want to be able to modify files in your web directories, try changing the ownership (instead of the mode) by doing this:
sudo chown -R $USER:$USER /var/www/html
The $USER variable will take the value of the user you are currently logged in as.
By doing this, your regular (non root) user now owns the html subdirectories where you are trying to move files into.
It probably a good idea to also modify permissions a little bit to ensure that read access is permitted to the general web directory and all of the files and folders it contains so that pages can be served correctly, use:
sudo chmod -R 755 /var/www
Your web server should now have the permissions it needs to serve content, and your user should be able to create content within the necessary folders

mkdir -p cannot create directory `/DirName': Permission denied [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm trying to create a directory in a shell script:
mkdir -p DirName
but I always get the same error:
cannot create directory `/DirName': Permission denied
If I run the same command directly from the shell instead of using the scripts, that works perfectly.
any idea?
Thank you! :)
If you're going to use the -p option, you need to specify the full path
mkdir -p /some/path/here/DirName
I suggest listing the full path (If you plan on your shell script to change locations).
If your shell script isn't going to change locations (you're not going to move it somewhere else later), I'd use:
mkdir ./DirName
These should all behave similarly to you creating the directory in the shell.
You are trying to create a directory in the root of the filesystem (/DirName) instead of in the current directory (Dirname or ./Dirname). You don't have access to write to the root.

Copy shell script and preserve permissions [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have a small shell script that starts a program when I double-click it. (I have set the permissions to allow executing the script).
I want to be able to copy that script to another computer so that the new user can double-click it without needing to know anything about chmod or permissions. But I can't find out how to preserve the execute permission when I copy the file.
I can usually find answers with Google but this has me defeated - I guess I am not expressing my question properly.
Thanks
Use rsync or tar.
rsync -p file user#host:destdir
plus other options you might need.
Or
tar cvzf file.tar file
then copy (or email, etc.) file.tar to the other machine and extract the file:
tar xpvzf file.tar

How to copy files across computers using SSH and MAC OS X Terminal [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm trying to copy my .profile, .rvm and .ssh folders/files to a new computer and keep getting a "not a regular file" response. I know how to use the cp and ssh commands but I'm not sure how to use them in order to transfer files from one computer to another.
Any help would be great, thanks!
You can do this with the scp command, which uses the ssh protocol to copy files across machines. It extends the syntax of cp to allow references to other systems:
scp username1#hostname1:/path/to/file username2#hostname2:/path/to/other/file
Copy something from this machine to some other machine:
scp /path/to/local/file username#hostname:/path/to/remote/file
Copy something from another machine to this machine:
scp username#hostname:/path/to/remote/file /path/to/local/file
Copy with a port number specified:
scp -P 1234 username#hostname:/path/to/remote/file /path/to/local/file
First zip or gzip the folders:
Use the following command:
zip -r NameYouWantForZipFile.zip foldertozip/
or
tar -pvczf BackUpDirectory.tar.gz /path/to/directory
for gzip compression use SCP:
scp username#yourserver.com:~/serverpath/public_html ~/Desktop
You may also want to look at rsync if you're doing a lot of files.
If you're going to making a lot of changes and want to keep your directories and files in sync, you may want to use a version control system like Subversion or Git. See http://xoa.petdance.com/How_to:_Keep_your_home_directory_in_Subversion

Resources