Due to the way the Plesk Extension on my web server works, I am trying to write a shell command that fires after a deployment. This simply needs to copy the contents of one folder to another.
Currently, I am using this:
cp -r /deployed-site/public/ /httpdocs/
However, this only seems to work if the destination folder is empty. Every time a deployment occurs, I want the contents of the first folder copied and pasted into the second?
I would say it's better to clean the destination folder before copying files:
rm -rf /httpdocs
cp -r /deployed-site/public/ /httpdocs/
Related
I am zipping the .xctest file from Plugins folder inside the .app target generated by building my app. I have a build phase script that runs last in my test target to copy this file over. I use the following script to do the zipping:
XCTEST_FILE=${TARGET_BUILD_DIR}/${TARGET_NAME}.xctest
XCTEST_ZIP=${TARGET_BUILD_DIR}/../../${TARGET_NAME}.xctest.zip
zip -jr ${XCTEST_ZIP} ${XCTEST_FILE}
This gives me TestTarget.xctest.zip file. But it unzips differently based on these 2 methods,
unzip TestTarget.xctest.zip
-TestTarget
-CodeResources
-Info.plist
Double clicking TestTarget.xctest.zip in finder
-TestTarget.xctest
--TestTarget
--CodeResources
--Info.plist
Why is unzip going to the innermost node and extracting all the files? I want the unzip command to give me the .xctest directory. I tried renaming the zip file to TestTarget.zip and it still behaves similarly.
I was initially zipping using zip -r ${XCTEST_ZIP} ${XCTEST_FILE}, but the problem with this was it would retain the entire folder structure from root (\) when I double clicked to unzip the file. A post recommend using the -j flag instead of -r. But just -j led to no zip file being generated. Another comment recommend -jr which created a zip that generated output I expected when double clicking it. But I guess the unzip command does stuff differently.
Similar Question: MacOs zip file - different result when double click and running unzip command
The cause for error here was very different,
The problem was when the file was created. It was not related to MacOs issue but with certain path length known issue in windows
Based on How to zip folder without full path, I had to update my script to first cd into the TARGET_BUILD_DIR before generating the zip. I also had to remove the -j flag so the local folder structure was retained on running unzip.
cd ${TARGET_BUILD_DIR}
XCTEST_FILE=./${TARGET_NAME}.xctest
XCTEST_ZIP=../../${TARGET_NAME}.xctest.zip
zip -r ${XCTEST_ZIP} ${XCTEST_FILE}
I am currently taking a class where we submit homework through an online tool. However, it requires submissions to be made up of individually zipped files in order to compile them properly. This is time consuming when I sometimes make multiple submissions. I am trying to write a make file script to make a zipped copy of all files within the same folder. I then want to move those zipped files to a sub directory I create afterward called zippedFiles. This is what I have so far. The for loop line works when I run it directly in terminal but has the following error when I run make zip I get the following error: zip error: Nothing to do! (.zip) I am new to learning bash and make files and have been unable to research a solution on my own.
zip:
rm -f ./*zip #remove any extra zip copies.
rm -rf zippedFiles #delete old zippedFiles
for i in *; do zip $i.zip $i; done #zip each file ***not working
mkdir zippedFiles #remake new zippedFiles directory
In makefiles, you must use twice the $ to reference to a variable from the for loop.
for i in *; do zip $$i.zip $$i; done
I have two directories:
/dev/
/www/
The www is a copy of the dev directory.I copy the files across from the dev to the www when they're ready to go live by a script which deletes all of the files inside the www directory and then copies the dev files to it. I'm losing the updated time though as the new copies are essentially new files.
How can I copy the last-modified date too?
It was only a particular subdirectory that I was concerned about so I did it with a for loop in my shell script.
$DIR_DEV="/dev"
$DIR_LIVE="/www"
for i in `ls $DIR_DEV/demos/*.html`
do
DEMO_FILENAME=`basename $i`
touch -d `stat $DIR_DEV/demos/$DEMO_FILENAME --format=%y` "$DIR_LIVE/demos/$DEMO_FILENAME"
done
OOPSY: As I write this I've realised that the copy command has a --preserve option... Could've saved a few hours. :-/
I have a folder where some xml files keep updated, I want those files copied to another folder. From which a software collects the xml. I want to watch the folder and copy only new files as it comes to another folder automatically.
Is there any ready made solution or can it be done using Autoit, shell script or vb.net(last choice).
Main concerns:
Know when a file is added.
Copy them to the folder.
Make sure its notcopied second time.
If I schedule this code, it might also work
cd /home/webapps/project1/folder1 for f in *.csv do cp -v $f
/home/webapps/project1/folder2/${f%.csv}$(date +%m%d%y).csv done
But it again don't have a way to not copy those which are already copied.
rsync. Put it into cron to run every few minutes.
I am using RSync to copy tar balls to an external hard drive on a Windows XP machine.
My files are tar.gz files (perms 600) in a directory (perms 711).
However, when I do a dry-run, only the folders are returned, the files are ignored.
I use RSync a lot, so I presume there is no issue with my installation.
I have tried changing permissions of the files but this makes no difference
The owner of the files is root, which is also the user which the script logs in as
I am not using Rsync's CVS option
The command I am using is:
rsync^
-azvr^
--stats^
--progress^
-e 'ssh -p 222' root#servername:/home/directory/ ./
Is there something I am missing to get my files copied over?
I can think of only a single possibility: My experience with rsync is that it creates the directory structure before copying files in. Rsync may be terminating prematurely, but after this directory step has been completed.
Update0
You mentioned that you were running dry run. Rsync by default only shows the directory names when the directory and all its contents are not present on the receiver.
After a lot of experimentation, I'm only able to reproduce the behaviour you describe if the directories on the source have later modification dates than on the receiver. In this instance, the modification times are adjusted on the receiver.
I had this problem too, and it turns out that backing up to a windows drive from linux doesn't seem to copy the temp files in place, after they are transferred over.
Try adding the --inplace flag, when rsyncing to windows drives.