Compiling files in a directory into their own zip folder using a makefile script - bash

I am currently taking a class where we submit homework through an online tool. However, it requires submissions to be made up of individually zipped files in order to compile them properly. This is time consuming when I sometimes make multiple submissions. I am trying to write a make file script to make a zipped copy of all files within the same folder. I then want to move those zipped files to a sub directory I create afterward called zippedFiles. This is what I have so far. The for loop line works when I run it directly in terminal but has the following error when I run make zip I get the following error: zip error: Nothing to do! (.zip) I am new to learning bash and make files and have been unable to research a solution on my own.
zip:
rm -f ./*zip #remove any extra zip copies.
rm -rf zippedFiles #delete old zippedFiles
for i in *; do zip $i.zip $i; done #zip each file ***not working
mkdir zippedFiles #remake new zippedFiles directory

In makefiles, you must use twice the $ to reference to a variable from the for loop.
for i in *; do zip $$i.zip $$i; done

Related

zip created using -jr flags unzipped differently on macOS when double clicking vs running unzip

I am zipping the .xctest file from Plugins folder inside the .app target generated by building my app. I have a build phase script that runs last in my test target to copy this file over. I use the following script to do the zipping:
XCTEST_FILE=${TARGET_BUILD_DIR}/${TARGET_NAME}.xctest
XCTEST_ZIP=${TARGET_BUILD_DIR}/../../${TARGET_NAME}.xctest.zip
zip -jr ${XCTEST_ZIP} ${XCTEST_FILE}
This gives me TestTarget.xctest.zip file. But it unzips differently based on these 2 methods,
unzip TestTarget.xctest.zip
-TestTarget
-CodeResources
-Info.plist
Double clicking TestTarget.xctest.zip in finder
-TestTarget.xctest
--TestTarget
--CodeResources
--Info.plist
Why is unzip going to the innermost node and extracting all the files? I want the unzip command to give me the .xctest directory. I tried renaming the zip file to TestTarget.zip and it still behaves similarly.
I was initially zipping using zip -r ${XCTEST_ZIP} ${XCTEST_FILE}, but the problem with this was it would retain the entire folder structure from root (\) when I double clicked to unzip the file. A post recommend using the -j flag instead of -r. But just -j led to no zip file being generated. Another comment recommend -jr which created a zip that generated output I expected when double clicking it. But I guess the unzip command does stuff differently.
Similar Question: MacOs zip file - different result when double click and running unzip command
The cause for error here was very different,
The problem was when the file was created. It was not related to MacOs issue but with certain path length known issue in windows
Based on How to zip folder without full path, I had to update my script to first cd into the TARGET_BUILD_DIR before generating the zip. I also had to remove the -j flag so the local folder structure was retained on running unzip.
cd ${TARGET_BUILD_DIR}
XCTEST_FILE=./${TARGET_NAME}.xctest
XCTEST_ZIP=../../${TARGET_NAME}.xctest.zip
zip -r ${XCTEST_ZIP} ${XCTEST_FILE}

make rebuild target depending on zip file

Why make rebuilds the target (I suppose) if the dependency is a binary file?
To reproduce:
create (and enter it) a new empty directory
download the GameLift SDK (it is just an example: the Makefile content on this question is an example with this file)
create a simple Makefile with the content below
issue more times the make command
all: GameLift_12_22_2020/GameLift-SDK-Release-4.0.2/GameLift-Cpp-ServerSDK-3.4.1/CMakeLists.txt
GameLift_12_22_2020/GameLift-SDK-Release-4.0.2/GameLift-Cpp-ServerSDK-3.4.1/CMakeLists.txt: GameLift_12_22_2020.zip
unzip -oq GameLift_12_22_2020.zip
I would have expected to see the unzip command to be executed only first time I issue the make command, but it continue to be executed in next make runs... why?
There are two possibilities, we cannot know which is the case with the information you've provided.
The first is that the file GameLift_12_22_2020/GameLift-SDK-Release-4.0.2/GameLift-Cpp-ServerSDK-3.4.1/CMakeLists.txt is not present in the zip file, so the second time make runs it looks to see if that file exists and it doesn't, so it re-runs the rule. If, in the same directory you run make, you use ls GameLift_12_22_2020/GameLift-SDK-Release-4.0.2/GameLift-Cpp-ServerSDK-3.4.1/CMakeLists.txt (after the unzip runs) and you get "file not found" or similar, this is your problem.
If that's not it, then the problem is that the timestamp of the file in the zip file is older than the zip file itself, and when unzip unpacks the file it sets the timestamp to this older time.
So when make goes to build it finds the CMakeLists.txt file but the modification time is older than the zip file, so make unpacks the zip file again to try to update it.
You can use ls -l to see the modification time on that file. If this is the case you should touch the file when you unpack it, so it's newer:
GameLift_12_22_2020/GameLift-SDK-Release-4.0.2/GameLift-Cpp-ServerSDK-3.4.1/CMakeLists.txt: GameLift_12_22_2020.zip
unzip -oq GameLift_12_22_2020.zip
touch $#

Shell command for copying the contents of one folder to another

Due to the way the Plesk Extension on my web server works, I am trying to write a shell command that fires after a deployment. This simply needs to copy the contents of one folder to another.
Currently, I am using this:
cp -r /deployed-site/public/ /httpdocs/
However, this only seems to work if the destination folder is empty. Every time a deployment occurs, I want the contents of the first folder copied and pasted into the second?
I would say it's better to clean the destination folder before copying files:
rm -rf /httpdocs
cp -r /deployed-site/public/ /httpdocs/

zip all files and folders recursively in bash

I am working on a project, where compilation of the project involves, zipping up various files and folders and subfolders (html/css/js) selectively. Working on the windows platform, and I could continue to just use the CTRL+A and then SHIFT-click to unselect, but it does get a little tedious. I am working with cygwin, so I was wondering if it is possible to issue a command to zip selected files/folders recursively whilst excluding others, in one command? I already have zip command installed, but I seem to be zipping up the current zip file too and the .svn file too.
I would like this to be incorporated into a shell script if possible, so the simpler the better.
After reading the man pages, I think the solution that I was looking for is as follws:
needs to recurse directories (-r),
needs to exclude certail files/directories (-x)
It works in the current directory, but the . can be replaced with the path of any directory
zip -x directories_to_exclude -r codebase_latest.zip .
I have incorporated this into a short shell script that deletes files, tidy up some code, and then zips up all of the files as needed.
You should read man page of zip command:
-R
--recurse-patterns
Travel the directory structure recursively starting at the current directory; for example:
zip -R foo "*.c"
In this case, all the files matching *.c in the tree starting at the current directory are stored into a zip archive named foo.zip. Note that *.c will match
file.c, a/file.c and a/b/.c. More than one pattern can be listed as separate arguments. Note for PKZIP users: the equivalent command is
pkzip -rP foo *.c
Patterns are relative file paths as they appear in the archive, or will after zipping, and can have optional wildcards in them. For example, given the cur‐
rent directory is foo and under it are directories foo1 and foo2 and in foo1 is the file bar.c,
zip -R foo/*
will zip up foo, foo/foo1, foo/foo1/bar.c, and foo/foo2.
zip -R */bar.c
will zip up foo/foo1/bar.c. See the note for -r on escaping wildcards.
You can also have a look HERE

Copy new files to another folder on windows automatically

I have a folder where some xml files keep updated, I want those files copied to another folder. From which a software collects the xml. I want to watch the folder and copy only new files as it comes to another folder automatically.
Is there any ready made solution or can it be done using Autoit, shell script or vb.net(last choice).
Main concerns:
Know when a file is added.
Copy them to the folder.
Make sure its notcopied second time.
If I schedule this code, it might also work
cd /home/webapps/project1/folder1 for f in *.csv do cp -v $f
/home/webapps/project1/folder2/${f%.csv}$(date +%m%d%y).csv done
But it again don't have a way to not copy those which are already copied.
rsync. Put it into cron to run every few minutes.

Resources