I have the following makefile that is attempting to run some commands on HTML files, if any of the .CSS or .JS files change.
The first .HTM file "works", the rule is run, but the others show "up to date" (they are not).
CSS_Files = $(wildcard wwwroot/assets/css/*.css)
JS_Files = $(wildcard wwwroot/assets/js/*.js)
HTML_Files = $(wildcard wwwroot/t*.htm wwwroot/pw*.html)
all : $(HTML_Files)
$(HTML_Files) : $(CSS_Files) $(JS_Files)
#echo Build '$(#F)' - '$(%F)'
-#attrib -r $#
#hashinclude $# -inplace -hvonly
Here's the output
e:\code\project>make
Build 'template.htm' - ''
/assets/js/appsvc.js?hv=8a52d1d0
wwwroot\template.htm created. Files Updated: 1. Total: 10
Build 'terminal_logon.htm' - ''
wwwroot/terminal_logon.htm is up to date.
Build 'pwlookup.html' - ''
wwwroot/pwlookup.html is up to date.
Build 'pwreset.html' - ''
wwwroot/pwreset.html is up to date.
In this case a appsvc.js was updated, and template.htm was "built". However the other .HTML files were considered as "up to date", but they need to be re-built too.
Related
I am trying to run the programme Unicycler on multiple sets of fasta.gz files. A set of three fasta.gz files is required for each assembly, each set of three fasta.gz files has a common ID which are located in a unique sub-directory (containing the same corresponding common ID in the name).
For example, the three files: QC_141696.fastq.gz, QC_141696_1.fastq.gz, QC_141696_2.fastq.gz are required to run the assembly and are located in the subdirectory assem_141696. I have 10 more sets of 3 files organised in the same way; all 11 subdirectories with named assem_(ID) and are located in the parent directory Assemblies.
Sequencing/Assemblies/assem_(IDset1)/QC_(IDset1).fastq.gz
Sequencing/Assemblies/assem_(IDset1)/QC_(IDset1)_1.fastq.gz
Sequencing/Assemblies/assem_(IDset1)/QC_(IDset1)_2.fastq.gz
An example of the command I am trying to run, not within a loop is:
unicycler --short1 QC_141696.fastq.gz --short2 QC_141696_2.fastq.gz --long QC_141696.fastq.gz --out QC_141696_hybrid --threads 16
I want to loop through each of the assem_(IDset*) subdirectories and run Unicycler using the three files located within it, the output directory should be located in the relevant assem_(IDset*) subdirectory
This is the code that I have so far:
for file in Assemblies/assem*/*_1.fastq.gz;
do base=$(basename ${file} _1.fastq.gz)
echo "running unicycler hybrid assembly on ${base}"
unicycler --short1 ${base}_1.fastq.gz --short2 ${base}_2.fastq.gz --long ${base}.fastq.gz --out ${base}_hybridassem --threads 16
echo "unicycler assembly on ${base} finished"
done
I am running the code from within the Sequencing directory
But I get:
Error: could not find home/user/scratch/Sequencing/QC_181651_1.fastq.gz
So it seems that my code is not looping through the intended directories. Annoyingly it works fine when testing it with echo.
Any help would be greatly appreciated!
Your code running in the Sequencing directory will need to build the paths to the input and output files for each of the assem_(IDset*) subdirectories. You can use bash parameter expansion dir=${file%\/*} to extract the directory in your loop. (Note also that the base variable was renamed to id):
#!/bin/bash
for file in Assemblies/assem*/*_1.fastq.gz ; do
id=$(basename "${file}" _1.fastq.gz)
dir=${file%\/*}
echo "running unicycler hybrid assembly on ${id}"
unicycler --short1 "${dir}/${id}_1.fastq.gz" --short2 "${dir}/${id}_2.fastq.gz" --long "${dir}/${id}.fastq.gz" --out "${dir}/${id}_hybridassem" --threads 16
echo "unicycler assembly on ${id} finished"
done
Let's say I have a makefile like this:
build:
# output will be something like dist/foo-1.2.3-py3-none-any.whl
poetry build
upload: build
aws s3 cp ${WHEEL_NAME} s3://some-bucket/
The name of the wheel is obviously always changing and so it will somehow need to be recorded at the end of build and read at upload. What is the correct way of implementing this?
Here's one simple approach. Instead of dist/, use a directory whose sole purpose is to contain the latest wheel:
WHEELDIR := latest-wheel
build:
#rm -fr $(WHEELDIR)
#mkdir $(WHEELDIR)
# output will be something like $(WHEELDIR)/foo-1.2.3-py3-none-any.whl
# I don't know what "poetry build" does
upload: build
aws s3 cp $(WHEELDIR)/* s3://some-bucket/
If you like, you can add a line or two to store a copy of the wheel in dist/, or some other archive.
I am using the moderncv class to create a CV in Rmarkdown. In order to make the cv reproducible out of the box I have included the .cls and .sty files in the root directory. However, in an effort to keep the root directory uncluttered I would prefer to keep all the moderncv related files in a subdirectory (assets/tex/). I am able to access the .cls file using a relative path in the yaml front matter, but I am not able to access the .sty files unless they are in the root directory.
Searching previous questions on stackoverflow I learned the following: (1) keeping .cls and .sty files in nested directories is not recommended. I understand this and would like to do it anyway so that other people can fork my project and be able to knit the cv without having to deal with finding their texmk folder. (2) the solution to my problem seems to involve setting the TEXINPUTS using a Makefile (see this thread and another thread)
I am not very good with Makefiles, but I have managed to get one working that will knit my .Rmd file to pdf without problems, so long as the .sty files are still in root. This is what it looks like currently:
PDF_FILE=my_cv.pdf
all : $(PDF_FILE)
echo All files are now up to date
clean :
rm -f $(PDF_FILE)
%.pdf : %.Rmd
Rscript -e 'rmarkdown::render("$<")'
My understanding is that I can set the TEXINPUTS using:
export TEXINPUTS=".:./assets/tex:"
Where "assets/tex" represents the subdirectory where the .sty files are located. I do not know how to incorporate the above code into my makefile so that the .sty files are recognized in the subdirectories and my .Rmd is knit to PDF. In its current state, I get the following error if I remove the .sty files from root and put then in the aforementioned subdirectory:
! LaTeX Error: Command \fax already defined.
Or name \end... illegal, see p.192 of the manual.
which I assume is occurring because the moderncv class needs---and cannot locate---the relevant .sty files.
You could try to define the environment variable in the make rule:
%.pdf : %.Rmd
export TEXINPUTS=".:./assets/tex:"
Rscript -e 'rmarkdown::render("$<")'
Or you could set the environment variable in a set-up chunk in your Rmd file:
```{r setup, include = FALSE}
Sys.setenv(TEXINPUTS=".:./assets/tex:")
```
Note: Not tested due to lack of minimal example.
I'm using jam in my project to automate building in Visual Studio.
I'm trying to move subdirectories and files from $folder to $folder1.
$folder is containing a project
$folder1 is empty.
I use File to copy files.
I try to copy files like this:
File ($folder1) : ($folder) ; //works
File ($folder1)\\subdir : ($folder)\\subdir //don't work
//etc...
But $folder1 is empty and does not contain a folder structure so File ($folder1)\\subdir : ($folder)\\subdir doesn't do anything because $folder1 is empty.
Is there way in Jam to create a folder depending if it exists or not?
I solved it by using MkDir
https://swarm.workshop.perforce.com/view/guest/perforce_software/jam/src/Jambase.html
It's important that you add the target as a Depends
Depends rule : $(1) ;
MkDir $(1) ;
I have some large data files that need to be copied from source folders to build folders during our Qmake/QtCreator build. Since they are large, I only want the copy to happen for new/changed files. And I'd really like to avoid listing them all specifically in the project file. Here's what I've tried:
This attempt at copying data files fails because the DemoData folder is the target. Therefore the copy is not performed if files within the folder are added or changed. Only if the folder does not exist.
DemoData.commands = $$COPY_CMD $${SRC_DATA_DIR}DemoData $${BLD_DATA_DIR}DemoData
DemoData.target += $${BLD_DATA_DIR}DemoData
PRE_TARGETDEPS += $${BLD_DATA_DIR}DemoData
QMAKE_EXTRA_TARGETS += DemoData
This approach fails because the DemoData.target item is not expected to have a list of multiple items. QMake puts the list in quotes in the generated makefile so it becomes one target.
DemoData.commands = $$COPY_CMD $${SRC_DATA_DIR}DemoData $${BLD_DATA_DIR}DemoData
DEMO_DATA_FILES = $$files($${SRC_DATA_DIR}DemoData/*)
for(FILE, DEMO_DATA_FILES){
DemoData.target += $${BLD_DATA_DIR}DemoData\\$$basename(FILE)
PRE_TARGETDEPS += $${BLD_DATA_DIR}DemoData\\$$basename(FILE)
}
QMAKE_EXTRA_TARGETS += DemoData
This attempt fails because (AFAICT) QMake does not support variable names contained in other variables. It seems to be more of a one level substitution. A makefile is generated, but the DemoDataX targets all have no command lines. All attempts to display the contents of the 'commands' field generate syntax errors.
DEMO_DATA_FILES = $$files($${SRC_DATA_DIR}DemoData/*)
DEMO_DATA_NAME = DemoData
for(FILE, DEMO_DATA_FILES){
$${DEMO_DATA_NAME}.target = $${FILE}
$${DEMO_DATA_NAME}.commands = $$COPY_CMD $${FILE} $${BLD_DATA_DIR}DemoData
PRE_TARGETDEPS += $${FILE}
QMAKE_EXTRA_TARGETS += $${DEMO_DATA_NAME}
DEMO_DATA_NAME = $${DEMO_DATA_NAME}X
}
This approach works, but with two shortcomings. The minor one is that a separate 'make install' step must be performed. The major one is that the files are always copied unconditionally. Since our data files are large, this is unacceptable timewise.
DemoData.path = $${BLD_DATA_DIR}DemoData
DemoData.files = $${SRC_DATA_DIR}DemoData/*
INSTALLS += DemoData
Is there a way to do this, or am I left with some sort of external script or manually generated/maintained makefile?
Use QMAKE_EXTRA_COMPILES feature.
# list your files in this variable.
# Masks are available with $$files functions but
# if your set of files changes (files added or removed)
# your have to re-run qmake after that explicitly, not just make
MYFILES = $$files($${PWD}/files/*.*)
copy_files.name = copy large files
copy_files.input = MYFILES
# change datafiles to a directory you want to put the files to
copy_files.output = $${OUT_PWD}/datafiles/${QMAKE_FILE_BASE}${QMAKE_FILE_EXT}
copy_files.commands = ${COPY_FILE} ${QMAKE_FILE_IN} ${QMAKE_FILE_OUT}
copy_files.CONFIG += no_link target_predeps
QMAKE_EXTRA_COMPILERS += copy_files
Add your big files to MYFILES variable. For each file a rule will be generated in Makefile that copies file to specified directory (datafiles in the example). Original file will be listed as a dependecy in the rule (this is default qmake behaviour) so copy will occur only when original file is fresher than existing copy. Generated rules are listed as dependencies in the target file rule (copy_files.CONFIG += target_predeps) so copying will occur on every build automatically.
The only caveat is this: if your set of files is dynamic (files are added or removed) you can use masks as in my example but you have to be careful to execute qmake after changing the set. Be aware that Qt Creator builds projects by launching make, not qmake. The most simple way to ensure that qmake will be launched is to modify .pro file.
For those who can read Russian there is more info about QMAKE_EXTRA_COMPILERS here
Do you need the script to be cross platform? I personally wouldn't use the copy command, but robocopy on Windows and rsync on Mac/Linux.
win32: $${DEMO_DATA_NAME}.commands = robocopy $${SRC_DIR} $${DST_DIR} $${FILE} /MIR /XO
!win32: $${DEMO_DATA_NAME}.commands = rsync -aru $${FILE} $${BLD_DATA_DIR}
I'm not really sure what you want to copy here, but you get the idea, you can adapt the files and/or directories.
Robocopy parameters are described here.
/MIR Mirrors a directory tree
/XO Excludes older files.
Rsync parameters are described here.
-a Archive
-r Recursive
-u Update only when the source is newer
As a side note if you don't want to run this make install command, you can set this extra target as a dependency to the project that needs these files: theProjectNeedingThoseFiles.depends += DemoData.