With Coffeescript coffee command, it is possible to compile a coffee source file into its JavaScript equivalent with -c flag:
coffee -c toto.coffee
Shall produce an appropriate toto.js file.
Now what I would like to do is to compile many .coffee files into their respective .js equivalents and then concatenate them into a final library file. Something like this:
coffee -c toto.coffee
coffee -c foo.coffee
coffee -c bar.coffee
??? toto.js foo.js bar.js # Would produce a final .js file
The coffeescript command proposed me to do the following:
cat toto.coffee foo.coffee bar.coffee | coffee -c -s > library.js
But the problem is, if I modify one line in one coffee file, the entire library would have to be recompiled.
Neither the coffee command nor one of coffeescript build tools solved my problem.
I also looked at this question, where gruntjs is mentioned, but I didn't find any clear explanations on how I should use it in my specific case.
You could keep the coffeescript compiler watching for changes using -w flag. So everytime you change a line only this file will be recompiled just after save.
The concat can also automatically concatenate all files together after at least one changed. But as far as I know you have to code a nodejs helper therefor. I would use the chokidar library to do so.
Related
i'm working on a project requiring cmake. i'd like to add some custom rules to my makefile, but can't quite get my head around how to do it.
both c source files and header files are in the same directory. also in this same directory are a number of .def files, which are the sources for some of the header files #included in the source during compilation.
if i were to do this in a makefile, i'd use a simple rule like
.SUFFIXES: .def
.def.h:
$(PREPROC) $< > $#
how can i do this with cmake ??
i've tried various permutations of the following, both with and without cmake working directory specifications :
add_custom_command(
OUTPUT vvr_const.h
PRE_BUILD
COMMAND preproc vvr_const.def > vvr_const.h
DEPENDS vvr_const.def
)
add_custom_target(vvr_const.h DEPENDS vvr_const.def)
but the header file isn't generated by the time the c source file is compiled, so the compile fails. i've also tried a variation where i replace the last line above with
set_property(SOURCE main.c APPEND PROPERTY OBJECT_DEPENDS vvr_const.h)
in this case, the header file is correctly generated in advance, but make can't find it, and complains that there's no rule to make the target .h.
ideally this would be a general rule, like the make rule above, but i'm not opposed to making a separate rule for each of the .def files if that's what it takes.
cheers.
There are 2 problems with the add_custom_command approach you present:
You did not specify a working directory; by default the command is run in the build directory, not in the source directory.
You rely on shell functionality here (the redirect to a file). Even though this probably still works. You should go with an approach that does not rely on the shell.
To solve issues 1 and 2 I recommend creating a seperate cmake script file receiving the absolute paths to input and output files and using those in the custom command. This allows you to use execute_process to specify the file to write without relying on the platform.
preprocess_def.cmake
# preprocess def file
# parameters INPUT_FILE and OUTPUT_FILE denote the file to use as source
# and the file to write the results to respectively
# use preproc tool to get data to write to the output file
execute_process(COMMAND preproc "${INPUT_FILE}"
RESULT_VARIABLE _EXIT_CODE
OUTPUT_FILE "${OUTPUT_FILE}")
if (_EXIT_CODE)
message(FATAL_ERROR "An error occured when preprocessing the file ${INPUT_FILE}")
endif()
CMakeLists.txt
set(_INPUT_FILE "${CMAKE_CURRENT_SOURCE_DIR}/vvr_const.def")
set(_OUTPUT_FILE "${CMAKE_CURRENT_SOURCE_DIR}/vvr_const.h")
# not necessary to use build event here, if we mark the output file as generated
add_custom_command(OUTPUT "${_OUTPUT_FILE}"
COMMAND "${CMAKE_BUILD_TOOL}" -D "OUPUT_FILE=${_OUTPUT_FILE}" -D "INPUT_FILE=${_INPUT_FILE}" -P "${CMAKE_CURRENT_SOURCE_DIR}/preprocess_def.cmake"
DEPENDS "${_INPUT_FILE}")
add_executable(my_target vvr_const.h ...)
set_source_files_properties(vvr_const.h PROPERTIES GENERATED 1)
Documentation from cmake:
PRE_BUILD
On Visual Studio Generators, run before any other rules are executed within the target. On other generators, run just before PRE_LINK commands.
So possibly your command is just running too late.
According to Clang doc:
-I < directory >
Add the specified directory to the search path for include files.
I wounder if there a way to add multiple search path on the same directory with one -I command, something like this:
-I"Dir1/SubDir/SubDir/SubDir/{IncludePath1,IncludePath2,IncludePath3}"
My project folder tree (unfortunately) is in a formation that there are 2 main folders for include paths which each one includes many paths for the -I option. This cause the clang command to be very long and i will give an example:
clang (...)
-I"Dir1/SubDir/SubDir/.../SubDir/IncludePath1"
-I"Dir1/SubDir/SubDir/.../SubDir/IncludePath2"
-I"Dir1/SubDir/SubDir/.../SubDir/IncludePath3"
-I"Dir1/SubDir/SubDir/.../SubDir/(And so on...)"
-I"Dir2/SubDir/SubDir/.../SubDir/IncludePath1"
-I"Dir2/SubDir/SubDir/.../SubDir/IncludePath2"
-I"Dir2/SubDir/SubDir/.../SubDir/IncludePath3"
-I"Dir2/SubDir/SubDir/.../SubDir/(And so on...)"
So again i wonder if there a way to tell clang to search with one command multiple search path or maybe make it search within a specific dir
Use options -isysroot and -iwithsysroot:
clang -isysroot"Dir1/SubDir/SubDir/SubDir/" -iwithsysroot"/IncludePath1/" \
-iwithsysroot"/IncludePath2/" -iwithsysroot"/IncludePath3/"
Unfortunately, this solution only works for one main folder and it also makes those include folders system ones, i.e., Clang won't show any warnings for them.
Also, -iwithsysroot is pretty long, so you may not save much typing there :)
But I'm not aware of any better way to do this directly via Clang options.
Although you could always write a shell script to ease the job...
Note. While digging through Clang command line reference trying to find a better solution, I came across option -ivfsoverlay that seems like it maybe could solve your problem.
I wasn't able to make it work, though, but I still decided to leave it here, maybe it'll be useful for you.
Consider having a makefile, which can generate some files using generating lines listed in a file. For example file 001 using first line, 002 using second line, etc.
This file can be changed (it has it's own dependences, but this doesn't metter).
If some lines in this file changed, appropriate files should be remade. But other files shouldn't.
The solution I found is this one: for every file there is flag-file which content is generating line it was made last time. After remaking file with generating lines, i check all this file, and remove them if line changed. So files which have dependences on removed files will be remade, and other files won't. But this works too slow if use msys-make.
Can you suggest any other solution, which doesn't need many extra calls to file system and executable runs.
If I understand your description correctly, what you're describing is a Makefile which depends on another file that is functionally a Makefile but for unknown reasons uses a different format.
Turn that file into Makefile format and include it into the original Makefile. (If you're using GNU Make.)
We have what may be similar to your problem. We have xml files, say foobar.xml which contains the dependencies for foobar.out
<files>
<file>a</file>
<file>b</file>
<file>c</file>
</files>
We decided to adhere to this simple layout, so we don't need to parse xml. We use makepp (because we got fed up with gmake not noticing dependencies like changed commands). This gives us the builtin &sed command (the expression is actually Perl programming, but as you see, you don't need to get into it much). Here's what we do with three simple substitutions for the three kinds of lines:
%.d: %.xml
&sed 's!<files>!$(stem).out: \\! || s!<file>(.+)</file>!$$1 \\! || s!</files>!!' \
$(input) -o $(output)
include foobar.d
This produces foobar.d which we then include:
foobar.out: \
a \
b \
c \
Stackoverflow is swallowing the last empty line here, which avoids having to worry about the trailing backslash.
I have source code tree that contains about 300 Makefiles. These Makefiles compile sources to object files and object files to several firmware images.
What I want is to get a listing like this:
<firmware image name> : <list of object files> : <list of source files>
Is there any tool for that?
Well, if what you are shooting for is getting a list of images that will be built, you could try playing around with the -n and -W flags.
-n tells make to do a dry-run. It will print out all the commands it would have executed, but won't really execute them. If you do this after a make clean it might give you the information you want. Perhaps not in exactly the form you want, but that's what sed and awk are for.
-W file tells make to pretend that file was modified, and do its thing. This might be helpful if a make clean would be excessive.
Not by magic. Make doesn't know what are sources and what are objects. It just knows that there are all targets with various dependency relationships between them. You could try post-processing -p.
I'm building ubuntu-8.04 with gcc 3.4 and I need to generate the .i files, which are the output of the gcc preprocessor. I have tried adding the --save-temps flag but this only generates the .i files for the top level directory, i.e. source, and does not seem to get passed recursively to the child directories. I also tried the -E flag, which is supposed to output preprocessed files and stop compilation, but this did not generate the files either.
I'm specifically looking to generate the .i files for the source in net/core.
Any help is appreciated. Thanks!!
There is no support for bulk preprocessing.
For single file use "make net/core/foo.i"
For bulk, workaround is "make C=2 CHECK="cc -E"".
I know that is an old post, but maybe can be useful; for me this works:
gcc -E filename.c -o outputfile.i