Get list of files containing string(s) or pattern(s) - gradle

Is there a Gradle pattern for retrieving the list of files in a folder or set of folders that contain a given string, set of strings, or pattern?
My project produces RPMs and is using the Nebula RPM type (great package!). There are a couple of different kinds of sets of files that need post-processing. I am trying to generate the list of files that contain the strings that are the indicators for post-processing. For example, files that contain "#doc" need to be processed by the doc generator script. Files that contain "#HOSTNAME#" and "#HOSTFQDN#" need to be processed by sed to replace the strings with the actual host name or host fqdn.
The search root in the package will be src\main\resources. With the result the build script sets up the post-install script commands - something like:
postInstall('/opt/product/bin/postprocess.sh ' + join(filesContainingDocs, " "))
postInstall('/bin/sed -i -e "s/#HOSTNAME#/$(hostname -s)/" -e s/#HOSTFQDN#/$(hostname)/" ' + join(filesContainingHostname, " ")
I can figure out the postinstall syntax. I'm having difficulty finding the filter for any of the regular Gradle 'things' (i.e., FileTree) that operate on contents of files rather than names of files. How would I populate filesContainingDocs and filesContainingHostname - something along the lines of:
filesContainingDocs = FileTree('src/main/resources', { contents.matches('#doc') }
filesContainingHostname = FileTree('src/main/resources', { contents.matches('#(HOSTNAME|HOSTFQDN)#') }
While the post-process script could simply do the grep, the several RPMs in our product overlay each other and each RPM should only post-process the files it provides, so a general grep over the final installed folder is not workable - it would catch files provided by other RPMs. It seems to me that I ought to be able to, at build time, produce the correct static list of files from the bigger set of source files that comprise the given RPM's project.
It doesn't have to be FileTree - running a command like findstr /s /m /c:"#doc" src\main\resources\*.conf (alas, the build platform is Windows) produces the answer in stdout but I'm not sure how to get that result into an object Gradle can use to expand the result. (I also suspect there is a 'more Gradle way' to do this.)
The set of files, and the contents of those files, is generally fairly small.

I'm having difficulty finding the filter for any of the regular Gradle 'things' (i.e., FileTree) that operate on contents of files rather than names of files.
You can apply any filter you can imagine on a Gradle file tree, in the end it is just Groovy (or Kotlin) code running in the JVM. Each Gradle FileTree is nothing more than a (lazily evaluated) collection of Java File objects. To filter those File objects, you can read their content, e.g. in the same way you would read them in Java. Groovy even provides a JDK enhancement for the Java class File that includes the simple method getText() for this purpose. Now you can easily filter for files that contain a certain string:
filesContainingDocs = fileTree('src/main/resources').filter { file ->
file.text.contains('#doc')
}
Using Groovy, you can call getters like .getText() in the same way as accessing fields (.text in this case).
If a simple contains check is not enough, the Groovy JDK enhancements even provide the method matches(Pattern pattern) on CharSequence/string instances to perform a regular extension check:
filesContainingDocs = fileTree('src/main/resources').filter { file ->
file.text.replace('\r\n','\n').matches('.*some regex.*') }
}

Related

Build translated Sphinx docs in separate directories

I work on a documentation that will be published in different languages. It is one of the reasons I use Sphinx.
I know how generate the translated version but with the setting described in the documentation, the resulting files replaces the ones that were generated before. Thus, when generating multiple translation, I have to move the files to another directory before doing anything else. It would be more practical (and easier to deploy) to generate the translations in separate directories.
Is there a way to tell Sphinx or the makefile that when I run
make -e SPHINXOPTS="-D language='(lang)'" (format)
the files have to be generated in /build/(format)/(lang) ?
For now, only the HTML build is used (and I doubt that something else will be used) so a specific solution would be accepted if it is not possible to do it globally.
Sphinx version is 1.4.6.
I found a working solution by replacing the Makefile by a custom Python script (build.py).
Using sys.argv, I emulate the make target behaviour. I added several options for the language. Using the subprocess module, precisely its call() function, I am able to run commands with a set of options. The script is based on a function that generates the command to be executed by subprocess.call():
def build_command(target, build_dir, lang=None):
lang_opt = []
if lang:
lang_opt = ["-D", "language='" + lang + "'"]
build_dir += "/" + lang
else:
build_dir += "/default"
return ["sphinx-build", "-b", target, "-aE"] + lang_opt + ["source", "build/" + build_dir]
It is the lang parameter that allows me to separate each language, independently of the target. Later in the code, I just run
subprocess.call(build_command(target, target, lang))
To build the documentation in the desired language with the specified target (usually, target = "html"). It can also emulate make gettext:
subprocess.call(build_command("gettext", "locale"))
And so on...
A better solution may exist, but at least this one will do the job.

How to set Sphinx's `exclude_patterns` from the command line?

I'm using Sphinx on Windows.
Most of my documentation is for regular users, but there are some sub-pages with content for administrators only.
So I want to build two versions of my documentation: a complete version, and a second version with the "admin" pages excluded.
I used the exclude_patterns in the build configuration for that.
So far, it works. Every file in every subfolder whose name contains "admin" is ignored when I put this into the conf.py file:
exclude_patterns = ['**/*admin*']
The problem is that I'd like to run the build once to get both versions.
What I'm trying to do right now is running make.bat twice and supply different parameters on each run.
According to the documentation, I can achieve this by setting the BUILDDIR and SPHINXOPTS variables.
So now I have a build.bat that looks like this:
path=%path%;c:\python27\scripts
rem BUILD ADMIN DOCS
set SPHINXOPTS=
set BUILDDIR=c:\build\admin
call make clean
call make html
rem BUILD USER DOCS
set SPHINXOPTS=-D exclude_patterns=['**/*admin*']
set BUILDDIR=c:\build\user
call make clean
call make html
pause
The build in the two different directories works when I delete the line set BUILDDIR=build from the sphinx-generated make.bat file.
However, the exclude pattern does not work.
The batch file listed above outputs this for the second build (the one with the exclude pattern):
Making output directory...
Running Sphinx v1.1.3
loading translations [de]... done
loading pickled environment... not yet created
Exception occurred:
File "C:\Python27\lib\site-packages\sphinx-1.1.3-py2.7.egg\sphinx\environment.
py", line 495, in find_files
['**/' + d for d in config.exclude_dirnames] +
TypeError: coercing to Unicode: need string or buffer, list found
The full traceback has been saved in c:\users\myusername\appdata\local\temp\sphinx-err-kmihxk.log, if you want to report the issue to the developers.
Please also report this if it was a user error, so that a better error message can be provided next time.
Either send bugs to the mailing list at <http://groups.google.com/group/sphinx-dev/>,
or report them in the tracker at <http://bitbucket.org/birkenfeld/sphinx/issues/>.
What am I doing wrong?
Is the syntax for exclude_patterns in the sphinx-build command line different than in the conf.py file?
Or is there a better way to build two different versions in one step?
My first thought was that this was a quoting issue, quoting being notoriously difficult to get right on the Windows command line. However, I wasn't able to come up with any combination of quoting that changed the behavior at all. (The problem is easy to replicate)
Of course it could still just be some quoting issue I'm not smart enough to figure out, but I suspect this is a Sphinx bug of some kind, and hope you will report it to the Sphinx developers.
In the meantime, here's an alternate solution:
quoting from here:
There is a special object named tags available in the config file. It can be used to query and change the tags (see Including content based on tags). Use tags.has('tag') to query, tags.add('tag') and tags.remove('tag') to change
This allows you to essentially pass flags into the conf.py file from the command line, and since the conf.py file is just Python, you can use if statements to set the value of exclude_patterns conditionally based on the tags you pass in.
For example, you could pass Sphinx options like:
set SPHINXOPTS=-t foradmins
to pass the "foradmins" tag, and then check for it in your conf.py like so:
exclude_patterns = blah
if tags.has('foradmins'):
exclude_patterns = []
That should allow you to do what you want. Good Luck!

Automatically generate conf file during make

I have a conf file that is of the format:
name=value
What I want to do is using a template, generate a result based on some values in another file.
So for example, say I have a file called PATHS that contains
CONF_DIR=/etc
BIN_DIR=/usr/sbin
LOG_DIR=/var/log
CACHE_DIR=/home/cache
This PATHS file gets included into a Makefile so that when I call make install the paths are created and built applications and conf files copied appropriately.
Now I also have a conf file which I want to use as a template.
Say the template contains lines like
LogFile=$(LOG_DIR)/myapp.log
...
Then generate a destination conf that would have
LogFile=/var/log/myapp.log
...
etc
I think this can be done with a sed script, but I'm not very familiar with sed and regular expression syntax. I will accept a shell script version too.
You should definitely go with autoconf here, whose very job is to do this. You'll have to write a conf.in file, wherein all substitutions are marked with #'s, e.g.
prefix=#prefix#
bindir=#bindir#
and write up a configure.ac, which is a shell script that will perform these substitutions for you and create conf. conf is subsequently included in the Makefile. I'd even recommend using a Makefile.in file, i.e. including your snippet in the Makefile.
If you keep to the standard path names, your configure.ac is a four-liner and has the added advantage of being GNU compatible (easy to understand & use).
You may want to consider using m4 as a simple template language instead.

How can I set up an Xcode build rule with a variable output file list?

Build Rules are documented in the Xcode Build System Guide
They are well adapted to the common case where one input file is transformed into a fixed number (usually one) of output files.
The output files must be described in the "Output Files" area of the build rule definition; one line per output file. Typically the output files have the same name as the input file but have different extensions.
In my case, one single input file is transformed into a variable number of files with the same extensions. The number and the names of the output files depend on the content of the input file and are not known in advance.
The output files will have to be further processed later on (they are in this case C files to be compiled).
How can I set up a build rule for such a case?
Any suggestions welcome.
(I asked the same question on the Apple developer forum, but I figured it'd be a good idea to ask here too).
I dealt with this by, instead of generating multiple C files, just concatenating them all together into one file (e.g. "AUTOGENERATED.c"), and specifying that as the output file.
So long as your output files don't contain anything that will conflict (static functions with the same name, conflicting #defines etc.) this works well.
See this article on Cocoa With Love:
http://cocoawithlove.com/2010/02/custom-build-rules-generated-tables-and.html
This has an example of generating custom C code and using that as input to the normal build process. He's using ${} variable syntax in the output
The best way I found to add any number of files to my xcode project (and make some processing) is to write a little php script. The script can simply copy files into the bundle. The tricky part is the integration with xcode. It took me some time to find a clean way. (You can use the script language you like with this method).
First, use "Add Run Script" instead of "Add Copy File"
Shell parameter:
/bin/sh
Command parameter:
${SRCROOT}/your_script.php -s ${SRCROOT} -o ${CONFIGURATION_BUILD_DIR}/${UNLOCALIZED_RESOURCES_FOLDER_PATH}
exit $?
(screenshot in xcode)
${SRCROOT} is your project directory.
${CONFIGURATION(...) is the bundle directory. Exactly what you need :)
This way, your script return code can stop xcode build (use die(0) for success and die(1) for failures) and the output of script will be visible in xcode's build log.
Your script will look like that: (don't forget chmod +x on it)
#!/usr/bin/php
<?php
error_reporting(E_ALL);
$options = getopt("s:o:");
$src_dir = $options["s"]."/";
$output_dir = $options["o"]."/";
// process_files (...)
die(0);
?>
BONUS: here my 'add_file' function.
Note the special treatment for PNG (use apple's png compression)
Note the filemtime/touch usage to prevent copy files each times.
l
define("COPY_PNG", "/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/copypng -compress");
function add_file_to_bundle($output_dir, $filepath) {
// split path
$path_info = pathinfo($filepath);
$output_filepath = $output_dir.$path_info['basename'];
// get file's dates of input and output
$input_date = filemtime($filepath);
$output_date = #filemtime($output_filepath);
if ($input_date === FALSE) { echo "can't get input file's modification date"; die(1); }
// skip unchanged files
if ($output_date === $input_date) {
//message("skip ".$path_info['basename']);
return 0;
}
// special copy for png with apple's png compression tool
if (strcasecmp($path_info['extension'], "png") == 0) {
//message($path_info['basename']." is a png");
passthru(COPY_PNG." ".escapeshellarg($filepath)." ".escapeshellarg($output_filepath), $return_var);
if ($return_var != 0) die($return_var);
}
// classic copy
else {
//message("copy ".$path_info['basename']);
passthru("cp ".escapeshellarg($filepath)." ".escapeshellarg($output_filepath), $return_var);
if ($return_var != 0) die($return_var);
}
// important: set output file date with input file date
touch($output_filepath, $input_date, $input_date);
return 1;
}

Path for tags in VIM for multiple projects

I've recently started using ctags on my projects. I currently have the following setup:
root/tags [contains all non-static tags]
root/foo/tags [contains static tags for the foo directory]
root/bar/tags [static]
root/something/else/tags [etc.]
...
I can set tags=./tags,tags,/path/to/root/tags and everything works perfectly.
However, my problem is that I work on several projects at once, so I have, for example, /path/to/root1, /path/to/root2, and /path/to/root3 all at once. I'd rather not manually set the tags each time I open a file; is there any way I can have tags to to the /path/to/rootX based on the file I'm editting? (i.e., if I'm editing /path/to/root3/foo/x.c, use the tags in root3/tags?
In my case, all of my projects share a common parent directory; what I really want is something like:
set tags=./tags,tags,substitute("%:p:h", "\(^\/path\/to\/.*/\).*$", "\1", "")
but I can't seem to get the right vimfu to make it work.
EDIT: I just realized that this won't work; I can't actually write to root*. Instead, I'd like to store my main ctags file in ~/ctags/root*/tags, where there's a 1:1 mapping between the subdirectories of ~/ctags/ and /path/to/ [For those who may be wondering, these are ClearCase UCM dynamic views; neither /view/XXX/ nor /view/XXX/vobs/ is writable]
If what you want is:
set tags=./tags,tags,substitute("%:p:h", "\(^\/path\/to\/.*/\).*$", "\1", "")
Try:
let &tags = './tags,tags,' . substitute(expand("%:p:h"), "\(^\/path\/to\/.*/\).*$", "\1", "")
There's no expansion in a :set command. Also, "%:p:h" won't be expanded automatically, so use expand(). See:
:help :let-option
:help expand()

Resources