Using Gradle to Rsync instead of copy - gradle

I have a project that involves moving files from one directory to another repeatedly during build and debugging processes. To help with that, I ended up making a task to file copy parts of the project from one location to another.
Is there a way to get gradle to perform an rsync instead of a copy? I feel like 2 minutes to copy all of the necessary files when only making a few changes to one isn't exactly efficient.
Or is there something wrong with gradle for it to be taking that long?

Gradle doesn't ship with rsync-like functionality, but you could call rsync using an Exec task. It's also worthwhile to check whether there is a third-party plugin.

Related

How does a gradle build know which modules have changed?

I want to write a bash script which grabs only the outputted jars for the modules within my project which have changed (after a build) so that I can copy them up to a server. I don't want to have to copy every single module jar every time, as in if you do a full clean build. It's a gradle project using git. I know that gradle can do an incremental build based on only the modules whose code has updated but is there a way this plugin (assuming it's a plugin) can be called? I have done some searching online but can't find any info.
Gradle has the notion of inputs and outputs that are associated with a task. Gradle takes snapshots of the inputs and outputs for a task the first time they run and on each subsequent execution. These snapshots contain hashes of the contents of each file. This enables gradle to check upon subsequent executions, if the inputs and/or outputs have changed and decide if the task needs to be executed again.
This feature is also available to custom gradle tasks (those that you write yourself) and is one way in which you could implement the behaviour you are looking for. You could invoke the corresponding task from a bash script, if needed. More details can be found here:
Gradle User Guide, Chapter 14.
Otherwise, I imagine your bash script might need to compare the modified timestamps of the files in question or to compute and compare hashes itself.
The venerable rsync exists to do exactly this kind of thing: find differences between an origin and a (possibly remote) destination, and synchronize them, with lots of options to choose how to detect the differences and how to transfer them.
Or you could use find to search for .jar files modified in the last N minutes ...
Or you could use inotifywait to detect filesystem changes as they happen...
I get that getting Gradle to tell you directly what has been built would be the most logical thing, but for that I'd say you have to think more Java/Groovy than Bash... and fight your way through the manual.

Symlinking or hardlinking (but not copying) files with Maven

I'm switching an old web-application to using Maven rather than Ant. It goes mostly fine, but there is one thing I'm not sure about.
With customly written Ant build file I had a "development deployment mode", where it would symlink certain files (JSP and certain others) rather than copying them. This would result in a very streamlined development procedure: once you have deployment running, you just edit files in your source code checkout directory, and webserver picks up these changes automatically. Basically, you edit something in your editor, save file, and in a few seconds the changes automatically become visible through your browser, without any further steps.
How would I go about implementing something similar with Maven?
While this doesn't seem possible without writing a custom plugin, I found war:inplace goal in maven-war-plugin, which achieves what I want. The only downside is that I have to keep JSP files, JS files, images etc. together in src/main/webapp rather than have them logically separated in e.g. src/main/jsp, src/main/js, but that's not that important.

Using Gradle To Move Source Controlled Files Across Network

A project I'm working on uses Gradle to build code bases. I'm quite new to Gradle. I want to modify the current build process such that, once the build concludes, it moves a set of files (configuration files in this case), which are under source control (SVN), to a folder on another server on the network.
I've looked through the Gradle user guide, and have come across the 'copy' and 'sync' tasks. Will either of these be sufficient to carry out what I've described? Thanks for your help.
You can define a remote repository and upload there like in How to upload artifact to network drive using gradle?
If that doesn't suit your need you can look for gradle-ssh-plugin or just write a task that will execute scp or something similar.

Dynamically adding gradle projects at runtime

I'm in the process of updating our build process for Android to use gradle. We have client-specific apps - i.e. a single code template which is used as the basis for all the apps which are created dynamically.
To build the apps, I loop through a CSV file to get the details for each one. I then take a copy of the source template, inserting the client's name, images, etc. before compiling the app. This works fine in the current system. In the gradle version, I've got it successfully looping through the rows and creating the app source for each one with the right details. However when I try to actually build the app, it fails with the message:
Project with path ':xxxxxx' could not be found in root project 'android-gradle'.
From reading the documentation, I understand that this is because the project doesn't exist during the configuration phase as it's not created until the execution phase. However what I haven't been able to find is a way around this. Has anyone managed to achieve something similar? Or perhaps a suggestion for a better approach?
One option is to script settings.gradle. As in any other Gradle script, you have the full power of Groovy available. Later on you can no longer change which projects the build is composed of. Note that settings.gradle will be evaluated for each and every invocation of Gradle, so evaluation needs to be fast.
While Peter's answer pointed me in the right direction, it ended up not being a workable solution. Unfortunately with nearly 200 apps to build, creating a copy of the source for each one was too big an overhead and gradle kept running out of memory.
What I have done instead is to make use of the Android plugin's product flavors functionality. It was quite straight forward dynamically adding a productFlavor for each row in the CSV (and I can do that in build.gradle rather than settings.gradle), and I just set the srcDir to point to the relevant images etc for each one.

How to add some prebuild steps to jenkins?

I am a Jenkins newbie and need a little hand holding because we only maintain parts of our app in SVN. I have basic Jenkins install setup.
This is what I do to get a local DEV environment setup and need that translated to Jenkins in order to make a build:
DO SVN checkout (and get the 2 folders that are under SVN)
Delete the folders
Copy over the full app from FTP location
Do SVN restore
download sql file
Import into MySQL
How would I get the above mentioned steps in Jenkins? I know there are some post build steps that I can use. Just not sure how to put it all together. Any help will be much appreciated.
Tell Jenkins about the SVN repository and it will check it out automatically when a new build is started. That should take care of 1. 2-5 would be build steps (i.e. execute shell commands). Basically, you can set up Jenkins to do exactly what you do on the command line, except that the first step is taken care of automatically if you tell Jenkins about the repository.
Rather than trying to do these sort of things in Jenkins, you'll likely save yourself some trouble if you make use of something like Ant or NAnt to handle the complexities for your build.
I've found that doing my builds this way gives me added flexibility (ie, if it can be done via the command-line, I can use it in my build, rather than needing a Jenkins plugin to support it), and makes maintenance easier as well (since my NAnt scripts become part of the project and are checked into the VCS system, I can go back if I make a change that doesn't work out.
Jenkins has some build-history plugins, but over time I've found it easier to keep the majority of my 'build' logic and complexity outside of the CI environment and just call into it instead.

Resources