Maven, How to Package ONLY Changed Files? - maven

I have a project of PL/SQL files (stored procedures). I need to hand an archive of only files that have changed to the DBA to execute for deployment to production. How can I create an archive with Maven that only contains files from the project that have changed since the last release?
Thanks

In case of database scripts, look for solutions like liquibase or solidbase. These frameworks are capable of recognizing which scripts have already been executed and where to start from.

Related

Execute only SQL delta (change) files in Jenkins CI/CD and get deployed on production server

I am new aspirant of DevOps and working on Oracle ci/cd using Jenkins. What I am looking for is, can I able to execute and deploy only recently changed SQL files using Jenkins CI/CD pipeline?
I have found similar blogs but not appropriate one. My task is to execute only recently changed or newly added SQL files (assume there are no dependencies) using jenkins pipeline.
For executing SQL files I am using flyway free edition and gitlab as SCM.
One of my idea is:
To check-in only recently commited files to Jenkins custom workspace and execute only that particular workspace.
Say for example,in current build if I change 4 files in gitlab, only these files need to be dumped on Jenkins custom workspace and my job should execute only those 4 files and for next build if I change only 2 files it should remove old files and new 2 files should be dumped and executed in that custom workspace.
Is this possible? I am not sure about it. If not this way, is there any other chances that can work for my current scenario?
I am trying to dump files to workspace so that I can able to copy these files from custom workspace to flyway and then flyway will execute them.
Current challenges are:
-To dump only recently changed files to my custom workspace
-To execute only that particular custom workspace (not main workspace)
I am not sure whether I posted it in understandable way😅, if my idea seems to be complex please provide any other related optimal solutions
Thanks in advance

Liquibase: tracking changelogs

The project
We are ~50 developers plus DevOps staff and run ~30 Oracle 12c EE instances. We have introduced Liquibase in 2018.
We use the Liquibase Maven plugin version 3.8.8, changelogs are stored in numerous Maven projects, these are commited to subversion in the usual trunk/tag/branch structure.
The Goal
We want to ease provisioning of new database instances with release versions matching the respective environments. A typical use case is setting up a fresh database in the integration test environment.
One would start with an empty database schema and apply changelogs up to a certain version.
Unfortunately, changelogs that were applied to a schema are often stored in different Maven projects. This makes them hard to locate.
Liquibase does NOT store actual changeset contents (concrete DDL) in the DATABASECHANGELOG table. This would solve the problem.
In search of a solution, I first used maven to store the changelog's SVN revision into the DATABASECHANGELOG when liquibase:update was executed.
Retrieving changelogs based on revision number was error prone.
I have spend a week now to find a robust solution, googled for hours and built several test cases (with adapted parent and and concrete poms, partly using the maven scm plugin and such) but without luck. Initially, I planned to use liquibase:tag to store file path + revision, but this works only if all changesets are in one single changelog file, which is not the case.
Of course, it's desirable to have all changelogs stored in ONE location,
but this is not always possible. For example, scripts that require DBA privileges must to be committed to extra maven projects.
I need a strong reference between each changeset and the corresponding changelog file, or the changelog must be stored directly in the DATABASECHANGELOG.
With our current setup, "Database versioning" with Liquibase is not possible. There is theoretical
tracebility, but it is up to the users to somehow locate original changelogs in a huge mess of 100+ separate Maven projects.
Question 1: Is it possible to store the actual changelog content for each changeset into the DATABASECHANGELOG?
Question 2: If not, how can one preserve a reference beetween a DATABASECHANGELOG entry and the originating changelog file?
(Also, what happens, when a changelog file is deleted from subversion by accident? The DATABASECHANGELOG would just tell me the date and time of the change, some details and a file name - pretty useless, because the actual file would be gone and there would be no way to restore the actual DDL. To prevent such a scenario, I would backup all changelog files. To do that, DATABASECHANGELOG meta data is insufficient, as Liquibase does not track SVN revisions and file paths.)
One option would be to combine various SVN repositories into new one using SVN Externals and then create new changelog file.
You can map URLs (SVN Tag/Branch/Revisions) to a folder without copying using SVN Externals. http://svnbook.red-bean.com/en/1.7/svn.advanced.externals.html.
Hope it helps.

TeamCity Using OctoPack - Isn't Excluding Superfluous Files

I'm just looking at streamlining the nuget packages that are coming out of my build system and I'm stuck on how to only package the files that are required.
I have several configurations sharing a Root VCS checkout. I have a configuration that runs a debug build with unit tests. I also have a release configuration that does a release build, this configuration then also uses the TeamCity OctoPack plugin to create the nuget packages.
What I want to achieve is the building of nuget packages that don't contain the *.pdb and *.xml documentation files as these aren't required for the release deployment.
I've looked through this page on the OD site:
http://docs.octopusdeploy.com/display/OD/Using+OctoPack
And according to this page OctoPack should only package up the required files by default. I'm not entirely clear on how or what needs to be done to get around this problem as it doesn't appear to be working as described.
It seems that one solution would be to provide a nuspec file for the projects I'm looking to deply but I'm also wondering if there is something I'm missing before I head off down that route.
I also have some MEF plugins that are copied in post build events and these aren't included in the nuget packages when in fact they are needed for the application to run. I think I need to get explicit with a nuspec file but would like to confirm this.
What is the simplest way of achieving what I need?
Assuming you're running the later versions of OctoPack, in your release build you can set a system parameter system.DebugType = None which will get passed to the OctoPack build scripts and prevent the PDB's being created.
This simply overrides the setting defined in your csproj msbuild file (assuming C#), so you can use it wherever you want to prevent PDB's being created at the build configuration level (not just OctoPack). I generally prefer this approach as it prevents side-effects in your build from changes by developers in the project file.
As for the xml files, I haven't actually tried this, but you can try a similar approach and create a system parameter system.DocumentationFile = "" to blank out the output.
If you really want to make sure that the files have been removed there are a couple of ways you can do this. Modify your deployment process to:
Execute your own custom PowerShell script in that removes the files
Include a script module from the Octopus Library to the same. Check out the File System - Clean Directory from the Octopus Library

Build action for sql script files in database projects (*.sqlproj)

For a sql script file in a database project, is there anything to be gained by using a Build Action of Build or Compile? I usually just use None, but wonder if Build or Compile do anything for a sql script e.g. perhaps some sort of additional syntax checking?
(We use our own custom deployment for scripts, so deployment options are not required.)
When performing a schema compare, only .sql files with their build action set to Build are included.
If you are building the project, creating a dacpac and then using sqlpackage.exe to deploy the dacpac to the server. Then setting the build action to none on a script will ensure that the script is not included in the update to the server. It give you8 a way to control what updates are carried out thru this tooling.
Nope, there is nothing to compile, just use None...

CC.NET: howto trigger an extended build when a subdirectory in svn has changed

We have a couple of projects configured in cc.net. Each of these projects has following items in it's working directory (svn):
source
lib
db scripts
SSIS package(s)
We would like to know if there is a way to find out if there are any modifications in the subdirectory containing the SSIS packages? This would allow us to do a full build (including execution of package). We don't want to do this with every build since package execution might take some time...
Our other option is to create a cc.net project that does the complete builds at night time.
Does anybody have a nice solution to this problem?
I would use the SvnVersion task from MSBuild Community Tasks to identify the latest revision on the SSIS package subdirectory. Then compare it to one you've stored in a file somewhere in your working directory or elsewhere.
If it's different, pass a property to your main MSBuild task with a flag instructing to build and execute the SSIS packages. Once that's done, update your revision file with the new new revision number for the SSIS subdir.

Resources