Is it possible to perform a search over all the configuration parameters in a server?
The reason I'm asking is because I have a path that points to a specific C# solution, and I need to update it, but I don't want to have to look in every build configuration.
Basically under the hood all your TeamCity build configurations are just XML files in the BuildServer\config\projects\ folder and sub folders. You can just do a grep over those XML files to find your configuration parameter.
I have done mass find and replace operations on those XML files before without problems (of course got to make sure that what you are replacing is not used anywhere else in the XML, and always keep a backup).
On our on prem TeamCity 2018.1.3 install the buildSettings are stored in /data/teamcity-data/system/artifacts
The following grep term will find all the things you are looking for
grep -rPe "searchRegexHere" --include buildSettings.xml .
Related
i am looking for a build tool that allows me to store the build tool with additional pre and postscripts as well as the build configs in folders separated from the source code.
Most build tools i tried work with config files directly in the source code folder.
Could you recommend something?
I know that it sounds like i am missusing the concept and should simply insert config files in the source code folder. Yet the reasoning behind this will blow up this post without adding a lot of value.
SCons can do it several different ways.
Though Repository() might be the simplest.
See: https://scons.org/doc/production/HTML/scons-user.html#idm46358268990080
I'm still trying to create my first Azure Pipeline CI / CD. My CI part is working fine, my CD is also working except I cannot apply my Web.config file transformations.
Let me first show you what I have then I will ask several questions below. The build with generated artifact. I also copy manually my 3 config files.
Wen I open my WebAPI.zip file here is the path and content:
Here is my pipeline project
And the details of my staging phase:
When I run this full pipeline my config file is never transformed but I get no error. I just get a
2019-05-02T03:27:23.5778958Z ##[warning]Unable to apply transformation
for the given package.
I also have the debug log with full information but it doesn't give me much information for now. I will add it here later.
Questions
Azure Pipeline File Transformation is not working. Why?
Is it because the File Tranform task only look for config file in zip?
Is this system then just ignoring my tranformation file in root of artifact?
So I think my manual copy of config transformation file is obsolete?
How can I then add my transformation file into my zip?
In my csproj I already set all my tranformation files on Build action content, copy always, this is ignored too, is it normal?
EDIT 1
One more important question: Is it possible to simply ask the deployment system to ignore or not deploy my config file. It is not something I want to deploy every time. I like the idea I have to do it manually or from alternative deployment system. With this solution I can have some other issues if I save a version or build variable in my config file. Then is it possible to modify a already deployed file after deployment? I'm looking for workaround here. Example: I read a value in my existing config file then I increment this value by one or simply replace this value with another?
EDIT 2
I'm now able to add the config file to the WebApi.zip package on root and/or in bin folder. I followed the comment of Shayki Abramczyk bu using the xml transform of deploy. Still not working. And the errors messages are so poor. Seriously Microsoft? Is your transformation system even working? I see question similar to mine everywhere.
And now I get
The file is correct, transform works fine from Visual Studio Publish tool. I really think the xml transform tool from Microsoft in Azure is just not working.
EDIT 3
Is it possible to issues from my transformmations come from NLog because of the name and then special rule I apply on it?
There have been some questions about this, but none of them solves my problem.
I use SonarQube to do code analysis on one of my projects, which contain a Migrations directory. I would like to exclude all the source files in that directory from the code analysis.
In the projects Configuration->Settings->Exclusions->Files->Source Files Exclusions I added "**/Migrations/.", but in the analysis results I still get issues in code files in that directory.
The directory structure of my project looks like this: \MyProject\Migrations\SourceFile.cs
What am I doing wrong? Am I entering the wildcard in the wrong place, or my wildcard is wrong?
In the logs I can see
13:06:23.460 INFO - Copy-paste detection exclusions:
13:06:23.476 **/Migrations/*.*
but then I can also see
13:06:12.076 INFO - Inspecting <MyProject>\Migrations\SourceFile.cs
That's the correct place to set it up. Please try simply /Migrations/** or /MyProject/Migrations/**. When you go to one of the issues you want to get rid of, you'll see what your "regex" path should start with.
And one more tip: To see result, you have to rebuild the project, run sonar again. And again, until you get it right.
I had to use a different setting.
Instead of Configuration->Settings->Exclusions->Files->Source Files Exclusions I had to use Configuration->Settings->Exclusions->Issues->Ignore Issues on Multiple Criteria.
In this setting, I had to set the RULE KEY PATTERN to *, and I had to set the path wildcard in the FILE PATH PATTERN, **/Migrations/. works perfectly.
We are in the process of trying to automate our build/deploy processes with the Release Management tool for Visual Studio (formerly InCycle).
The Release Management tool includes a facility to modify settings in a web.config (or app.config). However, there are situations where I'd like to be able to do more than this.
For example, we have URL rewriter rules to automatically redirect HTTP requests to HTTPS. But this won't work (at present) on our dev workstations. So, the "base" version of the web.config doesn't include the rewriter rules -- they are inserted at build/publish time via a web.config transform.
But the Release Management "configuration variable" mechanism won't let me specify more than a single line as a replacement value.
I realize I can remove line breaks, and condense an XML fragment to a single line of text. But I'd rather not have a web.config with lines that are several thousand characters long. And I suspect our IT folks -- who after all may also need to view/edit the file -- would feel rather more strongly about this than I do ;)
In general, the web.config transform mechanism had several modes: you could change a setting but also insert or replace (or delete) an entire section / XML element. While it's nice to no longer be restricted to web.config files (out of the box), the new functionality seems to be much more limited.
Am I missing something? Has anyone else found this to be an issue? What did you do to work around it?
You can still use xml transform to achieve what you want. Make sure that your transform are applied during your build, and the resulting web.config file available in your build output folder will be containing your URL rewriter rules. RM will pick it up from there and apply any other normal token replacement.
Here is a post that help in this regards: http://incyclesoftware.zendesk.com/entries/21487316-InRelease-with-Web-Deploy
If you have multiple stages in your release path, and for example the first stage should not have your URL rewriter section, than it may be a bit harder. You will need to apply your transform as part of your deployment. Multiple components/actions will need to be used for that (xcopy component, xml transform action/component).
I can't find it now, but I know there is some command line tool you can invoke to achieve your xml transformation as part of your deployment.
Apologies for my lack of knowledge about rewriter rules but can they exist in the base version of web.config and be set up so that they don't effectively do anything and 'rewrite' to HTTP?
If that's possible then the way I would do this is to configure a web.config.release file that will create a tokenised web.config via the transformation process. However, rather than use Web One Click Publish I use the /p:UseWPP_CopyWebApplication=true /p:PipelineDependsOnBuild=false arguments in the TFS build definition to apply the transformation. This then results in a build in the drops folder that is completely unaware about any environment it will be deployed to. You then simply use an XCopy Deployer-based component in RM to deploy the website and replace all the tokenised values for that environment. See my blog post here for more details of the technique.
I would like to harvest a folder with a lot of files by using heat.exe. But instead of harvesting all files, I would like to exclude specific file extensions like "*.txt" or something like that.
How can I do this?
I think the only option for now is to harvest the entire folder and apply a transform to the resulting .wxs file (see -t:<xsl> switch) to exclude what is not required (txt files in your case). However, I didn't try the 3.5 version of heat (judging based on the 3.0), but I don't think there are changes in this area.
I'm not a huge proponent of this pattern. How do you ensure change control when using a non-deterministic process? How do you know a file that appeared in a directory really should ship in a product and how do you know a file that vanished from the directory shouldn't break a build? How do you know you are breaking the component rules and creating servicability issues?
I used to do dynamic file linking in the 1990's because it was "easy" but I can remember it biting me many times and I haven't done it ever since.
I know Bob Arnson used to agree with this view point:
http://www.mail-archive.com/wix-users#lists.sourceforge.net/msg03420.html
But now in WiX 3.5 I'm starting to see capabilities that support dynamic linking and I just don't understand why they would go that way. I'd much rather update a WXS file and check it back into source control then risk putting my deployment process on autopilot.
Instead of trying to figure out how to harvest selected files from of a folder, I use a before build action to populate a folder with just the files that I want harvested. The following workflow has been working for me:
Delete a "files" if it exists
Create a "files" folder
Copy the files to the "files" folder. I use the robocopy build action, that gives me enough control to specify which files to include or exclude.
Harvest the entire folder.
I have it set to run the harvest action conditionally, just for debug builds. Release builds are generated from our TFS server and use the generated .wxs from source control. It should be OK to run harvest on the build server, but it's an extra step and not having it run eliminates the "non-deterministic process" problem described by Christopher Painter. Other than that one step, the same steps execute on the build server as they do on my dev machine.