We've been using TeamCity as a NuGet server with great success. Last week I migrated TeamCity to a new server machine. The database remained where it was on an external server. I copied the artifacts folder to the new server and I can see them all listed on the project configuration page. However, the old packages are not showing up in the NuGet package feed. What else do I need to do to get the new TeamCity server to list those old packages in the feed?
Other projects that depend on older versions of these packages are currently broken because they can't be found.
New builds of those NuGet packages are showing up in the feed, but I also need all the old ones.
thanks demis. you put me on the right path to resolving this for me.
to make the answer here a little more complete and formal:
overwriting the following files:
provider-nuget.data
provider-nuget.index
in \system\caches\buildsMetadata
with the same files from the old build server worked for me (before creating new builds - not sure you'd be able to get a merge done after the fact).
I don't remember now exactly what I did to fix this, but it had something to do with the files inside this folder on the server running TeamCity: C:\ProgramData\JetBrains\TeamCity\system\caches\buildsMetadata
Did you move the old artifacts from older builds as well?
Related
I'm pretty new to OctopusDeploy and am trying to set up a process to deploy our artifact to multiple Windows Servers.
As of right now it is deploying the package to the default working directory of C:\Octopus\Applications...... but I need it to be deployed to a different path.
I have defined a Custom Install Directory in the process editor, however this seems to be overlooked during the deployment, and the package just goes to the default directory.
I have tried substituting the path with a variable, but that didn't fix it. There are no errors or warnings in the deployment logs.
Can anyone help?
Sounds like you're taking the right steps to change your custom installation directory on your deployment.
One thing to check is that you've created a new release since updating your step configuration. Because releases in Octopus snapshot the deployment process, any updates you make won't show up in your deployments until you've created a new release.
I have a Teamy City Build Server and wanted to migrate my solution piece by piece to Package Reference. Unfortunately it seems that it doesn't find the references / packages for those project I already moved or reacreated with package reference. The packages are restored correctly as I see them in the global .nuget\packages folder. Also I can build locally without any issues.
Funny enough, I have some other smaller projects which do work correctly with package reference on teamcity.
TeamCity Version: 2019.2
NuGet: 5.4
I also tried to add a .NET CLI Task with a restore command, but that didn't change anything aswell.
NuGet Installer Step:
dotnet restore step:
In the end I create a new pipeline and reconfigured all the steps from scratch instead of copy an existing pipline and now it works. seems there was some issue somewhere attached to those existing pipelines.
After an entire day of trying to fix this problem I finally found the cause.
In the parameters of the build configuration in TeamCity there was a parameter called "system.VisualStudioVersion" with its value set to "11.0". I changed the value to "16.0", and this fixed the problem for me.
This might explain the solution of NPadrutt, assuming he had the variable set in that particular build configuration. Recreating his build configuration would then result in a new build configuration without the bad parameter, fixing the problem for him. But in my situation the parameter was inherited from the root project, so recreating the build configuration wouldn't have fixed it for me.
I am using TC10.x and one of my build generates an artifact which is then loaded in one of my custom configuration tabs.
Now after 10 runs, I wanted to change something in that file, so i edited all the artifacts that are created in .buildserver/../../artifacts folder.
When i go to the build configuration and downloaded the artifacts and see the contents are all changed but when i click on the link in the teamcity, it still loads the old stuff how can i work around this?
Do i need to bounce the teamcity server instance or the agent?
Restarting the apache web server resolved the issue. New files were taken up.
Could not see cache folder mention in the conf file, nor found any cache folder under apache or c:\program data etc.
What is a good way for extension/plugins during sonar upgrade? I am upgrading sonar from 4.0 to 4.5.1 second time.In first time, I copied the old extension/plugins folder into new sonar version.It so happened that during first time, there was a C# plugins and during database upgrade step, we got the message of "Impossible to upgrade Database". On removing this plugin, the database upgrade didn't happen and we were taken directly to the login page. As a results projects were missing on sonar dashboard though the LDAP users got imported.So I would like to know what is the right way out of below ?
1. Copy the old plugins folder from sonar 4.0 (old) folder to sonar 4.5.1 (new) folder.
2. Don't copy the old plugin folder. Just download the new plugins which are required post sonar upgrade.
Don't do #2! It will screw up your rule profiles.
You started out correctly by copying the plugins folder. But you have to go a little farther.
You need to read the upgrade notes for each intervening version. They're all children of this general guide to upgrading. They should mention all plugin incompatibilities & you'll have to deal with those manually. You may be able to do some of the upgrades via the update center in the old version before you shut it down. The rest you have to handle by deleting/replacing the old plugin jars.
I am a complete noob to this so if there is a completely obvious answer by all means make fun and point and laugh then give the answer.
We use Visual Studio 2010 to compile our published website. I have a repository that I use for my source code and one which I publish the compiled code to. I then check out the publish repository on the testing server and once it tests good I check out the repository on my main server. This is fine and all but I am using Tortoise SVN and automating the commit. Problem is, I really need to wipe the publish SVN repository, then copy the files, then commit. I just can't get that to happen and have it still recognize it as a SVN repository. Suggestions?
First of all, don't put compiled code into your source repository. It's bad form.
Look at Jenkins as a build server. Jenkins can use the msbuild.exe command to build .NET projects using the .sln file your project creates.
When you do a commit in Subversion, Jenkins will automatically fire off the build. If you have NUnit tests, Jenkins will run those and give you the results. You can have Jenkins store the compiled files for you in its archive. If someone wants to install a particular build, they can directly download it from Jenkins without having to do a checkout in Subversion first.
Jenkins offer all of these advantages:
It shows you all the changes in your repository and what changed in each commit.
It can run all sorts of tests automatically for you.
You can mark builds that are released using the "Simple Promotion" plugin.
You can tag builds in Subversion directly in Jenkins without needing a command line or working directory.
It can alert the developers if a build fails due to bad code, or if testing fails. These alerts can be done via Email, instant messaging, phone text messages, Twitter, and many other ways. All it takes is the right plugin which Jenkins makes easy to install.
Jenkins can act as a release repository which makes it easy to find the release, what's in the release and why.
Jenkins integrates with Bamboo, ViewVC, and Sventon. These are web-based repository browsers. This way Jenkins not only shows you the file changed, but what changed in the file.
Jenkins is easy to use and install. Download it and give it a try.
Unless you have a hard and fast requirement which forces you to use two separate repositories, i'd suggest taking a look at SVN tagging and branching functionality.
http://tortoisesvn.net/docs/release/TortoiseSVN_en/tsvn-dug-branchtag.html
Having a repository for the published code really doesn't buy you anything. IMO, you would be better off with a bunch of zip files (one per release) with the date and SVN branch reflected in the name. DO have a changelog .txt file in the zip, and also check that into the repo.
Problem is, I really need to wipe the publish SVN repository, then copy the files, then commit.
You don't need wiping in repo. Just make commit to production repo with exported HEAD from dev-repo (post-commit hook for commit message)
And tags, yes, are more natural and bulletproof way.