SonarQube Incremental mode gives different results on different machines with same code - sonarqube

I am running a new sonar server with one project and one quality profile.
I have a "runner" machine that continuously running full scans (after getting the latest code)
and the developers machines where they would like to run incremental analyses before checking-in.
when running an incremental on the "runner" machine without making any changes I get 0 issues as expected (but I am also getting a lot of "resolved" issues - what is the deal with that? I expected to get 0 new and 0 resolved since I changed NOTHING).
BUT, when running incremental on the developer machine (after getting the latest code) I am getting a huge number of new issues, even though they also made no changes to the code.
to make sure I am not making any mistakes, I used TFS to compare the two project directories (on the runner and on the dev local machines, the folders that the analysis uses) - and proved that both are the same (except for the sonar generated files)
so to sum it up:
what could be the cause for such a behavior?
why would I get resolved issues if I did not make any changes to the code?
could it have anything to do with machine clocks? (i am desperate...)
If you would tell me that there is no change in hell that such a problem can occur, then I would go back and check myself, but I am running such a simple configuration that I don't think that I am missing anything.
Thank you for your help.

Related

AzureDevOps onPremise, Workspacemapping really slow

We are using the onPremise version of the DevOps Server 2019 (curently update 1) with self-hosted agents (agents have the last version available from gitHub) in combination with TFVC (2019).
The devOps server is running in a virtual machine and the tfvc server is running in a different virtual machine.
The communication between them is fast, i tested this already by simply copying big testdata from one to the other over network. There is no Problem.
On each and every run, at the very beginning, the workspace mapping from a previous run is getting deleted, a new one is created and than a new workspace mapping to every source paths defined in the repository is established. This is taking about 30-60 Minutes on each and every pipeline/run.
We dont have only one single path in the repository defined. there are a lot of mappings, so that the amount of code that gets taken from TFS stays little and only represents that source code, that is needed by this executed solution.
This can't be changed and has to stay as it is, also we can't simply move to github. (Just sayin in case someone would like to advice to move to github :))
Are there any people, that experienced the same behaviour in the past, that the repository paths mapping at the fist build step is taking about 30-60 minutes when a build is executed?
thanks for any hints in advance
The solution now was, installing everything from scretch on a new machine.
After that, the mappings are running in a 10th of the time it took before

New Visual Studio installation, tests not running in Test Explorer

This question is very similar to other questions that also in some cases literally have the text "tests not running in Test Explorer" in the title. But, my context is a bit different. In those questions, there was a fair bit of investigation into what might be wrong with the tests. I am fairly confident nothing is wrong with the tests in this case.
I am one of hundreds of developers working on a project, and this project has a large bank of automated tests (though perhaps not as large as it ought to be :-P). Everybody is frequently running tests, and triggers fire when pull requests are made and merged to automatically run them then too. Tests were working fine for me as well. But, I have just been given a new laptop with better hardware specs, and I am trying to get it set up. On the new laptop, the project builds just fine (and noticeably faster :-) ), but the automated tests just don't run. I can't figure out why, and I'm looking for suggestions about what to check in this context -- given that there are hundreds of places where the exact same code is working perfectly, I really don't think the tests or test projects themselves are at fault here.
I have observed that the build output, apparently randomly, sometimes does not contain the test adapter files:
Microsoft.VisualStudio.TestPlatform.MSTest.TestAdapter.dll
Microsoft.VisualStudio.TestPlatform.MSTestAdapter.PlatformServices.dll
Microsoft.VisualStudio.TestPlatform.MSTestAdapter.PlatformServices.Interface.dll
xunit.runner.visualstudio.testadapter.dll
If these files are missing, then VSTest.Console.exe also cannot run the test. But, usually rebuilding the project results in the files magically appearing, and then VSTest.Console.exe works just fine.
I haven't been able to ascertain a reason why the adapter files are sometimes put into the build output and sometimes not, and in either case, the Test Explorer within Visual Studio always fails to run the tests -- it discovers the tests just fine, puts several thousand items into the forest of trees, but when told to run tests, it just sits there for a minute or two and then returns to idle state with no output at all in the "Tests" output window.
This is a brand new installation of Visual Studio Enterprise 2019 Preview, the exact same version that is on my old laptop, but on my old laptop the tests run fine. What do?? I don't know what to check next. :-(
Well, I am thoroughly confused. I tried installing new features, I tried checking for system updates, I rebooted multiple times, and tests did not work. So, finally, I decided to make a cut-down minimal test project and see if I could observe any differences in Process Monitor between the two computers. I made a project with two tiny tests, one with NUnit and one with xUnit, and ... they worked. So, I opened up the big project again and hit Run Tests, and ... they worked. I am completely stumped, and the only advice I can offer to anyone finding this question with a similar problem is, just keep trying.

How do I resolve a merge conflict when both server and local versions are wrong?

I am working in the enterprise on Visual Studio 2013 and Team Foundation Server. We have a test source and a production source, and I Get Latest on them both regularly. As there are only a few developers, we make changes directly in test, we don't have personal branches off of test (though when our shop was bigger, we did).
So I opened a file in Test, made a simple change to it, saved it, checked it into test, and published it to production. When I went to merge Test with production, I spawned a merge conflict. I am looking at the "server" version and the "local" version and neither one is correct. The "server" version is the file I edited, minus the edit I just made. That makes sense. But the "local" version is something I haven't seen before, what looks like perhaps an older version that I've never never worked on. Maybe this was my local copy and Get Latest didn't update it. But the local file I edited was exactly what I expected it to be.
So, what could cause that? How do I troubleshoot this problem? All I want is to get the correct version in TFS so it doesn't get blown away later, but I have no idea how to proceed.
This may due to when you do the get latest option, TFS didn't update the workspace correctly.
A clean way to do this, back up your file with changes, undo your pending changes. Delete the old workspace, create a new one. Get latest version from server for both test and production source.
Edit the specific file with changes, check it into the test, and finally do the merge Test with production again.

Build always takes forever

When I queue my builds in VS 2013 using the "Hosted Build Controller"(Visual studio online) my build keeps saying:
![enter image description here][1]
Position In Queue = 1 (Waits in the Queue for more than half an hour)
It is not starting my builds inspite that fact that I have no other builds in queue or or running.
Sometimes it gives me the message saying the connection to the build server was lost.
Not Sure why this is happening because earlier when I initiated my builds using VS 2010, my builds used to start immediately.
Any help is greatly appreciated.
You're dealing with what can be a painful issue. Have you tried creating another agent? You can specify which agent to use when queuing the build. If other agents work and not this once, you may have encountered a bug that requires updates to your current version of TFS.
If new agents consistently fail or don't fail always, then you're dealing with a bigger issue that can be due to performance, cross-geography issues, or with too much latency loading your template. I believe one thing that can cause issues with this is having too many agents. I would also try clearing your build caches out. C:\users[user]\appdata\local\temp (i.e. BuildAgent/Controller). Also, definitely make sure your build software matches your TFS version (including the update). Slight differences there can cause issues.

How to speed up the eclipse project 'refresh'

I have a fairly large PHP codebase (10k files) that I work with using Eclipse 3.4/PDT 2 on a windows machine, while the files are hosted on a Debian fileserver. I connect via a mapped drive on windows.
Despite having a 1gbit ethernet connection, doing an eclipse project refresh is quite slow. Up to 5 mins. And I am blocked from working while this happens.
This normally wouldn't be such a problem since Eclipse theoretically shouldn't have to do a full refresh very often. However I use the subclipse plugin also which triggers a full refresh each time it completes a switch/update.
My hunch is that the slowest part of the process is eclipse checking the 10k files one by one for changes over samba.
There is a large number of files in the codebase that I would never need to access from eclipse, so I don't need it to check them at all. However I can't figure out how to prevent it from doing so. I have tried marking them 'derived'. This prevents them from being included in the build process etc. But it doesn't seem to speed up the refresh process at all. It seems that Eclipse still checks their changed status.
I've also removed the unneeded folders from PDT's 'build path'. This does speed up the 'building workspace' process but again it doesn't speed up the actual refresh that precedes building (and which is what takes the most time).
Thanks all for your suggestions. Basically, JW was on the right track. Work locally.
To that end, I discovered a plugin called FileSync:
http://andrei.gmxhome.de/filesync/
This automatically copies the changed files to the network share. Works fantastically. I can now do a complete update/switch/refresh from within Eclipse in a couple of seconds.
Do you have to store the files on a share? Maybe you can set up some sort of automatic mirroring, so you work with the files locally, and they get automatically copied to the share. I'm in a similar situation, and I'd hate to give up the speed of editing files on my own machine.
Given it's subversioned, why not have the files locally, and use a post commit hook to update to the latest version on the dev server after every commit? (or have a specific string in the commit log (eg '##DEPLOY##') when you want to update dev, and only run the update when the post commit hook sees this string).
Apart from refresh speed-ups, the advantage of this technique is that you can have broken files that you are working on in eclipse, and the dev server is still ok (albeit with an older version of the code).
The disadvantage is that you have to do a commit to push your saved files onto the dev server.
I solwed this problem by changing "File Transfer Buffer Size" at:
Window->Preferences->Remote Systems-Files
and change "File transfer buffer size"-s Download (KB) and Upload (KB) values to high value, I set it to 1000 kb, by default it is 40 kb
Use offline folder feature in Windows by right-click and select "Make availiable offline".
It could save a lot of time and round trip delay in the file sharing protocol.
The use of svn externals with the revision flag for the non changing stuff might prevent subclipse from refreshing those files on update. Then again it might not. Since you'd have to make some changes to the structure of your subversion repository to get it working, I would suggest you do some simple testing before doing it for real.

Resources