I used TeamCity to create CI, The CVS is ClearCase. I have test the connection when configure the Version Control Settings on TeamCity, Connection successful!
But when I run this build, it's not correct, the status of build will show "Checking for changes" for a long time, the source code is only 40M.
My ClearCase view's config spec as following:
element * CHECKOUT
element * /main/LATEST
load \Tranning
Anyone meets the same issue?
Is this configuration correct?
The page on "ClearCase support" mentions:
When using the ClearCase integration, it is helpful to open the Version Control tool window. Its Console tab displays the following data:
All the commands generated based on the settings you specify through the IntelliJ IDEA user interface.
Information messages concerning the results of executing generated ClearCase commands.
Error messages.
So that would go a long way in order to debug whatever error the integration with ClearCase currently has with your view.
I assume you have the full ClearCase client installed with your agent (not CCRC, ClearCase Remote Control)
Related
I have a brand new TFS2018 test installation and try to run a maven build with sonarqube analysis.
The sonarqube extension is installed from the marketplace and configured to use our internal Sonarqube. I added the prepare and publish SonarQube steps to my build like described.
In the prepare step I can successfully select my SonarQube endpoint from the drop down box.
When I now tick the "Use SonarQube" check box in my maven task the SonarQube-Endpoint drop-down box is empty. My SonarQube-Server "Heuboe" does not show up. Even if I type it in the box stays invalid.
Any hint what's going wrong? Does anybody now how to file an issue to SonarQube directly. I can view issues under: https://jira.sonarsource.com/browse/VSTS/?selectedTab=com.atlassian.jira.jira-projects-plugin:summary-panel
but I didn't find a way to report one.
The Maven task is expecting a Generic Endpoint. The SonarQube extension adds another type of endpoint specifically for SonarQube, which the built-in Maven task has no awareness of.
Version 2.* of the task (which should be selectable from the dropdown menu for the task) has another method of running SonarQube, which will use the endpoint defined in the Prepare Analysis step.
I configured a project in SONAR(6.1) to run from jenkins and configured to use clearcase as scm.
sonar.scm.provider=clearcase
And our clearcase is configured to use Local and Tst Streams.
Now when i checkout and checkin any changes into Local Stream for the first time it creates new branch for the changes .
When i checked the clearcase annotate(BLAME) information, its showing up correctly. But in SONAR it is showing incorrect.
Here is the information from annotate
0 sgadey01 \main\FW_3.0.0.0_TST\FW_3.1.2.0_TST\FW_3.1.2.0_LOCAL\1 | | System.out.println("testing");
and information from SONAR as in the below link with user bdiaz.
https://s28.postimg.org/8m8l921rh/sonarerror.png
Is there any known limitation with sonar? for cvs plugin they mentioned revision has to be passed manually in limitations(https://github.com/SonarSource/sonar-scm-cvs).
Thanks
sandy
The SonarQube ClearCase SCM Provider is simply running cleartool annotate from command line and parsing output (see https://github.com/SonarQubeCommunity/sonar-scm-clearcase/blob/master/src/main/java/org/sonar/plugins/scm/clearcase/ClearCaseBlameCommand.java) so I'm really surprised you get different results.
You can try to restart analysis and force the SonarQube scanner to collect blame again (there is a cache to not compute blame on files were content was not changed compared to previous analysis):
sonar-scanner -Dsonar.scm.forceReloadAll=true
I am trying to run SonarQube using Sonar runner in local dev box for pre-commit check. We have a central SonarQube server where a analysis is done every day and published to the dashboard. When we are running on local dev box everytime the the issue report contains all the issues as new hence incremental data is not available. I have also tried both incremental and preview mode but the result is some.
Please find below the version of the tools used.And also configuration files. Please let me know if some other data is required.
SonarQube version : 5.1
Sonar Runner version : 2.4
sonar-runner.properties
sonar.host.url=http://[central sonar server]:9000/
sonar.issuesReport.html.enable=true
sonar.login=admin
sonar.password=admin
sonar-project.properties
sonar.projectKey=myProj:myProj-master
sonar.projectName=MASTER_PROJECT
sonar.projectVersion=21.0
sonar.sources=./src
sonar.binaries=./bin/
sonar.issuesReport.html.enable=true
sonar.exclusions=com/**/test/*.java
sonar.skipPackageDesign=true
sonar.profile=SonarWay
sonar.preview.excludePlugins=devcockpit,buildstability,pdfreport,report,buildbreaker,views,jira,issueassign,scmstats
Command Used :
c:\sonar-runner-dist-2.4\sonar-runner-2.4\bin\sonar-runner -e -Dsonar.analysis.mode=preview -Dsonar.issuesReport.console.enable=true -Dsonar.issuesReport.html.enable=true
Updated with additional properties tried as well. in sonar-runner.properties
I believe your problem is tied directly to your use of a local server.
The purpose of preview analysis is to allow you to compare your local changes with what's on the remote SonarQube server. Since your remote server is update every night, running your preview against it will show you the issues you've introduced that day. Instead, you're running against a local instance which gets updated with a full analysis... never? Which (if true) would be why all your issues show up as new.
To execute a preview analysis against your remote server, you will need both the global Execute Preview Analysis permission and the project-level Browse permission for the project in question.
If for some reason you're unable to get those permissions (which is possibly why you're running a local SonarQube server?) Then you'll want to do the same full checkout and analysis locally every night that's being done for the official, remote server. I.e. you'll probably have to set up a second, parallel architecture. In short, it's probably easier in the long run to nag to get the appropriate permissions on the remote server.
Issue is resolved . 2 things fixed the issue.
Creating a user with the required permissions.
Installing "Issues Report" plugin
I have the following set-up:
TeamCity server running on one machine
TeamCity agent on a separate machine, connected via VPN to source control (TFS).
The VPN is a bit tricky to set up to run as a service so can't/don't want to set it up on the server as well. Rather, I was hoping to have everything go through that agent.
The build server fails while collecting sources, it appears it's trying to figure out what changes were performed in TFS (but it can't find the TFS host since it's not on that VPN). The build is set to check out the sources only on the agent.
I'm afraid the answer is obvious, but couldn't find any documentation confirming this...Is it possible to have such a setup? Or does the build server need access to the TFS repo to check for changes and trigger builds?
The TeamCity server will still require access to the VCS root to evaluate the current revision and changeset details.
It's important to note the additional side-effects of agent side checkout as well. See VCS Checkout Mode in the TeamCity docs for more information (note the 2nd line).
I'm migrating continuous integration system from Teamcity to Jenkins. We have a single svn repository for all our projects like this:
project/dev_db_build (folder)
project/module1 (folder)
project/module2 (folder)
projets/pom.xml
For building db on CI server I use url project/dev_db_build and can pol this url to trigger builds when there are changes.
For building application I use url project/ So if I poll it and there are changes to dev_db_build application build should be ignored and triggered after db_build as successful.
In teamcty I used "Trigger patterns" for this. But in Jenkins there are so many triggering plugins https://wiki.jenkins-ci.org/display/JENKINS/Plugins#Plugins-Buildtriggers - I looked into some of them and have not found suitable.
Ideally, you should use a post-commit hook as suggested by #Mike, rather than polling. Otherwise, when configuring the Jenkins job, under 'Source Code Management' with 'Subversion' selected, there is an advanced button. Clicking this reveals an number of options, including 'Excluded Regions'
If set, and Jenkins is set to poll for changes, Jenkins will ignore
any files and/or folders in this list when determining if a build
needs to be triggered. Each exclusion uses regular expression pattern
matching, and must be separated by a new line.
/trunk/myapp/src/main/web/.*.html
/trunk/myapp/src/main/web/.*.jpeg
/trunk/myapp/src/main/web/.*.gif
The example above illustrates that if only html/jpeg/gif files have
been committed to the SCM a build will not occur. More information on
regular expressions can be found here.
In your case, you would set 'Excluded Regions' to something like
/project/dev_db_build/.*
Do you have the ability to edit your Subversion hooks? Instead of having your Jenkins server poll SVN, I would recommend that you have SVN call Jenkins via a post-commit hook to automatically kick off a build upon developer commit. This has the effect of lessening the load on both the Jenkins and SVN servers as well as not having a waiting period of however long your polling interval is before a build is kicked off.