Deploying a pre-built package to Appharbor - appharbor

I couldn't find any information on Appharbor's website about the possibility to deploy pre-built asp.net (mvc) applications. Does anyone know if that's doable?
Another question I have is wether appharbor's built process supports project that launch an executable (node.exe in this case) that's included in a solution folder as part of a custom build step?

If you're worried about precompilation, that's something AppHarbor does out of the box. If you push a repository without a solution file, we won't build it, but just deploy the contents (see part with no solution file).
You should also be able to run node.exe as part of the build, as long as all dependencies (incl. node.exe) are in the repository.

Related

How to set up and maintain directory structure in TFS build server?

So I have this pretty huge solution with many projects, few of them use dlls from other projects in this solution, some projects copy files to other directories after build is performed. (as post build events)
when I build the solution locally on my machine, everything is great and working, but when i configure a build, and build it on build server (we use TFS) something goes wrong and i get a an error when i try to load one of the applications in this solution. (the error does not give me much data on what went wrong)
so before i sit to debug all of this. does anybody know how can i smartly manage all the build actions that are performed locally and via build server and see the deltas?
I would like to be able to build the solution exactly the same on build server as i do on my machine (with directory structure, post build events..etc)
thanks a lot
The generally accepted way to do what you're after is to use NuGet for managing your assembly references. You can publish your dependent assemblies into NuGet as part of a continuous delivery process, then reference (and update!) those dependencies in the solutions that consume them as necessary.
This removes ambiguity ("What version of Foo.dll is Project X using?") and reduces runtime errors ("Why is Project X using Foo.dll 3.0? It was never tested with 3.0! It needs to run with 2.7!").

Build dependencies and local builds with continuous integration

Our company currently uses TFS for source control and build server. Most of our projects are written in C/C++, but we also have some .NET projects and wouldn't want to be limited if we need to use other languages in the future.
We'd like to use Git for our source control and we're trying to understand what would be the best choice for a build server. We have started looking into TeamCity, but there are some issues we're having trouble with which will probably be relevant regardless of our choice of build server:
Build dependencies - We'd like to be able to control the build dependencies for each <project, branch>. For example, have <MyProj, feature_branch> depend on <InfraProj1, feature_branch> and <InfraProj2, master>.
From what we’ve seen, to do that we might need to use Gradle or something similar to build our projects instead of plain MSBuild. Is this correct? Are there simpler ways of achieving this?
Local builds - Obviously we'd like to be able to build projects locally as well. This becomes somewhat of a problem when project dependencies are introduced, as we need a way to reference these resources or copy them locally for the build to succeed. How is this usually solved?
I'd appreciate any input, but a sample setup which covers these issues will also be a great help.
IMHO both issues you mention fall really in the config management category, thus, as you say, unrelated to the build server choice.
A workspace for a project build (doesn't matter if centralized or local) should really contain all necessary resources for the build.
How can you achieve that? Have a project "metadata" git repo with a "content" file containing all your project components and their dependencies (each with its own git/other repo) and their exact versions - effectively tying them together coherently (you may find it useful to store other metadata in this component down the road as well, like component specific SCM info if using a mix of SCMs across the workspace).
A workspace pull wrapper script would first pull this metadata git repo, parse the content file and then pull all the other project components and their dependencies according with the content file info. Any build in such workspace would have all the parts it needs.
When time comes to modify either the code in a project component or the version of one of the dependencies you'll need to also update this content file in the metadata git repo to reflect the update and commit it - this is how your project makes progress coherently, as a whole.
Of course, actually managing dependencies is another matter. Tons of opinions out there, some even conflicting.

What exactly are teamcity artifacts?

A noob question, but googling and stack overflow search didn't seem to yield an answer.
Can someone explain what exactly are teamcity artifacts?
From the documentation
"Typically these include distribution packages, WAR files, reports,
log files, etc. When creating a build configuration, you specify artifacts
of your build at the General Settings page.
It doesn't really explain to me what an artifact is. A .Net oriented answer will be very helpful. I have a couple of builds already working on teamcity, but I'm not sure what exactly I would need an artifact for ?
thank you
Artifacts are the files you want the TeamCity server to store so that they can be downloaded after the build has finished. They will be downloadable from the TeamCity dashboard from each build.
For a .NET project you might choose the store the output of the compiler (i.e. .exe and .dll files), and the log files from running unit tests. You might just have a Windows Installer package (i.e. .msi).
It is completely up to you what gets stored for your specific needs. Just note that build artifacts do take up disk space on the TeamCity server, so if yours are large you'll want to configure the Build History Clean-up rules.

Jenkins- multiple locations SVN. Is it possible to specify the build version

I am new to Jenkins CI tool and I want to know if it is possible to specify what build to use when there are several projects, on different SVN locations, dependent on one another. For example, if I have the web project on SVN location1 and the backend project on SVN location2 and the web depends on the backend and one of the developers modifies something in backend, when the web developer does a commit, there will be a build failure. Is there the possibility to specify that the build from the web part should take into consideration build x from backend and not the newest build?
Thanks in advance.
yes that can be done. in Jenkins check for the Build Triggers options in your project web-settings and on the line Build after other projects are built you can specify the name of projects you want to build automatically after there has been changes made to the base project.
And similarly, in the Post-build Actions, look for Build other projects, where you can specify that if the base project builds successfully, it will automatically trigger a build on children projects.
Hope this helps.
Your example of building a project against a specific version of another project is a little non-standard, but not impossible.
In your case, I would use Jenkins' ability to execute arbitrary scripts to help. The script would take care of getting the correct version of the project that the one I want to build depends on.
Building on your example of a Web and Backend project, here's how I would do things without using a parametrized build:
Add a file to the repository of the Web project that stores the version of the Backend project to use
Configure a job to build the Web project when the source for the backend project changes in SVN.
The project should check out the latest version of the Web project
The first Build Step for the project would be a script (Execute Shell or Execute Windows Batch Command) that does the following:
Gets the version of the Backend to use from the file containing the version info
Either pulls the appropriate version of the Backend from the Backend's repository; or pulls the source of the appropriate version of the Backend's source
(If you pulled the source only for the Backend, the next Build Step should be to build the Backend next)
Build the Web piece
Do any unit tests

Managing internal 3rd Party Dependencies

We have a lot of different solutions/projects which are managed by different teams. Our solution needs to reference several projects that another team owns. We don't want to add these dependencies as project references because we do not intend on modifying that code, we just want to use it. Also we already have quite a bit of projects in our solution and don't want to add a bunch more since it will slow down Visual Studio. So we are building these projects in a separate solution and adding them as file references to our solution.
My question is, how do people manage these types of dependencies? Should I just have some automated process what looks for changes to those projects, builds them and checks the dlls into our source control, after which we treat them like other 3rd party dependencies? Is there a recommended way of doing this?
One solution, although it may not necessarily be what you are looking for, is to have each dependent sub-system perform a release. This release could be in the form of a MSI install, or just a network share of assemblies. When a significant change is made, that team could let you know, and you could run the install or a script to copy the files.
Once you got the release, you could put them into the GAC, that way you would not have to worry about copying them to your project bin folders.
Another solution, assuming you are using a build server or continuous integration of some kind, is to have a post build step or process stage the files. Than at any given moment, the developers of the other teams could grab the new files , or have a script or bat file pull them down locally.
EDIT - ANOTHER SOLUTION
It might be best to ask why do you have these dependencies? Do you really need them locally when building your part of the application? Could you mock out the dependencies in your solution, allowing you to code, build, and run unit tests? The the actual application would wire these up in your DEV/Test/Prod environments. Keeping your solution decoupled and dependent free may be a better solution for the individual team. Leave the integration and coupling when the application runs in a real setting.
(Not a complete answer, but still:)
Any delivery is better stored in a file/binary repository, as opposed to a VCS used to manage sources history.
We prefer managing those deliveries in a repo like Nexus, and we are using maven to get back the right dependencies.
Even if those tools can be more Java-oriented, Nexus can store anything, and maven is only there to read the pom.xml of each artifact and compute the right dependencies.

Resources