how to develop code on multiple computers when using CI Server - continuous-integration

I currently develop a lot of my work on my laptop while on train to and from work most days. But it is no way near as good or easy as my main machine at home. My issue is how to keep the files in sync. I use subversion but also have a CI Server that rebuilds on each check in. There is a question here that deals with multiple machines and says to use source control. But what if there is a CI server, this would lead to broken builds.
My first thought (which seems a bit daft) would be to use two sets of source control, or at least two repos for the same code. One for the CI server and one to transfer between machines. Not sure you can even do that with SVN. Was just looking at Mercurial after Jeol's blog. Not sure if this would be able to work as you still have the issue of pushing to central repo would be where the CI server pulls from.
I guess the crux is how do you share code that is still buggy or doesn't compile so you can switch machines, without braking the build?
Thanks

You're actually on the right track with the multiple repository idea, but Subversion doesn't natively support this sort of thing. Look into using a so-called DVCS (Distributed Version Control System) such as Git or Mercurial, where you can have multiple repositories and easily share changes between them.
You can then push changes to the central server from either your desktop or your laptop machine, and your source control system will sort it all out for you. Changes that are not yet complete can be done on another branch, so if one morning on the train you write some good code and some bad code, you can push the good code up to the repository used by the CI server, and the bad code over to your desktop development machine, and carry on from the point you left off.

What about branches? Develop on one branch and only reintegrate after you finished a piece of development?

Related

Can you host a bitbucket pipeline internally?

We are currently using bitbucket cloud to host our grails-app repository. We want to set up some pipelines to do things like run unit tests and make sure the app compiles before being able to merge a branch to master.
I know this can pretty easily be done by letting them host the pipeline and committing a well written pipe file, however there is a problem standing that our app is very large, and even on brand new macbook pros takes 20 minutes to compile, on some older ones it can take 2 hours or more. Grails, thankfully, only compiles files that have changes in them from the last compilation. However, this can't be used on a bitbucket pipe that's working off a fresh pull of the app every time it runs.
My solution to this was wanting to set up a pipeline to run for us internally so that it can already have the app pulled, and just switch to the desired branch and run from there. This still might take time if switching between 2 very diverged branches, but it's better than compiling from fresh every time.
I can't seem to find any documentation on hosting a pipeline internally with bitbucket cloud, does anyone know if this is possible, and if so where there is documentation for it?
It would also be acceptable to find a solution to the long compilation problem itself with bitbucket hosted pipelines.
A few weeks ago, self hosted runners was made available as a public beta. Here are the details: https://community.atlassian.com/t5/Bitbucket-Pipelines-articles/Bitbucket-Pipelines-Runners-is-now-in-open-beta/ba-p/1691022
Additionally, if you're looking to retain some of your files from one build to the next to save doing the same work over and over again, have a look at caches: https://support.atlassian.com/bitbucket-cloud/docs/cache-dependencies/ there are some built ones that you could use, but you can define your own custom ones as well. Essentially it's just a way of preserving the contents of a directory for a future build.

Netezza CI/CD tool

Is there any CI/CD tool for Netezza that can manage versions and can be used for migrating code across environments? We have used flywaydb for other databases and are happy with it, but that does not support Netezza. I have already googled and did not find a single tool, so any responses are good for me to begin analyzing further
To my knowledge, there's nothing specifically geared for Netezza. That said, with a bit of understanding of your target environment, it's certainly possible.
We use git and GitHub Enterprise (GHE). The reason for GHE is not particular to this solution, but rather because I work at a hospital. Here's what we do.
Setup
Build a repository at /home/nz on your production server. Depending on how you work nzlogs, nzbads, and other temporary files, you may need to fiddle quite a bit with the .gitignore file. We have dedicated log directories where temporary files should reside.
Push that repo into GHE.
If you have a development server, clone the repo in the /home/nz directory on that server. Clearly you'll lose all development work up until that point and will want to make sure that things like .bashrc are not versioned. Alternatively, you could set up a different branch and repo and try merging the prod and dev versions. We did this, but I'd recommend just wiping your development box with production code one slow day.
Assign your production box a dedicated branch in git. For this discussion, I'll call them prod and dev. Do the same for development, if you have it. This is mainly a mental thing, not a tech thing, but it's crucial, like setting up a remote for Heroku or Azure.
Find or develop a tiny web server that can listen for GitHub webhooks. I built a Sinatra server with a simple configuration file. Anything will do. Deploy the web server to each of the environments and tune them to perform the following activities on an update to the prod or dev branches, respective to the server.
git reset --hard
git clean -f
git pull
Set up webhooks in your GHE repository to send the push event to the web servers.
Of course, you can always have the web server do other things on a branch update if you want to get fancy (maybe update cron from a versioned file or update schemas from all new files).
Process
Fairly simply, follow the GitHub Flow workflow. You can pretty much follow whatever process you want with the understanding that your prod and dev branches should be protected and only removed or futzed with as an admin task. Create a feature branch, test it by pushing to dev, and then make a pull request for the prod branch.
Why GHE? Mainly because it keeps an open area where our code is available. You could absolutely do this by pushing directly to Netezza's git repo, but your workflow will suffer--it just isn't as clean as having all code in one clear place with discussion around pull requests.

Magento - Automated deployments

Are there any automated depoloyment tools out there for Magento sites?
If not does anyone have any best practices so to speak for maintaining and deploying Magento builds across local, staging and products?
This is how I've been working for the past few months and it works pretty well for me.
Install SVN on your server. Or get your host to do it. Or choose a host with SVN in place. Or git.
or
Use Springloops.
The 'trunk' is your live site.
Branches are for staging. Set up the webserver to treat these folders as subdomains.
The live database is regularly copied to branches. This refreshes the data for testing. (Consider anonymizing sales & customer data)
Each repository has it's own "app/etc/local.xml" file. Mark these with SVN:ignore so that one will not upset another.
Also SVN:ignore the "media" and "var" directories.
Each dev has a local webserver for working on. When they finish a change it is deployed to a branch ready for QA.
Nobody except the lead dev is allowed to merge branches to trunk on pain of death!
This means changes in code bubble up to the live site. Copies of the database bubble down to devs. Sometimes copies of the "media" dir are copied downwards as well. Extensions and upgrades are tested on branches too, I dislike using the Connect Manager on a live site.
Been using Git lately, so far liking it much more than SVN, this same flow could be applied to SVN as well I believe:
More details: http://nvie.com/posts/a-successful-git-branching-model/
Currently having, a local VM with a base install of Magento to setup for projects to roll out to new developers is the best approach I think. Most of us just use NetBeans inside the VM and use git pull/pushes as well as some custom build modules for deployment to all of our usual environments: local, integration, UAT, and production. Production or Integration is usually our system of record database wise.
Here is a base .gitignore file to start off with:
https://github.com/github/gitignore/blob/master/Magento.gitignore
A simple Git Deployment:
http://ryanflorence.com/simple-git-deployment/
You can try the packaged Magento that is automatically deployed with a help of Jelastic PaaS https://github.com/jelastic-jps/magento/tree/master/magento
You can get it pre-configured and installed with NGINX or LiteSpeed server and MariaDB.
After customization, you can clone the whole environment in order to get similar replicas for dev, test, stage and production. And when all needed changes are done on the cloned environment, you can just swap domains with current production and thus make the updated version available.
Or you can set up automated update process from Git/SVN.
I'm in the early stages of my first magento site. Its a big project, and my team and I have been discussing this very issue. We've seriously considered using a Git repository to maintain versioning across local, staging and live servers. Here is a good article on the subject. Its obviously focused on Wordpress, but I think the workflow would be almost identical.
And to answer your first question, I know of nothing automated.
We use SVN for very large scale projects. Almost any hosting service for your staging and product environments will be able to provide you with an SVN client to maintain sync with your repository.
Never heard of any automated deployment tools for Magento.

Distributed Revision Control with automatic synchronization or Eclipse plugin better than FileSync?

I have what I hope is not a unique situation...
...and I'm looking for suggestions.
I am looking for a better synchronization plugin for Eclipse than FileSync
-or-
I am looking for a distributed (preferably) version control system that will allow me and the other developers in my team the ability to work with local files and have that repository automatically upload changes and revision history to our development box
-or-
A combination of the two.
Most revision control applications I've tried are catered more to the compiled code workflow where you only check in when you have a compilable code base, and that makes sense to me. We, however, are working with Coldfusion pages on a remote development server which complicates the process of check-ins, quick updates and debugging. Now, I don't necessarily want to have to check in every time I want to test code (because that would be a nightmare...) but it would be nice to have something that tracks changes throughout the day and logs those changes in a revision control automatically (Dev would state intention in dialog on opening the project?) while keeping the files on the development server in sync with all the programmer's machines. It would be awesome if it tracked changes locally and did one auto check-in per day (at some scheduled time, preferably as a background process), but I've not seen anything like this.
Currently we are shoe-horned into using Serena PVCS (because they have free licenses mainly) and it's not a very fast solution when we all work in different States, our development server is in a State that none of us work in, and the repository is in an even different State. (No control over this!) It normally takes Eclipse 10-15 minutes to synchronize ~500 files with the PVCS server and check-ins are "Eclipse-lockingly" slow. (ie: when checking in, forget using Eclipse for anything.)
I would like to have a workflow process that manages all our workfiles locally, synchronizes those changes to a remote development server and pulls down any changes that happen to be up there. I do not count on having any/many conflicting merges during this because we all work on different parts of the same site. However, it may happen.
I have played around with Bazaar, and this is what made me think about having a distributed revision system, but I'd like to have that auto-merge with the remote repository (the development server in this case) and I did not find a way to do that when local files were updated. I will have to admit I have not looked into Git or Mercurial much and was hoping that someone could share their experience with me on feature sets or solutions if one of these other options will work.
To give a back history, this came about when one of our developers started using FileSync in Eclipse and started overwriting all our changes because the Eclipse FileSync plugin is only one way... from the dev box to the server. Boss asked why we weren't checking in all the time... we blame the speed... I get tasked to find a solution.
Also, a centralized solution like SVN was already turned down (because we have Serena and a crew of people that are supposed to manage this... but I've been waiting two days for even a response to an issue log I submitted concerning our lack of speed issue, so if we can self manage a solution [thus distributed and why I looked at Bazaar] that would be awesome.)
A DVCS like Git or Mercurial would definitely be a sensible choice, especially for:
distributed development
distributed repos (including one dedicated for those checks of yours)
That notion of dedicated repo is not a new one and has been used before (for local repo used for testing before pushing to a remote repo), but it can easily be adapted for the kind of automatic pushing you are looking for.
For a strong Eclipse integration, I would go with Git (even though EGit is not fully baked yet): all Eclipse projects (for the development of Eclipse) itself are or will be soon on git repo.
Eclipse is committed to replace its current native CVS integration with a complete native Git integration.

Is using dvcs making harder to use continuous integration in corporate environments?

There are unarguably lots of advantages in using dvcs and it can be used like centralized vcs, but is having local commits and being able to very easily fork the project for some smaller group making it harder to support continuous integration? It helps in development that everybody has access to the most recent changes, which are tested by the CI server so possibility of incompatible changes can be minimized.
You can centralize a DVCS. The difference between DVCS and centralized ones is that with a DVCS, you don't have to centralize it.
You could have a central repository where everyone can push changes, and everyone can pull the latest code from. You can write a commit hook on the server so that every time someone pushes code, it runs a test to make sure it passes the tests. It's just like centralized version control, only better, because I can create a local branch and make several local commits before I'm ready to push to the central server.
Have you ever been making a big change that breaks a lot of things, and wanted to make several commits, but not share them until you're done and everything is fixed again? That's what DVCS makes easy.
It does make it harder to perform CI as the source control system is encouraging you NOT to integrate continuously. However, there is absolutely nothing that prevents you from doing that regular integration into the central repository. The team just needs to remain disciplined about that.
Where smaller teams fork the project and do their own thing for a while, you should do continuous integration builds against that fork as well and potentially set up a regular integration between the two forks.
This would be similar to the stream based multi-stage continuous integration strategies that Accurev pushes:
http://www.accurev.com/multistage-continuous-integration.html

Resources