Gitlab CI VirtualBox runners do not support the Gitlab CI cache (as layed out in the documentation). However, I require a cache (primarily to cache my maven and node folders).
Are there known home-made workarounds/scripts, best practices or third party libraries for caching in VirtualBox runners?
Or in short: How can I cache specific folders when using Gitlab CI VirtualBox runners?
Related
What are the supported ways to install and deploy Apache APISIX? Is there any corresponding documentation? How should I install Apache APISIX if I am in an isolated network environment?
What do you mean by an isolated network environment?
Perhaps you could try APISIX stand-alone. https://apisix.apache.org/docs/apisix/2.12/stand-alone/
I think the method of installing and deploying APISIX depends on your needs. I would even recommend using a source installation if you are using it for a personal website, as this gives you early access to the latest features.
As per Hashicorp documentation on Nomad+Consul, consul service mesh cannot be run on MacOS/Windows, since it does not support bridge network.
https://www.nomadproject.io/docs/integrations/consul-connect
What is the recommended way to setup a local development environment for Nomad+Consul?
I'd suggest to have a look at setting up your local environment using Vagrant (which is also a product for Hashicorp) and Virtual box. There are plenty examples online, for example
Here is one of the most recent setup with Nomad and Consul, although it is not parametrised much.
Here is one with the core Hashicorp stack, i.e. Nomad, Vault and Consul. This repo is quite old but it merely means that it uses old versions of binaries, which should be easy to update.
Here is one with only Vault and Consul, but you can add Nomad in a similar way. In fact, this Vargrant setup and how files are structured seems to me pretty close to the one above
I've run the first two previous week with a simple
vagrant up
and it worked almost like a charm. I think, I needed to upgrade my VirtualBox and maybe run vagrant up multiple times because of some weird run time errors which I didn't want to debug)
Once Vagrant finishes build you can
vagrant ssh
to get inside created VM, although configs are setup with mounting volumes/syncing files and all UI components are also exposed at the default ports.
GitLab CI is highly integrated with Docker. But in some cases the applications need some interactions with some app (which cannot be deployed in docker)
so i want to make my jobs (on gitlab-ci.yml) be running on a Linux VM Server.
how can i set up that in Gitlab? i searched in many website but i didn't find the answer.
thanks you
You can use different executors with Gitlab. For your case, you should set up Gitlab Runner as shell executor and register it (provide it with token obtained from repo)
https://docs.gitlab.com/runner/install/linux-repository.html
Background
Our current infrastructure consists of a Jenkins master and a number of slave VM's. We are running into a lot of scalability and inherently stability issues with our tests as the VM's are being overworked.
Mesosphere and Jenkins
That being said, I'm looking to explore more solutions, particularly with mesosphere because its ability to dynamically generate slaves as needed.
My only issue with that is that we have all these dependencies installed on the slave VM's. In order to make Jenkins work on mesos, I would have to "dirty" the mesos slaves by installing the dependencies on them. This would kind of render these mesos slaves useless as they would only be suited for running Jenkins.
Question
What is the proper method of implementing a Jenkins environment in Mesos alongside other applications?
Check out eBay's video and blogs about their Mesos+Marathon+Jenkins setup:
http://blog.docker.com/2014/06/dockercon-video-delivering-ebays-ci-solution-with-apache-mesos-docker/
http://www.ebaytechblog.com/2014/04/04/delivering-ebays-ci-solution-with-apache-mesos-part-i/
http://www.ebaytechblog.com/2014/05/12/delivering-ebays-ci-solution-with-apache-mesos-part-ii/
Part II of the blog talks about running Jenkins builds in Docker containers, which could alleviate the problem of "dirtying" the slaves with dependencies.
See the mesos-jenkins plugin for more documentation, and see dockerhub for pre-built images
https://github.com/jenkinsci/mesos-plugin
https://registry.hub.docker.com/u/folsomlabs/jenkins-mesos/ (latest)
https://registry.hub.docker.com/u/thefactory/jenkins-mesos/ (documented)
When installing a Magento Cluster with Two Web Servers and One Database Server - Is the best to go through the Magento installation process on each server OR to install on one server and and use rsync to copy across all files.
And what is the way to deal with caching between both web servers?
best way for you is to go to amazon and purchase a book called "Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation (Addison-Wesley Signature Series (Fowler))"
however i would suggest to build a deployment pipeline based on git and fabric that pushes your code to each host with a "push of the button" you can sync media and other shared resources with rsync
The solution was to create a file share on the DB server and then initiate multiple layers of caching with Varnish and run both Apache and Nginx.