I'm beginning to understand how Heroku works, but haven't yet used a pipeline. I have an app I'm working on that is near its first production version. I'd like to begin using pipelines.
But I don't understand how to begin. What do I need to do to make a copy of the current app and have that copy be in the development stage and make another copy for the staging stage? Do I fork my git repository twice and add each one?
I'm trying to take this one step at a time. I don't need GitHub integration yet. This is a small project and will not have any pull requests for quite some time, if ever. I'm only interested in the ability to develop, stage and release in the three stages offered by Heroku.
While pipelines do use multiple apps, they should use the same git repository with different remotes. Heroku's help page helped me understand that the process is to link the repository to each app different remote names and then push to the remote that I'm currently working on.
Related
We are currently using bitbucket cloud to host our grails-app repository. We want to set up some pipelines to do things like run unit tests and make sure the app compiles before being able to merge a branch to master.
I know this can pretty easily be done by letting them host the pipeline and committing a well written pipe file, however there is a problem standing that our app is very large, and even on brand new macbook pros takes 20 minutes to compile, on some older ones it can take 2 hours or more. Grails, thankfully, only compiles files that have changes in them from the last compilation. However, this can't be used on a bitbucket pipe that's working off a fresh pull of the app every time it runs.
My solution to this was wanting to set up a pipeline to run for us internally so that it can already have the app pulled, and just switch to the desired branch and run from there. This still might take time if switching between 2 very diverged branches, but it's better than compiling from fresh every time.
I can't seem to find any documentation on hosting a pipeline internally with bitbucket cloud, does anyone know if this is possible, and if so where there is documentation for it?
It would also be acceptable to find a solution to the long compilation problem itself with bitbucket hosted pipelines.
A few weeks ago, self hosted runners was made available as a public beta. Here are the details: https://community.atlassian.com/t5/Bitbucket-Pipelines-articles/Bitbucket-Pipelines-Runners-is-now-in-open-beta/ba-p/1691022
Additionally, if you're looking to retain some of your files from one build to the next to save doing the same work over and over again, have a look at caches: https://support.atlassian.com/bitbucket-cloud/docs/cache-dependencies/ there are some built ones that you could use, but you can define your own custom ones as well. Essentially it's just a way of preserving the contents of a directory for a future build.
I'm somewhat new to maintaining separate production vs development builds of an app.
I want to have my current build deployed to heroku so i can easily get it in front of people for critique but I'd also like to run a local version as well so i can make changes and see them quickly on the fly.
With my app on heroku, everytime i make a change, I have to push to github then hit the deploy button. This takes a relatively long amount of time compared to just launching the app on localhost and just refreshing the browser page to see how the changes came out. This is fine if you have made a ton of changes and you know they all work as expected, but its horrible for trying incremental changes, as you can imagine.
I know this is sort of a newbie question, but how can I have the best of both worlds?
The only way to achieve something like this is with review apps.
Instead of doing a git push, you will need to enable GitHub Sync. You will be able to deploy either through the heroku dashboard, or automatically when a push is made to master.
Review apps will automatically create a test app and configure it, for each pull request.
Then, when you wish to do QA with other people, you just have to give them the address to that review app where the new code is deployed instead of the main one.
I have a Parse application that will soon be used in production, and I need to be able to continue developing things locally without breaking things for live users when I make changes to cloud code.
I have cloned the app, and can now deploy to either the production or staging app using the parse deploy staging and parse deploy production commands, however these commands only work if I am on the master branch.
What I would like to have are two branches in git, one that can be pushed to my staging app, and the other that can be pushed to the production app.
At the moment all I can think of doing is to just tag commits in master as being pushed to production, then continue ontop of that for development, but that is going to be a nightmare if I need to patch the released app when I have all my development changes on master.
Pushing directly to the heroku git repos doesn't seem to work either, parse deploy must be doing something extra (plus it tries to build the app so I can see when things go wrong).
Another issue is that when other developers start working on this as well, we won't be able to all deploy to the development server, and as far as I know there isn't an easy way to run parse cloud code locally on windows.
What is the best way to manage all this?
You have to setup parse-server (use parse-server-example), parse-dashboard and mongoDB on a local or remote development server. You and your team can now develop everything locally, test and then deploy to production.
Is there any CI/CD tool for Netezza that can manage versions and can be used for migrating code across environments? We have used flywaydb for other databases and are happy with it, but that does not support Netezza. I have already googled and did not find a single tool, so any responses are good for me to begin analyzing further
To my knowledge, there's nothing specifically geared for Netezza. That said, with a bit of understanding of your target environment, it's certainly possible.
We use git and GitHub Enterprise (GHE). The reason for GHE is not particular to this solution, but rather because I work at a hospital. Here's what we do.
Setup
Build a repository at /home/nz on your production server. Depending on how you work nzlogs, nzbads, and other temporary files, you may need to fiddle quite a bit with the .gitignore file. We have dedicated log directories where temporary files should reside.
Push that repo into GHE.
If you have a development server, clone the repo in the /home/nz directory on that server. Clearly you'll lose all development work up until that point and will want to make sure that things like .bashrc are not versioned. Alternatively, you could set up a different branch and repo and try merging the prod and dev versions. We did this, but I'd recommend just wiping your development box with production code one slow day.
Assign your production box a dedicated branch in git. For this discussion, I'll call them prod and dev. Do the same for development, if you have it. This is mainly a mental thing, not a tech thing, but it's crucial, like setting up a remote for Heroku or Azure.
Find or develop a tiny web server that can listen for GitHub webhooks. I built a Sinatra server with a simple configuration file. Anything will do. Deploy the web server to each of the environments and tune them to perform the following activities on an update to the prod or dev branches, respective to the server.
git reset --hard
git clean -f
git pull
Set up webhooks in your GHE repository to send the push event to the web servers.
Of course, you can always have the web server do other things on a branch update if you want to get fancy (maybe update cron from a versioned file or update schemas from all new files).
Process
Fairly simply, follow the GitHub Flow workflow. You can pretty much follow whatever process you want with the understanding that your prod and dev branches should be protected and only removed or futzed with as an admin task. Create a feature branch, test it by pushing to dev, and then make a pull request for the prod branch.
Why GHE? Mainly because it keeps an open area where our code is available. You could absolutely do this by pushing directly to Netezza's git repo, but your workflow will suffer--it just isn't as clean as having all code in one clear place with discussion around pull requests.
Are there any automated depoloyment tools out there for Magento sites?
If not does anyone have any best practices so to speak for maintaining and deploying Magento builds across local, staging and products?
This is how I've been working for the past few months and it works pretty well for me.
Install SVN on your server. Or get your host to do it. Or choose a host with SVN in place. Or git.
or
Use Springloops.
The 'trunk' is your live site.
Branches are for staging. Set up the webserver to treat these folders as subdomains.
The live database is regularly copied to branches. This refreshes the data for testing. (Consider anonymizing sales & customer data)
Each repository has it's own "app/etc/local.xml" file. Mark these with SVN:ignore so that one will not upset another.
Also SVN:ignore the "media" and "var" directories.
Each dev has a local webserver for working on. When they finish a change it is deployed to a branch ready for QA.
Nobody except the lead dev is allowed to merge branches to trunk on pain of death!
This means changes in code bubble up to the live site. Copies of the database bubble down to devs. Sometimes copies of the "media" dir are copied downwards as well. Extensions and upgrades are tested on branches too, I dislike using the Connect Manager on a live site.
Been using Git lately, so far liking it much more than SVN, this same flow could be applied to SVN as well I believe:
More details: http://nvie.com/posts/a-successful-git-branching-model/
Currently having, a local VM with a base install of Magento to setup for projects to roll out to new developers is the best approach I think. Most of us just use NetBeans inside the VM and use git pull/pushes as well as some custom build modules for deployment to all of our usual environments: local, integration, UAT, and production. Production or Integration is usually our system of record database wise.
Here is a base .gitignore file to start off with:
https://github.com/github/gitignore/blob/master/Magento.gitignore
A simple Git Deployment:
http://ryanflorence.com/simple-git-deployment/
You can try the packaged Magento that is automatically deployed with a help of Jelastic PaaS https://github.com/jelastic-jps/magento/tree/master/magento
You can get it pre-configured and installed with NGINX or LiteSpeed server and MariaDB.
After customization, you can clone the whole environment in order to get similar replicas for dev, test, stage and production. And when all needed changes are done on the cloned environment, you can just swap domains with current production and thus make the updated version available.
Or you can set up automated update process from Git/SVN.
I'm in the early stages of my first magento site. Its a big project, and my team and I have been discussing this very issue. We've seriously considered using a Git repository to maintain versioning across local, staging and live servers. Here is a good article on the subject. Its obviously focused on Wordpress, but I think the workflow would be almost identical.
And to answer your first question, I know of nothing automated.
We use SVN for very large scale projects. Almost any hosting service for your staging and product environments will be able to provide you with an SVN client to maintain sync with your repository.
Never heard of any automated deployment tools for Magento.