Setting up a collaborative environment for web application development - magento

My office is growing and ive been tasked to build out the IT for our web development.
Whats the best tool/setup for doing web development in a group setting? The requirements are a centralized code repository, a location to test development code on, and finally a way to push tagged code out to a staging server. What im thinking is svn/redmine for code repo, each user has an account on a central development machine to allow for ssh access(eclipse over ssh) and their own virtual host on the dev server which gives everyone a centralized development sandbox. Code is written and tested on this dev box then checked back into svn and later tagged and pushed out to the staging server. Yeah? Thoughts comments or recommendations?
*Also, in a dev environment what is the best way to handle databases? Is it wise to pull from the production database? Also should each developer have his/her own db or work off a master db?
**We are building a magento application and also have some custom backoffice tools that run on cakePHP.

Although this subject is off-topic in StackOverflow and flagged so then you need to concentrate on following areas:
VERSION-CONTROL
GIT has all the glory and you don't need your own box for this as https://bitbucket.org/ offers unlimited data and private/public repos and you can set your codebase there. http://github.com is also powerful and de facto most popular version-control oriented tool out there although it comes for a small price
so your master branches live in your version control and your devs will checkout frpom there and commit to it as well
your deployment tools will deploy data to your live and staging environments from your master
ENVIRONMENTS
usually three are used LIVE, STAGE, DEV
LIVE is well live and only approved code gets deployed there
STAGE is pre-live environment and should be exact replica environment according to LIVE so all things can be tested there by merchant
DEV is cool to have exact replica but can as well be on developers local env and is ment for loose testing and experimenting
DATABASES AND DEPLOYMENT
mysql databases are pain in the ass to sync so you better have a script for it that syncs from live to others and prevent syncing from other environments to LIVE. This limitation also requires that all the configuration and content will be added from LIVE only and only then synced down the line. Every change to schema or permanent setting should be handled by update scripts (As we are talking MAGENTO CE , MAGENTO EE has migration built in)
for deployment I also suggest you to build a fabric or capistrano script that resets dev and staging environments, handles database reset and pull from LIVE DB, and imports code from central repository.
it's also a good idea to target the following everyday tasks:
clients needs to reset the stage for it's tests
project manager, developer or testers need to test so spawning a test clone should be oneclick action (take current db and code and make it live in some subfolder for specific test only) as well as deleting the test
3rd party devs might need access to specific test or dev environment (this is actual with magento as in average there are at least 10 external extensions installed in every magento store)

Related

Development versus Production in Parse.com

I want to understand how people are handing an update to a production app on the Parse.com platform. Here is the scenario that I am not sure about.
Create an called myApp_DEV. The app contains a database as well as associated cloud code.
Once testing is complete and ready for go-live I will clone this app into myApp_PRD (Production version). Cloning it will copy all the database as well as the cloud code.
So far so good.
Now 3 months down the line I want have added some functionality which includes adding some cloud code functions as well as adding some new columns to the tables in the db.
How do I update myApp_PRD with these new database structure. If i try to clone it from my DEV app it tells me the app all ready exists.
If I clone a new app (say myApp_PRD2) from DEV then all the data will be lost since the customer is all ready live.
Any ideas on how to handle this scenario?
Cloud code supports deploying to production and development environments.
You'll first need to link your production app to your existing cloud code. this can be done in the command line:
parse add production
When you're ready to release, it's a simple matter of:
parse deploy production
See the Parse Documentation for all the details.
As for the schema changes, I guess we just have to manually add all the new columns.

Team City to deploy artifacts to external server

I've tried taking a look on Google for how this can be done but I thought I'd post a question anyway to see what the best practice is for doing this nowadays.
We are trying to setup a Team City build to deploy to a clients environment, basically we're generating an artifacts zip file and the plan is to (somehow) deploy this to the clients UAT, Staging and Live Servers (which are password protected). When the build is run it executes a nant script.
From our network in the office we are able to remote into the UAT box, but we can only get to the Staging and Live servers whilst on the UAT box.
What is the best way of doing this? Are there any useful resources I can look at to help me move forward?
You can try Deployer Plugin developed by TeamCity team. It offers SMB/FTP/SSH deploy options as well as SSH Exec option.

configuring compatible development and production sites

I am developing a Magento site.
I have access to a local host and a remote host and would like to
somehow configure development and production environments. On the
remote host I restore the database data that was backed up on the
local host, but when I do so, I overwrite the host's base name and
this causes the site to be redirected to a nonexistent URL when
the page is loaded. How can I avoid this clash:
I want to be able to develop either (a) on http:// remotehost/foobardev
and back up my data to http:// remotehost/foobar or otherwise (b) develop
on http:// localhost/foobar and deploy on http:// remotehost/foobar . I
want to know how to transfer the database data back and forth without
overwriting the values found in Magento Admin Panel -> System
-> Configuration -> Web -> Unsecure Base URL / Secure Base URL
when I run mysql and use the mysql command source to reinstate
the database entries found on the development site onto the
production site.
So, I would like an easier way to restore the database contents without
overwriting the base url configured in magento admin panel as doing so
would cause a redirect to a nonexisting or wrong place on each page load
and thus render the system unusable.
Not exactly a SO type of question. Magento EE has staging built in and can merge your data as well. You have to understand that syncing data from dev to live is not easily possible without some serious sync framework that keeps track on state of every row and column and knows what data is new and what is old and solve syncing conflicts.
Here's your flow based on assumption that you are using CE and does not have data migration tools bundled.
setup live database and count that data will move only from live to dev and never from dev to live as you don't have data migrations. Every config you need to make and preserve in database level do it on live database (test them out in dev environment and then create in live)
make a shell script , fabric script whatever deployment script you are comfortable with that will export live db dump , deletes dev database if exists and create a new database and import live database to it, run a pre or post sql script that will change/delete config values that are environment dependant (like base_url, secure_base_url etc)
to avoid double data entry always create all attributes and config values that you need to preserve with magento setup scripts.
Same goes about code and here's a common setup scenario with live, stage and development environments
one master version control (preferably bare just to avoid that someone will change files there) repository based on clean magento versions tree
separate branches for each environment (live, stage, dev(n)) and a verified code flow from dev (where you develop and can have broken codebase state) to stage (where release candidate resides and is ready for testing and does not change) from stage to live (where your live code is in stable state)
every developer works on a checkout from dev branch and commits to it's own dev branch and then pushes changes to dev where they can be evaluated and decided if changes are mature enough for staging
stage is a place where release candidate lives and client can test (or automated tests) and diagnose if it's ready enough to be released, no one ever changes code here and code comes from dev branch
live is live and running version where no one ever changes any code directly . If tests are passed code can come here from stage only
so to visualise it better imagine your codebase residing in git.
myproject_magento_se (your project git repository on bitbucket.org or in github or wherever you can host)
--> master (branch with all clean magento versions from your current to latest)
--> dev (git checkout -b master (or by specific version from master)
--> stage (while on dev: git checkout -b stage)
--> live (while on stage: git checkout -b live)
and imagine your hosts setup like this:
www.mylivesite.com = git clone yourgitrepo; git checkout live;
stage.mylivesite.com = git clone yourgitrepo; git checkout stage;
dev.mylivesite.com = git clone yourgitrepo; git checkout dev;
For all this you better have deployment scripts that do switching and code and database lifting between environments with a push of the button.
Here's a few common actions that you need to perform daily with every software project
move/reset data from live to stage from live to dev (have obfuscation calls if needed to scramble or change client related data)
move code from dev to stage
move code from stage to live
reset/create any dev with live state (Data and code)
have fun :) and go through this thread as well https://superuser.com/questions/90301/sync-two-mysql-databases and all other you can find searching on SO in similar matter.

Incrementally updating a remote joomla web site?

I have a Joomla 1.5 site on my localhost. It's hosted on a public hosting server as well.
I was wondering what is the best way to do incremental updates to the site? I mean I don't want to update the whole site, if I just changed one source file (html, php, images, etc) or made changes to the database. I understand, to be safe I'd have to update the database every time (export from local and import in remote), but I'm sure we can avoid unnecessary uploads of unchanged files.
I've seen https://www.akeebabackup.com and it doesn't offer what I need. One option is to use an ftp client (like Filezilla) which does folder synchronizations, but I'm not sure they work very well.
For database you could use master-master replication, which is quite easy to set up but you need GRANT privileges in MySQL, which most likely won't be possible on a shared hosting. I'd also suggest connection both machines via VPN to make it more secure.
The other easy way to sync databases is "Synchronisation" tool if you're using phpmyadmin.
If not, look at any MySQL planning software like MySQL Workbench, which also has this feature built-in.
You didn't tell what privileges you have to access the public hosting server.
If you're an admin you can have SVN admin installed and configured to sync files with your local data.
You can also have a GIT repository to do exactly the same, or LDAP set-up via VPN to keep your files in sync.
If you're not an admin just see or ask your hosting company what's of the above is available, I'm sure they'll be able to help you. Nowadays, hosting companies have SVN or GIT installed, which should be what you need.
I often use SVN tools built-in PHP Designer 8, but you can have SVN, GIT and many more also in NetBeans.

Magento upgrading process and infrastructure for smallest possible downtime

I have a client who currently has one server with Magento and his admin takes down whole site for updates for multiple hours. I would like to make it instant process so that I wanted to propose new solution on how he should have set it up:
Magento Production Server 1 (WEB+DB)
Magento Production Server 2 (WEB+DB)
Magento Dev Server 1
DB would have to be synced somehow between those 2 servers (cluster? replication?) and I was thinking that for the smallest downtime possible first the updates should be tested on Dev Server (DB / WEB synced from Production server just before upgrading) and after checking it works fine and knowing how the process looks like I would be disabling LoadBalancing or RoundRobin DNS to only Server 1 then doing upgrades/updates on Server 2 and then Switching to server 2 as production server and updating server 1. When both are done switch on LoadBalancing/Round Robin on.
I come from Windows environment so this is how I would do it on Windows (maybe with seperate Database and Web too) and with tools like RedGate SqlCompare/Sql Data Compare etc it should work.
But I don't know Magento at all so please let me know what's possible and maybe how this should be done if the client don't want to end up with his shop being down...
You'll definitely need a production server, and some sort of staging/version management system.
I recommend checking out Subversion or Git for version management.
Changes can be committed to a repository first, and then updated to the live site with no downtime. This would be more than sufficient for a development environment.
For bigger changes, like a Magento version upgrade, you might still want/need to take the site down for a few hours in the middle of the night, as this is a much bigger process.
As for multiple servers, as an example I run a load balancer which balances between a primary and a secondary server. There is one database server that is separate. Changes are made to a development server, committed to the primary server with Subversion, and then any changes between the primary and secondary servers are rsynced to the secondary server every 60 seconds.
For this solution, session and cache data are stored in the database.
IMHO, with a good hosting environment, you won't need multiple servers unless you literally are in the thousands of simultaneous visitors. Plugins are the usual cause of admin-related problems.
We've had great success with "cloud" environments. Instantiate a new cloud instance, get that IP, then in your "hosts" file, point something like dev.yourdomain.com to it for testing. The only real downtime is that you should freeze the production site while the database converts to the new version, which can be a couple hours. Our mySql DB backup is 3 GB or so, but thankfully tgz's down to 280 MB.
We're using nginx and php-fpm and they are obscenely fast.
Typical migration path for me:
backup production site
start new cloud instance and copy production site to dev site
(restore production database)
try upgrading dev site one step at a time to see what breaks
start new cloud instance and do completely fresh install of newest
magento version
once working, restore production database and watch as it grinds on
converting it, see what breaks
pick between upgrade versus fresh install
back up production mySql, put production site in maintenance mode
while dev site converts the database
point domain to new IP address

Resources