How can I keep a backup of my web application being developing live on web server? - codeigniter

I am using Coda to create a web application using CodeIgniter. I am hosting it live on hostgator and testing it as well. I want to know if there is a way to use some kind of Revision contorl or backup system like Github that would allow me to save my files and keep them updated without having to actually do the folder copy pasting.

You can set up a remote repo at hostgator, and push changes to the remote using git, that doesn't require github, you can just do it from a repo locally on your machine.
Here is a tutorial.

Related

Transferring an app with an attached database to another Heroku account

I made an application in Heroku with a PostgreSQL addon and I want to transfer them to another account.
I didn't find any "transfer to another account" option for the PostgreSQL addon. Does it move automatically with the app? Should I create another database in the destination account and link it to the app after transferring it?
Will the repository location be affected? Do the collaborators have to re-clone the repository?
How long will the transfer process take? While it is transferring, can we still view the app / push code to it?
When you transfer a Heroku app its add-ons should come with it. Transfers take place very quickly once the receiver has accepted the transfer, and the transfer process shouldn't affect the running app.
The Heroku Git URL will change, but you shouldn't be using this as your main code repository:
Heroku provides the git service primarily for deployment, and the ability to clone from it is offered as a convenience. We strongly recommend you store your code in another git repository such as GitHub and treat that as canonical.
Nobody should be cloning from that URL. Either way, you'll have to update your Heroku remote's URL, e.g. using git remote --set-url.

How to deploy my website online

I have a website build using Codeigniter Framework. I already uploaded it in a web server, TurnKey Lamp in Linux using Git Bash.. I can access it already since it is networked.. What I want now is to make it visible on internet.. How can I do that?

Dev->Stage->Prod with Git deployment for Azure Websites

How best should I accomplish the following deployment objectives with Git deployment for Azure?
Easily switch when working locally to either use fake in-memory data or (eventually) non-production snapshot of real data
Deploy to staging environment on Azure such that at first I could use fake in-memory data and eventually move to non-production snapshot of real data.
Deploy to production with real data
I currently deploy using Github and a staging branch to a staging Azure website. Since I deploy to a public repo, the web.config file is ignored by git. (EDIT: I just learned that ignoring web.config actually causes deployment error on azure)
Any help/suggestion is appreciated.
It's actually supposed to be simpler than that. Please see this page. Basically, the idea is that you set some AppSettings in the Azure portal to override the default values that are committed to your repo.
Well... Here's what I did that works for me right now.
To quickly switch between fake in-memory data locally, I use a compilation symbol LOCAL and a preprocessor directive #if LOCAL.
Same compilation symbol works when you deploy to Azure, so I can work on fake data until I'm ready to switch to real db. I can also use the app settings if I really want to make to switch it more easily.
The challenge was to keep a web.config with "secrets" (like connection string) locally and not expose it to Github. I added it to .gitignore, but then my deployments started failing on Azure because it could not find the web.config. Just copying it to wwwroot via ftp did not help - Azure was looking for web.config in the repository.
So, to make this work I "slightly" altered the deployment process by first copying the Web.config from wwwroot to the repository before running the default deploy.cmd. This was simple - this is what you do:
Create a .deployment file in the root of your repository with the following:
[config]
command = deploy.my.cmd
Create deploy.my.cmd with the following script:
xcopy %DEPLOYMENT_TARGET%\Web.config %DEPLOYMENT_SOURCE%\\ /Y
deploy.cmd
Now, I have web.config with secrets locally. Git ignores this file. I uploaded the correct web.config to Azure via FTP, and it gets used whenever I deploy.

Incrementally updating a remote joomla web site?

I have a Joomla 1.5 site on my localhost. It's hosted on a public hosting server as well.
I was wondering what is the best way to do incremental updates to the site? I mean I don't want to update the whole site, if I just changed one source file (html, php, images, etc) or made changes to the database. I understand, to be safe I'd have to update the database every time (export from local and import in remote), but I'm sure we can avoid unnecessary uploads of unchanged files.
I've seen https://www.akeebabackup.com and it doesn't offer what I need. One option is to use an ftp client (like Filezilla) which does folder synchronizations, but I'm not sure they work very well.
For database you could use master-master replication, which is quite easy to set up but you need GRANT privileges in MySQL, which most likely won't be possible on a shared hosting. I'd also suggest connection both machines via VPN to make it more secure.
The other easy way to sync databases is "Synchronisation" tool if you're using phpmyadmin.
If not, look at any MySQL planning software like MySQL Workbench, which also has this feature built-in.
You didn't tell what privileges you have to access the public hosting server.
If you're an admin you can have SVN admin installed and configured to sync files with your local data.
You can also have a GIT repository to do exactly the same, or LDAP set-up via VPN to keep your files in sync.
If you're not an admin just see or ask your hosting company what's of the above is available, I'm sure they'll be able to help you. Nowadays, hosting companies have SVN or GIT installed, which should be what you need.
I often use SVN tools built-in PHP Designer 8, but you can have SVN, GIT and many more also in NetBeans.

Setting up a collaborative environment for web application development

My office is growing and ive been tasked to build out the IT for our web development.
Whats the best tool/setup for doing web development in a group setting? The requirements are a centralized code repository, a location to test development code on, and finally a way to push tagged code out to a staging server. What im thinking is svn/redmine for code repo, each user has an account on a central development machine to allow for ssh access(eclipse over ssh) and their own virtual host on the dev server which gives everyone a centralized development sandbox. Code is written and tested on this dev box then checked back into svn and later tagged and pushed out to the staging server. Yeah? Thoughts comments or recommendations?
*Also, in a dev environment what is the best way to handle databases? Is it wise to pull from the production database? Also should each developer have his/her own db or work off a master db?
**We are building a magento application and also have some custom backoffice tools that run on cakePHP.
Although this subject is off-topic in StackOverflow and flagged so then you need to concentrate on following areas:
VERSION-CONTROL
GIT has all the glory and you don't need your own box for this as https://bitbucket.org/ offers unlimited data and private/public repos and you can set your codebase there. http://github.com is also powerful and de facto most popular version-control oriented tool out there although it comes for a small price
so your master branches live in your version control and your devs will checkout frpom there and commit to it as well
your deployment tools will deploy data to your live and staging environments from your master
ENVIRONMENTS
usually three are used LIVE, STAGE, DEV
LIVE is well live and only approved code gets deployed there
STAGE is pre-live environment and should be exact replica environment according to LIVE so all things can be tested there by merchant
DEV is cool to have exact replica but can as well be on developers local env and is ment for loose testing and experimenting
DATABASES AND DEPLOYMENT
mysql databases are pain in the ass to sync so you better have a script for it that syncs from live to others and prevent syncing from other environments to LIVE. This limitation also requires that all the configuration and content will be added from LIVE only and only then synced down the line. Every change to schema or permanent setting should be handled by update scripts (As we are talking MAGENTO CE , MAGENTO EE has migration built in)
for deployment I also suggest you to build a fabric or capistrano script that resets dev and staging environments, handles database reset and pull from LIVE DB, and imports code from central repository.
it's also a good idea to target the following everyday tasks:
clients needs to reset the stage for it's tests
project manager, developer or testers need to test so spawning a test clone should be oneclick action (take current db and code and make it live in some subfolder for specific test only) as well as deleting the test
3rd party devs might need access to specific test or dev environment (this is actual with magento as in average there are at least 10 external extensions installed in every magento store)

Resources