Our web developer picked OctoberCMS to develop our new website (his skill). Unfortunately before completion he rapidly left us due to health reasons and is no longer available. His Ubuntu environment has some problems and we need it on CentOS 7 anyway. The rest of us are OctoberCMS newbies, but want to learn it.
We built a CentOS 7 VM and installed OctoberCMS and want to move his work over.
We can not find any instructions on how to "export" the work he has done thus far and import it into our new OctoberCMS.
He is using 10 plugins and 3 he developed. (I don't know if that is relevant)
Is there an easy way to do this or at least instructions?
We have been googling, youtubing, IRC'ing for a week and still at a loss.
Any help would be most appreciated.
There really isn't anything special you need to know about moving an OctoberCMS install to a new server compared to moving over any other PHP application.
I am assuming you know how to do the basics of setting up a LAMP stack, such as setting up a virtual host for the domain you want to host the site on and setting up a MySQL database and user/password to access the database. There are of course many variants on how you could accomplish this such as using a management tool like Plesk or cPanel, or just configuring the services manually via the command line.
1) Ensure your new server is running at least roughly the same version of Apache, MySQL, and PHP.
2) Copy over the directory that contains all of the web files from the old server into the document root for your domain on the new server.
3) Do a database dump from the old server and copy it to the new server. If possible, use the same database name and username and password as the old server. This way you don't have to worry about updating the configuration of the website.
4) Pull up the site and troubleshoot any errors that come up. It is helpful if OctoberCMS debug mode is on.
Following the above method will ensure that you have the exact same setup on your new server that the old server had. This will copy over all of the plugins, data, etc.
There are of course many complexities that can come up during a switch over like this, but this should at least get you started and you can come back to StackOverflow with some more specific hurdles.
Hope that helps.
Related
I know that https://forge.laravel.com/auth/register is available for $12/month*, but I'd like to understand how to accomplish the same thing myself.
What I assume is possible (and what I'm looking for): I create a server that has only Ubuntu 18.04.3 installed and nothing else, and I upload a script that installs all the appropriate software and sets up MySQL with the correct passwords, etc (without manual intervention).
I've tried Laradock and had tons of problems with Docker and don't want to do that anymore.
I see that https://cloud.digitalocean.com/droplets/new lets me create a LEMP droplet (Ubuntu, Nginx, MySQL, PHP-FPM) with one click. But it lacks Redis, and its versions are outdated (e.g. PHP 7.2).
I've heard people mention Chef (maybe this?), but that seems to be more complicated than what I'm imagining.
Unfortunately I'm not even sure how to search for what I'm trying to do (or how to tag this question); is this called "server provisioning"? I've been searching phrases like "automatic install script redis mysql server for laravel".
Thanks in advance for pointing me in the right direction.
* I also just found https://getcleaver.com/ and https://runcloud.io/server-management, which each look like Forge + Envoyer (and RunCloud offers a free plan).
It is called server provisioning and Chef would be a good fit for this, check out Ansible too - another thing you could do is setup the server yourself and create an image from that server and then base your new servers out of that image, that way you'll have all your services installed from the start.
This sounds like a job or something like Puppet (or Chef/Ansible), however Laravel Envoy may be another tool to look at if you haven't already for the second part of your problem.
I highly recommend Heroku (or similar service), as this is all done out of the box, and has a ton of other great features that make developing a pipeline a breeze.
I want to setup a Nextcloud on my personal VPS. To do the first time setup, I have to access the webserver via my browser and it says I should do it over http://localhost/nextcloud/ (Nextcloud Installation Wizard (Right in the beginning), but this does not work for my because the VPS is not my local machine. So I have to open up the setup website to the public web and everybody who would know the IP of my VPS could do it first time setup.
I read other tutorials from web applications (for example Confluence Confluence Installation Documentation (Point 4.2)) where this is the common way of setting things up the first time.
Is there another secure way to do this in general for setting up an webapp for the first time? Firewall? VPN? How do you guys do it?
Thank you for your help
Yes - this is the common way on how to set it up. In the unlikely case that somebody else sets it up in the short time between placing the files and running the installer, you could also remove the config/config.php and do the setup again.
If you want to not do the web based setup you could use the CLI tool to run the installation. It also asks in an interactive way to set up Nextcloud or all the parameters can be provided via CLI options.
See https://docs.nextcloud.com/server/12/admin_manual/installation/command_line_installation.html for more details on the CLI installation method.
I have a client who currently has one server with Magento and his admin takes down whole site for updates for multiple hours. I would like to make it instant process so that I wanted to propose new solution on how he should have set it up:
Magento Production Server 1 (WEB+DB)
Magento Production Server 2 (WEB+DB)
Magento Dev Server 1
DB would have to be synced somehow between those 2 servers (cluster? replication?) and I was thinking that for the smallest downtime possible first the updates should be tested on Dev Server (DB / WEB synced from Production server just before upgrading) and after checking it works fine and knowing how the process looks like I would be disabling LoadBalancing or RoundRobin DNS to only Server 1 then doing upgrades/updates on Server 2 and then Switching to server 2 as production server and updating server 1. When both are done switch on LoadBalancing/Round Robin on.
I come from Windows environment so this is how I would do it on Windows (maybe with seperate Database and Web too) and with tools like RedGate SqlCompare/Sql Data Compare etc it should work.
But I don't know Magento at all so please let me know what's possible and maybe how this should be done if the client don't want to end up with his shop being down...
You'll definitely need a production server, and some sort of staging/version management system.
I recommend checking out Subversion or Git for version management.
Changes can be committed to a repository first, and then updated to the live site with no downtime. This would be more than sufficient for a development environment.
For bigger changes, like a Magento version upgrade, you might still want/need to take the site down for a few hours in the middle of the night, as this is a much bigger process.
As for multiple servers, as an example I run a load balancer which balances between a primary and a secondary server. There is one database server that is separate. Changes are made to a development server, committed to the primary server with Subversion, and then any changes between the primary and secondary servers are rsynced to the secondary server every 60 seconds.
For this solution, session and cache data are stored in the database.
IMHO, with a good hosting environment, you won't need multiple servers unless you literally are in the thousands of simultaneous visitors. Plugins are the usual cause of admin-related problems.
We've had great success with "cloud" environments. Instantiate a new cloud instance, get that IP, then in your "hosts" file, point something like dev.yourdomain.com to it for testing. The only real downtime is that you should freeze the production site while the database converts to the new version, which can be a couple hours. Our mySql DB backup is 3 GB or so, but thankfully tgz's down to 280 MB.
We're using nginx and php-fpm and they are obscenely fast.
Typical migration path for me:
backup production site
start new cloud instance and copy production site to dev site
(restore production database)
try upgrading dev site one step at a time to see what breaks
start new cloud instance and do completely fresh install of newest
magento version
once working, restore production database and watch as it grinds on
converting it, see what breaks
pick between upgrade versus fresh install
back up production mySql, put production site in maintenance mode
while dev site converts the database
point domain to new IP address
I have ColdFusion Builder 2.0.0 installed and I am trying to look at the much vaunted step debugging. However, I cannot seem to get it to work as I don't have my site / JRun install setup in the naive way the examples show.
I am using version 9,0,1,274733 of ColdFusion and my configuration is as follows:-
Installed as multi-server version with Jrun here:- c:\Apps\JRun4
application files are here:- d:\websites\my.website.com
web root is here d:\websites\my.website.com\www
core library of CFCs is here d:\websites\frameworks\core which is mapped in CF as core
I have read this watched this http://help.adobe.com/en_US/ColdFusionBuilder/Using/WS0ef8c004658c1089-31c11ef1121cdfd6aa0-7fff.html and this http://forta.com/blog/index.cfm/2007/5/30/CF8-Debugger-Getting-Started and watched this https://experts.adobeconnect.com/_a204547676/p33029638/?launcher=false&fcsContent=true&pbMode=normal but I get stuck at the point after you have configured RDS and you are setting up the server for your project.
Now I am pretty sure the above is correct, when I move to the next page in the wizard I get the following:-
Now I as I understand it my Server Home should be c:\Apps\JRun4 and my Document root should be d:\websites\my.website.com
This all looks like it is going to be fine until you actually try and debug when I get
followed by
I can confirm that the server is running and RDS is enabled as in the RDS Dataview I can see all my databases.
Any help would be gratefully received as this is very frustrating and the documentation is very lacking.
There is a video tutorial as well that you may want to check and see if that helps. http://blogs.adobe.com/anand/2011/01/learn-how-to-debug-coldfusion-applications-using-coldfusion-builder-2.html
You need to specify the RDS username/password and the "application server name". If you are using the base instance that was installed when you setup the multiserver install of CF that is "cfusion", otherwise its the name of the instance you are using.
The RDS username is most likely "admin" unless you setup custom users for RDS. The password is the RDS password you specified when you installed CF.
I tried to deploy my project with capifony, becouse I found here an answer, that with capifony deployment is easy. Well I don't think it is, so my question is:
How can I deploy my project via ftp, I put all my files on the server but even if I browse to web/app.php, the only thing I get is an empty page, whatever route I write in the url. So someone please explain me how can I get this work! Thank you!
A couple of things to think of when deploying a Symfony2 project to a new server or computer (as far as I've encountered) might be:
Make sure that the server and it's PHP installation meets the Symfony2 requirements (and perhaps also the recommendations)
Check that you've somewhat followed the installation instructions (found here)
Try to clear the cache
Make sure that the web server and it's PHP process have write permissions to the cache folder
If none of these helps, try to modify the app_dev.php to temporarily allow access from your current (client) IP (instead of restraining it to localhost). Then, hopefully, you'll get a more useful and detailed error message, instead of the blank page (which often is caused by some fatal error that have occurred during the initialization of the framework and its kernel)
Update: Noticed now that you've tagged your question with 'windows', but that you don't mention which server you're trying to deploy to. I wrote the above with some *nix based server in mind, but hopefully some of it are applicable to Windows servers too (but there might be other common sources of error running under Windows that I'm not familliar with