I have installed a Joomla site with CF on bluemix.
As you know Joomla as other CMS allows to install components for adding functionalities.
This uploads the php code needed for the component and add additional tables/entries in the Database.
My issue is that when I CF PUSH, the new component script is removed from the joomla folders on bluemix, and the database still contains component's tables/entries.
I guess this is the situation for all CMS (Drupal, Wordpress, Joomla, Vbulletin, etc..).
How could I get a kind of CF PULL (?) to keep the modified CMS code including the new component locally on the computer side ?
So when i will redo the CF PUSH the installed component will not be erased.
Thank you in advance for your support,
Best regards
Yves
There is no cf pull command in Cloud Foundry. The closest you would have is the cf files app-name command that you can navigate the directory structure of your cloud application and get specific files as needed, but this would be really tedious if you have multiple files to copy to your local computer.
It looks like Joomla fits better with the IBM Containers service in Bluemix. With the IBM Containers you can have an Docker image from Joomla (https://hub.docker.com/_/joomla/) and use persistent Volumes to save your added functionality. You can also use any Bluemix services (like a database) with IBM Containers.
The article below provides more details and step by step instructions to create an IBM Container for Wordpress. You can easily modify it for Joomla:
http://blog.ibmjstart.net/2015/05/22/wordpress-on-bluemix-containers/
When you push an application on a runtime, php Java or whatever, it will restage all the application sources, included what has been configured and modified before through the cms interface, leaving the db databases untouched. And it is for joomla, but also for drupal or WP or any other cms. By this way to achieve what you wish you have 3 options:
- push exactly the filesystem structure you need on Bluemix, including the configuration files and modules to use on it
- use (as suggested above) a container instead of a runtime: anyway also with a container you have to install your cms on an external docker volume, otherwise the cms will be reset every time you restart the container
- use a Bluemix VM
Related
What is the recommended way to deploy changes (for example change in some Content Type model) from development to production without downtime?
I’m using this setup.
I have development instance with development postgres database.
On production I have 3 strapi instances (serving both api & admin, using the same production postgres database) and those instances are behind loadbalancer.
Lets say that I have Content Type named: Article (both on development and deployed on production).
Lets assume that I want to change that content type for example I want to add some fields and remove some fields in Article content type.
How to deploy changes to production without downtime?
I’ve done some tests and when I for example update Strapi Production Instance #1 to pull new code for updated models, strapi will update database of course. And from that time Strapi Production Instance #2 and #3 have problems serving Admin panel for example (javascript errors because database was changes but JS model files are not updated).
After I updated code on instance #2 and #3 everything works as expected.
But doing something like this on “working product” will be visible as downtime.
How to properly handle this situation? Thanks for help!
Could PM2 solve this problem? Strapi mentiones this in their documentation
PM2 Runtime allows you to keep your Strapi project alive and to reload it without downtime.
Strapi Docs v4
i am using google cloud platform and i have created an instance from compute engine. i installed apache server and then fresh laravel installation using ssh. All my laravel files exist on this path
var/www/html
but now when i am trying to edit any file its not reflecting. When i access my site using this link
https://project-id.appspot.com/
it only display a fresh laravel installation. not reflecting new changes.
I am using filezilla to update files.
The link you have provided is the URL format used by App Engine applications, not a Compute Engine instance. I believe you may be confusing the two.
To view the changes you have done to the files in a Compute Engine instance, you have to access the external IP of said instance, just as you would with a regular machine or VM.
Therefore, navigate to the Compute Engine section of the Cloud Console and look for the external IP of the Compute Engine instance where you have installed Laravel.
We are working on an enterprise system writed by Java. And we use an Apache ACE server to deploy the OSGi bundles, a Jenkins as CI server. When we want to update a bundle, we make a jar file in Eclipse, and upload it to ACE server through Web UI. When we want to release a new version, we must upload all bundles through Web UI. I think that is foolish.
I think there must be a simple way just like when I finish coding, then I can do something just in Elipse to upload the bundle to the ACE server. When we release a version, the Jenkins should also update all of the bundles to ACE server itself.
Certainly, you basically have two options if you want to automate things:
Use the REST based interface to talk to ACE.
Use the shell based interface to script to ACE.
Both are explained on the website, so for more detailed steps refer to:
http://ace.apache.org/docs/rest-api.html
http://ace.apache.org/docs/shell-api.html
I want to move my magento site from AWS to Google and I want to make sure I'm doing it the right way as I am new with google cloud computing.
These are the steps I'm planning on doing:
create an instance and install redis and my magento store on it.
create sql for my DB
create a snapshop of this instance
create a template from this instance
create a group of instances with the template
create a load balancer and connect it with the instances group
is that the correct way to build a solid and fairly scalable magento site on GCC?
are there any services on google cloud I can use to make my store even more fast and scalable?
That's a fairly good way to deploy, but you can offload a few of those to managed services by GCP.
Use Click-To-Deploy solution for Magento (https://cloud.google.com/launcher/solution/bitnami-launchpad/magento?q=magento)
Launch another Click-To-Deploy solution for Redis (https://cloud.google.com/launcher/solution/bitnami-launchpad/redis?q=redis)
Launch a Cloud SQL instance (https://cloud.google.com/sql/)
Update your Magento instance with the configuration for these servers
Use this as a template to launch instances-group
Put this groups behind a load balancer
Why is this better?
You don't have to manage your SQL DB security and scaling
You get redis and magento using simple clicks, saves a lot of time
All you need to manage are your settings. Even if you wanted to update your magento to newer upgrades on better servers
Bonus: You should also make use of a CDN for your static resources and Cloud CDN (https://cloud.google.com/cdn/) will be helpful there.
Further Read: Go through this to get a sense of what else can you do with GCP (https://cloud.google.com/solutions/commerce/)
I am new windows azure user. I have gotten selected for 90 days trial account and I am able to upload my ASP.NET MVC3 application to my account. My site is also running now. After I did publish my site, I added more model, views and controller to my proramme. Now I can not find a way to update my application. I can again publish my application but update option is not there. I want to update my new code only but the package option is creating full application. How I can update the new code to my site in windows azure cloud?
[Changed spelling]
With Windows Azure you can publish/update an application following ways:
Log into you Windows Azure account. Select you hosted server name and at the top panel you will see "Upgrade" option, when you will use this option you will be given a chance to select your CSPKG and CSCFG file from local file system or from Windows Azure storage. Once you selected new or updated CSPKG, your current running service will bee upgraded.
You can also use Windows Azure PowerShell Cmdlets to upgrade your current running hosted service using "Update-Deployment" command:
2.1 http://wappowershell.codeplex.com/
You can other 3rd party applications created using Windows Azure Service Management API to upgrade/manage your current running hosted service.
3.1 http://wapmmc.codeplex.com/
3.2 http://www.cerebrata.com/Products/CloudStorageStudio/Default.aspx
Note: With Visual Studio if you again publish your application, it will delete the current running hosted service and then create the new on so for update it is not the good one.
Finally based on your question about partial update, that is not supported. Even when you make a single line change in your code the deployment will be considered a full deployment even when the action is "update/upgrade". There is no diff package deployment so evertime you update your Windows Azure application, you will use the newly created CSPKG file and upgrade your hosted application.
Regarding partial update: If you have multiple Roles, you may choose to upgrade a single role (so that would be a partial update of the deployment). For a given Role, all code is redeployed. If you're running more than one instance, the update will be rolled out across groups of instances, not all instances at once.
For updates such as static content: if you move these into blob storage (a great place for css, jquery, images, etc.), then you may update this content by simply uploading new items to blob storage individually. These updates don't require any code to be rebuilt or redeployed.
If you're in dev mode (e.g. non-production), you may enable Web Deploy, which then allows very fast updates of your app to the running instance. This only works in single-instance mode, and it's great when doing frequent code+test cycles.