Speeding deployment process in SAP Commerce Cloud - performance

We have a SAP Commerce Cloud solution (SAP commerce ver. 2105.8) with Spartacus storefront
We create builds and deployments over SAP Commerce Cloud portal.
Can we decrease deployment time? Can we somehow influence this deployment process?
Thank you for your reply
Best regards
Patrik
We currently use the recreate mode and migration data setting when deploying

This configuration is the fastest:
Platform Update Mode: No migration required
Database isn’t updated and no data is imported.
Deployment Mode: Recreate (fastest, with downtime)
Shuts down the running deployment and creates a new one based on a
selected build.
This operation is the fastest way to deploy the build, but it requires
site downtime. If a problem occurs during the deployment, this option
is also safer in terms of rollback potential.
Visitors to your storefront see a temporary maintenance page until the
deploy action completes.
For official documentation, see: https://help.sap.com/docs/SAP_COMMERCE_CLOUD_PUBLIC_CLOUD/0fa6bcf4736c46f78c248512391eb467/16a4fd7c4fee4bcc8b18e96b016e9faf.html?version=v2205

Related

Deploy Microsoft Portals

We are just configuring our first Microsoft CRM Portal using the Portals Add-In.
We are doing this in our sandbox, obviously. Question: How do we deploy this to our production system? As far as I see, most of the data is not saved in a solution but in entity records. Is there a practical way to deploy the stuff?
You’ll want to use the Portal Records Mover in the XRMToolbox. It will allow you to export/import the relevant records for the Portal.
https://community.dynamics.com/crm/b/dynamicscrmtools/archive/2017/06/13/new-xrmtoolbox-plugin-portal-records-mover
Alternatively, you can use the Configuration Migration tool (available in the SDK NuGet package) to move the records. I prefer the Portal Records Mover because it provides more granular control and supports updates without overwriting.
As suggested by #Nicknow use Configuration Migration and Portal Records Mover.
I would do the first transfer with the Configuration Migration and continue with Portal Records Mover where you can filter records that were updated on a given time frame.
You can also build your own SSIS package to filter the modified records and move them on the productive system.

How to setup SonarQube for a large organization

I am in the process of setting up SQ for a large organization. I plan to have two separate systems one for update testing and rule development. The second would be the production system where real work occurs. I will be using SQL 2014 typically when I do that I use a SQL always On group to sync to a DR server in another datacenter. My question is with a SonarQube instance does it make sense to DR the application to that level. If my organization can wait for a period of time to stand up a new server in a DR event would that be possible with a proper backup of the DB? Further if there were no backups of the DB what would be lost with a fresh new SonarQube server besides setup/config time? Is there historical value of code scans that would be lost or would the next scan of the code base have us right back to where we were in terms of critical issues found etc.?
Thanks for your replies.
All the data is stored in the database so using DR on the database is a good idea. You should make backup of the database and restoring the database is also a good solution (note that you should do backup of installed plugins).
If you loose the database, you will also loose all the configuration (quality profiles, credentials, etc.) and the history of the analyzed projects.
So to restore a SonarQube instance, you have to :
Restore the database
Restore SonarQube or install the same version
Restore the plugins (${SONAR_HOME}/extensions/plugins)
During the first start, the ES files (${SONAR_HOME}/data/es) will be regenerated and you're instance will then be up and running.
If you have commercial plugins or if you are working with large SonarQube instance you may contact the sales team to have support on this setup.
Disclaimer : I'm working at SonarSource

How to keep deployment history of azure cloud services?

I'm using Jenkins to produce cspkg files using msbuild. It stores build results in azure blob storage. Then I use management portal to deploy them.
The biggest drawbacks I see are:
1. Deployments can be accidentally deleted easily.
2. There is no straightforward* way to check which version the cloud service has.
Is there a better way to manage deployments?
Its definitely not the best experience is it?
The approach I tend to use is as follows:
Build the deployment package and add the version number to the package filename (taken from AssemblyInfo.cs) e.g. MyCloudService-1.2.0.0.cspkg - this should be trivial using msbuild.
Push the package to Cloud Storage.
Perform the deployment of the package from Storage, with the Deployment Label '[CLOUD SERVICE NAME]-[VERSION] # [DATE & TIME]' e.g. 'MyCloudService-1.2.0.0 # 10-09-2015 16:30'
Check the deployment package into a 'Packages' directory in source control.
If you need to identify the version of the package deployed to the cloud service, you can see the Deployment Label on the Azure Management Portal:
'Old' Portal (manage.windowsazure.com):
'New' Portal (portal.azure.com):

Development versus Production in Parse.com

I want to understand how people are handing an update to a production app on the Parse.com platform. Here is the scenario that I am not sure about.
Create an called myApp_DEV. The app contains a database as well as associated cloud code.
Once testing is complete and ready for go-live I will clone this app into myApp_PRD (Production version). Cloning it will copy all the database as well as the cloud code.
So far so good.
Now 3 months down the line I want have added some functionality which includes adding some cloud code functions as well as adding some new columns to the tables in the db.
How do I update myApp_PRD with these new database structure. If i try to clone it from my DEV app it tells me the app all ready exists.
If I clone a new app (say myApp_PRD2) from DEV then all the data will be lost since the customer is all ready live.
Any ideas on how to handle this scenario?
Cloud code supports deploying to production and development environments.
You'll first need to link your production app to your existing cloud code. this can be done in the command line:
parse add production
When you're ready to release, it's a simple matter of:
parse deploy production
See the Parse Documentation for all the details.
As for the schema changes, I guess we just have to manually add all the new columns.

Setting up a collaborative environment for web application development

My office is growing and ive been tasked to build out the IT for our web development.
Whats the best tool/setup for doing web development in a group setting? The requirements are a centralized code repository, a location to test development code on, and finally a way to push tagged code out to a staging server. What im thinking is svn/redmine for code repo, each user has an account on a central development machine to allow for ssh access(eclipse over ssh) and their own virtual host on the dev server which gives everyone a centralized development sandbox. Code is written and tested on this dev box then checked back into svn and later tagged and pushed out to the staging server. Yeah? Thoughts comments or recommendations?
*Also, in a dev environment what is the best way to handle databases? Is it wise to pull from the production database? Also should each developer have his/her own db or work off a master db?
**We are building a magento application and also have some custom backoffice tools that run on cakePHP.
Although this subject is off-topic in StackOverflow and flagged so then you need to concentrate on following areas:
VERSION-CONTROL
GIT has all the glory and you don't need your own box for this as https://bitbucket.org/ offers unlimited data and private/public repos and you can set your codebase there. http://github.com is also powerful and de facto most popular version-control oriented tool out there although it comes for a small price
so your master branches live in your version control and your devs will checkout frpom there and commit to it as well
your deployment tools will deploy data to your live and staging environments from your master
ENVIRONMENTS
usually three are used LIVE, STAGE, DEV
LIVE is well live and only approved code gets deployed there
STAGE is pre-live environment and should be exact replica environment according to LIVE so all things can be tested there by merchant
DEV is cool to have exact replica but can as well be on developers local env and is ment for loose testing and experimenting
DATABASES AND DEPLOYMENT
mysql databases are pain in the ass to sync so you better have a script for it that syncs from live to others and prevent syncing from other environments to LIVE. This limitation also requires that all the configuration and content will be added from LIVE only and only then synced down the line. Every change to schema or permanent setting should be handled by update scripts (As we are talking MAGENTO CE , MAGENTO EE has migration built in)
for deployment I also suggest you to build a fabric or capistrano script that resets dev and staging environments, handles database reset and pull from LIVE DB, and imports code from central repository.
it's also a good idea to target the following everyday tasks:
clients needs to reset the stage for it's tests
project manager, developer or testers need to test so spawning a test clone should be oneclick action (take current db and code and make it live in some subfolder for specific test only) as well as deleting the test
3rd party devs might need access to specific test or dev environment (this is actual with magento as in average there are at least 10 external extensions installed in every magento store)

Resources