Using Solr on multiple CMS - magento

I have an eZ Publish and a Magento site on two different servers, and one Solr server. The Solr server is now used as the search engine for eZ Publish but I would also like to use the same Solr-server on Magento.
eZ Publish comes with an extension (eZFind) which contains schema.xml, and I got it working straight of the box without any configuration (other than defining the Solr-server, user, password, etc).
Magento ships with a schema.xml and solrconfig.xml, which according the documentation needs to be copied to the Solr-server.
I'm a bit afraid of doing this since I don't want to break the search on eZ Publish.
Does anyone have any experience with this or has any recommendations on the Solr setup?

You need to use the multi-core feature of Solr (see there) so that you will only have one Solr instance, serving 2 cores (at least).
What does that mean ? Each core will be defined by at least 2 files (schema.xml and solrconfig.xml), which will be located in dedicated folders within your Solr installation. Then the cores have to be registered in a file named solr.xml which, in your case, could look like this :
<?xml version="1.0" encoding="UTF-8" ?>
<solr persistent="true" sharedLib="lib">
<cores adminPath="/admin/cores">
<core name="ezpublish" instanceDir="ezpublish" />
<core name="magento" instanceDir="magento" />
</cores>
</solr>
If your current solr installation is still in the eZ Find extension, then you should have a look a this page which tells you how to move the bundled Solr installation outside of eZ Publish. Then, add a new core with the Magenta configuration files.
Depending on the Solr version you are using, I would recommend installing Solr on your own (without taking the one for eZ Find) and apply the eZ Publish configuration on it.

You can use solr's multicore feature which allows you to host multiple indexes, each with its own schema and each accessible with its own url (http://localhost:8983/solr/ezpublish/ and http://localhost:8983/solr/magento).
eZPublish has a tutorial on how to do this : http://doc.ez.no/Extensions/eZ-Publish-extensions/eZ-Find/eZ-Find-2.7/Advanced-Configuration/Using-multi-core-features
All you should have left to do is copy your magento config

Related

Solr index is not updating with publish

Using Sitecore 10.0.1
Solr 8.4
I published few items.
There are 3 servers and each server are pointing to different solrs, say A,B,C.
But No publishing changes are updating in Solr indexes B and C. I checked event queue tables and it is updating there.
If you have a geo-distributed setup with Sitecore servers in multiple regions and a Solr server for each region, you should make sure that one Sitecore server in each region has Indexing role in the web.config file. It can be a Content Management server or Content Delivery server depending on your environment configuration.
Here is an example of how the Indexing role can be added to web.config:
<add key="role:define" value="ContentManagement, Indexing"/>
See more details in Sitecore documentation here.

What is Sonar Search?

I am upgrading an install of SonarQube from 4.5.1 to 5.2. I wasn't part of the original install and when looking at the sonar.properties file to see what needs to be update in the new one, I see properties for "sonar.search".
What is Sonar Search? Why would I need to uncomment/update these properties?
I haven't been able to find any good documentation on the SonarQube website on what it is and "sonar" and "search" internet searches bring up way too many unrelated results to sift through.
It is an Elasticsearch instance used for indexing some database tables. It allows for example powerful search requests on issues. See the sidebar of page "Issues". It supports multi-criteria searches and displays valuable facets.
Default settings in sonar.properties are good enough for most of environments. JVM settings of this dedicated process could be overridden if dozens of millions of issues are stored in database.

Bluemix and CMS (Joomla) and CF Push

I have installed a Joomla site with CF on bluemix.
As you know Joomla as other CMS allows to install components for adding functionalities.
This uploads the php code needed for the component and add additional tables/entries in the Database.
My issue is that when I CF PUSH, the new component script is removed from the joomla folders on bluemix, and the database still contains component's tables/entries.
I guess this is the situation for all CMS (Drupal, Wordpress, Joomla, Vbulletin, etc..).
How could I get a kind of CF PULL (?) to keep the modified CMS code including the new component locally on the computer side ?
So when i will redo the CF PUSH the installed component will not be erased.
Thank you in advance for your support,
Best regards
Yves
There is no cf pull command in Cloud Foundry. The closest you would have is the cf files app-name command that you can navigate the directory structure of your cloud application and get specific files as needed, but this would be really tedious if you have multiple files to copy to your local computer.
It looks like Joomla fits better with the IBM Containers service in Bluemix. With the IBM Containers you can have an Docker image from Joomla (https://hub.docker.com/_/joomla/) and use persistent Volumes to save your added functionality. You can also use any Bluemix services (like a database) with IBM Containers.
The article below provides more details and step by step instructions to create an IBM Container for Wordpress. You can easily modify it for Joomla:
http://blog.ibmjstart.net/2015/05/22/wordpress-on-bluemix-containers/
When you push an application on a runtime, php Java or whatever, it will restage all the application sources, included what has been configured and modified before through the cms interface, leaving the db databases untouched. And it is for joomla, but also for drupal or WP or any other cms. By this way to achieve what you wish you have 3 options:
- push exactly the filesystem structure you need on Bluemix, including the configuration files and modules to use on it
- use (as suggested above) a container instead of a runtime: anyway also with a container you have to install your cms on an external docker volume, otherwise the cms will be reset every time you restart the container
- use a Bluemix VM

Solr configuration and setup under Git source control

I've got Solr running as a service on windows. I used NSSM (http://nssm.cc/) to set up the service to automatically start. The web server is Jetty.
I'd like to have my Solr directory under source control in Git because the configuration changes (and sometimes plugin changes) need to be picked up by all team members. At the very least, I'd like to have the configuration files (solrconfig.xml, schema.xml, stopwords.txt, etc.) under Git control, but ideally, I'd like to put the whole solr directory (including jar and war files) under Git control. Will this pose any problems? I can foresee us pulling commits and switching branches, all while the Solr service is running.
How have other teams configured Solr under source control?
The rule I go by is to check in configuration files (SolrConfig.xml, Stopwords.txt, dataconfig.xml etc.)
There are reasons, IMHO, to not check in the entire Solr directory in source control:
Solr directory contains the index data as well as configuration. Bad idea to check in the index, because
size of the repo will grow
your index isn't a data-source. In most cases, it relies on external source such as RDBMS to refresh itself. Huge risk on data-integrity when your database goes out of sync with your Solr Index.
Only in development box, we have Solr and the consuming app deployed in the same machine, otherwise, setting up Solr is independent of application deploy. Checkin in Solr directory in SC would mean unnecessarily big repositories to deploy.
Rather than doing the whole repository checkin, we ended up having the config files checked in and basic scripts to setup solr, create index, start an instance etc. So every team member could check out the code base, run a couple of build tasks and get ready to party :)

Liferay Community Edition centralized/shared storage issue

For quite some time I've been working on getting the community version of Liferay 6.0.6 (windows 2003 + Apache Httpd 2.2 + Apache Tomcat 6.0.29 ) to run on shared storage environment. Only the application base (webapps) is currently running from the shared storage. But I also wanted to move the "data" and "deploy" directory to the shared storage. So please provide any custom settings/changes that needs to be performed to change the location of these two directories/folders.
Whenever I change the "deploy destination directory" some portlets work, but not all (Custom portlets). It'll be great if anybody can provide certain points/checklist which needs to be followed in this situation.
Additionally, the configurations files used are attached along with this thread. If any other files is needed please let me know the same.
Thank you for your support in advance. All efforts are appreciated.
Thanks,
Joji VJ
Try to set Cache Page Definition Only at System Level,
By using this option create a single cached copy of the page definition in the system cache for all users.
Because the page definition is the same for all users, page customization options are disabled. This caching option greatly reduces storage requirements and improves performance.

Resources