I'm using geoserver 2.19 and tried to update directly global.xml to update the contact information but when reloading the cache, it just discards my changes and writes back the global.xml without my changes.
I tried modifying the logging.xml as well, the change I made is visible in the GUI when reloading the cache but is not really adapting the logs as per the modifications I've made.
Am I missing something?
To give a bit more information, I have 2 instances of Geoserver and when I make changes to 1 instance, I call the rest reload to apply the changes on the other instances too. I've read about JMS clustering but it seemed a bit too complex and rigid for what I need to do. Advices are welcome.
I am trying to achieve this : https://docs.geoserver.geo-solutions.it/edu/en/clustering/clustering/passive/passive.html. But I'm having trouble with the synchronization between the instances
Thank you
Basically, don't do that! GeoServer manages that file and you can break things very badly if you mess with that file.
You should only change things like contact information through the GUI or the REST API.
Related
I am using GEOIP plugin in my laravel system. All things are correctly working but response time of GEOIP is very more. I want to reduec it.
I tried to fetch simple IP address but it is not worked , because i need to do country validation also.
use \Torann\GeoIP\Facades\GeoIP;
$respondentLocation = GeoIP::getLocation();
I just need quick response nothing else.
Thank you in advance.
I'm not familiar with that particular library but I'm assuming its looking up remote data (hence the speed issues). If so, there isn't a lot you can do outside of caching records, and possibly pre-fetching data (if your scenario allows for that).
Looking at their docs, it appears they do have caching mechanisms in place: http://lyften.com/projects/laravel-geoip/doc/commands.html
currently I am a bit lost or maybe have just a mental blockade.
The topic for my question is a 1.7.3.3 Prestashop currently hosted at shared hosting. Due to slow performance and long TTFB I am currently moving it to a VPS running Plesk, hosted on DigitalOcean.
Now comes the Part where I am a bit lost: I copied the files via WGET, dumped the Database and applied permissions (to my knowledge) correct. Shop comes up at the new Plesk-Host under new domain without issues.
As soon as I am trying to enable MySQL-caching I am able to edit the pages with Apollo Pagebuilder, but not save them anymore. At least the changes don't show up at front office. If I switch back to filecache, changes are propagated as intended, but the modules-page in the backend doesn't work anymore (e.g. error 500, can be fixed by removing /app/cache/prod and app/cache/dev)
So, to summarize my issue: If I enable filecache, everything except the module-page works, if I enable MySQL-cache, everything except Apollo Pagebuilder-propagation works.
What I already tried:
I have reinstalled Apollo Pagebuilder, but this rather completely breaks my Front Office (means I'd have to rebuild everything from scratch, as the current status doesn't seem to be read properly).
Exported, reimported and "update and fixed" Apollo, not successful :(
Only thing that comes to my mind as a fix would be sacrificing something to the gods, but I'd rather not do that.
Environment:
Ubuntu 16.04 LTS; Plesk Onyx 17.8.11; Prestashop 1.7.3.3; PHP 7.1.26
If no one had this problem before, maybe someone has an idea on what to delete to just enable the modules in the backoffice. I'd be willing to take MySQL caching as non-available.
Thank you in advance for your help.
Ok, I think I found the answer. As the server including cache was migrated, it cached the database-connection as well. (Fortunately it wasn't able to write on the previous DB).
So if ever someone faces the same issue:
prestaroot/app/cache/prod/appProdProjectContainer.php stores the connectionstrings at 2 positions.
Once in: protected function getDoctrine_Dbal_DefaultConnectionService() //**around line 670
and once around line 5000. Easiest would be to just search for your previous connection-credentials.
Also you need to make sure that in prestaroot/app/cache/prod/appParameters.php the same, valid, credentials are existant.
Hope it will help someone one day.
I'm looking at the cache function in the findAll function of cfWheels. I'm a little apprehensive about using it. My queries are not taking that long that I absolutely need them, but a bit of a speed boost is always welcome. I'm getting 10ms from a queried cache that otherwise takes about 100ms. The thing I'm wondering about is when an entry changes, I'd like the cache to be cleared on the next run. It doesn't seem like there's any mechanism or flag in the framework that would allow that, so I'd have to set and clear the flags myself, which would most likely end up having to read from the database anyway. I was hoping that I could set the cache for a full day and update when needed, is this horribly misguided? I'm most likely not going to go down the road of developing any of the functionality to allow the caching for this application, but am curious if it is worth while revisiting.
More precisely whenever you make a new entry in the database, use the cfhttp tag to reload the application.
Caching can be cleared through reloading an application. It would not be the answer you are seeking but is a solution, here is an another approach. You can reload an application through <CFHTTP> by sending URL through <CFHTTP> after adding your new database record. If you are adding record through management site, then you can reload your Public site using <cfhttp>.
:)
I have developed application using ZF.The app is little big with a lots of features.
I use Zend_Application(already using autoloader in constructor),Zend_Layout,Zend_view,Zend_form,etc. My current issue is, the page loading is very slow and that too in localhost with XAMP.
I have enabled xdebug, to investigate the issue, got a cachegrind file in "tmp" folder and tried to view it with WinCachegrind software. There i can a see a lot of processes and functions being run for each and every request or page load.
Also, i have installed YSlow add-on for firefox and observed the speed of page loads in seconds...I have compare the speed with ZF and non ZF applications. And from the comparison, the pages for non zf app takes less than 1 sec to load and for the ZF app, it takes atleast 6-7 seconds. What a huge difference.
Main Things happen in the app are :
1) Database connection happens for each request.
2) Im not adding the view to layout explicitly,ZF just appends it automatically, to layout.phtml, based on the action name.
3) Some windows have forms with few drop down boxes which fetches data from the database.
4) Have menus with ACL implimented, before it was loading the privilges from DB for each and every request, but now i have optimized it, so that it will work only duiring the login and rest of the time it will take from the Zend_Registry.
I would like to attach the cachegrind file so that some one can see whats happening in the background, but i cant see an option here for attaching.
Someone please help me to find a solution for this. Any kind of help is really appreciated. Thanks a lot
Let's try to give some hints.
First database connection should happen only once (except if you use several privileges access on the database or several databases). So check that you use Singleton patterns with you Zend_Db_Tables object
Then you do not use Zend_Cache. You should really start to use Zend_Cache and build several cache objects. Let's say for example a File cach, with long term storage, and a memcache or Apc Cache, storing objects. Then use these cache in several layers:
gives the FileCache to Zend_Db_Table (defaultMetaDataCache), this way you will avoid a loot of metadata queries, queries that ask for description of each columns of the tables you use.
Store one or more Acl object (depends on how you use Acl, if you have one big Acl with all rules or several with subsets). And store them in mid-duration caches when they are built.
Think of other usages, detect heavy loops, semi-static contents (like you select lists, how many time should they be considered static?)
Finally, get a whole mental image of how your application engine works, and how your data will grow and be used.You will need that step to use application levels caches in the very best way (for example should some elements be cached for groups of users?, should Acl objects be build for groups, for each user, for everybody, is ther some blocks in the layout that should be rendered the same for everybody?).
I am sure answer for this question will be very subjective, I simply want to know what the options are out there (for building a proxy to load external contents).
Typically I used cURL in php and pass a variable like proxy.url to fetch content. Then make an AJAX call with Javascript to populate the contents.
EDIT:
YQL (Yahoo Query language) seems a very promising solution to me, however, it has a daily usage limit which essentially prevents me from using it for large scale projects.
What other options do I have? I am open to any language, any platform, key criteria are: performance and scalability.
Please share your ideas, thoughts and experience on this topic.
Thanks,
you dont need a proxy server or something else.
Just create a cronjob to fetch the contents every 5 minutes (or whenever you want).
You just need to create a script that grabs the content from the web and saves it (to a file, a database, ...), which will be started by the cronjob.
If somebody requests your page, you just need to send the cached content out and do with it whatever you want to do.
I think scalability and performance will be no problem.
Depending on what you need to do with the content, you might consider Erlang. It's lightening fast, ridiculously reliable, and great for scaling.