Geoserver layers displaced when cache activated - caching

I am trying to solve a problem for some days, and it is impossible for me even to find the reason of that behaviour of geoserver.
I have a web application that prints some map layers from my geoserver (version 2.1.3) through openlayers.
I have all the layers well configured in my geoserver application and these layers are well drawn by openlayers using the projection 900913.
The problem comes when I try to change from obtaining directly the layer images from my geoserver, to obtain the layers through the geowebcache included in the geoserver distribution (adding in the request the parameter "&tiled=true").
Then the layers are also obtained but some kilometres displaced from the original position.
Maybe it uses a different projection, but in the geoserver logs I saw it is calling the request with the 900913 projection.
I have also wiped the temp directory of the geowebcache where one can find the cache files, to force geowebcache to redraw the layer cache files. But it redraws them with the same problem.
Does somebody have had this problem or know what is the reason of this layer displacement when using the cache layers?
Thank you very much,
Aleix
EDIT: I see there is the same question in that following post in the GIS site of the StackExchange community, although it is neither answered... (http://gis.stackexchange.com/questions/4289/geowebcache-misalignment-of-tiles)

If you want to really use GWC please hit the GWC own WMS service, it will also tell you if the requests you are making are displaced compared to the cached grid:
http://localhost:8080/geoserver/gwc/service/wms?...

Related

Updating global.xml on disk doesn't update geoserver config

I'm using geoserver 2.19 and tried to update directly global.xml to update the contact information but when reloading the cache, it just discards my changes and writes back the global.xml without my changes.
I tried modifying the logging.xml as well, the change I made is visible in the GUI when reloading the cache but is not really adapting the logs as per the modifications I've made.
Am I missing something?
To give a bit more information, I have 2 instances of Geoserver and when I make changes to 1 instance, I call the rest reload to apply the changes on the other instances too. I've read about JMS clustering but it seemed a bit too complex and rigid for what I need to do. Advices are welcome.
I am trying to achieve this : https://docs.geoserver.geo-solutions.it/edu/en/clustering/clustering/passive/passive.html. But I'm having trouble with the synchronization between the instances
Thank you
Basically, don't do that! GeoServer manages that file and you can break things very badly if you mess with that file.
You should only change things like contact information through the GUI or the REST API.

Cache OpenLayers3 Tiles server side

I'm developing a website that display a map using OpenLayers3 API.
I know that most of requests will refer to a specific location so I'm wondering if it's possible to save server-side the Tiles of that region, in order to reduce calls to OpenStreetMap that sometimes is slow.
Thanks in advance!
Yes, you can. There are several solutions
You can try dedicated software called more or less "map tiles proxy"
Mapproxy
Mapcache
You can reuse your existing web server
You will tweak it for the intent of caching map tiles (Nginx recipe, did not found one out of the box for Apache)
You can use what we call "reverse proxy" like Varnish. See a recipe for this.

How import geoserver mbtiles pbf?

I try to import mbtiles from openmaptiles.org in geoserver.
When I try to add a layer, I get the message:
"No enum constant org.geotools.mbtiles.MBTilesMetadata.t_format.PBF"
How can i resolve this problem.
Thanks in advance
GeoServer does not support serving of vector tiles stored in MBTiles now.
Use the free TileServer GL open-source project instead - as the map server software for serving the vector tiles available from OpenMapTiles.org.
TileServer GL will also provide you with raster tiles (PNG/JPEG tiles for use in Leaflet or other traditional viewers and old browsers) created on demand from vector tiles on your server - and it will deliver WMTS service, for openning the map layers in QGIS, ArcGIS, etc.
See: https://openmaptiles.org/docs/host/tileserver-gl/ and https://github.com/klokantech/tileserver-gl
Manual of TileServer GL is available at http://tileserver.readthedocs.io/
Disclaimer: we are authors of the OpenMapTiles.org project which you mention in your question, and you have probably downloaded the MBTiles file from our server.
We are also programmers behind the TileServer GL project. As such we are affiliated with the above-mentioned projects. The GeoServer developers may provide you with more details about a planned (or not?) support of vector tiles served from MBTiles format.

Do any open-source standalone restful image servers exist?

I'm planning to develop a standalone restful Image Server with the following functionality, but first would like to know if something similar already exists in the open source world (language not important):
restful (crud) on master image, e.g: /GET/asd983249as
possibly bulk-gets / LIST
support for metadata (Creative commons info, dimensions, etc.) that directly relates to the image (references from the domain to these images is NOT included)
restful lazy-get of different 'renditions' of an image. i.e if a rendition doesn't exist, it is created upon request. Obviously the original image needs to exist. Different operations are allowed (resize and crop to begin with)
e.g: /GET/asd983249as/100x100 (simple resize)
allowed dimensions are configurable, so not to get DoS'ed (not as quickly anyway)
Non functional:
Reasonable performant / Scalable / HA (yeah I know this doesn't say anything really)
Possibly in-mem caching
Thinking about going the Mongo GridFS route, getting MongoDb sharding and replication almost for free. Putting Nginx in front, perhaps (in part) directly using nginx-gridfs (see below) should allow for the rest-stuff and, with some config, some simple caching if gridfs can't handle that for itself (don't know)
Sources:
nginx-gridfs
http://www.coffeepowered.net/2010/02/17/serving-files-out-of-gridfs/
Idea of lazy-gets (and a simple implementation of what I'm looking for, although it seemed more hobbyish than an actively maintained project)
http://sumitbirla.com/2011/11/how-to-build-a-scalable-caching-resizing-image-server/
other stuff that comes close, but isn't an end solution
https://github.com/adamdbradley/foresight.js/wiki/Server-Resizing-Images
Anything that already does this?
I would recommend you this project:
https://github.com/imbo/imbo
Its easy to use, stable and used in big projects.
But I am still curious about alternatives.
I was looking for options for a project, and I found those two below. They are not a perfect match to your requirements but seem quite mature. I have no experience with them yet, though.
https://imageresizing.net/ Essential edition is open source. The more advanced solutions are not.
http://thumborize.me/ (with associated github) has many interesting features like face detection, new codecs, smart cropping.

Optimal delivery method for a large quantity of images

I have a website centered around an online chat application where each user can have up to several hundred contacts. Each contact has there own profile image. I want to make it so that the contact's profile image is loaded next to there name. However, having the user download 100+ images every time they load the site seems intensive (Studies have shown that as much as 40% of users don't utilize there cache). Each image is around 60x60 pixels in dimension.
When I search on google or sign on to facebook, dozens of images are served nearly instantaneously. Beyond just having fast servers and a good connection, what are the optimal methods for delivering so many images to the user?
Possible approaches I have come up with are:
Storing each user's profile image in a database, constructing one image in a php file, than having the user download that, then using css to display each profile image. However, this seems extremely intense on the server and referencing such a large file so many times might take a toll on the user's browser.
Using nginx rather than apache to server the images (nginx generally works better to server static content such as this). However, this seems more like an optimization to a solution, rather than a solution in itself.
I am also aware that data can be delivered across persistent http connections so multiple requests do not have to be made to the server for multiple files. However, exactly how many files can be delivered across one persistent connection. Would this persistent model mean that just having the images load as separate files would not necessarily be a bad idea?
Any suggestions, solutions, and/or notes on personal experiences with relevant matters would be greatly appreciated. Scalability is extremely important here, as well as cross-browser support (IE7+, Opera, Firefox, Chrome, Safari)
EDIT: I AM NOT USING JQUERY.
Here's a jquery plugin that delays loading images until they're actually needed (i.e., only loads images "above the fold".)
http://www.appelsiini.net/2007/9/lazy-load-images-jquery-plugin
An alternative may be to use Flash to display just the images. The advantage is Flash is a much stronger local cache that you have programm

Resources