Geoserver looks for qix file for image pyramid - geoserver

The latest geoserver (2.17.1) with extension imagepyramid is looking for *.qix file for each request. I don't have any trouble in the old version (2.8.0) with the same dataset.
The error message is
26 Jun 16:55:15 ERROR [data.shapefile] -/2/2.qix (Permission denied)
java.io.FileNotFoundException: /2/2.qix (Permission denied)
The pyramid retile files are generated by gdal package. Any suggestions I could fix it?

The .qix file is an open version of the shape index (shx) file that speeds up access to the shapefile (but ESRI in their wisdom didn't open that part of the format).
To fix this you need to make sure the user running GeoServer can write to the pyramid directory (this is good practice anyway). If you really don't want to allow GeoServer to create this file then you could do it by opening all the shapefiles in QGIS as a user that does have write access to that directory.

Related

Azure Cloud Bash Shell file storage 5 GB file downloaded but unable to mount/open

I want to download folders from Azure Cloud Bash Shell.
For this I went to the storage account of the azure cloud shell and located a 5 GB .img file.
The full location to access the file is as follows.
https://csg10032000b5360942.file.core.windows.net/cs-rajat-agrawal-kpitd365-onmicrosoft-com-10032000b5360942/.cloudconsole/acc_rajat.img
(The URL needs my access token to be downloaded.)
I downloaded the file in Windows 10 but upon right click and mount it gives the error "The disc image file is corrupted".
Also I am unable to open the file using WinRar, 7-Zip.
Regards
Rajat Agrawal
To download the files from the Azure Cloud Shell, it's not a good way to download the image directly. You can change a more appropriate place to store the files or create the folders and then you can download them directly. This place is in the path /home/user/clouddrive. When you create the files or folder in this place, it shows like this in the file storage of the Azure Cloud Shell:
Well, in this situation, you can download them as you want from the file storage, not the image.
After some struggling, I managed to open and check contents.
Just add it VDI extension and open it with 7-Zip

Using IIB File nodes to move, copy, download and rename (S)FTP files without opening them

I am using IIB, and several of the requirements I have are for message flows that can do the following things:
Download a file from an FTP and/or SFTP server to the local file system, with a different name
Rename a file on the local file system
Move and rename a file on the (S)FTP server
Upload a file from the file system to the (S)FTP server, with a different name
Looking at the nodes available (FileInputNode, FileReadNode, FileOutputNode); it appears that they can read and write files in this way; but only by copying them into memory and then physically rewriting the files - rather than just using a copy/move/download-type command, which would never need to open the file in the same way.
I've noticed that there's options to move store files locally once the read is complete, however; so perhaps there's a way around it using that functionality? I don't need to open the files into memory at all - I don't care what's in the files.
Currently I am doing this using a Java Compute Node and Apache Commons Net classes for FTP - but they don't work for SFTP and the workaround seems too complex; so I was wondering if there was a pure IIB way to do it.
There is no native way to do this, but it can be done using Apache Commons VFS

How to open DBF file in DBeaver with JDBC

I'm not sure why I can't connect to .DBF files using DBeaver with the built in JDBC driver for "Flat Tiles (CSV/DBF).
I have a share drive with dozens of DBF files on it. I create the connections as shown in the attached images, but when I connect to the source I have two issues. I've included the steps I follow and the error that I get.
Does anyone have experience connecting to DBF files with JDBC and or using the DBeaver tool that might help me here?
I did download that DANS-DBF library JAR from GitHub but I am not sure how I can use it in this situation. I noticed on this site it says
CsvJdbc requires Java version 1.6, or later. For reading DBF files, DANS DBF Library must be downloaded and included in the CLASSPATH.
But I'm not sure how I can add it to DBeaver projects. They don't use build paths like an actual java project.
(I know I can open them in excel, but I prefer this tool for data queries).
I create the database
I select the build in CSV DBF connection type.
The driver properties only had .CSV I tried it with this setting, and when it didn't work, I changed it to .dbf and it still didn't work
I can connect to this folder fine, and i know there are plenty of DBF files in it.
Settings FYI.
When I try to open the one DBF file that appears I get an error message.
I apologize for breathing life into this year and a half old post but I had the same problem and this was the first link on google.
After much research and fiddling I got DBeaver to open a .dbf flat file using most of the settings you already described.
The CSV/DBF JDBC Driver to open dbf files requires DANS DBF as you mentioned and requires its addition to the CLASS PATH. There was limited information on that process and I have found no easy way to modify that in Dbeaver. I also looked through a few other JDBC that supposedly opened xBASE files such as HXTT but they weren't free which was a deal breaker for my use.
I did however get it to work by placing the DANS DBF jar file in the same directory with the csv JDBC driver. It had no trouble finding it as a dependency and ran like a charm.
So for anyone who is looking to do this.
In DBeaver open the driver manager and select the csv flat file
download the driver if needed.
download DANS DBF from souce forge
http://dans-dbf-lib.sourceforge.net
add that file to the driver and make sure you put it in the same directory as the csvjdbc driver. It should be in your user folder .dbeaver-drivers. if you click the driver file and the information button it should give you the filepath.
Then add the DANS DBF file to the driver manager
Make sure you change the file filter type to .dbf as you did otherwise it will hide all .dbf files.
Make a new connction and you are good to go!
a few things to note. I found that the file type extension is case sensitive so if you filter by .dbf then .DBF will not show up in the connection. A few people commented that the JDBC driver doesn't like spaces in filenames and it is a read only driver with a few quirks.
I ran into this issue recently and wanted to share, if you're still having problems of DBeaver stating it cant find Field or getRecordCount, etc. Serphentelm mentioned following the steps and still getting an error. I found that the JAR file from sourceforge is the source, NOT the compiled .class files.
I had to build the jar myself. For those needing it, I put it here:
http://s000.tinyupload.com/index.php?file_id=59469996816520223299
I placed that in the csvjdbc folder mentioned above, then just did "Add File" from the Edit Driver page in DBeaver to add the jar.
For DBeaver 2.22.1:
download dans-dbf-lib-1.0.0-beta-10.jar (e.g. from sourceforge)
in Drivers location, Local folder (in Windows: C:\Users\user\AppData\Roaming\DBeaverData\drivers) create the \drivers\dbf directory. NB 'drivers' must be created under drivers, so ...\DBeaverData\drivers\drivers\...
put dans-dbf-lib-1.0.0-beta-10.jar in this folder
now you can create a new connection using the Embedded/DBF driver

Way to load locally developed magento extension

I am developing a magento extension. After getting help from stackoverflow, I am able to create a package extension which is stored in [magento]/var/connect folder. I noticed that package.xml, myextension.xml and myextension-1.0.0.0.1.tgz files are created therein.
I created another magento instance locally where I want to load that package and test. It's not possible to get my extension verified by magento team and then use it quickly or is it? I copied those package files under /var/connect of the test instance but I cannot see that appear in admin-> system -> magento connect -> package extensions.
Any idea how I can do that? All I need is he ability to let my customer have my zip files(package files), then they will upload somewhere. Any help would be appreciated.
You can upload packaged Magento extension by going to:
System > Magento Connect > Magento Connect Manager
then uploading the package under the Direct package file upload section.
Furthermore, if you decide not to package your extension, just copy the working file structure of your extension to it's own folder, and you can simply drop your extension files into the working root directory of compatibility Magento installs. Just be sure your extension file structures are correct. EX: app/code/local/MyNamespace/MyExtension etc.

How can I atomically replace a file on a webserver so it's latest version is continually available?

I'm working on a project that generates Google Earth KML files and saves the file to a web-accessible directory. It's running on Windows with ActivePerl. (not my preferred platform but it's what I must work with.)
The method I'm using for this is: write to temp.kml, use File::Copy to copy temp.kml to real.kml. This occurs once a second.
Google Earth grabs this real.kml via an apache2 webserver. The problem is, errors get thrown when Google Earth grabs the real.kml at the same time as temp.kml is being copied to real.kml.
I understand that there's a good chance this is unavoidable, but is there any way that I can minimize the frequency of errors thrown?
Instead of copying the file, why not just move it from your temp directory to the web directory once your processing has finished? If your temp directory is on the same filesystem as the web directory, this should result in only the name of the file changing, while the contents remain unchanged. There should be a smaller chance of a race condition.
Use file::Copy to move file

Resources