Geoserver - GetFeature SHAPE-ZIP request - 413 error - geoserver

I am using Geoserver with an app written with OpenLayers 3. The app can download zipped shapefiles using a WFS service, which works unless I make a large (long URL) request. In that case I get a 413 error in Chrome.
Is there a way I can change this setting so that I can make a longer request to Geoserver (or is the problem something else?
Here is the request:
$('#btnDownloadSHP').click(function (e) {
var tostring = '(' + ids.toString() + ')';
var data = {
service: 'WFS',
version: '1.1.0',
request: 'GetFeature',
typename: 'download_layer',
format_options: "filename:" + shapefileName,
srsname: 'EPSG:3857',
outputFormat: 'SHAPE-ZIP',
CQL_FILTER: "id IN " + tostring
}
var parameters = Object.keys(data).map(function (key) {
return key + '=' + data[key]
}).join('&');
var url = "http://" + servername + "/geoserver/wfs?" + parameters;
//make dummy link and download shapefile
var link = document.createElement("a");
link.download = 'Features';
link.href = url;
link.click();
// }
});

That response would be generated by the server that GeoServer is running on rather than GeoServer itself. So depending on which httpd and/or servlet engine you are using you may be able to fix it there.
But the easy answer is to switch from GET to POST.

Related

GET request from NetSuite to Oracle EPM, but faced with "Authorization Required - You are not authorized to access the requested resource

Error: "Authorization Required - You are not authorized to access the requested resource. Check the supplied credentials (e.g., username and password)."
Using the same exact headers and URL, I am successfully able to make the request get through via Postman and Powershell. But when doing the call via SuiteScript, I get the auth error. I am thinking it may have something to do with me constructing the headers.
Here is the code I used via NetSuite Debugger:
require(['N/https', 'N/encode'], function(https, encode) {
function fetchCSVdata() {
var authObj = encode.convert({
string : "username:password",
inputEncoding : encode.Encoding.UTF_8,
outputEncoding : encode.Encoding.BASE_64
});
var psswd = 'Basic ' + authObj;
var headerObj = {'Authorization' : psswd};
var response = https.get({
url: 'https://<bleep>.pbcs.us6.oraclecloud.com/interop/rest/11.1.2.3.600/applicationsnapshots/DemandPlan_ExportItemPlan.csv/contents',
headers: headerObj
});
return response.body;
};
var x = fetchCSVdata();
log.debug("error", x);
});
Looking at some working code of mine it is different than yours but I don't see the error.
var authstring = encode.convert({string: 'username:password',
inputEncoding: encode.Encoding.UTF_8,
outputEncoding: encode.Encoding.BASE_64});
var headerObj = {Authorization: 'Basic '+ authstring };
var response = https.get({url: 'https://webservices.XXX.com', headers: headerObj});

Display contact list images in Outsystems Mobile

How can I display the contacts images along with the numbers as like the contact list from the device.I tried to display the image from URL "content://com.android.contacts/contacts/" by using the 'Contacts Plugin'.But I can't fetch the image from that URL.The type of image is set as 'External URL'.
I was facing the same issue but resolved it now
I have used below javascript and you must have FilePlugin as dependency for your module.
window.resolveLocalFileSystemURL($parameters.ContactPhotoURI, onResolveSuccess, onResolveFail);
function onResolveSuccess(fileEntry) {
fileEntry.file(function (file) {
var reader = new FileReader();
reader.onloadend = function(evt) {
// Remove the data:image/jpeg, part of the returned value
$parameters.ContactPhoto = evt.target.result.substring(evt.target.result.indexOf(',') + 1);
$resolve();
};
reader.readAsDataURL(file);
}, onErrorReadFile);
}
function onResolveFail(error) {
console.log("Error resolving Local File System URL " + JSON.stringify(error));
$resolve();
}
function onErrorReadFile(error){
console.log("ERRO!");
console.log(error);
$resolve();
}
Here ContantPhotoURI is the uri returned by ContactPlugin and ContactPhoto is binary data which can be loaded into Image.
If there is any doubt you can follow the discussion here

xmlhttprequest POST 405 - Method NOT ALLOWED

I have a problem that is making me crazy because I'm not able to solve it. I want to upload with my application some files to a server IIS.
My code in HTML is:
<input id="files" type="file" />
And just in the controller when I detect that a new file is added I use XMLHttpRequest:
document.getElementById('files').addEventListener('change', function (e) {
var file = this.files[0];
var xhr = new XMLHttpRequest();
(xhr.upload || xhr).addEventListener('progress', function (e) {
var done = e.position || e.loaded
var total = e.totalSize || e.total;
console.log('xhr progress: ' + Math.round(done / total * 100) + '%');
});
xhr.open('POST', 'http://10.0.19.25:80/CG/files', true);
xhr.addEventListener('load', function (e) {
console.log('xhr upload complete', e, this.responseText);
});
xhr.send(file);
});
When I launch my app on Chrome, Firefox or IE, I get this error:
POST http://10.0.19.25/CG/files 405 (Method Not Allowed)
enter image description here
Thanks in advance!
I had the same error, the problem was that the method what I tried to reach didn't exist, so I tried to use POST but on the server in that URL was expected PUT.
Looking on the server log might probably help!
I think you need read more about Content-type. I had the same problem, i was send json data on server and i just change Content-type into application / json; charset = UTF-8, that helped, default is text / html; charset = utf-8.

Phantomjs if proxy not respond

Script test.js:
var page = require('webpage').create();
var url = args[1];
page.open(url, function (status) {
console.log(status);
phantom.exit();
});
Run script:
phantomjs --proxy=1.1.1.1:22 test.js 'http://nonexistent_site.com'
1.1.1.1:22 - nonexistent server
http://nonexistent_site.com - nonexistent site
How can I determine in PhantomJS which one is not responding - a proxy or a site?
You can catch network timeouts with page.onResourceTimeout callback:
page.onResourceTimeout = function(request) {
console.log('Response (#' + request.id + '): ' + JSON.stringify(request));
};
You can also set your own timeout:
page.settings.resourceTimeout = 3000; // ms
To intercept network errors you can register page.onResourceError callback:
page.onResourceError = function(resourceError) {
console.log('Unable to load resource #' + resourceError.id + ' URL:' + resourceError.url);
console.log('Error code: ' + resourceError.errorCode + '. Description: ' + resourceError.errorString);
};
With this in place, non-existent host will trigger Host not found error.
But if you use a non-working proxy, you will always end up with error Network timeout on resource first, even if target host does not exist.
So if you want to check proxies :) I'd suggest just to page.open hosts that are 100% working, for example, set up a simple static web page on the very server that you are operating from.
Also there is a node.js module: proxy-checker

XmlHttpRequest corrupts headers in Firefox 3.6 with "Content-Type:multipart/form-data"

I'm working on "multiple ajax uloader". Works fine in bleeding edge browsers (Chrome 6, Firefox 4). But in Firefox 3.6 I must manualy create output string to be sended, cos this browser doesn't support FormData object.
I followed many tutorial, especialy this. Author notify about correct setup of headers & content of body to be sended. I carefully followed that advises, but Firefox 3.6 fail my efforts.
This is correct setup of headers and body (captured by submitting simple static form):
correct headers, correct body
This is what I get, when I use Firefox's xhr object to submit same data:
wrong headers, wrong body
As you can see xhr's headers are corrupted. This lead in total failure of file upload. Here is a code I use:
function generateBoundary()
{
var chars = '0123456789',
out = '';
for( var i = 0, len = chars.length; i < 30; i++) {
out += chars[Math.floor(Math.random()*len)];
}
return '----' + out;
}
function getMultipartFd(file, boundary)
{
var rn = '\r\n',
body = '';
body = boundary + rn;
body += 'Content-Disposition: form-data; name="Files[]"; filename="' + file.name + '"' + rn;
body += 'Content-Type: ' + file.type + rn + rn;
body += file.getAsBinary() + rn;
return body;
}
$(function(){
$startUpload.click(function(){
var url = $uploadForm.attr('action'),
xhr = new XMLHttpRequest(),
boundary = generateBoundary(),
file = null,
body = '';
file = $SOME_ELEMENT_WITH_ATTACHED_FILE.file;
body = getMultipartFd(file, boundary);
console.info(file);
console.info(body);
xhr.upload.onload = function(){
console.info('done');
};
xhr.open('POST', url, true);
xhr.setRequestHeader('Content-Type', 'multipart/form-data; boundary=' + boundary);
xhr.sendAsBinary(body + boundary + '--' + '\r\n');
return false;
});
});
Here is also a dump of file and body variables:
dump file, dump body
Have anybody any idea, why xhr is corrupting headers this way?
I was scoping problem. I tried to use code in fresh Firefox installation under WinXP (my primary system is Arch Linux). Problem remains. I found that Mozilla's xhr has additional property called 'multipart'. With this set to true, headers is OK, but my xhr.events aren't fired - JS crash after sending file.
I scoped bit more deep with Firebug's JS debugger and found, that after xhr.multipart = true; code jumps into deep waters of jQuery library, where strange things happens around some curious events.
Even more curiou is that headers/content seems to be right in Firebug's console, but in HttpFox extension, it is corrupted.

Resources