xmlhttprequest POST 405 - Method NOT ALLOWED - methods

I have a problem that is making me crazy because I'm not able to solve it. I want to upload with my application some files to a server IIS.
My code in HTML is:
<input id="files" type="file" />
And just in the controller when I detect that a new file is added I use XMLHttpRequest:
document.getElementById('files').addEventListener('change', function (e) {
var file = this.files[0];
var xhr = new XMLHttpRequest();
(xhr.upload || xhr).addEventListener('progress', function (e) {
var done = e.position || e.loaded
var total = e.totalSize || e.total;
console.log('xhr progress: ' + Math.round(done / total * 100) + '%');
});
xhr.open('POST', 'http://10.0.19.25:80/CG/files', true);
xhr.addEventListener('load', function (e) {
console.log('xhr upload complete', e, this.responseText);
});
xhr.send(file);
});
When I launch my app on Chrome, Firefox or IE, I get this error:
POST http://10.0.19.25/CG/files 405 (Method Not Allowed)
enter image description here
Thanks in advance!

I had the same error, the problem was that the method what I tried to reach didn't exist, so I tried to use POST but on the server in that URL was expected PUT.
Looking on the server log might probably help!

I think you need read more about Content-type. I had the same problem, i was send json data on server and i just change Content-type into application / json; charset = UTF-8, that helped, default is text / html; charset = utf-8.

Related

<Img> tag does not download image, while image is available [duplicate]

I created a script that extracts photos in the gallery of a certain profile…
Using instagram-web-api
Unfortunately now it no longer works, instagram does not return the image of the media
This is the mistake:
ERR_BLOCKED_BY_RESPONSE
Instagram has changed it’s CORS policy recently? How I can fix?
for php; I changed my img src to this and it works like charm! Assume that $image is the instagram image cdn link came from instagram page:
'data:image/jpg;base64,'.base64_encode(file_get_contents($image))
EDIT FOR BETTER SOLUTION
I have also noticed that, this method is causing so much latency. So I have changed my approach and now using a proxy php file (also mentioned on somewhere on stackoverflow but I don't remember where it is)
This is my common proxy file content:
<?php
function ends_with( $haystack, $needle ) {
return substr($haystack, -strlen($needle))===$needle;
}
if (!in_array(ini_get('allow_url_fopen'), [1, 'on', 'true'])) {
die('PHP configuration change is required for image proxy: allow_url_fopen setting must be enabled!');
}
$url = isset($_GET['url']) ? $_GET['url'] : null;
if (!$url || substr($url, 0, 4) != 'http') {
die('Please, provide correct URL');
}
$parsed = parse_url($url);
if ((!ends_with($parsed['host'], 'cdninstagram.com') && !ends_with($parsed['host'], 'fbcdn.net')) || !ends_with($parsed['path'], 'jpg')) {
die('Please, provide correct URL');
}
// instagram only has jpeg images for now..
header("Content-type: image/jpeg");
readfile( $url );
?>
Then I have just converted all my instagram image links to this (also don't forget to use urlencode function on image links):
./proxyFile.php?url=https://www.....
It worked like charm and there is no latency anymore.
now 100% working.
You can try this.
corsDown
Using the Google translation vulnerability, it can display any image URL, with or without permission. All these processes are done by the visitor's IP and computer.
I have the same problem, when I try to load a Instagram's pictures url (I tried with 3 IP addresses), I see this on the console:
Failed to load resource: net::ERR_BLOCKED_BY_RESPONSE
You can see it here, the Instagram image doesn't load (Actually, when I paste this url on google it works, but Instagram puts a timestamp on there pictures so, it's possible it won't work for you).
It's very recent, 3 days ago, it works with no issues.
<img src="https://scontent-cdt1-1.cdninstagram.com/v/t51.2885-19/s320x320/176283370_363930668352575_6367243109377325650_n.jpg?tp=1&_nc_ht=scontent-cdt1-1.cdninstagram.com&_nc_ohc=nC7FG1NNChYAX8wSL7_&edm=ABfd0MgBAAAA&ccb=7-4&oh=696d56547f87894c64f26613c9e44369&oe=60AF5A34&_nc_sid=7bff83">
The answer is as follows. You can use the imgproxy.php file. You can do it like this:
echo '<a href="' . $item->link . '" class="image" target="_blank">
<span style="background-image:url(imgproxy.php?url=' . urlencode($thumbnail) . ');"> </span>
</a>';
Using PHP
u can grab content of the image and show it in php file as an image by setting the header:
<?php
$img_ctn = file_get_contents("https://scontent-ber1-1.cdninstagram.com/v/......");
header('Content-type: image/png');
echo $img_ctn;
You can display the Image using Base64 encoded.
Base64 func based on #abubakar-ahmad answer.
JavaScript:
export const checkUserNameAndImage = (userName) => {
/* CALL THE API */
return new Promise((resolve, reject) => {
fetch(`/instagram`, {
method: "POST",
headers: {
Accept: "application/json",
"Content-Type": "application/json",
},
body: JSON.stringify({ userName }),
})
.then(function (response) {
return response.text();
})
/* GET RES */
.then(function (data) {
const dataObject = JSON.parse(data);
/* CALL BASE64 FUCNTION */
toDataUrl(dataObject.pic, function (myBase64) {
/* INSERT TO THE OBEJECT BASE64 PROPERTY */
dataObject.picBase64 = myBase64;
/* RETURN THE OBJECT */
resolve(dataObject);
});
})
.catch(function (err) {
reject(err);
});
});
};
Base64 func:
function toDataUrl(url, callback) {
var xhr = new XMLHttpRequest();
xhr.onload = function () {
var reader = new FileReader();
reader.onloadend = function () {
callback(reader.result);
};
reader.readAsDataURL(xhr.response);
};
xhr.open("GET", url);
xhr.responseType = "blob";
xhr.send();
}
Now, instead of using the original URL, use the picBase64 property:
<image src={data.picBase64)}/>
I have built a simple PHP based media proxy to minimize copy&paste.
https://github.com/skmachine/instagram-php-scraper#media-proxy-solving-cors-issue-neterr_blocked_by_response
Create mediaproxy.php file in web server public folder and pass instagram image urls to it.
<?php
use InstagramScraper\MediaProxy;
// use allowedReferersRegex to restrict other websites hotlinking images from your website
$proxy = new MediaProxy(['allowedReferersRegex' => "/(yourwebsite\.com|anotherallowedwebsite\.com)$/"]);
$proxy->handle($_GET, $_SERVER);
I was too lazy to do the suggested solutions and since i had a nodejs server sending me urls i just wrote new functions to get the images, convered them to base64 and sent them to my frontend. Yes it's slower and heavier but it gets the job done for me since i don't have a huge need for performance.
Fetch and return base64 from url snippet
const getBase64Image = async (url) => {
return new Promise((resolve, reject) => {
// Safety net so the entire up
// doesn't fucking crash
if (!url) {
resolve(null);
}
https
.get(url, (resp) => {
resp.setEncoding("base64");
body = "data:" + resp.headers["content-type"] + ";base64,";
resp.on("data", (data) => {
body += data;
});
resp.on("end", () => {
resolve(body);
});
})
.on("error", (e) => {
reject(e.message);
});
});
};
You don't need any external modules for this.

How I can catch and save to file data from form sended by AJAX in ColdFusion

I have the followign JavaScript code:
function upload(blob) {
var xhr = new XMLHttpRequest();
var url = "test.cfm";
xhr.onload=function(e) {
if(this.readyState === 4) {
console.log("Server returned: ",e.target.responseText);
}
};
var fd=new FormData();
fd.append("randomname",blob);
xhr.open("POST",url,true);
xhr.send(fd); }
How can I catch it on server side by ColdFusion and Save blob object to File?
Can someone please some code sample. Thx.
PS. I am pretty new in CF.
Since you are using formdata, you can access the form variable with ajax, just like you would with normal http requests.
#form.randomname#
#form['randomname']#
So you could save the content in a file with
<cfscript>
fileWrite( 'c:\myfile.txt', form.randomname );
</cfscript>

a file upload progress bar with node (socket.io and formidable) and ajax

I was in the middle of teaching myself some Ajax, and this lesson required building a simple file upload form locally. I'm running XAMPP on windows 7, with a virtual host set up for http://test. The solution in the book was to use node and an almost unknown package called "multipart" which was supposed to parse the form data but was crapping out on me.
I looked for the best package for the job, and that seems to be formidable. It does the trick and my file will upload locally and I get all the details back through Ajax. BUT, it won't play nice with the simple JS code from the book which was to display the upload progress in a progress element. SO, I looked around and people suggested using socket.io to emit the progress info back to the client page.
I've managed to get formidable working locally, and I've managed to get socket.io working with some basic tutorials. Now, I can't for the life of me get them to work together. I can't even get a simple console log message to be sent back to my page from socket.io while formidable does its thing.
First, here is the file upload form by itself. The script inside the upload.html page:
document.getElementById("submit").onclick = handleButtonPress;
var httpRequest;
function handleResponse() {
if (httpRequest.readyState == 4 && httpRequest.status == 200) {
document.getElementById("results").innerHTML = httpRequest.responseText;
}
}
function handleButtonPress(e) {
e.preventDefault();
var form = document.getElementById("myform");
var formData = new FormData(form);
httpRequest = new XMLHttpRequest();
httpRequest.onreadystatechange = handleResponse;
httpRequest.open("POST", form.action);
httpRequest.send(formData);
}
And here's the corresponding node script (the important part being form.on('progress')
var http = require('http'),
util = require('util'),
formidable = require('formidable');
http.createServer(function(req, res) {
if (req.url == '/upload' && req.method.toLowerCase() == 'post') {
var form = new formidable.IncomingForm(),
files = [],
fields = [];
form.uploadDir = './files/';
form.keepExtensions = true;
form
.on('progress', function(bytesReceived, bytesExpected) {
console.log('Progress so far: '+(bytesReceived / bytesExpected * 100).toFixed(0)+"%");
})
.on('file', function(name, file) {
files.push([name, file]);
})
.on('error', function(err) {
console.log('ERROR!');
res.end();
})
.on('end', function() {
console.log('-> upload done');
res.writeHead(200, "OK", {
"Content-Type": "text/html", "Access-Control-Allow-Origin": "http://test"
});
res.end('received files: '+util.inspect(files));
});
form.parse(req);
} else {
res.writeHead(404, {'content-type': 'text/plain'});
res.end('404');
}
return;
}).listen(8080);
console.log('listening');
Ok, so that all works as expected. Now here's the simplest socket.io script which I'm hoping to infuse into the previous two to emit the progress info back to my page. Here's the client-side code:
var socket = io.connect('http://test:8080');
socket.on('news', function(data){
console.log('server sent news:', data);
});
And here's the server-side node script:
var http = require('http'),
fs = require('fs');
var server = http.createServer(function(req, res) {
fs.createReadStream('./socket.html').pipe(res);
});
var io = require('socket.io').listen(server);
io.sockets.on('connection', function(socket) {
socket.emit('news', {hello: "world"});
});
server.listen(8080);
So this works fine by itself, but my problem comes when I try to place the socket.io code inside my form.... I've tried placing it anywhere it might remotely make sense, i've tried the asynchronous mode of fs.readFile too, but it just wont send anything back to the client - meanwhile the file upload portion still works fine. Do I need to establish some sort of handshake between the two packages? Help me out here. I'm a front-end guy so I'm not too familiar with this back-end stuff. I'll put this aside for now and move onto other lessons.
Maybe you can create a room for one single client and then broadcast the percentage to this room.
I explained it here: How to connect formidable file upload to socket.io in Node.js

Inconsistent AJAX POST status 400 . Issues with image complexity

Our team has developed a JS HTML5 canvas based paint application. In the following code, the image data is fetched from the canvas as base 64 encoding and posted to a servlet via ajax. The data post behaves erratically. If the image is simple , as in a straight line, I get Ajax status = 200 and the image gets saved. If the image is complex, then I get a status = 400 and the data is not saved.
Why should the content of the POST create issues with posting of the data itself?
function getCode(){
var canvas = document.getElementById('imageView');
var context = canvas.getContext('2d');
// draw cloud
context.beginPath();
// save canvas image as data url
var dataURL = canvas.toDataURL();
// set canvasImg image src to dataURL
// so it can be saved as an image
document.getElementById('canvasImg').src = dataURL;
var uri= document.getElementById('canvasImg').src;
uri = uri.replace('data:image/png;base64,','');
uri = uri.replace('=', '');
uri = uri.trim();
alert("uri is "+uri);
var ajaxobject ;
if(window.XMLHttpRequest){
ajaxobject = new XMLHttpRequest();
} else if(window.ActiveXObject){
ajaxobject = new ActiveXObject("Microsoft.XMLHTTP");
}else if(window.ActiveXObject){
ajaxobject = new ActiveXObject("Msxml2.XMLHTTP");
}
ajaxobject.open("POST", "SaveImageServlet?image="+uri, true);
ajaxobject.setRequestHeader("Content-type","application/x-www-form-urlencoded");
ajaxobject.onreadystatechange = function(){
if(ajaxobject.readyState==4){
alert(ajaxobject.status);
if(ajaxobject.status==200){
alert(ajaxobject.responseText);
}}
};
ajaxobject.send(null);
}
From looking at your code, the problem seems that you're passing the data in querystring instead of using the request body (as you should be doing since you're setting the POST verb).
Your uri should look like this:
SaveImageServlet
without the question mark and the parameter. The parameter should be set in the request body. Using jquery ajax your request would look like this:
$.ajax({
contentType: 'text/plain',
data: {
"image": yourBase64string
},
dataType: 'application/json', // or whatever return dataType you want
success: function(data){
// callback in case of success
},
error: function(){
// callback in case of error
},
type: 'POST',
url: '/SaveImageServlet'
});
On server side you should be reading the data from the appropriate place. For example, if you're using .Net read it like this:
Request.Form["image"]
instead of:
Request.Querystring["image"]
This should work as intended and consistently.
#Matteo, Thanks for your help and effort. However, AJAX issue never got solved. I found a way to send the base64 image data to the servlet. Just appended it to a hidden field and sent it as a regular form field.

XmlHttpRequest corrupts headers in Firefox 3.6 with "Content-Type:multipart/form-data"

I'm working on "multiple ajax uloader". Works fine in bleeding edge browsers (Chrome 6, Firefox 4). But in Firefox 3.6 I must manualy create output string to be sended, cos this browser doesn't support FormData object.
I followed many tutorial, especialy this. Author notify about correct setup of headers & content of body to be sended. I carefully followed that advises, but Firefox 3.6 fail my efforts.
This is correct setup of headers and body (captured by submitting simple static form):
correct headers, correct body
This is what I get, when I use Firefox's xhr object to submit same data:
wrong headers, wrong body
As you can see xhr's headers are corrupted. This lead in total failure of file upload. Here is a code I use:
function generateBoundary()
{
var chars = '0123456789',
out = '';
for( var i = 0, len = chars.length; i < 30; i++) {
out += chars[Math.floor(Math.random()*len)];
}
return '----' + out;
}
function getMultipartFd(file, boundary)
{
var rn = '\r\n',
body = '';
body = boundary + rn;
body += 'Content-Disposition: form-data; name="Files[]"; filename="' + file.name + '"' + rn;
body += 'Content-Type: ' + file.type + rn + rn;
body += file.getAsBinary() + rn;
return body;
}
$(function(){
$startUpload.click(function(){
var url = $uploadForm.attr('action'),
xhr = new XMLHttpRequest(),
boundary = generateBoundary(),
file = null,
body = '';
file = $SOME_ELEMENT_WITH_ATTACHED_FILE.file;
body = getMultipartFd(file, boundary);
console.info(file);
console.info(body);
xhr.upload.onload = function(){
console.info('done');
};
xhr.open('POST', url, true);
xhr.setRequestHeader('Content-Type', 'multipart/form-data; boundary=' + boundary);
xhr.sendAsBinary(body + boundary + '--' + '\r\n');
return false;
});
});
Here is also a dump of file and body variables:
dump file, dump body
Have anybody any idea, why xhr is corrupting headers this way?
I was scoping problem. I tried to use code in fresh Firefox installation under WinXP (my primary system is Arch Linux). Problem remains. I found that Mozilla's xhr has additional property called 'multipart'. With this set to true, headers is OK, but my xhr.events aren't fired - JS crash after sending file.
I scoped bit more deep with Firebug's JS debugger and found, that after xhr.multipart = true; code jumps into deep waters of jQuery library, where strange things happens around some curious events.
Even more curiou is that headers/content seems to be right in Firebug's console, but in HttpFox extension, it is corrupted.

Resources