cy.intercept does not accept Hebrew letters within headers values - cypress

I'm trying to change my headers with cy.intercept with cypress, when I enter English entries everything's fine,
But when I try to enter Hebrew entries I get:
"> Error: socket hang up"
For example:
Working:headers:{ lastname:"shalom",}
Not working: headers:{ lastname:'שלום'}
The file is encoded by utf-8,
I tried everything and nothing works,
If anyone can help me I would be very happy

Related

Cakephp 2.0 inflector slug text encoding utf 8 return unexpected string

I have a problem with the Inflector::slug(); function on my live server.
Local everything works fine.
I use the following code:
Inflector::slug($data['menu_items']['page_url'], '_');
A URL like 'this is an url' outputs 't_u_l'.
It has probably something tod o with text-encoding, but I changed everything to UTF-8.
Anybody had this problem?
Php have to be compiled with "--enable-unicode-properties" instead of utf-8
or/and
Sounds like your PCRE libraries are broken (this happens when the pcre libraries don't properly handle unicode patterns)

I can't send a request by Firefox in iso-8859-1

I have an application in Java EE and I have my database in ISO-8859-1, thus I need do the jsp encoding in ISO-8859-1... (all my pages are in iso-8859-1)
I have a jsp with a javascript code, which does a request to a Struts action.
This is my js code.
$.ajax({
type:'GET',
encoding:'iso-8859-1',
contentType: 'text/html;charset=ISO-8859-1',
url: xUrl,
success: function(){
$("#MensajeOk").attr('style','display:block');
$("#MensajeOk").delay(10000).slideUp(1000);
}
});
with IE and Chrome all is correct, because it does the request coding in ISO-8859-1 but Firefox encodes the request in UTF-8 and this is a problem for me, because in server side I need ISO-8859-1 and with FF there are some characters than i can't recover.
mi form is
<html:form action="/action.do" acceptCharset="iso-8859-1">
<input type="text" name="title">
and my java code is
new String((request.getParameter("title")+"").getBytes("iso-8859-1"),"iso-8859-1"));
with it, I can recover fine the text with IE and Chome, but fails with Firefox.
Other option will be send the request in UTF-8 encoding by encodeURI('data') but in the server side I can't convert the text from UTF-8 to ISO-8859-1...
Some idea???
Thanks a lot and sorry for me english!!
looking at the documentation, there doesn't seem to be an option called encoding - but theres a nice little hint (including the solution to your problem) on the contentType-option:
Data will always be transmitted to the server using UTF-8 charset; you must decode this appropriately on the server side.
so it seems like firefox is right and the other browsers do it wrong - try to remove the encoding-option and to a serverside conversion.

Firefox, PHP Headers and application/octet-stream

We have a page that outputs a file using PHP headers, so when you visit the page the download prompts up.
Randomly, the page will return a file application/octet-stream, other times it will return the correct thing.
It works prefectly in Chrome, IE, etc.
This only happens in Firefox and it is completely random.
Basically, it outputs as a application/octet-stream instead of the correct one.
After some more digging, I found the following:
http://support.mozilla.com/en-US/questions/800957
http://support.mozilla.com/en-US/questions/746116
http://support.mozilla.com/en-US/questions/699834

ajaxSubmit not working in IE jQuery Form plugin

I am using jQuery form plugin to upload images in my MVC project.
For some reason the Code in IE no longer working (worked before):
I can tell the submit is successful, image is successful uploaded, and recoded in database, however the response seems somehow corrupted in IE.
function showResponse(responseText, statusText, xhr, $form) {
$("#loading").hide();
AddImage(responseText.ImageId);
buildArray();
}
I tested on Firefox, Chrome, Safari, it all working fine, however when i use it in IE.
I got error:
Message: 'ImageId' is null or not an
object
Anyone have had any similar problem before?
Thanks in advance!
Well the problem solved by changing the content type from "text/plain" to "text/html", that's it.
OMFG, Internet Explore!
Code I have changed:
return Json(newImage, "text/html", Encoding.Unicode, JsonRequestBehavior.AllowGet);
hope that would help someone else as well.
As said in the documentation about File Uploads:
It is important to note that even when the dataType option is set to 'script', and the server is actually responding with some javascript to a multipart form submission, the response's Content-Type header should be forced to text/html, otherwise Internet Explorer will prompt the user to download a "file".
I don't know it worked before, but the documentation is clear about this concern.

Asian characters in IE 8 get garbled in Server; is this due to HTTP header Content-Type?

One of the request parameters in an http request made by the client contains Japanese characters. If I make this request in Firefox and look at the parameter as soon as it reaches the server by debugging in Eclipse, the characters look fine. If I do the same request using IE 8, the characters get garbled when I look at them at the same point in the server code (they are fine in both browsers, though). I have examined the POST requests made by both browsers, and they both pass the same sequence of characters, which is:
%2C%E3%81%9D%E3%81%AE%E4%BB%96
I am therefore thinking that this has to do with the encoding. If I look at the HTTP headers of the request, I notice the following differences. In IE:
Content-Type: application/x-www-form-urlencoded
Accept: */*
In Firefox:
Content-Type application/x-www-form-urlencoded; charset=UTF-8
Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7
I'm thinking that the IE 8 header doesn't state the UTF-8 encoding explicitly, even though it's specified in the meta tag of the HTML document. I am not sure if this is the problem. I would appreciate any help, and please do let me know if you need more information.
Make sure the page that contains the form has UTF-8 as charset. In IE's case, the best thing to make sure of this is by sending a HTTP header ('Content-Type: text/html; charset=utf-8') and adding a meta http-equiv tag with the content type/charset to your html (I've seen this actually matter, even when the appropriate header was sent).
Second, your form can also specify the content type:
<form enctype="application/x-www-form-urlencoded; charset=utf-8>

Resources