I'm uploading files from a form on foo.bar.
I'm using jquery.form.js because I don't want the browser to navigate away.
var ref = this;
var options ={
success: onSuccess,
beforeSubmit: onBeforeSubmit
};
$("form#file-upload).ajaxForm( options );
It works when the "action" attribute is set to the same domain. But I get an error if the "action" is on a different domain (e.g. api.foo.bar )
Unsafe JavaScript attempt to access frame with URL
http://api.foo.bar/file/ from frame with URL http://foo.bar/index.php.
Domains, protocols and ports must match.
I am aware the jquery.form plugin is creating an iframe and posting the request there.
Is there a way to avoid the error?
set dataType:"jsonp". See here: http://api.jquery.com/jQuery.ajax/
Here's more info on jsonp:
http://en.wikipedia.org/wiki/JSONP
http://remysharp.com/2007/10/08/what-is-jsonp/
Related
I know title is a bit confusing but I will explain here instead.
I have an CMS for users to insert their own information. So inside the page that is an URL field:
Which mean users are able to insert whatever format such as
https://backend.com
www.backend.com
backend.com
and this url will put in a button on frontend. Once clicked on it, will redirect to the url accordingly.
Now the issue I'm facing is 3rd point (backend.com) format. If user key in "backend,com" and click on this button on frontend, they will redirected to https://mywebsite.com/backend.com. By right should redirect to "backend.com" instead.
I'm using Laravel and use {{$url}} to read data from database. Simplified code below:
//html
<a id="book-submit">Book</a>
//javascript
$("#book-submit").on('click', function(){
$.ajax({
url: '',
type: "POST",
data:{
},success :function(response) {
window.location.href='{{$url}}';
}
})
});
//mysql
$check= DB::connection('mysql_api')->table('course')->select('url')->first();
$data['url'] = $check->url;
Is that any method to prevent this? Hoping that some of you could provide me with some advice. Thanks!
To refer to another page using window.location.href you need http:// or https:// on it
if your already have data url in your database without http:// or
https://, you need to manipulate it by your own logic like you
using strpos, or `regex
You can validation the user input
For some information, you can use [FILTER_VALIDATE_URL] to validate URL
See http://php.net/manual/en/filter.filters.validate.php for additional flags
What is the correct way to open a SeaDragon viewer with straight XML data? I need to know what I'm doing wrong here. I have a bunch of DZI images that are hosted on another domain that I need to display, but I can't do a simple OpenSeadragon() call with the appropriate URLs because the domain the images are on has no "Access-Control-Allow-Origin" header. As such, I've set up a proxy controller to retrieve the XML data and pass that back to my web page. However, I can't get the images to load with the XML data.
I've been using a working image (from a different website) to test the issue and figure out what I need to do. When I use the following code, the image displays:
var viewer = OpenSeadragon({
id: "openseadragon1",
prefixUrl: "../../Scripts/openseadragon/images/",
tileSources: "https://familysearch.org/dz/v1/TH-1971-27860-10353-27/image.xml?ctx=CrxCtxPublicAccess&session"
});
Now I'm trying to display the image the way I am with my Proxy controller, by retrieving the XML and using the XML in my OpenSeadragon call:
var ajaxresult = $.ajax({
url: "https://familysearch.org/dz/v1/TH-1971-27860-10353-27/image.xml?ctx=CrxCtxPublicAccess&session",
type: 'get',
success: function (data) {
// data is an XMLdocument object
var viewer = OpenSeadragon({
id: "openseadragon1",
prefixUrl: "../../Scripts/openseadragon/images/",
tileSources: data
});
},
error: function (jqXHR, textStatus, errorThrown) {
alert(jqXHR.responseText || textStatus);
}
});
I get a blank image and my console says that every tile failed to load. I have also tried pasting the xml directly into the tileSources field as a string, like this:
tileSources: '<?xml version="1.0" encoding="utf-8"?><Image TileSize="256" Overlap="1" Format="jpg" ServerFormat="Default" xmlns="http://schemas.microsoft.com/deepzoom/2009"> <Size Width="6233" Height="4683" /></Image>'
but that doesn't work either.
What am I doing wrong here?
I found a way to resolve the issue. Because my images were hosted on an S3 account, I discovered that I could log into the account and add CORS configuration to each of the image buckets. So, no need to use Ajax to pull the XML; once I added CORS to the buckets, I was able to put the URLs in the OpenSeadragon call directly.
Unfortunately OpenSeadragon does not yet support passing the XML directly; you'll have to break apart the info. See the answer here:
https://github.com/openseadragon/openseadragon/issues/460
I have an application that contains a grid and button for the user to be able to export the grid to excel. I want to be able to show the save as dialog box when the server responds with the excel file.The server accepts the parameters as a JSON object. Here is my code:-
Ext.Ajax.request({
url: '/export/excel/',
method: 'POST',
//Send the query as the message body
jsonData: jsonStr,
success: function (result,request) {
Ext.DomHelper.append(document.body, {
tag: 'iframe',
frameBorder: 0,
width: 0,
height: 0,
//src:result,
css: 'display:none;visibility:hidden;height:1px;'
});
}, //success
failure: function (response, opts) {
var msg = 'server-side failure with status code: ' + response.status + ' message: ' + response.statusText;
Ext.Msg.alert('Error Message', msg);
}
});
I know there is a similar question ( ExtJS AJAX save as dialog box) but that references a static file on the server.In my case, depending upon the json sent the result is going to be different each time. I get back the entire excel file that i want in the result.responseText. What needs to be done so that the dialog box popup up asking the user for save as options? Also, im not sure how the src in the domhelper should be configured. Any help would be really appreciated.
I believe the only way to do this in a totally client agnostic way is to force it from the server-side using a Content-Type of: octect-stream and Content-Disposition of 'attachment' with a suggested filename. See:
http://www.w3.org/Protocols/rfc2616/rfc2616-sec19.html#sec19.5.1
and
http://www.ryboe.com/tutorials/php-headers-force-download
You can also use so-called 'dataURIs' but they have their own set of issues:
http://en.wikipedia.org/wiki/Data_URI_scheme
My recommendation would be to return EXCEL content dynamically on the server and just make the button a link to that url that POSTS the current data your working with and sets the correct response headers to trigger your browser to download the file. Since you are doing an AJAX call and not letting the web-browser get the URL directly, any HTTP headers you set to control the way the browser interprets the content won't matter because you are doing it in JS not in the user's browser. By returning the content directly to the user through a link on the server, you'll get around this problem.
Since you actually want the user to click on the link, I don't think AJAX is appropiate here.
I made the server side return the path of the location the file is created and saved, instead of sending the file as an attachment in the response and hence avoiding the problem of setting the headers in the response. Then i just set the source of the iframe in the code to the path that gets returned in the response (result.responseText).
I've come across a problem that if I use jQuery's Get method to get some content, if I click back, instead of it actually going back one page in the history, it instead shows the content returned by the Ajax query.
Any idea's?
http://www.dameallans.co.uk/preview/allanian-society/news/56/Allanian-test
On the above page, if you use the pagination below the list of comments you will notice when clicking back after changing a page, that it shows the HTML content used to generate the list of comments.
I've noticed it doesn't always do it, but if you click on a different page a few times and click the back button, it simply displays json text within the window instead of the website.
For some reason, this is only affecting Chrome as IE and Firefox work ok.
Make sure your AJAX requests use a different URL from the full HTML documents. Chrome caches the most recent request even if it is just a partial.
https://code.google.com/p/chromium/issues/detail?id=108425
Just in case you are using jQuery with History API (or some library like history.js), you should change $.getJSON to $.ajax with cache set to false:
$.ajax({
dataType: "json",
url: url,
cache: false,
success: function (json) {...}
});
Actually this is the expected behavior of caching system according to specs and not a chrome issue. The cache only differentiate requests base on URL and request method (get, post, ...), not any of the request headers.
But there is a Vary header to tell browser to consider some headers when checking the cache. For example by adding Vary:X-Requested-With to the server response the browser knows that this response vary if request X-Requested-With header is changed. Or by adding Vary:Content-Type to the server response the browser knows that this response vary if request Content-Type header is changed.
You can add this line to your router for PHP:
header('Vary:X-Requested-With');
And use a middleware in node.js:
app.use(function(req, res) {
res.header('Vary', 'X-Requested-With');
});
You can also add a random value to the end of the ajax url. This will ignore the previous chrome cache and will request a new version
url = '/?'+Math.random()
Just add the following header to the Response headers :
Vary: Accept
I couldn't give different urls for each ajax request as it was an ajax pagination, declaring no cache on headers did nothing, so i included a little javascript in the view only when headers were for the ajax request:
<script>
if (typeof jQuery == 'undefined') {
window.location = "<?php echo $this->here; ?>";
}
</script>
It is a dirty trick, but it works, if the ajax content is normally loaded, the container has Jquery loaded so it does nothing. But if you load the ajax supposed content without the surrounding content, Jquery is missing (at least in my case), so i redirect to the current page requesting a normal GET page with all the headers and scripts.
If you put it in the top of the page, the user won't notice because it won't wait till the page loads, it will redirect as soon as the browser gets this 4 lines...
Replace here; ?> by the current url in your APP, this was a CakePhp 2.X
Still had this problem in 2021 in Chrome.
Problem is doing underlying ajax request to the same url as the one the user is currently on.
I was working in Symfony and the complete fix that did the work for me was
$response->headers->addCacheControlDirective('no-cache', true);
$response->headers->addCacheControlDirective('max-age', 0);
$response->headers->addCacheControlDirective('s-maxage', 0);
$response->headers->addCacheControlDirective('must-revalidate', true);
$response->headers->addCacheControlDirective('no-store', true);
/**
* from https://stackoverflow.com/a/1975677/5418514
*
* The HTTP request header 'Accept' defines the Content-Types a client can process.
* If you have two copies of the same content at the same URL, differing only in Content-Type,
* then using Vary: Accept could be appropriate.
*/
$response->headers->set('Vary', 'Accept');
The #abraham's answer is right.
I just wanted to post a solution for Rails: all you need is just add different path to routes.rb.
In example, I have resource :people and I want to compose index page from ajax parts one of those is list of people. The straightforward way is to create index.js.erb and to load partial via ajax using url: people_path. But here occurs the issue.
So, for Rails, it needs just add a different route, like
get 'people_list', to: 'people#index', as: :people_list, format: :js
If I want to use index method of a laravel controller returns both html and json response, I add a get parameter at the end of the endpoint to pass browser caching:
axios.get(url, {params: {ajax: 1}})
I have my jquery mobile app pulling data from our mysql db using JSONP. The data is pulling fine, but the problem comes when I go back to the previous "page" in my app then click on a different option, it doubles the data on the next screen, and it will just keep stacking the data as many times as I do that. What am I missing?
The app doesn't look right in any browsers, but it looks fine in the ios simulator or appmobi simulator. I can post some code if needed, just know it won't look right in your browser.
Thank you for any help you can provide
$('#two').bind('pagecreate',function(event){
var img = getUrlVars()["st"];
var photo = $('#img');
$.ajax({
type: 'GET',
url: 'http://serverhidden/json/img.php?st='+img,
dataType: 'jsonp',
success: function(data) {
$.each(data, function(i,item){
var image = '<img class="stmap" src="images/states/lrg/'+item.img+' "/>';
photo.html(image);
});
},
error: function(e) {
//called if there is an error
//console.log(e.message);
}
});
});
Make sure you are not subscribing your event multiple times. It seems silly but is easy to do.
I would recommend you add logs to your JQM site so that you can see how many times your site is being updated.
You should also be aware that updating a JQMobile page often requires a call to a method to update content after a page is rendered. See here: jQuery Mobile rendering problems with content being added after the page is initialized
Hope those help.
So without any code from your project this is a shot in the dark but it seems like you populate a pseudo-page with information on pageshow with an .append() call. Instead of using .append(), use .html() as it will replace the information already present rather than add to it.
If each state has an individual page then you can bind to the pagecreate (or similar) event so the data will only be appended once rather than on each pageshow event.