I'm new to this Ajax thing. I wanted to try this
http://labs.adobe.com/technologies/spry/samples/data_region/SuggestSample.html
neat little Autosuggest form.
The form doesn't work when i save it locally.
Below there is a list of what i've done and used so far :
Firefox -> save pages as ..(index.html)
new folder ( test23 )
also saved the products.xml
opened index.html
change this line : var dsProducts = new Spry.Data.XMLDataSet("../../demos/products/products.xml", "/products/product", { sortOnLoad: "name" })
into : var dsProducts = new Spry.Data.XMLDataSet("products.xml", "/products/product", { sortOnLoad: "name" })
test failed :(
Can anyone help me out ?
AJAX requests cannot access the local file system, so requests like that will fail. You will need to have the page up on a webserver. If you want a local one, install XAMPP or something similar.
I just tried for like three minutes and got it to work at the first try (without images). you have to remember to get all the scripts and actually point to them in the main html file.
Don't forget the script tags on lines 41 through 43.
Kris
-- additions:
I tested on my Mac's local filesystem without any server using Safari as my browser. I have since deleted the files but could easily do it again and put the files up for download.
Related
Im creating a web extension and porting from XUL. I used to be able to easily read files with
var dJsm = Components.utils.import("resource://gre/modules/Downloads.jsm").Downloads;
var tJsm = Components.utils.import("resource://gre/modules/Task.jsm").Task;
var fuJsm = Components.utils.import("resource://gre/modules/FileUtils.jsm").FileUtils;
var nsiPromptService = Components.classes["#mozilla.org/embedcomp/prompt-service;1"].getService(Components.interfaces.nsIPromptService);
....
NetUtil.asyncFetch(file, function(inputStream, status) {
if (!Components.isSuccessCode(status)) {
return;
}
var data = NetUtil.readInputStreamToString(inputStream, inputStream.available());
var data = window.btoa(data);
var encoded_data_to_send_via_xmlhttp = encodeURIComponent(data);
...
});
This above will be deprecated.
I can use the downloads.download() to know what was the last download but I can NOT read the file and then get the equivalent for encoded_data_to_send_via_xmlhttp
Also in Firefox 57 onwards, means that I have to try to fake a user action by a button click or something, or upload a file.
Access to file:// URLs or reading files without any explicit user input
isnt there an easy way to read the last downloaded file?
The WebExtension API won't allow extensions to read local files anymore. You could let the extension get CORS privilege and read the content directly from the URL via fetch() or XMLHttpRequest() as blob and store directly to IndexedDB or memory, then encode and send to server. This comes with many restrictions and limitations such as to which origin you can read from and so forth.
Also, this would add potentially many unneeded steps. If the purpose is, as it seem to be in the question at the moment, to share the downloaded file with a server, I would instead suggest that you obtain the last DownloadItem object, extract the URL (.url) from that object and send the URL back to server.
This way the server can load directly from that URL (and encode it on server if needed). The network load will be about the same (a little less actually since there is no Base64 encoding involved which adds 33% to the size), and much less load on the client. The server would read the data as a binary/byte data stream; about the same as if the data was sent directly from the extension.
To obtain the last downloaded file you would do the following from a privileged script:
browser.downloads.search({
limit: 1,
orderBy: ["-startTime"]
})
.then(getLastDownload);
function getLastDownload(downloads) {
if (downloads.length) {
var url = downloads[0].url;
// ... send url to the server and let server fetch the data from it directly
}
}
According to this support mozilla question.
(2) Local file security
Firefox limits access from pages on web servers to pages on local disk or UNC paths. [...]).
Which solution ?
Use local-filesystem-links firefox addon (not tested)
and/or
run a small local webserver on client side, supposing server was run with sufficient privileges, you may finally access any local content via http:// (but still cannot with file:///)
I am trying to make a playground like plunker. I just noticed a very odd behavior on production (with active mode in Cloudflare), whereas it works well in localhost.
By iframe, the playground previews index_internal.html which may contain links to other files (eg, .html, .js, .css). iframe is able to interpret external links such as <script src="script.js"></script>.
So each time a user modifies their file (eg, script.js) on the editor, my program saves the new file into a temporary folder on the server, then refresh the iframe by iframe.src = iframe.src, it works well on localhost.
However, I realized that, in production, the browser always keeps loading the initial script.js, even though users modify it in the editor and a new version is written in the folder in the server. For example, what I see in Dev Tools ==> Network is always the initial version of script.js, whereas I can check the new version of script.js saved in the server by less on the left hand.
Does anyone know why it is like this? And how to fix it?
Edit 1:
I tried the following, which did not work with script.js:
var iframe = document.getElementById('myiframe');
iframe.contentWindow.location.reload(true);
iframe.contentDocument.location.reload(true);
iframe.contentWindow.location.href = iframe.contentWindow.location.href
iframe.contentWindow.src = iframe.contentWindow.src
iframe.contentWindow.location.replace(iframe.contentWindow.location.href)
I tried to add a version, it worked with index_internal.html, but did not reload script.js:
var newSrc = iframe.src.split('?')[0]
iframe.src = newSrc + "?uid=" + Math.floor((Math.random() * 100000) + 1);
If I turn Cloudflare to development mode, script.js is reloaded, but I do want to keep Cloudflare in active mode.
I found it.
We can create a custom rule for caching in CloudFlare:
https://support.cloudflare.com/hc/en-us/articles/200168306-Is-there-a-tutorial-for-Page-Rules-#cache
For example, I could set Bypass as Cache Level for the folder www.mysite.com/tmp/*.
I've uploaded them to a public folder. Everything works apart from these two.
Is there some extra "Google Drive Hosting" script I need to put in?
Or should it be working without any extra and I need re-check my code?
Many thanks!
me also faced same problem while uploading my custom Jquery Plugin, bt i can able upload js file to an existing folder which is already using for my previous project , me faced this problem while creating new folder only
but
In my new folder me just deleted that js file and uploaded it again ,its working fr me nw
save file as .js
content of that file
(function ( $ ) {
$.fn.doValidation = function(options) {
//ur code goes here
}( jQuery ));
I want to add validation in filefield of ExtJs4 , so that user can only browse .png , .jpeg image files..How should I do it ?
{
xtype: 'filefield',
id:'photoUpload',
buttonOnly:true,
buttonText: 'Photo'
}
I think it is important to understand how file upload works, so to prevent yourself from troubles in the future...
For security reasons, the following applies:
Browsers cannot access the file system unless the user has explicitly clicked on an upload field.
Browser has minimal access to the file being uploaded, in particular - you JS code may be able to see the file name (the browser has to display it in the field), but nothing else (the path itself on most browsers is not the correct one).
The upload process itself happens in these steps:
The user clicks on an upload field, initiating the file select dialog.
The browser implements access to the file system through the dialog, allowing the user to select a file.
Upon OK click, the browser sends the file to the server.
The server places the file in its temp directory (configured per server).
Once upload is complete, the upload script on the server is called with the file details, and that script will have full access to the uploaded file.
The last step is the only point where you have full access to the file details, including the real actual name, its size, and its content.
Anything the browser gives javascript is browser depended. Even the file name will vary between browsers although all the browsers I know do keep the actual file name (but not the real actual path), you cannot rely on this to work with future versions. The reason for this is that the file name is displayed on the client side.
So the recommendation is this:
Do all file upload checks on the server side.
Again, you may get away with the file name on the JS client side, particularly if you know and can test what browsers your clients will use, but I'd strongly recommend to to this test on the server.
The last thing you have to remember is that users might upload a file ending with .png, but the file itself is a .zip with the extension changed - so to really confirm that the file is .png you need to actually look into the file data, which only the server can do.
{
xtype: 'filefield',
id:'photoUpload',
buttonOnly:true,
vtype:'fileUpload',
buttonText: 'Photo'
}
And Vtype which I have use..
Ext.apply(Ext.form.VTypes, {
fileUpload: function(val, field) {
var fileName = /^.*\.(gif|png|bmp|jpg|jpeg)$/i;
return fileName.test(val);
},
fileUploadText: 'Image must be in .gif,.png,.bmp,.jpg,.jpeg format'
});
Try following snippet in your 'filefield' xtype config
regex : (/.(gif|jpg|jpeg|png)$/i),
regexText : 'Only image files allowed for upload',
msgTarget : 'under'
I'm looking for a way through AJAX (not via a JS framework!) to real time monitor a file for changes. If changes where made to that file, I need it to give an alert message. I'm a total AJAX noob, so please be gentle. ;-)
Edit: let me explain the purpose a bit more in detail. I'm using a chat script I've written in PHP for a webhop, and what I want is from an admin module monitor the chat requests. The chats are stored in text files, and if someone starts a chat session a new file is created. If that's the case, in the admin module I want to see that in real time.
Makes sense?
To monitor a file for changes with AJAX you could do something like this.
var previous = "";
setInterval(function() {
var ajax = new XMLHttpRequest();
ajax.onreadystatechange = function() {
if (ajax.readyState == 4) {
if (ajax.responseText != previous) {
alert("file changed!");
previous = ajax.responseText;
}
}
};
ajax.open("POST", "foo.txt", true); //Use POST to avoid caching
ajax.send();
}, 1000);
I just tested it, and it works pretty well, but I still maintain that AJAX is not the way to go here. Comparing file contents will be slow for big files. Also, you mentionned no framework, but you should use one for AJAX, just to handle the cross-browser inconsistencies.
AJAX is just a javascript, so from its definition you do not have any tool to get access to file unless other service calls an js/AJAX to notify about the change.
I've done that from scratch recently.
I don't know how much of a noob you are with PHP (it's the only server script language I know), but I'll try to be as brief as possible, feel free to ask any doubt.
I'm using long polling, which consists in this (
Create a PHP script that checks the content of the file periodically and only responds when it sees any change (it could include a description of the change in the response)
Create your XHR object
Include your notification code as a callback function (it can use the description)
Make the request
The PHP script will start checking the file, but won't reply until there is a change
When it responds, the callback will be called and your notification code will launch
If you don't care about the content of the file, only that it has been changed, you can check the last-modified time instead of the content in the PHP script.
EDIT: from some comment I see there's something to monitor file changes called FAM, that seems to be the way to go for the PHP script