I'm new to coldfusion and am trying out the ORM (Hibernate, I believe, which I don't know at all)
I came across a problem when trying to call two CF pages in rapid succession. The code on the two pages is super simple:
getAppointments.cfm:
<cfscript>
ORMReload();
appointments = serializeJSON(EntityLoad("Appointment"));
</cfscript>
<cfoutput>#appointments#</cfoutput>
getRooms.cfm:
<cfscript>
ORMReload();
rooms = serializeJSON(EntityLoad("Room"));
</cfscript>
<cfoutput>#rooms#</cfoutput>
The code that I use to call them is jQuery/AJAX:
var appointments;
var rooms;
$(document).ready(function () {
loadAppointments();
loadRooms();
});
function loadAppointments() {
$.ajax({
type: 'get',
url: 'getAppointments.cfm'
}).done(function (response) {
appointments = JSON.parse(response);
}).fail(function (response) {
var message = response.status + " - " + response.statusText;
alert("Failed to load appointments: " + message);
});
}
function loadRooms() {
$.ajax({
type: 'get',
url: 'getRooms.cfm'
}).done(function (jsonString) {
rooms = JSON.parse(jsonString);
}).fail(function (response) {
var message = response.status + " - " + response.statusText;
alert("Failed to load rooms: " + message);
})
}
If I set a breakpoint to pause execution before loadRooms() is called, all is well. If I let the code run straight through I get a 500 error, so it's pretty obvious that I'm running into some sort of concurrency issue with the ORM due to the asynchronous nature of the AJAX calls.
I'm running CF on IIS (localhost), with an SQL Server database.
None of the tutorials on CF that I've seen covers this kind of scenario. I know I could defer execution inside the JS functions, but that would only be masking the underlying problem.
Can anybody point me towards a solution? Is there something similar to a C# lock available in CF?
You are applying ORMReload() with each call which is causing to reload all entities, clear orm cache etc.
Please read about ORMReload(), it should only be used once whenever you make changes in your entity CFCs.
Related
I am using asp.net core MVC and my site has a lot of CRUD pages with dynamic UI such as modals loading when click on rows on a table to display more information or load a form.
This means that all of my pages have lots of jquery ajax requests and it's starting to get messy keeping track of it all.
Is there a better way to structure this or handle real time UI with asp.net core? I would like to try and only use asp.net core but I would be open to trying some other client side frameworks but I am not sure what would be suitable.
An example of the sort of code in my html files.
$(document).ready(function () {
var currentId = 1;
var controller = "Software";
$('tbody tr').click(function () {
var id = $(this).children()[0].textContent;
currentId = id;
$.ajax({
url: window.location.origin + "/" + controller + "/Details",
data: {
id: id
},
type: "GET",
dataType: "json",
complete: function (result) {
$('.modal-body').html(result.responseText);
}
})
$('#details-modal').modal('show');
})
$('#details-modal').on("click", "#edit-software", function (event) {
event.preventDefault();
$.ajax({
url: window.location.origin + "/" + controller + "/EditViewComponent",
data: {
id: currentId
},
type: "GET",
dataType: "json",
complete: function (result) {
$('.modal-body').html(result.responseText);
}
})
})
$('#risk-icon').hover(
function () {
$('#risk-img').css({ "display": "block" });
}, function () {
$('#risk-img').css({ "display": "none" });
})
});
You are already using a client side (framework or) library with your jquery ajax calls here. You can replace this with a client side framework while still using aspnet core for your api returning json just as you are today. Structuring components and handling all the ajax calls are just a couple of the exact reasons client side frameworks have become so popular today as people demand and design more responsive UIs. There are many to choose from. Angular, React, Vue, Knockout. There are plenty of arguments for each one also so choosing one depends on your more specific needs.
I have an ajax POST that sends a CSV file to the server. I have signalR 3 working to inform the client as to where its up to with the 16000 odd records its processing via a progress bar. It works.... for small files with processing finishing in less than two minutes. Any time processing takes longer it times out.
I have looked here and its not what I require.. and
I have tried adding an extended timeout using ".timeout: 600000" on the POST but that doesnt work. It still times out at 2 minutes but processing works and the database is still updated and it even tries to send back those records that were new.
Can I use a SignalR server call to keep the session alive? (how, with SignalR 3)?
can I use AJAX?
I need it to wait just over 8 minutes to avoid timeout.
AJAX code:
//On button click POST file and signalR connection Id back to controller.
$('#SubmitFile').click(function (e) {
e.preventDefault(); // <------------------ stop default behaviour of button
var url = "/SuburbsAndPostcodesAdmin/FileIndexView";
var connId = $.connection.hub.id;
var fd = new FormData();
fd.append('File', $("#txtFileUpload")[0].files[0]);
fd.append('connId', connId);
//Post back the file with the SignalR Id.
$.ajax({
type: "POST",
url: url,
data: fd,
processData: false,
contentType: false,
success: function (result) {
//display the results in a partial view.
$("#DisplayResults").html(result);
},
error: function (xhr, status, error) {
alert(xhr.responseText);
},
});
});
SignalR client code:
// Reference the auto-generated proxy for the hub.
var progress = $.connection.progressHub;
// Start the connection.
$.connection.hub.start().done(function () {
connectionId = $.connection.hub.id;
$("#SubmitFile").removeAttr("disabled");
$("#initialise").html("<h4>" + "Ready to upload" + "</h4>" + "<br /> " + "<br />");
});
updating progress bar:
//Send from server to the client process current notification.
progress.client.notifyProgress = function (processedRecords, newRecords, percentage) {
$("#NoOfRecordsProcessedValue").text(processedRecords);
$("#NoOfNewRecordsFoundValue").text(newRecords);
$('.progress-bar')
.css('width', percentage + '%')
.attr('aria-valuenow', percentage);
if (percentage > 5) {
$("#workDone").text(percentage + "%");
}
};
I am experiencing two issues with my jQuery record-inserting process, and I am hoping that this wonderful SO community can help me to solve at least one of those issues. These are the issues:
Issue 1 - Intermittent server delay
The first issue relates to the fact that my Ubuntu 10.04 server seems to exhibit intermittent, 4.5 second delays when doing a POST of data to the mySQL database. Most POST commands are executed within a normal amount of milliseconds, but when a delay occurs, it always seems to be for approximately 4.5 seconds. This is not a busy, public server so it shouldn't be a matter of server load being the problem. These short videos demonstrate what I am trying to explain:
Video 1
Video 2
I have posted a question on serverfault and am awaiting some input from that forum which is probably more appropriate for this Issue 1.
Issue 2 - Timing of jQuery POST and GET Methods
The real issue that I am trying to resolve is to prevent the call to GET before all of the calls to POST have completed. Currently, I have implemented $.when.apply to delay the sending of GET. Here is the code for that:
function(){
$.when.apply(undefined, InsertTheAPPs()).done(function (){
$.ajax({
url: sURL + "fileappeal/send_apps_email",
success: function() {
var m = $.msg("my message",
{header:'my header', live:10000});
setTimeout(function(){
if(m)m.setBody('...my other message.');
},3000);
setTimeout(function(){
if(m)m.close(function(){
window.location.replace(sURL+'client/view');
});
},6000);
$('#ajaxShield').fadeOut(1000);
},
error: function(){
$.msg("error message",
{header:'error header', live:10000});
}
});
});
}
My problem arises due to the delay described above in Issue 1. The GET method is being called after all of the POST methods have begun, but I need the GET method to wait until all of the POST methods have ended. This is the issue that I need assistance with. Basically, what is happening is happening wrong here is that my confirmation email is being sent before all of the records have been completely inserted into the mySQL database.
Here is the code for the jQuery $.each loop. This is the code that needs to not only begin, but must end before the ajax call to fileappeal/send_apps_email above:
function InsertTheAPPs(){
$('input[name=c_maybe].c_box').each(function(){
var jqxhrs = [];
if($(this).prop('checked')){
var rn = $(this).prop('value');
jqxhrs.push(
$.ajax({
url: sURL + 'fileappeal/insert_app',
type:"POST",
dataType: 'text',
data: {'rn': rn},
error: function(data) {console.log('Error:'+rn+'_'+data);}
})
)
return jqxhrs;
}
});
}
Anyone have any suggestions for how I can workaround the server delay issue and prevent the call to the GET before all of the POST methods have completed? Thanks.
There's a small problem with your post. After you resolve it, this post should help you finish out your code: jQuery Deferred - waiting for multiple AJAX requests to finish
You're returning inside the .each but the function itself doesn't return anything. So your delay is not being given the array of ajax calls to wait for. And also, since your jqhrs is defined inside the each, the scope is per iteration over each c_box. Your method should look like this:
function InsertTheAPPs(){
var jqxhrs = [];
$('input[name=c_maybe].c_box').each(function(){
if($(this).prop('checked')){
var rn = $(this).prop('value');
jqxhrs.push(
$.ajax({
url: sURL + 'fileappeal/insert_app',
type:"POST",
dataType: 'text',
data: {'rn': rn},
error: function(data) {console.log('Error:'+rn+'_'+data);}
})
)
}
});
return jqxhrs;
}
You can also make your code easier. Since you just want to know if something is checked you can use the jquery pseudo class filter :checked such as:
function InsertTheAPPs(){
var jqxhrs = [];
$('input[name=c_maybe].c_box').filter(':checked').each(function(){
var rn = $(this).prop('value');
jqxhrs.push(
$.ajax({
url: sURL + 'fileappeal/insert_app',
type:"POST",
dataType: 'text',
data: {'rn': rn},
error: function(data) {console.log('Error:'+rn+'_'+data);}
})
)
});
return jqxhrs;
}
You could combine the filter on :checked into the main filter such as $('input[name=c_maybe].c_box:checked') but I left it in long form to really demonstrate what was going on.
I'm attempting to pull two separate things from outside sources to put onto an HTML page I'm creating. I have a successful AJAX function to pull the most recent video from a particular Youtube channel by parsing through the XML/RSS feed for that channel. I receive this feed through an AJAX call.
I'd also like to get the most recent blog post from a Blogger account. The code for parsing the feed to get the most recent entry shouldn't be difficult, but I'm having trouble with simultaneous AJAX calls. I read somewhere that it can only handle one at a time? I'm weary about queuing them because I don't want to the content on the page to load in steps. I'd rather it all just get fetched simultaneously. How might I go about doing this?
Here is my current script:
<script type="text/javascript" charset="utf-8">
$(function() {
$.ajax({
type: "GET",
url: "http://gdata.youtube.com/feeds/base/users/devinsupertramp/uploads?orderby=updated&alt=rss&client=ytapi-youtube-rss-redirect&v=2",
dataType: "xml",
success: parseXml
});
});
function parseXml(xml) {
$(xml).find("item:first").each(
function() {
var tmp = $(this).find("link:first").text();
tmp = tmp.replace("http://www.youtube.com/watch?v=", "");
tmp = tmp.replace("&feature=youtube_gdata", "");
var tmp2 = "http://www.youtube.com/embed/" + tmp + "?autoplay=1&controls=0&rel=0&showinfo=0&autohide=1";
var iframe = $("#ytplayer");
$(iframe).attr('src', tmp2);
}
);
}
</script>
I read somewhere that it can only handle one at a time?
Either you misunderstood what the person was trying to say or they were incorrect. Javascript doesn't run any functions concurrently so someone with poor English might reword that as "can only handle one at a time" but that doesn't mean you can't make multiple AJAX calls. jQuery is smart and will do what it needs to do to make sure both calls are executed eventually.
If you'd like all the content to be loaded simultaneously the sad fact is you can't. However you can make it appear that way to the user by declaring a flag that is set by the success method of each call. Then just keep the content hidden until both flags have been set.
EDIT:
Here's a very simplistic approach to make it appear that they are fetched simultaneously:
<script type="text/javascript" charset="utf-8">
var youtubComplete = false;
var otherComplete = false;
$(function() {
$.ajax({
type: "GET",
url: "http://gdata.youtube.com/feeds/base/users/devinsupertramp/uploads?orderby=updated&alt=rss&client=ytapi-youtube-rss-redirect&v=2",
dataType: "xml",
success: parseXml
});
$.ajax({
type: "GET",
url: "http://someotherdata.com/",
dataType: "xml",
success: function() { otherComplete = true; checkFinished(); }
});
});
function parseXml(xml) {
$(xml).find("item:first").each(
function() {
var tmp = $(this).find("link:first").text();
tmp = tmp.replace("http://www.youtube.com/watch?v=", "");
tmp = tmp.replace("&feature=youtube_gdata", "");
var tmp2 = "http://www.youtube.com/embed/" + tmp + "?autoplay=1&controls=0&rel=0&showinfo=0&autohide=1";
var iframe = $("#ytplayer");
$(iframe).attr('src', tmp2);
}
);
youtubeComplete = true;
checkFinished();
}
function checkFinished()
{
if(!youtubeComplete || !otherComplete) return;
// ... Unhide your content.
}
</script>
The browser will support multiple outbound calls but there is a cap per domain. Take a look at this related Q/A How many concurrent AJAX (XmlHttpRequest) requests are allowed in popular browsers?.
There are several good libraries for doing request scheduling including chaining and parallelizing AJAX calls. One good library is https://github.com/kriskowal/q, which provides async promises framework to enable arbitrarily complicated chaining of AJAX requests. Q minified is about 3.3KB.
// The jQuery.ajax function returns a 'then' able
Q.when($.ajax(url, {dataType: "xml"}))
.then(function (data) {
var parsedXML = parseXML(data)
...
// some other ajax call
var urls = [Q.when($.ajax(url2, {data: {user: data.userId}})),
Q.when($.ajax(url3, {data: {user: data.userId}}))];
// run in parallel
return Q.all(urls)
})
.then(function (data) {
// data retrieved from url2, url2
})
This is my Ajax:
$("form[0] :text").live("keyup", function(event) {
event.preventDefault();
$('.result').remove();
var serchval = $("form[0] :text").val();
if(serchval){
$.ajax({
type: "POST",
url: "<?= site_url('pages/ajax_search') ?>",
data: {company : serchval},
success: function(data) {
var results = (JSON.parse(data));
console.log(results);
if(results[0]){
$.each(results, function(index) {
console.log(results[index].name);
$("#sresults").append("<div class='result'>" + results[index].name + "</div>");
});
}
else {
$("#sresults").append("<div class='result'>לא נמצאו חברות</div>");
}
}
});
}
});
When I type slowly (slower then a letter per second) I get the results correct, when I type faster I get 2 times the same results
example:
slow typing: res1 res2 res3
fast typing: res1 res2 res3 res1 res2 res3
Also, any advice on improving the code would be welcome!
Thats what is happening (pseudocode):
When you're typing slow:
.keyup1
.remove1
//asynchronous ajax1 request takes some time here...
.append1
.keyup2
.remove2
//asynchronous ajax2 request takes some time here...
.append2
When you're typing fast:
.keyup1
.remove1
//asynchronous ajax1 request takes some time here...
//and keyup2 happens before ajax1 is complete
.keyup2
.remove2
.append1
//asynchronous ajax2 request takes some time here...
.append2
//two results were appended _in a row_ - therefore duplicates
To solve duplicates problem, you would want to make your results removing/appending an atomic operation - using .replaceWith.
Build results HTML block first as string and then do the .replaceWith instead of .remove/.append:
var result = '';
for (i in results) {
result += "<div class='result'>" + results[i].name + "</div>";
}
$("#sresults").replaceWith('<div id="sresults">' + result + '</div>');
Another problem (not related to duplicates) may be that older result overwrites newer which arrived earlier (because AJAX is asynchronous and server may issue responses not in the same order it receives requests).
One approach to avoid this is attaching roundtrip marker (kind of "serial number") to each request, and checking it in response:
//this is global counter, you should initialize it on page load, global scope
//it contains latest request "serial number"
var latestRequestNumber = 0;
$.ajax({
type: "POST",
url: "<?= site_url('pages/ajax_search') ?>",
//now we're incrementing latestRequestNumber and sending it along with request
data: {company : serchval, requestNumber: ++latestRequestNumber},
success: function(data) {
var results = (JSON.parse(data));
//server should've put "serial number" from our request to the response (see PHP example below)
//if response is not latest (i.e. other requests were issued already) - drop it
if (results.requestNumber < latestRequestNumber) return;
// ... otherwise, display results from this response ...
}
});
On server side:
function ajax_search() {
$response = array();
//... fill your response with searh results here ...
//and copy request "serial number" into it
$response['requestNumber'] = $_REQUEST['requestNumber'];
echo json_encode($response);
}
Another approach would be to make .ajax() requests synchronous, setting async option to false. However this may temporarily lock the browser while request is active (see docs)
And also you should definitely introduce timeout as algiecas suggests to reduce load on server (this is third issue, not related to duplicates nor to request/response order).
You should involve some timeout before calling ajax. Something like this should work:
var timeoutID;
$("form[0] :text").live("keyup", function(event) {
clearTimeout(timeoutID);
timeoutID = setTimeout(function()
{
$('.result').remove();
var serchval = $("form[0] :text").val();
if(serchval){
$.ajax({
type: "POST",
url: "<?= site_url('pages/ajax_search') ?>",
data: {company : serchval},
success: function(data) {
var results = (JSON.parse(data));
console.log(results);
for (i in results)
{
console.log(results[i].id);
$("#sresults").append("<div class='result'>" + results[i].name + "</div>");
}
}
});
}
}, 1000); //timeout in miliseconds
});
I hope this helps.