I have hundred of categories and sub-categories that were done entered earlier. I want to know able to consolidate them and move them to a more logical order. However the adminhtml, each move takes a long time.
I have already disabled the catalog_rewrite_url to Manual mode. Can I go disable anything esle programatically to make this faster.
Alternatively is there any way i can programatically change all of them into the new order.
(P.S. I am sorry if this is the wrong forum to ask this)
See this:
Magento - Move a category programmatically
You could work out all the moves you want and put together a little script to do it all for you.
You might also want to clear out the catalog_rewrite_url table completely (if you are on a dev environment). This can be rebuilt (its in index). In that way the old URL redirects won't have to be written.
Actually, if you are on dev, try clearing this table out anyway to see how it speeds up using normal admin.
I have the same problem. After examining the code I just changed js code so that user will see that category is moved in category tree but actual php backend job will be processing for some time. To do that. you need to override:
app/design/adminhtml/default/default/template/catalog/category/tree.phtml
change js ajax request code like this:
new Ajax.Request(
'<?php echo $this->getMoveUrl() ?>',
{
method: 'POST',
parameters: pd.join(""),
onCreate: function(request) {
console.log('onCreate()');
Ajax.Responders.unregister(varienLoaderHandler.handler);
},
onComplete: function(request) {
console.log('onComplete()');
Ajax.Responders.register(varienLoaderHandler.handler);
},
onSuccess : success,
onFailure : failure
}
);
With this screen will not be blocked by js when actual moving is made. Also you will be able to move multiple categories while one is already being moved.
Code is not complete. You will need to restrict that category cannot be moved to already moving category and some other constraints.
But if you know what you're doing everything will be ok. :)
Related
I have a server-side "cart" variable that gets updated via an AJAX call when a button is clicked (I'm using Shopify, if it matters, but I feel that this is a general question). I am using AJAX to reload a div once the cart changes. The problem I encountered was this:
I submit the "update cart" AJAX call
Immediately after I try to reload the div
Depending on the exact timing, maybe 1 out of every 10 times the reload would use the old cart data, since the cart change hadn't registered on the server yet.
I came up with a solution to use setInterval, but I think there are some serious problems with this method. I'll show the code first, then share my concerns.
function addToCart(prodid,prodHandle,sizeString){
var oldSpaces = getNumSpaces(); //gets the number of "free spaces" to display
//the actual call to update the cart
push_to_queue(prodid, 1, {}, Shopify.moveAlong);
//now wait for the number of items to change (ignore the possibility of cart update
//failure, that's handled elsewhere)
var timerVar = setInterval(function(){
var newSpaces = getNumSpaces();
if(newSpaces != oldSpaces)
{
$( document ).ready(function() {
$("#toybox").load(" #toybox > *");
clearInterval(timerVar);
});
}
},200);
};
My concerns are that this feels extremely hacky. I'm running my update function once every 200ms. Is there a way (in general preferably but in shopify only if need be) to ask the server itself to let me know when something has changed?
This seems like a strange question. The server does not change the cart. The client changes the cart. So when you ask for a preferable way to ask the server to let you know when something has changed, the answer will always be, that is never going to happen.
If you want to know when the cart changes, you will always know since you can listen to all cart events client side. Since you are coding up things client side, you need not trouble yourself with server events.
That is how Shopify cart works, and you are asking for advice with that in mind, so I hope this helps you. Polling every 200ms, or N ms or any seconds is a pointless exercise in wasting browser cycles.
I need help with the form submission and saving to db in my ruby camping web app.
The app itself is a quality assurance app. Users fill in forms for any errors detected during the quality check. The problem is that the check can take for a couple of hours or in extreme cases even more. During that time, in order to save the data, users have to submit the form several times. Currently I've set it up so that the users submit the form and are immediately redirected back to the form.
How would I go about to enable some sort of autosaving, let's say every 10 minutes or upon each change to the form so users don't have to do it manually.
This is quite difficult for me because I'm not a programmer per say, butI learn as go alogn to optimize my processes.
I've been reading about ajax and jquery but I'd really appreciate if someone could point me in the right direction since I don't know where to start and camping examples are rather scarce.
EDIT
Some additional infos:
I've successfully implemented jQuery to my camping app and I can manipulate html elements, but AJAX doesn't work. Firebug console doesn't return any errors.
Here are my assumptions:
If the form can be successfully submitted manually I don't have to change anything in my controller or view in order for AJAX to work, right?
The url passed to AJAX doesn't have to include the object id for which I want to submitt the form, right? I'm passing the '/edit' url and the full url to the relevant object is for example '/edit/5'.
Here is the code from the html head:
script :type => 'text/javascript' do "
$(document).ready(function() {
$('#np').change(function() {
$('#add_form').submit(function() {
$.ajax({
type: 'POST',
url: '/edit',
data: $('#add_form').serialize(),
});
});
});
$('#search_indiv_form').click(function(){
$('#search_indiv_form_toggle').toggle(700);
});
});"
end
Are there any errors in my ajax code to prevent the form being submitted?
thank you.
regards,
seba
I'm confused as to how to accomplish this. I have a page which, has a popup filter, which has some input elements and an "Apply" button (not a submit). When the button is clicked, two jquery .get() calls are made, which load a graph, a DataTables grid, photos, and miscellaneous info into four separate tabs. Inside the graph, if one clicks on a particular element, the user is taken to another page where the data is drilled down to a finer level. All this works well.
The problem is if the user decides to go back to the original page, but with the ajax generated graph/grid/photos etc. Originally I thought that I would store a session variable with the filter variables used to form the original query, and on returning to the page, if the session var was found, the original ajax call would be made again, re-populating the tabs.
The problem that I find with this method is that Coldfusion doesn't recognize that the session variable has been set when returning to the page using the browser's back button. If I dump out the session var at both the original and the second page, I can see the newly set var at the second page, and I can see it if I go to the original page through the navigation menu, but NOT if I use the back button.
SO.... from reading posts on here about ajax browser history plugins, it seems that there are various jquery plugins which help with this, including BBQ. The problem that I see with this approach is that it requires the use of anchor elements to trigger it, and then modifies the query string using the anchors' href attributes. I suppose that I could modify the page to include a hidden anchor.
My question, at long last is: is an ajax history plugin like BBQ the best way to accomplish this, or is there a way to make Coldfusion see the newly created session var when returning to the page via the back button? Or, should I consider re-architecting the page so that the ajax calls are replaced by a form submission back to the page instead?
Thanks in advance, as always.
EDIT: some code to help clarify things:
Here's the button that makes the original ajax calls:
<button id="applyFilter">APPLY</button>
and part of the js called on #applyFilter, wrapped in $(document).ready():
$('#applyFilter').click(function(){
// fill in the Photos tab
$.get('tracking/listPhotos.cfm',
{
id: id,
randParam: Math.random()
},
function(response){
$('#tabs-photos').html(response);
}
);
});
Finally, when the user calls the drill-down on the ajax generated graph, it uses the MaintAction form which has been populated with the needed variables:
function DrillDown() {
//get the necessary variables and populate the form inputs
document.MaintAction.action = "index.cfm?file=somepage.cfm&Config=someConfig";
document.MaintAction.submit();
}
and that takes us to the new page, from which we'd like to return to the first page but with the ajax-loaded photos.
The best bet is to use the BBQ method. For this, you don't have to actually include the anchor tags in your page; in fact, doing so would cause problems. This page: http://ajaxpatterns.org/Unique_URLs explains how the underlying process works. I'm sure a jQuery plugin would make the actual implementation much easier.
Regarding your other question, about how this could be done with session variables - I've actually done something similar to that, prior to learning about the BBQ method. This was specifically to save the state of a jqGrid component, but it could be easily changed to support any particular Ajax state. Basically, what I did was keep a session variable around for each instance of each component that stored the last parameters passed to the server via AJAX requests. Then, on the client side, the first thing I did was run a synchronous XHR request back to the server to fetch the state from that session variable. Using the callback method for that synchronous request, I then set up the components on my page using those saved parameters. This worked for me, but if I had to do it again I would definitely go with the BBQ method because it is much simpler to deal with and also allows more than one level of history.
Some example code based on your update:
$('#applyFilter').click(function(){
var id = $("#filterid").val(); // assumes the below id value is stored in some input on the page with the id "filterid"
// fill in the Photos tab
$.get('tracking/listPhotos.cfm',
{
id: id // I'm assuming this is what you need to remember when the page is returned to via a back-button...
//randParam: Math.random() - I assume this is to prevent caching? See below
},
function(response){
$('#tabs-photos').html(response);
}
);
});
/* fixes stupid caching behavior, primarily in IE */
$.ajaxSetup({ cache: false });
$.ajax({
async: false,
url: 'tracking/listPhotosSessionKeeper.cfm',
success: function (data, textStatus, XMLHttpRequest)
{
if (data.length)
{
$("#filterid").val(data);
$('#applyFilter').trigger('click');
}
}
});
This is what you need on the client-side to fetch the state of the photo list. On the server side, you'll need to add this modification to tracking/listPhotos.cfm:
<cfset session.lastUsedPhotoFilterID = URL.id>
And add this new one-line file, tracking/listPhotosSessionKeeper.cfm:
<cfif IsDefined("session.lastUsedPhotoFilterID")><cfoutput>#session.lastUsedPhotoFilterID#</cfoutput></cfif>
Together these changes will keep track of the last ID used by the user, and will load it up each time the page is rendered (whether via a back button, or simply by the user revisiting the page).
I'm having some trouble with the firefox and ie cache, on my website the user can make a query with a form, and this query returns a picture, but depending on which radio button is selected, it'll return a different picture, and it works just like that on chrome, but in IE and firefox, the same image is always returned, it only changes when i reopen the browser, can you guys give me some light on how to make this work?
Thanks to everyone, i solved my problem by putting and unique url each time i made the ajax call.
<?php $date = date("H:i:s");
echo ''; echo '<img src="web/WEB-INF/classes/lineChart.php?id='.$date.'" alt="">' ?>
Not sure the language you are using to code with, but regardless I am sure the strategy will work across the board. I pass an arbitrary value in the query string, something like a GUID or the datetime stamp. This will force a fresh load as the URL will be unique.
I use ASP.NET MVC which I then set a optional parameter in my route, which the controller method ignores. I then set my URL via JavaScript:
d = new Date();
$('.thumbnail').attr('src', $('.thumbnail').attr('src') + d.getTime());
The solution I needed was unique, so this is probably not similar to what you are trying to resolve. However, it should get the point across.
do private browsing in Firefox and see if image is changes successfully and problem is with cache memory or with the code
Given a unique can be the solution, but you have to watch the length in the string can keep growing, now, you can use this option at the begining of your code
$.ajaxSetup({ cache: false });
this will disable caching.
It's pretty nice to sort a dataset by a number of filters and get the results shown instantly, right?
My solution to do this would be to POST the "filter" (read forms) parameters to a page called dataset.php, which returns the appropriate dataset in compiled HTML, that can be loaded straight into my page.
So, besides this being a total no-no for SEO and for people having deactivated Javascript, It appears as a quite good solution to easily build on in the future.
However, I have yet not the experience to consider it a good or bad overall solution. What should be our concerns with an AJAX-fetched dataset?
So, besides this being a total no-no for SEO and for people having deactivated Javascript, It appears as a quite good solution to easily build on in the future.
Not entirely true, there are solutions out there like jQuery Ajaxy which enable AJAX content with History tracking while remaining SEO and javascript disabled friendly. You can see this in action on my own site Balupton.com with evidence it's still SEO friendly here.
However, I have yet not the experience to consider it a good or bad overall solution. What should be our concerns with an AJAX-fetched dataset?
Having Ajax loaded content is great for the user experience it's fast quick and just nice to look at. If you don't have history tracking then it can be quite confusing especially if you are using ajax loaded content for things like pages, rather than just sidebar content - as then you break away from consistency users are experienced with. Another caveat is Google Analytics tracking for the Ajax pages. These shortcomings, those you've already mentioned as well as some others mentioned elsewhere are all quite difficult problems.
jQuery Ajaxy (as mentioned before) provides a nice high level solution for nearly all the problems, but can be a big learning curve if you haven't worked with Controller architecture yet but most people get it rather quickly.
For instance, to enable history trackable ajax content for changing a set of results using jQuery Ajaxy, you don't actually need any server side changes. You could do something like this at the bottom of your page: $('#results ul.pages li.page a').addClass('ajaxy ajaxy-resultset').ajaxify();
Then setup a Ajaxy controller like so to fetch just the content we want from the response:
'resultset': {
selector: '.ajaxy-resultset',
request: function(){
// Hide Content
$result.stop(true,true).fadeOut(400);
// Return true
return true;
},
response: function(){
// Prepare
var Ajaxy = $.Ajaxy; var data = this.State.Response.data; var state = this.state;
// Show Content
var Action = this;
var newResultContent = $(data.content).find('#result').html();
$result.html(newResultContent).fadeIn(400,function(){
Action.documentReady($result);
});
// Return true
return true;
}
}
And that's all there is too it, with most of the above being just copy and pasted code from the demonstration page. Of course this isn't ideal as we return the entire page in our Ajax responses, but this would have to happen anyway. You can always upgrade the script a bit more, and make it so on the server side you check for the XHR header, and if that is set (then we are an ajax request) so just render the results part rather than everything.
You already named the 2 big ones. Now all you need to do is make sure all the functionality works without javascript (reload the page with the requested dataset), and use AJAX to improve it (load the requested dataset without reloading the page).
This largely depends on the context. In some cases people today may expect the results to be delivered instantly without the page refreshing itself. It does also improve overall user-experience - again, this largely depends on the context.
However, it does also have its pitfalls. Would the user have a need to return to the previous pages after the ajax content was delivered? Since this may not be as simple as pressing the Back button in the browser.