Ajax feedback from newsletter - ajax

I have created email contents as an html page.
I have placed 4 check boxes and every time one of them is checked I want to run ajax which calls a url with parameters. Is such a thing possible or the email client will not run it for security reasons or/and other reasons?
I can do it with a link to a page with the same contents but the recipient may not bother to click the link.
The whole idea is: Can we run ajax calls in html email contents as if the contents were in an autonomous web page?

No you can't. All JavaScript is stripped from emails for security reasons. The best you can do, as you've listed, is to have a link in the email with parameters to allow the landing page to do it.
You can achieve interactivity, however, with checkboxes - such as hiding/showing content (already placed in emails). If you're already inserting the parameters, I'm assuming then you have the information--so place that content hidden in the email, which when checked, shows. Mark Robbins shows how it's done: https://www.webdesignerdepot.com/2015/10/punched-card-coding-the-secret-of-interactive-email/ (if that's what you wanted, let us know in the comments and/or include your code and we can give a tailored example)

Related

ClickFunnels integration with rails2 app

Is there a way to create a page in ClickFunnels(https://www.clickfunnels.com/) website and when I submit that page, I need to store the form details in my rails app(into a particular table). Which means I want to display my database in the clickfunnels integrations list. I googled hours but couldn't get much information on this.
can anyone suggest me if you have done this. A reference link also much appreciated.
We couldn't find any way to do this inside clickfunnels itself, if there is a easy way to add custom systems to their integrations I too look forward to seeing those answers. Until then, here's what we did: We just put our custom form on their page and used ajax to send it back out the end point in our system it needed to hand that data too.
Then, since we also needed to submit the same info to click funnels, we build a fake CF form(I think we actually put one on the page, but used CSS to hide it, then filled it out dynamically from our custom form), and call submit on that form, sending the user through the normal click funnels submission process and sending them to the next page in the funnel.

Ajax generated pages with different URLs

I couldn't really word the title very well, but here's my problem: I've got a webpage that reads from a database each time the user clicks a button, the content is then replaced for part of the page.
Because it is an ajax load, everything is done in the background, and so the URL stays the same. This wasn't be a problem at all until I realised that I will want to have a different Facebook comments box for each set of content that is loaded - so if someone comments, it is posted to their facebook profile, people click on the link and are then taken to different content.
So... what I need is some way of referencing each set of content, and I've found a site that does exactly that (I'm sure there are a lot of them).
Here's the link.
Each set of content has a different 'hash code' (because I don't know the actual name for it) which is appended to the URL - in this case the code is "#1922934", this allows people to post links to it that specific set of content on Facebook etc. - and also allows a different Facebook comment box for each set of content.
Does anyone know how such a set-up can be achieved or how these 'hash codes' work?
Here's a document from wikipedia on it.
[http://en.wikipedia.org/wiki/Fragment_identifier][1]
The main idea is that URI fragments are used because they don't cause a page reload. They also can be used to refer to anchors on a web page.
What I would do is on page load use JavaScript to read the URI fragment (location.hash) then make a request to your server to load the comments etc. The URI fragment cannot be read by a server and is only found through a client (browser)
Sounds like you want something like SammyJS.

get title and url of the previous page on Magento

I'm trying to get title and url of the previous page the customers visited in Magento.
We can get the URL via $ _SERVER ['HTTP_REFERER'] variable but not sure how to get it's title.
Would appreciate any help on this issue or maybe different approaches how to display the previous page (url + title) in Magento
This information isn't sent normally, so you may want to add an observer to the page that saves the last title in the session. The trick will be that this will save the last page from /any/ tab, not necessarily for the referring page. You could save all titles from pages this way, but you'd have to hold onto everything (they could click the link a second time consecutively, etc etc). This may also run into issues with full page caching.
I know this is old but fyi you can JavaScript to the page and use
alert(document.referrer);
Since the numbers of web users with JavaScript turned on (~98%) and most site requirements it's safe. You would be able to make all adjustments to the DOM with that. Just one way of getting what you need.

Virtuemart Display Page after Remote redirect

I am aiming to create a payment module. Its users shall be redirected away from the site's URL in order for the transaction to be processed by a third party at a different URL. I would then like customers to be redirected back to a generic 'success' page that notifies them the order was a success. I have tried redirecting to the default success page (checkout.thankyou.php), but I get lots of errors; all the constants etc. that the application requires have obviously been lost during the redirect.
I would like to be able to retrieve the theme currently enabled in the configuration and use it to insert some basic HTML into the view. I would also like to access the database to perform some queries.
Can anybody advise? I am very stuck, and cannot find anything useful in the documentation! Thank you.
Can you be more specific about what type of information you want in your success page? If you just want basic HTML, then there's no reason you can't just write a basic Joomla article and redirect to that instead of trying to redirect to a VM partial. Again, if it's just basic HTML (no data from the transaction), then you can simply use a code inspector (like FireFox Inspect Element) to track down the CSS classes you like from the template and simply use them in your Joomla article to make it look like the VM template. You can find most of them in components/com_virtuemart/themes/default/themes.css.
If you need to display actual transaction data in your Thank You message, be prepared for a bit more work. You're probably going to have to write a cookie containing the record data BEFORE it gets sent offsite, and then read the cookie just prior to rendering the Thank You page.

Ajax generated content, crawling and black listing

My website uses ajax.
I've got a user list page which list users in an ajax table (with paging and more information stuff...).
The url of this page is :
/user-list
User list is created by ajax. When the user click on one user, he is redirected to a page which url is : /member/memberName
So we can see here that ajax is used to generate content and not to manage navigation (with the # character).
I want to detect bot to index all pages.
So, in ajax I want to display an ajax table with paging and cool ajax effetcs (more info...) and when I detect a bot I want to display all users (without paging) with a link to the member page like this :
JohnBob...
Do you think I can be black listed with this technique ? If you think so, could you please provide an alternative solution by keeping these clean urls and without redeveloping the user-list (without ajax) ?
Google support a specification to make AJAX crawlable:
http://code.google.com/web/ajaxcrawling/docs/specification.html
I did an experiment and it works:
http://seo-website-designer.com/SEO-Ajax-Google-Solution
As this is a Google specification, you won't get penalised (unless you abuse it).
Saying that, only Google support it at the moment (AFAIK).
Also, I believe following the concept of Progressive Enhancement is a better approach. That is, create a working html website then make the JavaScript enhance it
Maybe use the urls with an onclick to trigger your AJAX scripting? Like
Some URL
I don't think Google would punish you for this, you primarily use JScript, but you do provide a fall back for their bot, so your site doesn't get any less accessible.
EDIT
Ok, I misunderstood. Then my guess would be you basically have two options:
1. Write a different part of your site where bots end up, or,
2. Rewrite your current site to for example always give a 'full' page, with an option to only get, say, the content div. Then you can get only the content with JavaScript, but bots will always get a nice page.

Resources