According to user agent how to change the url domain using apache - user-agent

Sample
Desktop : www.example.com Tablet : m.example.com Mobile :
mobile.example.com
How can i can change to www to m to mobile according to user agent.
(in inspect using emulator for mobile and tablet)
Can anyone please give me a solution.

You could use a filter at the backend which when the user hits a website, redirects him to the appropriate based on his user agent.
You can also do the redirect from a JavaScript perspective by parsing the user agent when the user hits the page and then issuing a redirect like:
<script type="text/javascript">
if (navigator.userAgent == <your logic>) {
window.location = "http://m.example.com";
} else if (navigator.userAgent == <your logic>){
window.location = "http://mobile.example.com";
}
</script>
You can find very good user-agent parsers on the web, but please keep note that parsing user-agents is not foolproof, since a UA string can easily be spoofed.
JavaScript UA Parser
Java UA Parser
In Chrome, you can install the User-Agent-Switcher plugin to easily change it.

Related

Render different website based on http header user agent

I have created 2 version of the website, one for desktop and another for mobile. When user point their browser to www.example.com, based on the HTTP user agent, I want my server to serve them different website.
I don't want to use responsive design due to the fact my design and page layout as well as content are quite different between desktop and mobile. Furthermore, we may want to play around with search crawler by having another rule to open another plain HTML website.
I wonder can I configure such rule in my web server? or on Cloudflare?
You can detect the user's device by checking the User-Agent HTTP header for the first-time visitors OR a cookie for returning visitors, then use a Cloudfalre Worker script that would act as a reverse proxy redirecting requests to either desktop or mobile version of the website/app.
import isMobile from "ismobilejs";
export default {
fetch(req) {
const device = isMobile(req.headers.get("user-agent"));
// TODO: Also check cookies (for returning visitors)
const { pathname, search } = new URL(req.url);
const targetUrl = device.phone
? `https://m.example.com${pathname}${search}`
: `https://example.com${pathname}${search}`;
return fetch(targetUrl, req);
}
};
References
https://github.com/kaimallea/isMobile — User-Agent parser
https://github.com/kriasoft/cloudflare-starter-kit — Cloudflare Workers Starter Kit

proxy.pac - exception for images

I'm a web developer and I use squid as a proxy, which I entered in firefox as the proxy server.
So when I enter http://www.example.com in firefox, I see the site on my local machine, by having configured squid accordingly.
Now problem is, that some of our customers have GBs of images, and it's a pain to load them all on my machine. So basically I want to use my offline webpage, but loading the images from the live server, so I don't have a broken site without images.
In order to do this I've tried to create a proxy.pac and configured it this way:
function FindProxyForURL(url, host) {
if (shExpMatch(url, "*.jpg")) {
return "DIRECT";
} else {
return "PROXY 192.168.178.31:3128; DIRECT";
}
}
Unfortunately it doesn't really work. What am I doing wrong, and how can I achieve my goal?
According to the Mozilla document on PAC files:
The path and query components of https:// URLs are stripped. In Chrome, you can disable this by setting PacHttpsUrlStrippingEnabled to false, in Firefox the preference is network.proxy.autoconfig_url.include_path.
What this means is when you enter a url such as https://www.example.com/image.jpg, what gets passed to the PAC script is the url https://www.example.com. As a result, you're never going to enter the first condition of your if statement.
In Firefox, you can change this by going to the about:config page and setting network.proxy.autoconfig_url.include_path to true.

Problem sending AJAX request with headers on Blackberry Webworks

I am developing a Blackberry webworks application and I am having trouble with an AJAX request that I am making to a server. I am learning HTML/Javascript/AJAX on the fly, so excuse any beginner mistakes. Basically, formatted HTTP requests are made to the server, which returns JSON objects that I use in the application. I am using AJAX to make the requests without any kind of framework.
Most requests do not have to be authenticated, and those are returning just fine. However, to access a directory part of the server, a username and password are encoded and sent as a header with the XMLHTTPRequest. when I try and add the header, the request is sent, but I never get anything back. The readyState property is set to 1, but never goes beyond that. I know the server works fine, because I did the same thing for iPhone, and it worked.
Here is the relevant code:
function grabFromServer(httpRequest){
httpConnection = new XMLHttpRequest();
var me = this;
httpConnection.onreadystatechange=function(){
alert(httpConnection.readyState);
if(httpConnection.readyState==4){
me.processResponseText(httpConnection.responseText);
}
};
httpConnection.open("GET", httpRequest,true);
if(this.request == "company" || this.request == "property" || this.request == "individual"){
var authorized = this.checkCredentials();
if(!authorized){
//ask for username pword
}
//here, add credentials
httpConnection.setRequestHeader("Authorization", "Basic : ODI5ZGV2bDokY19kdXN0Ym93bA==");
}
httpConnection.send();
}
Your code appears to be good. Have you added an entry in your config.xml file to allow access to your domain? You should see an entry for something like <access subdomains="false" uri="http://data.mycompany.com/"/>. To make any HTTPRequests to an external website from a WebWorks application, you have to add an entry to "whitelist" domain like this.
If you're using the eclipse plugin, open up the config.xml file, click the Permissions tab at the bottom, and click "Add Domain".

Can XDomainRequest be made to work with SSL?

I have code that uses Microsoft's XDomainRequest object in IE8. The code looks like this:
var url = "http://<host>/api/acquire?<query string>";
var xdr = new XDomainRequest();
xdr.onload = function(){
$.("#identifier").text(xdr.responseText);
};
xdr.open("GET", url);
xdr.send();
When the scheme in "url" is "http://" the command works fine. However, when the scheme is "https://" IE8 gives me an "Access denied" JavaScript error. Both schemes work fine in FF 3.6.3, where I am, of course, using XmlHttpRequest. With both browsers I am complying with W3C Access Control. "http://" works cross origin for both browsers. So the problem is with IE8, XDomainRequest, and SSL.
The SSL certificate is not the problem. If I type https://<host>/ into the address bar of IE8, where <host> is the same as in "url" above, the page loads fine.
So we have the following:
- hitting https://<host>/ directly from the browser works fine;
- hitting https://<host>/api/acquire?<query string> via XDomainRequest is not allowed.
Can it be done? Am I leaving something out?
Apparently, the answer is here: Link
Point 7 on this page says, "Requests must be targeted to the same scheme as the hosting page."
Here is some of the supporting text for point 7:
"It was definitely our intent to prevent HTTPS pages from making
XDomainRequests for HTTP-based resources, as that scenario presents a
Mixed Content Security Threat which many developers and most users do
not understand.
However, this restriction is overly broad, because it prevents HTTP
pages from issuing XDomainRequests targeted to HTTPS pages. While it’s
true that the HTTP page itself may have been compromised, there’s no
reason that it should be forbidden from receiving public resources
securely."
It would appear at present that the answer to my original question is: YES, if the hosting page can use the "https://" scheme; NO, if it cannot.

Crawling Ajax.request url directly ... permission error

I need to crawl a web board, which uses ajax for dynamic update/hide/show of comments without reloading the corresponding post.
I am blocked by this comment area.
In Ajax.request, url is specified with a path without host name like this :
new Ajax(**'/bbs/comment_db/load.php'**, {
update : $('comment_result'),
evalScripts : true,
method : 'post',
data : 'id=work_gallery&no=i7dg&sno='+npage+'&spl='+splno+'&mno='+cmx+'&ksearch='+$('ksearch').value,
onComplete : function() {
$('cmt_spinner').setStyle('display','none');
try {
$('cpn'+npage).setStyle('fontWeight','bold');
$('cpf'+npage).setStyle('fontWeight','bold');
} catch(err) {}
}
}).request();
If I try to access the url with the full host name then
I just got the message: "Permission Error" :
new Ajax(**'http://host.name.com/bbs/comment_db/load.php'**, {
update : $('comment_result'),
evalScripts : true,
method : 'post',
data : 'id=work_gallery&no=i7dg&sno='+npage+'&spl='+splno+'&mno='+cmx+'&ksearch='+$('ksearch').value,
onComplete : function() {
$('cmt_spinner').setStyle('display','none');
try {
$('cpn'+npage).setStyle('fontWeight','bold');
$('cpf'+npage).setStyle('fontWeight','bold');
} catch(err) {}
}
}).request();
will result in the same error.
This is the same even when I call the actual php url in the web browser like this:
http://host.name.com/bbs/comment_db/load.php?'id=work_gallery&..'
I guess that the php module is restricted to be called by an url in the same host.
Any idea for crawling this data ?
Thanks in advance.
-- Shin
Cross site XMLHttpRequest are forbidden by most browsers. If you want to crawl different sites, you will need to do it in a server side script.
As mentioned by darin, the XMLHttpRequest Object (which is the essence of Ajax requests) has security restrictions on calling cross-site HTTP requests, I believe its called the "Same Origin Policy for JavaScript".
While there is a working group within the W3C who have proposed new Access Control for Cross-Site Requests recommendation the restriction still remains in effect for most mainstream browsers.
I found some information on the Mozilla Developer Network that may provide a better explanation.
In your case, it appears that you are using the Prototype JavaScript framework, where Ajax.Request still uses the XMLHttpRequest object for its Ajax requests.
method:'post'
might well be your problem: the host serving the request likely rejects get requests, which is all you can throw at it from a browser address bar. if this is what's happening, you'll need to find or install some sort of scripting tool capable of doing the job (perl would be my choice, and unless you're running Windows, you'll already have that).
I do have to wonder whether what you're trying to do is legit, though: trawling other sites' comment databases isn't usually encouraged.
I would solve this by running a PHP script locally that will do the crawling from outside pages. That way jQuery doesn't have to go to an outside domain.

Resources