Adblock Plus: How do I whitelist a Javascript File to remove antiadblock - adblock

I am new to stack overflow, this is my first question
I have a specific anti adblock script:
http://www.pentafaucet.com/libs/advertisement.jsHow do I whitelist this in Adblock Plus, to circumvent it?
P.S. Also, You can use Greasemonkey

You can whitelist this request using the following exception rule:
##||www.pentafaucet.com/libs/advertisement.js^
However, to prevent the site from detecting Adblock Plus I'd suggest using the following filter:
##||www.pentaufaucet.com^$generichide (see https://adblockplus.org/filters#generic-specific)
Note that this filter might not work in other ad blockers since it has only recently been added to Adblock Plus.

Related

IE6 Users must be told to use Firefox

I am working on a site that has to use IE6 !!! BUT also requires intranet users to access apps that need Firefox. Is there a slick way of doing this when launching the app? (link pop up - seems old fashioned). Not sure how much access I have to the root directory, ie for HTaccess
thoughts much appreciated.
You can use a <div> element inside conditional comments and display it for intranet users only based on an IP check.

Mediawiki can't display images or styles

I`m using MediaWiki v1.19.1.
My wiki works well when I use it locally.
But when I access it over the network (from another computer, or a different IP),
it displays the text only. There are no images.
It seems like a classic skin but it`s not.
The reason is that there is no layout on my wiki (other public wiki pages show ok).
My wiki uses the monobook skin now, but I can see only the text on the page.
I have changed the permission to 777, including on all directories (/var/www/kj/*),
but still no images.
Help me, please...
I got the same issue some time ago and the following worked fine for me.
The issue might be related to the LocalSettings.php file and the general setting $wgServer.
The following link can provide you more details : Manual of $wgServer
Since 1.18 MediaWiki has also supported setting $wgServer to a protocol-relative URL.
eg, //www.mediawiki.org
This is used for supporting both HTTP and HTTPS with the same caches by using links that work under both protocols.
So try removing localhost and provide your URL; eg ; $wgServer = "//mywebsite.com";
There's not enough information to give a definite answer, however general recommendations for such situations are:
If you're using any Apache rewrite rules (for example, to make URLs prettier), try disabling them.
Especially if you're using the http://example.com/Page_title style URLs, you should know that they're unsupported by the developers and require serious MediaWiki/Apache skills (and even then they will likely introduce subtle bugs).
Install Firebug and check what's the HTTP error for your images: is it because access is denied (HTTP 403) or the webserver doesn't see them at all (HTTP 404) - this should give you an idea what's going on.

How to disable cross-site ajax policies in firefox?

I need a way to request any site using Ajax. I mean ANY site, I don't want to have to use the workarounds that firefox offers that only apply to someone who's making page requests from the same domain. Is there ANY way to let this happen? I want this to occur as a local file.
Downgrade your Firefox to under version 3
Try http://dirolf.com/2007/06/enabling-cross-domain-ajax-in-firefox.html
Firefox 3 note
Versions of Firefox prior to Firefox 3 allowed you to set the preference capability.policy..XMLHttpRequest.open to allAccess to give specific sites cross-site access. This is no longer supported.
BTW, you can also save your web application(.html) as .hta, HTA application is allow cross site scripting.

Browser for cross-site-script testing (for testing Mozilla Add-On)

I am working on a Firefox extension that will involve ajax calls to domains that would normally fail due to the same-origin policy set by Firefox (and most modern browsers).
I was wondering if there is a way to either turn off the same-origin restriction (in about:config, perhaps) or if there was a standard lite-browser that developers turn to for this.
I really would like to avoid using any blackhat tools, if possible. Not because I'm against them, I just don't want to add another learning curve to the process.
I can use curl in PHP to confirm that the requests work, but I want to get started on writing the js that the addon will actually use, so I need a client that will execute js.
I also tried spidermonkey, but since I'm doing the ajax with jquery, it threw a fit at all of the browser-based default variables.
So, short version: is there a reliable browser/client for cross site scripting that isn't primarily a hacker app? Or can I just turn off same-domain policy in Firefox?
Use GreaseMonkey with GM_xmlhttpRequest
Did you look into HTTP Access Control

How to filter myself out of Google Analytics with a dynamic IP address?

Does anyone know how to setup Google Analytics to filter yourself out if you're visiting the site from a dynamic IP address? I don't want to include myself in my stats from home use where I have a dynamic IP address via Verizon FiOS.
Google currently has a browser add-on that will block any visits of yours from showing up in any Analytics. http://tools.google.com/dlpage/gaoptout
Pluses and minuses of this opt-out versus filters are discussed in this blog post.
There are a couple ways of doing this. If you know the range of IP addresses you're accessing your site from (and don't mind filtering them all out) you can set up an "Exclude" filter for that range of IP addresses. If that's too restrictive, you can set a cookie using the Google Analytics code and filter on that. Both techniques are documented at Google's help system.
Alternatively, if you're dynamically producing the pages on the server, you could simply not write the Google Analytics code into the pages in the first place, based on the currently logged in user. On my site, I'm choosing to write the code or not based on a few things, such as whether the website is running in debug mode or if an administrator is logged on.
You can do this by creating a special page on your site that sets a Google Analytics segmentation cookie, using code something like:
<body onLoad="javascript:__utmSetVar('exclude_from_report')">
Then create a custom filter in Analytics to exclude visitors that match the 'exclude_from_report' segment pattern.
Consider using the NoScript plugin for Firefox. Just mark google-analytics.com as an untrusted site and you should be all set. A nice side-benefit: better security in your browser.
Just block the domain where google analytics lives via your system's hosts file:
127.0.0.1 www.google-analytics.com
This is less disruptive than the NoScript plugin mentioned by jdigital, but still makes you effectively invisible to google analytics.
Setting a cookie to prevent the analytics code from being sent to the browser is by far the best option.
If you're a developer and concerned that you're going to get a bazillion hits while you're developing the site you can add the following line in your analytics tracking code :
pageTracker._setDomainName(".yourwebsitename.com");
Assuming you're hitting a url not ending in .yourwebsitename.com during testing then the tracking code will see your URL is 'localhost' and not 'yourwebsitename.com' and not send any tracking.
You can always setup a proxy to tunnel all of your traffic through. Then simply exclude the proxy's IP from the results.
can't find a way to reply to answers, I second to the hosts file trick:
127.0.0.1 www.google-analytics.com
as it works in all browsers at the same time, as designers often try site in all browsers.
I recommend to use a 127.0.0.1 (localhost) redirect in the HOST file to block all/any type of abusive sites or domain/trackers/analytic and such. For a large list take a look at the WinHelp website. I have and still use it for all my PC's. You also need to look over the list which domains you do want and remark the lines with a # tag in the list.
All instructions are on the site for different operating systems.

Resources