How to monitor browsers and capture urls without extensions - windows

How can I monitor browsers and capture URLs when they want to download files without using extensions and plugins? For example Internet Download Manager (the version I have) captures URLs from Chrome and Firefox without using extensions, or Free Download Manager does the same thing with Chrome (no extension is installed) but for Firefox it uses an extension and they also provide this service for almost all browsers out there.
Can this be done without extensions and plugins?
Can this be done for other operating systems like OSX and Linux ?
Thank you so much

One option would be to simply intercept all traffic by browsers, and offer the option to download when uncommon filetypes (those that browsers would typically download rather than open) are navigated to. There is probably a more elegant solution, but I might have it act as an odd sort of firewall--one that blocks certain types and instead downloads them on its own.

I found the answer. IDM Uses Layered Service Provider (LSP), which Windows provides it.

Related

Merge Mozilla Addon to Build

I am working on a few addons on Mozilla since this easier than manually editing the source code.
The bigger picture is to have a customized browser that I can share with my fellow geeks and friends.
Question:Is there a simple way to add an addon to a Mozilla build so that my users don't have to manually install the addons on their computer.Something like a pre-packaged Setup.exe. The Setup.exe needs to be fully independent and not requiring to have Mozilla pre-installed.
More info(Edited):
Another reason is I do not want them to have access to the addons ,
the addons shouuld be in the core of the browser.Hence, a user should
not be able to turn off or even know it is an addon but barely the
functionality of their browser X.
You may want to use Portable Firefox, make your customizations and share the folder with your friends.
http://portableapps.com/apps/internet/firefox_portable

Mac replacement for fiddler's auto-responder feature

Fiddler lets me intercept http requests and respond with files from from my local machine. I am looking for a tool that does exactly that, on mac.
I tested charles but its "rewrite tool" does not allow that. I tried also httpscoop which allows only looking at requests, and wireshark where I could not even find the gui (probably due to my noobness on mac)
As far as I understand it, Charles' Map feature offers what you're looking for.
If you have a Windows PC or VM on your Mac, you can use Fiddler to capture the Mac's traffic. Also worth noting is that I'm at Telerik now and one of our goals is to support more platforms with Fiddler. An alpha version of Fiddler for the Mono framework is now available.
you can try a free chrome extension: Trumpet
Features:
Wildcard pattern
RegEx pattern
Category
File drag
Try Tamper, it's based on mitmproxy and it allows you to see all requests made by the current tab, modify them and serve the modified version next time you refresh.

Browser cleanup like CCleaner using VB.NET

I'm working on a VB.NET application and I need to delete all cookies, Internet caches and auto-complete keywords from all browsers found in the system. What are the folders I should be deleting content from?
Thank you.
Install any browser you want to support in your app and then check where it stores the data you want to delete. Also check the documentation of the browser (path might depend on OS, whatever).
Since it's easy to write your "own" browser (using 3rd party rendering engine), you will never be able to support all browsers found in the system.

UserAgent String information for knowing the plugins or extensions installed on the requesting browser?

Can the details about installed plugins or extensions found by inspecting UserAgent string? I tried installing many plugins but I could not seen any of its reflection in the user agent string.
The UserAgent string won't contain any information about the plugins or extensions installed on the client's browser. It is simply a depiction of the version of the browser the user has. If you are looking for the capabilities of the browser, try looking into Modernizr and checking out what capabilities it can test for.
I found this site.
It has many user agents. I went through many Useragents and finally I could see that in older browser versions of IE and Opera the name of the plugins/extensions were present.
In Chrome and Firefox I could not find any such imprints.
Some of the extensions in Chrome or Firefox are JavaScript based. If an extension is interacting with the page, it will most likely be using JavaScript. So depending on what extensions you're trying to detect, you might be able to detect them with JavaScript on the page. A lot of them, though, will have code encased in a separate context, so that may be difficult.
Here's a link from Google's plugin guide about content scripts.

How to Programmatically take Snapshot of Crawled Webpages (in Ruby)?

What is the best solution to programmatically take a snapshot of a webpage?
The situation is this: I would like to crawl a bunch of webpages and take thumbnail snapshots of them periodically, say once every few months, without having to manually go to each one. I would also like to be able to take jpg/png snapshots of websites that might be completely Flash/Flex, so I'd have to wait until it loaded to take the snapshot somehow.
It would be nice if there was no limit to the number of thumbnails I could generate (within reason, say 1000 per day).
Any ideas how to do this in Ruby? Seems pretty tough.
Browsers to do this in: Safari or Firefox, preferably Safari.
Thanks so much.
This really depends on your operating system. What you need is a way to hook into a web browser and save that to an image.
If you are on a Mac - I would imagine your best bet would be to use MacRuby (or RubyCocoa - although I believe this is going to be deprecated in the near future) and then to use the WebKit framework to load the page and render it as an image.
This is definitely possible, for inspiration you may wish to look at the Paparazzi! and webkit2png projects.
Another option, which isn't dependent on the OS, might be to use the BrowserShots API.
There is no built in library in Ruby for rendering a web page.
Using Selenium & Ruby is one possibility. You can run Firefox as a headless browser (ie on a server).
Here is the source code for browser shots. http://sourceforge.net/projects/browsershots/files/
If you are using Linux you could use http://khtml2png.sourceforge.net/ and script it via Ruby.
Some paid services to try and automate
http://webthumb.bluga.net/home
http://www.thumbalizr.com
as viewed by.... ie? firefox? opera? one of the myriad webkit engines?
if only it were possible to automate http://browsershots.org :)
Use selenium-rc, it comes with snapshot capabilities.
With jruby you can use SWT's browser library.

Resources