I have these webpages with content that I want to print out. I could go to each webpage and just choose to print it. But I thought that it would be more fun and maybe a little useful, atleast for me; to have some script that I can run that just goes prints each page.
If I input each page that I want to print and the script goes through them and for each page it renders it and prints it. Does someone know how I could achieve this? What programming would suit this best?
Could I do it with some shell script?
Thanks!
Consider using wkhtmltopdf. Alternatively, you could probably script it using JavaScript in a fake page and load this page into a web browser on the command line.
Related
I am trying to write a Greasemonkey script to insert text into the Firefox search box that is next to the address bar.
I wonder if it is possible, if yes then how?
You cant do that.
That's because addons (in your case GreaseMonkey) run in a privileged environment which isn't available to userscripts and if this privilage was given to the user (i mean javascript) anyone could write a script and whenever you visit their website, they could do anything with your browser (or even your device!). So in other words Firefox (or formally browser) elements are unaccessible from js.
You could try to redirect your webpage to something like this: https://www.google.com/search?q=YOUR+SEARCH+QUERY .
Some parts were quoted from the comments of the question.
im using greasemonkey in firefox and looking to create a script that will close all but a specified tab(or webpage).
Ive little to no clue what im doing, i wanted it to be simple i.e.
if (tabs >= 3)
Close all tabs except specified webpage
but i just cant seem to figure it out or even find a list of my options =/
any help would be greatly appreciated
You can't do this, for the same reason that a random webpage can't: it would be a security nightmare. A given page can only control itself and its children.
I'm trying to import a script from one DB into another in FMP12 and found no other way then using UI scripting.
Most of the time the script stuck when I need to mark the checkbox, complaining that the UI element cannot be found, but as visible on the picture, it is referenced properly, at least this is how I see it:
Pls visit the link for the picture showing the problem, it is much clearer then if I write it down: https://www.dropbox.com/s/y8c7xuazrb5cvlq/Screenshot_25_4_13_3_08_PM.jpg
The funny think is that sometimes the script works fine and I cannot figure out what is making it working fine sometimes.
Any idea what am I doing wrong, or if there is some other way to refer to that UI element?
Thanks
Zsolt
I'm writing a Greasemonkey script that has a fair few user settings (just using GM_getValue and GM_setValue).
What I'd like to be able to do is create a settings page for the script, and add that to the #include-d sites. So, for example, it'd run on:
#include http://www.greasemonkeyedsite.com/*
#include about:myScriptConfig
Then the script would check the URL of the site it's being called for. If it's the about: one it'd create and display a settings page, otherwise it'd just run the script as usual.
I came up with this under the impression that you could type about:(anything) and it'd show up fine, with just the text following the about: as the page content. I remember this working last time I checked it, but that was years ago.
Seems to be that you can't just display arbitrary data by use of about:x any more, though. Firefox just displays a "The URL is not valid and cannot be loaded" error.
I know about the data: URI protocol, but it's not suitable as entering it manually into the address bar doesn't lead to its own page.
Is there some equivalent behaviour? Or am I going to have to just have a "settings" button on the top corner of greasemonkeyedsite.com that hides and shows a settings div?
If you have a permanent web site, you could make a URL there that becomes the Greasemonkey script's settings page. That could even be a convenient URL that allows the user to download the script if he does not already have it installed, and you can, that way, also offer the user a software update when a new version of your script is released. (Just have the Greasemonkey script check some "current version" part of the settings page.)
As mentioned by jnpcl, it is possible to create a chrome:// URI within the browser, but as I understand it, that requires a full-fledged Firefox add-on rather than just a Greasemonkey script.
You could use a designated URL on the affected site if you do not have a permanent web site, like http://www.greasemonekyedsite.com/myGreasemonkeySettingsPage. Your script could then strip out the parts of their 404 page it does not need, and then it could insert its list of settings within.
I start with describing the problem itself. Rather than a problem I'm looking for a better solution. I have a asp.net page which has a bunch of images and a link underneath it, Each image is infact the latest rendering of the link underneath it.
I scheduled a bat script which runs every hour to fetch the images through IECapt a web page rendering capture utility. One thing am annoyed about this utility is it takes a lot of time for the 20 images I have and for few because of the flash content it misses to take the actual screenshot of the website.
Now I like to know can this rendering be done by traditional programming am not interested in using any utilities. I'm interested in trying this. The solution need not be necessarily a C# based am ready to try in any other language. Because it gives me a chance to learn.
Thank you.
You should probably look at moz-headless-screenshot
You should be able to embed the functionality you need.
http://blog.mozilla.com/ted/2010/07/29/moz-headless-screenshot/
he also provided a sample embedding client application called moz-headless-screenshot.
This is a simple command line tool that takes a URL, image size, and output filename
and generates a PNG screenshot of the webpage.
You should look into browser shots:
http://browsershots.org/
They do what you want to do for lots of different browsers. It is even open source.
There's no simple-simple solution for what you're asking to do. This is because rendering HTML, CSS, and Flash is actually a very sophisticated process.
If you're up for quite a bit of coding, you can use the Gecko engine (which powers firefox) or another open-source web-browser core (ie Dillo) to render the page onto a custom canvas. Then save that canvas to a file. Unless you implement support for browser plug-ins, you won't get Flash this way, though. You could try using Gnash or its like. Good luck with that.
I don't know of an open-source project that already does this. It would be neat, though :-). If you write something, please push it to the world; it would be really cool to have a "get a screencap of this URL" tool.
One way is to use IRobotSoft web scraper. You can design a robot to go to the URL every hour, and capture the whole web page as an image via a function CapturePage(imagefile).
I am not sure if it will be better than IECapt though.
We have used ACA WebThumb ActiveX Control (http://www.acasystems.com/en/web-thumb-activex/) quite successfully to capture parts or whole of a web page in the web server and then to write them to a file, just passing in the url. It performs fast enough for our need.
I am not familiar with IECapt, but this might be something you might want to have a look at.