Best way to make a newsletter slideshow kiosk for the office? - powerpoint

So, I've been tasked with making a kiosk for the office for showing statistics about our SCRUM progress, build server status, rentability and so forth. It should ideally run a slideshow with bunch of different pages, some of them showing text, some showing graphs and so on.
What is the best approach for this? I first thought of powerpoint, but It should be able to take the images from a webserver so I can automate the graph generation procedure. I would also like to take text from an external source when showing "Who broke the build" or some page like that.
I have no doubt that ready-made systems exist, but I don't really know where to look for them.
Is this easy/hard in powerpoint? Or are there an ubiquous app that everybody but me knows about?

I would recommend creating it as a series of web-pages, which uses Javascript or the meta refresh tag to cycle though the different pages. Simply full-screen the browser on a spare machine, and connect it to a projector/monitor/big TV.
This has lots of benefits:
it's trivial to display images from an external server (an <img> tag)
it will cost nothing to setup (it can run on basically any functioning machine), and runs in a browser
it is quick to do (you do not have to worry about cross-browser compatibility, or different screen resolutions as you know the exact machine you are developing for
it's expandable - while what you describe is probably possible within Powerpoint, but if you do it as a web-page, you can use Javascript (or a JS framework like jQuery), and it's very easy to serve the pages via a web-server, then you can use any server-side scripting language.
Basically, you would have a series of files, say slide001.htm, slide002.htm and slide003.htm. Slide 1 would redirect to slide002 after 30 seconds, slide002 to slide 003, and slide003 would redirect to slide001..
The specific things you mention: graph generation and "Who broke the build" text:
Not sure which CI tool you use, but many of them generate graphs anyway, so that would be required is having one "slide" with something like <img src="http://hudson.abc/job/proj042/buildTimeGraph">
For the who-broke-the-build text, you would be easiest to run the slides as .php files served though a web-server, using XAMMP.
Then you would have a function that scrapes your CI server for whoever broke the last build, and in one of the slides, you would have <?PHP echo(who_broke_build()); ?>
(Obviously if you know some other language/system better, use that!)
The final benefit I can think of is that, if you serve the files through a web-server, you can allow people display it locally, say as their browsers home-page.

Thanks. I found jqS5 which did most of what you mentioned.
It requires 1 document where every h2 becomes a new slide.
I can then use the meta-refresh to reload to next page every 10 seconds. When I reach the end of the slides, I pull data from an aggregated RSS feed from all the different systems in order to pull information.
http://staticfree.info/projects/jqs5/

Related

Web Scraping an Image

I was thinking about the applications of web scraping (still quite new to it) and came up with a question. Can you get an image from a page if there are advertisements on the page (like can you avoid advertisements and only look for the correct image content on the page)? Also, if the image is also a link to another page, can you say go to the next page and get that image (and then go from there until you either reach a certain amount or get all of the images)? This would mean avoiding going to the advertisements pages.
Absolutely. If you use a tool like kimonolabs.com this can be relatively easy. You click the data that you want on the page, so instead of getting all images including advertisements, Kimono uses the CSS selectors of the data you clicked to know which data to scrape.
You can use Kimono to scrape data within links as well. It's actually a very common use. Here's a break-down of that strategy: https://help.kimonolabs.com/hc/en-us/articles/203438300-Source-URLs-to-crawl-from-another-kimono-API
This might be a helpful solution for you, especially if you're not a programmer because it doesn't require coding experience. It's a pretty powerful tool.
I think if you are ok with PHP programming then give a look into php simple html dome parser. I have used it a lot and scrapped number of websites.

How to access unrelated browser window?

So I know this might sound crazy, as it is technically a security concern which I understand. So I'm just trying to find out if there's any ideas on how to handle something like this.
Anyways, long story short, I was told to look into figuring out a possible way to scrape information from another browser window/tab. I have been asked to do this because, and I know this sounds crazy too, but the users of our website are incompetent enough to not be able to copy/paste and or type correctly something from a different website. I know it's tough for some to have to have several things in their workflow, but this is basically what they do: Go to their first website (after logging in) and bring up a record with information on it...including an identification number. Then, the user should take that number and go to the second website, our website (after logging in), and type it that number in a textbox (and eventually do some other stuff). But we have found that getting that identification number from the first website to ours is difficult for them. Some copy/paste correctly, some copy/paste too much text from the page, some write it down on paper then type it in our website, and some just seem to have trouble visually "copying" the number from site to site.
What I was thinking was that this could happen: the user would have already brought up the record on the first site, then they would come to ours. They could click a button, and that would run whatever I/we here come up with, that goes and finds the other browser window, finds the specific text needed, and puts it in our textbox. Sounds simple, right? HA.
The first website is not owned or managed by us in any way, otherwise this might be a little easier.
A little bit of background information: unfortunately, I'm technically targeting IE >= 10 through 9, so if there's a solution just for this (why I tagged vbscript), then that's great. If there's a broader solution (like with an applet or browser extensions... http://crossrider.com/ ), then that's even better, but not important. If it helps, we already have a hidden applet on the page that accesses the OS (yes, it has the mayscript attribute on the element so it is able to), so I thought that could be something to incorporate with. Also, the way I expect to know which window/tab to access is by URL and/or document title - either will be very specific.
We cannot install stuff on the users' computers, at least something outside of the browser (like extensions). I'm not sure how browser extensions work, so I'm wondering if they'd need to be "installed".
I know of HTML5's postMessage, but it only has partial support in IE (and none in IE <= 7)...and the partial support refers to not including exactly what I might need. It also requires that the other website be listening (which we don't have control over, but technically might be possible to include). So it doesn't count :)
The things I found with Java are to possibly find the list of processes currently running, but I don't know how to access/control one. Especially how to access the browser's Document.
And vbscript...I just don't know. I don't know if it's just me, but I can't seem to find good documentation on it, so I'm not sure what can be done with it.
Even if I could get control of the other browser window, I don't know how I would get information from it (like the DOM).
I'm not looking for code, just ideas...I'll do the research. And although it may sound impossible, don't just brush it off because Javascript can't do it - I haven't.
UPDATE:
I ended up developing a browser extension with http://www.crossrider.com/ which wasn't ideal, but works.
You could use a bookmarklet for this ... the user would have to drag the bookmarklet into their bookmarks bar on their browser, but if doing that wasn't beyond your user's abilities/the technical restrictions you've mentioned, then you'd definitely be able to send the information you need back to your site that way.
You'd just need to give your users instructions to:
i) drag the bookmarklet into their bookmarks bar on their browser
ii) go to the website in question and click the bookmarklet
you could code the bookmarklet so that it would grab the info you need, and redirect the browser to your website. All done in one click.
I think you may be thinking about it in the wrong way when you talk about posting from one 'window' to another. You could write the bookmarklet so that it would do a http post of whatever information you wanted into your site from the other site, and it could also redirect the window that they were looking at when they clicked it (the other site) to your site. Or if, for some reason, you didn't want to redirect the the window that they had the 'other' site in to your site, then you could add a listener to your site so that once the bookmarklet had posted the info you require then the window with your displaying could automatically update. The first option would make more sense and be easier though.
Maybe to open the other site from button/link resided in your site using window.open() method?

Download dynamically created Flash

Is there a way to download dynamically created Flash (SWF) files? I'd like to fetch them and convert to PNG. (This can be done for example with swfrender.)
For example, when pointing my webbrowser to http://flashserver.company.com/statsgraph.jsp?data=2012-04 (URL completely made up), a JSP script on that server creates a statistics graph. But this is done on-the-fly, so there's no SWF file that I could download (for example with wget).
I could take a screen shot from what my browser displays, but I prefer to do this automatically with a server-side (shell or PHP) script, because it's about a few dozen graphs to be downloaded, once per month. BWT, what I intend to do is fully legal. :-)
It's unlikely and overly complicated to render SWFs server side. The SWF probably holds no data itself, anyway, it probably just displays it.
You need to use Firebug or something similar to dig up the SWFs data source and save that data set instead of the SWF. Then, you can create your own image graph from that data set, using jpGraph or something similar.

Alternative for Downloading Several Images to a Web Page

We all know that pre-fetching images can run slow because of browser limits in the HTTP protocol, right? So, I have XHTML, jQuery, Apache httpd, and PHP at my disposal. What's an easy solution to pre-fetch a lot of images, without using sprites or multiple hosts?
See, I have these themes one selects with a SELECT box. It changes the 200x200 theme image on the right of the box. Unfortunately there are like 150 of these. So, when I load the page, the progress bar keeps running to download these all.
How can I get these images pre-fetching faster without using sprites or multiple hosts?
If it's just a theme change, which probably rarely happens (right)? Then why wouldn't you just load the image for a theme when the the select is changed and a new theme is chosen? It seems "strange" to load 150 images of which 149 may not be seen.
Correct me if I'm missing the point - and if so, can you provide a screenshot so I can get an idea of what you're really trying to show?
Hindsight is probably 20/20. I probably should have implemented it in sprites, as well as for many of the buttons I used on the site. It's just that I lack a good sprite editor tool that speeds that process up.
Anyway, the strategy I went with was to use Javascript prefetch via jQuery. But even that wasn't enough. I had to wrap that function in a setTimeout(), but that only helped a little. I then had to fire that setTimeout() during a login form submit. It made the login form submit slightly longer, but made the website appear snappy on load.

Web Page Rendering Capture

I start with describing the problem itself. Rather than a problem I'm looking for a better solution. I have a asp.net page which has a bunch of images and a link underneath it, Each image is infact the latest rendering of the link underneath it.
I scheduled a bat script which runs every hour to fetch the images through IECapt a web page rendering capture utility. One thing am annoyed about this utility is it takes a lot of time for the 20 images I have and for few because of the flash content it misses to take the actual screenshot of the website.
Now I like to know can this rendering be done by traditional programming am not interested in using any utilities. I'm interested in trying this. The solution need not be necessarily a C# based am ready to try in any other language. Because it gives me a chance to learn.
Thank you.
You should probably look at moz-headless-screenshot
You should be able to embed the functionality you need.
http://blog.mozilla.com/ted/2010/07/29/moz-headless-screenshot/
he also provided a sample embedding client application called moz-headless-screenshot.
This is a simple command line tool that takes a URL, image size, and output filename
and generates a PNG screenshot of the webpage.
You should look into browser shots:
http://browsershots.org/
They do what you want to do for lots of different browsers. It is even open source.
There's no simple-simple solution for what you're asking to do. This is because rendering HTML, CSS, and Flash is actually a very sophisticated process.
If you're up for quite a bit of coding, you can use the Gecko engine (which powers firefox) or another open-source web-browser core (ie Dillo) to render the page onto a custom canvas. Then save that canvas to a file. Unless you implement support for browser plug-ins, you won't get Flash this way, though. You could try using Gnash or its like. Good luck with that.
I don't know of an open-source project that already does this. It would be neat, though :-). If you write something, please push it to the world; it would be really cool to have a "get a screencap of this URL" tool.
One way is to use IRobotSoft web scraper. You can design a robot to go to the URL every hour, and capture the whole web page as an image via a function CapturePage(imagefile).
I am not sure if it will be better than IECapt though.
We have used ACA WebThumb ActiveX Control (http://www.acasystems.com/en/web-thumb-activex/) quite successfully to capture parts or whole of a web page in the web server and then to write them to a file, just passing in the url. It performs fast enough for our need.
I am not familiar with IECapt, but this might be something you might want to have a look at.

Resources