My problem is that I would like to download the csv from a facebook analytics cohort. And the button in the top right corner does just that. But downloading many reports manually is a lot of work, so I would like to do it from a script. I know of the Facebook Analytics Export option via their API, but that is only for the events (if I understood correctly) and doesn't seem to work for me besides (2 requests have the status "SCHEDULED" for over a day now, and their documentation says the whole process of preparing the files is 1-2 hours).
I have tried a simple web crawler but facebooks robots.txt forbids it from scraping any data.
Have I missed some API option that just lets me send which cohort and from which to which date I want data to be included, and would allow me to just perhaps get the data from a https request?
Thanks in advance!
Related
I have submitted a sitemap for my AJAX web application to Google via their Webmaster Tools. The submitted URLs are of the form:
http://www.mysite.com/#!myscreen;id=object-id
http://www.mysite.com/#!myotherscreen;id=another-id
However, even though more than a week has passed since sitemap submission, Google has not indexed the URLs. Google states that the sitemap has been processed, states that 60 URLs have been detected, states that no errors occurred, but does not index any of the URLs.
I have already implemented the AJAX crawlability contract on the server side, where requests containing an _escaped_fragment_ are responded to with a snapshot.
Any help/info regarding why Google is not indexing the URLs would be greatly appreciated.
See GWT SE friendly application
Suggestions include following the guide at http://code.google.com/web/ajaxcrawling/.
Nowadays, you don't need to do something specific for Google anymore, and AJAX crawling scheme has been deprecated been Google.
Just make sure that your website is easy to use for your users, and Google will be able to properly crawl it.
If you want to go the extra mile, however, you can check that article:
* https://moz.com/blog/optimizing-angularjs-single-page-applications-googlebot-crawlers
All!
I wanted to develop a feature for a blog I wrote myself. I want to grab the search result of Google Images, and display them in a user-friendly way in or beside my blog's posts editor.
So I did some research on code.google.com
I found Google's official AJAX API for Google images. But on its documentation site it says something like this:
Important: The Google Image Search API
has been officially deprecated as of
May 26, 2011. It will continue to work
as per our deprecation policy, but the
number of requests you may make per
day may be limited
I know they can restrict the number of requests I can make by my API key.But..
Can anyone tell me how exactly is this API restricted? like how much traffic or requests is allowed?
Is it possible to use this API to develop a Wordpress plugin that everyone else can use?
Basically, I want to use the Facebook Ads Manager Tool to estimate the number of users targeted by a particular set of targeting parameters. I know there is a published API available, but it is only usable if you are on their advertising application "whitelist." I am sure what I am asking is possible. Plus, it would be interesting to learn more about scraping.
Facebook's Ads Manager Tool is basically an AJAX UI for their ads API. In the process of creating a campaign, you can specify targeting parameters, and the page will dynamically report the number of users targeted as you modify the parameters. From what I've read on the web and here on stackOverflow, it is possible to use Firebug or a similar tool to pick apart what requests are being made by the page and to where, then mimicking these calls to get the information you want.
I'm having trouble interpreting the panels of Firebug. I think the URI I'm trying to send a request to is www.facebook.com/ajax/inventory_estimator.php, though I'm not sure how to form a call.
So, if I want to write a script or program that takes a list of words to use as keywords and returns the estimated number of users for each keyword, how could I do it?
Link to Facebook's Ads Manager Tool, Campaign Creation Page:
http://www.facebook.com/ads/create
yes using an extension like firebug to examine the HTTP requests is a good way to do this.
The Net tab is the one you want (last one).
Have you tried irobotsoft webscraper? It has a good ajax support.
Check their forum here: http://irobotsoft.org/bb/YaBB.pl
I've Googled this a 100 times, and I must be looking in all the wrong places, or looking up the wrong terms. I just don't know.
Basically, in my Google Checkout inbox, I can see all my customers' orders: Chargeable, Canceled, Charged, etc. I can export the CSV at the bottom of the page too.
However, is there an API I can use to write a script to export Charged orders between 2 dates?
I see tons of API info for using Google Checkout to make and accept orders, but I can't find anything to pull my merchant data OUT.
Well, of course I find the answer RIGHT AFTER posting this. So, I would like to share what I found in case someone else has the same question.
Google's Polling API (beta at time of posting this)
http://code.google.com/apis/checkout/developer/Google_Checkout_Beta_Polling_API.html
And Notifications:
http://code.google.com/apis/checkout/developer/Google_Checkout_XML_API_Notification_API.html
It appears that Polling allows you to request information and Notifications requires a secure web servers for sending notifications about orders to your server.
There is also a tool available (report.jar) that will allow you to leverage the Polling API to create detailed reports. The article below covers how to use the report.jar tool:
http://code.google.com/apis/checkout/articles/Order_Report_Tutorial.html
I am looking for a web service kind of like Google Analytics.
Paste some javascript into your web page and if any of the links there become invalid, hey presto, an email is sent to someone telling them which link, which page etc etc has the incorrect link.
Anyone heard of such a service?
This would slow the page loading down a lot if it had to check for broken links every time someone visited it (basically a http request for every link). Not that it isn't possible, but the implementation would have to be very very good.
Javascript cannot send emails, you would have to use ajax to post the details to another page that would then email the admin. As this is all client side, it is very open to abuse.
I would suggest using a program to do it every now and again. There are even Firefox extensions to do it rather than a program. Google will also list a whole host of websites offering the service.