We are in the US and using PageSpeed Insights, but really wanting to know how our website performs for our Chinese (and other global customers). We don't have a specific URL to pass in - localization is IP/cookie based. Is there a solution that will allow us, via the API, to pull back metrics for other locations?
Related
I'm trying to get people at a company to fill in a google forms survey with permissions from management.
The problem is that someone there has blocked all google features except search (I know it's crazy).
The survey is fairly large and the question format is tailored to google forms so rewriting it in a different survey system is hardly an option.
I also cannot ask them all to install a proxy or vpn.
I tried conventional proxy sites but they are viewing only eg. they can't fill in the survey.
Are there any other methods I can try to allow the people to fill in my google form survey?
thanks!
Is there any php API to gather information about a business(address, reviews) by its phone number from Yelp, Google, Insiderpages, Yahoo..
Please help, i have done research about these, but did't get the right info, though yelp is providing info by it's phone number but there they ask ywsid as mendatory (http://api.yelp.com/phone_search?phone=1234567890&ywsid=XXXXXXXXXXXXXXXX) but i want by phone number only.
Please note that all APIs have terms, most of them won't allow you to store their data in your own database and most of them have display requirements, before proceeding with any developing please read carefully their display requirements and terms.
Depending on your project you might not be allowed to use the data as you might need/want to on your project.
The other alternative would be scrapping sites, but most sites have rules against scraping too...
And again read a lot before putting too effort on something you are prohibited to in first place.
Yelp
ywsid = API key, you need to get your own key if you are using the yelp API, get it here
if you are using it to add it to your own database or storing the information anywhere it is against their policies display requirements & api terms.
if you are using any API you must read their terms before even thinking of doing anything.
Google Places
API
Insiderpages
I Don't think they have one but you could use the citygrid API that does a [lot of sites] search at once.
Yahoo
Yahoo API
CityGrid
As mentioned before citygrid API
Foursquare
Foursquare API
Merchant Circle
Merchant Circle API
White Pages
White Pages API
Yellow Pages
Yellow Pages API
Bottom line is, all these companies have put a lot of time and effort and money to build their databases, and they want you to redirect people back to their pages so they can make their money back/profit.
I am developing a Django app that functions basically as a data entry tool for websites. The use case has a trusted user or paid technician browsing the web. As they browse they enter data into an overlaid bar similar to what you see on many proxy websites, but containing a form that allows user to write metadata about the website (in this case, training classification data for an ML algorithm) and submit it to my app.
See http://hidemyass.com/proxy/ for an example of a proxy website that inserts an overlay into browsed sites.
I have heard conflicting suggestions on how to approach this.
Serve Websites as Proxy
Pipe all url requests through the django app with something like http://httpproxy.yvandermeer.net/, and rewrite the responses to include the header.
Pros
I can process the responses with sexy scientific libraries like the NLTK
AJAX-free failover. Users can submit human data (albeit with more of a hassle) without the need to submit computed data.
Cons
Greatly increased traffic. Now my webapp has to retrieve all websites and upload them to the user.
Some websites might block proxy requests. My intention is to deploy this on Heroku, but they might frown on an app that generates so many requests.
User Browses in an iFrame
The overlay is separated from the content by an iFrame, and I use javascript to inform the overlay on the page that is currently being browsed
Pros
Distributed Computing. User machines are used to make requests and do any necessary computations. The server is no longer a bottleneck.
Tighter Ajax integration. I can just post a JSON object representative of my entire Model.
Cons
iframes weren't really designed for full-scale browsing. Some websites force themselves out of iframes, and I worry that it won't be a reliable method of browsing.
I don't get to use all those sexy python libraries. My language processing will have to be done in javascript.
Question
I've never done anything like this before. I'm pretty new to all the tools involved, and seriously having trouble choosing between the two very different approaches.
Which method would you suggest? Why? Are there any considerations I have missed?
OKFN's annotator provides imho a good basis for what you are trying to accomplish http://okfn.github.com/annotator/
All!
I wanted to develop a feature for a blog I wrote myself. I want to grab the search result of Google Images, and display them in a user-friendly way in or beside my blog's posts editor.
So I did some research on code.google.com
I found Google's official AJAX API for Google images. But on its documentation site it says something like this:
Important: The Google Image Search API
has been officially deprecated as of
May 26, 2011. It will continue to work
as per our deprecation policy, but the
number of requests you may make per
day may be limited
I know they can restrict the number of requests I can make by my API key.But..
Can anyone tell me how exactly is this API restricted? like how much traffic or requests is allowed?
Is it possible to use this API to develop a Wordpress plugin that everyone else can use?
I want to know how an advertising network like adwords is built. What kind of systems display the ads and what kind of systems search keywords in the content of the publisher's website.
Google has a spider which indexes the content of pages on its adsense network. The ads are pulled in with JavaScript. The actual algorithms which decide what ads to display on a page are closely guarded secrets. Google uses Python a lot, so odds are most of the backend uses that.
To make this question approachable you need to specify what level/type of detail you want/need. Are you looking for a broad understanding of the information architecture and flow? do you need search/parse algorithms pseudo code/code? what exactly do you need?