I've come to you today in hopes of getting some support in regards to the Google Distance Matrix API. Currently I'm using this in a very simple way with a Web Services request through an HTTP interface and am having no problems getting results. Unfortunately my project seems to be running into Query limits due to the 2,500 query Quota limit. I have added Billing to the project to allow for going over 2,500 queries, and it reflects that increased quota in my project. What's funky though is that the console is not showing any usage and so I'm not sure if these requests are being ran against what I have set up.
I am using a single API Key for the project which is present in my requests, and as I said before the requests ARE working, but again I'm hoping to see if someone can shed some light as to why I might not be seeing my queries reflected in my usage, and to see how I can verify that my requests are being run under the project for which I have attached billing.
If there is any information I can provide to help assist in finding an answer, please feel free to let me know and I'll be happy to give what information I can.
After doing some digging I was able to find the following relevant thread to answer my question:
Google API Key hits max request limit despite billing enabled
Related
I am working on a family networking app for Android that enables family members to share their location and track location of others simultaneously. You can suppose that this app is similar with Life360 or Sygic Family Locator. At first, I determined to use a MBaaS and then I completed its coding by using Parse. However, I realized that although a user read and write geolocation data per minute (of course, in some cases geolocation data is sent less frequently), the request traffic exceeds my forward-looking expectations. For this reason, I want to develop a well-grounded system but I have some doubts about whether Parse can still do its duty if number of users increases to 100-500k.
Considering all these, I am looking for an alternative method/service to set such a system. I think using a backend service like Parse is a moderate solution but not the best one. What are the possible ways to achieve this from bad to good? To exemplify, one of my friends say that I can use Sinch which is an instant messaging service in background between users that set the price considering number of active users. Nevertheless, it sounds weird to me, I have never seen such a usage of an instant messaging service as he said.
Your comments and suggestions will be highly appreciated. Thank you in advance.
Well sinch wouldn't handle location updates or storing of location data, that would be parse you are asking about.
And since you implied that the requests would be to much for your username maybe I wrongly assumed price was the problem with parse.
But to answer your question about sending location data I would probably throttle it if I where you to aile or so. No need for family members to know down to the feet in realtime. if there is a need for that I would probably inement a request method instead and ask the user for location when someone is interested.
I was wondering whether it is possible to autoscale if the demand for requests escalates? What do people do if the app they just created goes viral in the middle of the night, and people starts getting error-codes instead of data? Or is such functionality in the pipeline?
If your app hits its request limit, your extra requests will begin to fail with error code 155 (RequestLimitExceeded). To prevent the requests from failing you should adjust the request limit slider for the relevant app on the on the Account Overview page.
Now, coming to your question, Can this be done automatically? As of now, I will say No. Parse currently requires you to manually do that. Having thoroughly gone through all their blog posts, I will say that there are no hints of this functionality coming in near future. Anyways this question can only be answered 100% "correctly" by someone from Parse. We, at stackOverFlow, can only guess.
This is a great question you raised! As I see parse is a good PaaS with all the "cloudy" features. Even the pricing looks new generation type hourly based, however if it is lacking of automation to adjust the limits you will still pay for your unused capacity over a period of time just as in old datacenters which really bothers me (unless you pay someone to continuously monitor the performance and manually set the limits).
I am working with geocoder gem and like to process more number of requests from an IP. By default Google API provides only 2500 requests per day.
Please share your thoughts on how I can do more requests than the limit?
As stated before: Using only Google API the only way around the limitation is to pay for it. Or in a more shady way make the requests form more than one IP/API-Key which i would not recommend.
But to stay on the save side i would suggest mixing the services up since there a few more Geocoding APIs out there - for free.
With the right gem mixing them is also not a big issue:
http://www.rubygeocoder.com/
Supports a couple of them with a nice interface. You would pretty much only have to add some rate-limiting counters making sure you stay within the limits of each provider.
Or go the heavy way of implementing your own geocoding. With for example your own running Openstreetmaps database. The Database can be downloaded here: http://wiki.openstreetmap.org/wiki/Planet.osm#Worldwide_data
Which is the best way depends on what your actual requirements are and what ressources you have available.
Is there a public API for programmatically querying pagerank? If so, for what level of volumes would it be permitted for a service to use it?
From my Experience, It's OK to query page-rank as long as you do it with 1 second intervals, otherwise Google blocks you IP address after a few queries. if you're using .net there is library to help query page-rank anagrammatically , you can find it here.
Just tried a PageRank tool, and the answer from Google is clear :
http://www.google.com/support/websearch/bin/answer.py?&answer=86640&hl=en
That says :
"If you tried the steps above and haven't resolved the issue, it's very likely that a user or a computer in your network is sending automated traffic to Google. Your network administrator may be able to locate and shut down the source of the automated traffic; feel free to refer them to this page. Sending automated queries of any sort to Google is against our Terms of Service. This includes, among other things, the following activities:
Using any software that sends queries to Google to determine how a website or webpage ranks on Google for various queries"
So the answer is NO : Google forbids and blocks those kind of queries !
This is from google guideline page
http://www.google.com/support/webmasters/bin/answer.py?answer=35769
"Don't use unauthorized computer
programs to submit pages, check
rankings, etc. Such programs consume
computing resources and violate our
Terms of Service. Google does not
recommend the use of products such as
WebPosition Gold™ that send automatic
or programmatic queries to Google"
So I guess that is a no.
But I found this: http://www.fourmilab.ch/webtools/PageRank/ that you could try. I do not know if it is "legal".
I've seen some clients complaining about slowness of my website lately and I'm pretty sure that the problem is related to their network. I'd like to be able to justify this to myself more thoroughly and also be able to more proactively reach out to clients that appear to be having network issues before they come banging on my door.
If I was running ASP.Net I would try to use the Response.AppendToLog Method and append a token so that I could tie back everything back to my custom application level logging (user, client, processing time, etc.). I can't seem to find a way to do that without ASP.net. I'm guessing it's built into ASP's ISAPI. My requests are going through IIS to JRun's ISAPI to Coldfusion (.cfm/.cfc files).
I'm most interested in knowing how long it took the client to receive the content not just the time it took to process the request.
If there are other places/information that I'm not considering that's worth looking at, please let me know. Perhaps I should log information from HTTP.sys somehow?
I know that I could set a cookie on every request and have that logged by IIS, I was just hoping there would be a better solution.
Thanks for your thoughts!
See Jiffy. It "is an end-to-end real-world web page instrumentation and measurement suite."
The introductory video gives a good overview.