Google API costs - google-api

where do I see daily costs in the Google API interface for Translate? I can see how many requests sent but I can't see the actual cost breakdown. Any help would be greatly appreciated.

You can use the Price Calculator here, to get the correct price on your quota!
Google Cloud Price Calculator

Google Translate API pricing can be found here.
Prices per month
0-1.5 billion characters 1.5 billion+
Translation $20 per 1,000,000 characters* $15 per 1,000,000 characters*
Language Detection $20 per 1,000,000 characters* $15 per 1,000,000 characters*

As of 02 APR 2020
Effective 11/1/2019, up to $10 worth of free usage/month will be applied to Billed accounts using either the Basic or Advanced editions of the Cloud Translation API. The usage applies to NMT and PBMT text translation, and language detection calls, collectively.
More details can be found here https://cloud.google.com/translate/pricing

Related

Subscriptions - Positive and negative balance?

The docs about Subscription.getBalance() say
The amount of outstanding charges associated with a subscription.
I noticed that I can achive a negative balance as well by granting a credit voucher in form of a discount but this is unfortuntely not mentioned in the docs.
A few quetions come to mind:
Is this intentional and how do subscriptions behave with negative balance?
What happens if a subscription with negative balance gets canelled?
If transactions are not supposed to get refunded, what is the best way to handle a user's cancellation request anyway?
Full disclosure: I work at Braintree. If you have any further questions, feel free to contact
support.
The ability to have a negative balance on a subscription is intended functionality. Some merchants may give several months "free" by creating a discount that will show as a negative balance. When a subscription has a negative balance and is then charged for the next billing cycle, the negative balance will be used in place of charging them. For example, a customer has a negative balance of $20 on their subscription on a $30 per month subscription. On the next billing cycle they will be charged $10 (with $20 covering the negative balance).
If a subscription with a negative balance is cancelled that will be up to the merchant to determine how to handle. Refunding customers for transactions is certainly allowed and if you need to issue a refund to a customer you are able to do so.
For more information about negative balances and what refunds may pose problems I would recommend reaching out to support for more information. They will be able to walk through specific scenarios with you in greater detail.

Total POIs in a country or for every category

Not really a technical question but they don't have other means of contact apart from https://groups.google.com/group/google-places-api and SO.
I'd like to get the total count of all POIs or the total amount of POIs in a category on a country. Is this possible in the API (since it always returns 20 results at a time)? if not, is there a google places support email to ask them about certain questions?
As mentioned on Places API Google Group:
Google does not disclose such information. However, If you are
familiar with a few places and locations in Thailand, you or your
client could simply perform a few searched on maps.google.com and
compare the results with your own knowledge to get a general
understanding of how precise or complete the data is.

Ordering advertisements/offers to increase revenue (confidence algorithm?)

I have a site where there will be a list of offers that the user can fill out for virtual currency. What's a decent algorithm to decide what order to arrange them by?
What's important:
New offers move up so more people see them in order to get some
metrics on them
Highest EPC offers are on top (best money makers, highest converting)
The metrics I have:
- Tags (if the user likes movies, the offers tagged with movies should move up)
- Reported EPC - EPC of the offer according to the affiliate network
- Network EPC - EPC of the offer across all of our sites
- Site EPC - EPC of the offer on this site
- Source EPC - EPC of the offer from a certain source (there can be multiple per user)
- Payout - How much the offer pays per conversion (lead)
- Clicks - Clicks network-wide, site-wide, and from a certain source
Is there any recommended algorithm for this kind of problem? I was thinking some sort confidence algorithm (like the Wilson sorting algorithm) but I haven't a clue how to implement that with the metrics I have. Any ideas?
You are basically trying to build a ad recommender system. This should be a good starting point: http://pages.cs.wisc.edu/~beechung/icml11-tutorial/ . Take a look at the netflix challenge (movie recommender), KDD cup 2011 challenge (Music recommender) etc

An easy way with Twitter API to get the list of Followings for a very large list of users?

I have about 200,000 Twitter followers across a few twitter accounts. I am trying to find the twitter accounts that a large proportion of my followers are following.
Having looked over the Search API I think this is going to be very slow, unless I am missing a something.
40 calls using GET followers/ids to get the list of 200,000 accounts. Then all I can think of is doing 200,000 calls to GET friends/ids. But at the current rate limit of 150 calls/hour, that would take 55 days. Even if I could get Twitter to up my limit slightly, this is still going to be slow going. Any ideas?
The short answer to your question is, no, there is indeed no quick way to do this. And furthermore, with API v 1.0 being deprecated sometime in March, and v 1.1 being the law of the land (more on this in a moment).
As I understand it, what you want to do is compile a list of followed accounts for each of the initial 200,000 follower accounts. You then want to count each one of these 200,000 original accounts as a "voter", and then the total set of accounts followed by any of these 200,000 as "candidates". Ultimately, you want to be able to rank this list of candidates by "votes" from the list of 200,000.
A few things:
1.) I believe you're actually referencing the REST API, not the Search API.
2.) Based on what you've said about getting 150 requests per hour, I can infer that you're making unauthenticated requests to the API endpoints in question. That limits you to only 150 calls. As a short term fix (i.e., in the next few weeks, prior to v 1.0 being retired), you could make authenticated requests instead, which will boost your hourly rate limit to 350 (source: Twitter API Documentation). That alone would more than double your calls per hour.
2.) If this is something you expect to need to do on an ongoing basis, things get much worse. Once API 1.0 is no longer available, you'll be subject to the v 1.1 API limits, which a.) require authentication, no matter what, and b.) are limited per API method/endpoint. For GET friends/ids and GET followers/ids, in particular, you will be able to only make 15 calls per 15 minutes or 60 per hour. That means that the sort of analysis you want to do will basically become unfeasible (unless you were to skirt the Twitter API terms of service by using multiple apps/ip addresses, etc.). You can read all about this here. Suffice to say, researchers and developers that rely on these API endpoints for doing network analysis are less than happy about these changes, but Twitter doesn't appear to be moderating its position on this.
Given all of the above, my best advice would be to use API version 1.0 while you still can, and start making authenticated requests.
Another thought -- not sure what your use case is -- but you might consider pulling in, say, the 1000 most recent tweets from each of the 200,000 followers and then leveraging the metadata contained in each tweet about mentions. Mentions of other users are potentially more informative than knowing that someone simply follows someone else. You could still tally the most mentioned accounts. The benefit here would be that in moving from API 1.0 to 1.1, the endpoint for pulling in timelines for users will actually have it's API limit raised from 350 per hour to 720 (Source: Twitter API 1.1 documentation)
Hope this helps, and good luck!
Ben

How to estimate the real amount of internet users who visits concrete site?

Using Alexa.com I can find out that 0.05 % of all internet users visit some site, but how many people equals that 0.05% ?
Is there any facts like: in US 1% from Alexa statistics is nearly equals 15 mln of people, and in France 1% is about 3 mln of people, for example?
Compete.com reckon that google has something like 147m monthly users, and alexa says they have 34% monthly. Ergo, you could estimate it to be approx 450million. That's one way of estimating...
Of course the data from both Compete and Alexa gets progressively more rubbish the smaller the site gets. Data for the biggest sites is likely to be the least skewed, but I still wouldn't trust it for anything serious.
InternetWorldStats.com has a number of 1.6 billion internet users worldwide
You can get world population statistics online - estimates are available here:
Wikipedia World Population
This will help you to rough-up some statistics, but you need to remember...
Population is not equal to "has an internet connection"
0.5% does not really equate to "internet users" - it's more like 0.5% of people who are the kind of people that would install a random toolbar that offers them very little - so you need to bear in mind it's a certain "type" of person and that the statistics will be skewed (which is why www.alexa.com isn't ranked as EVERYONE with the Alexa toolbar is going to visit that website at some point
The smaller your website, the less accurate the statistics are. If you aren't in the top 100,000 websites in the world, the statistics become largely an anomaly as they "estimate up" the statistics from the toolbar users into an "average if everyone had a toolbar".
Hope this helps.
Alexa doesn't show "X% of France users use this site". Instead it shows "X% of worldwide users use this site". So you don't have such information except the margin cases when 100% of site users are from one country.
Also most toolbars show just Alexa Rank. You can get online converter "Alexa Rank -> Monthly Traffic" here - http://netberry.co.uk/alexa-rank-explained.htm
Well, here (http://netberry.co.uk/alexa-rank-explained.htm) is described a way to make a traffic estimation based on the alexa rank. Basically, the author has offered an exponential function, not linear or polynomial.
There is also a web service which aggregated alexa rank information and has already performed all the calculations: http://www.rank2traffic.com/
I checked it, and for 80% of the websites the results are very satisfying. Still, there is 20% of (possibly, manipulated by webmasters) incorrect data (the estimated traffic is much higher than in reality)

Resources