Youtube data api quota increase - youtube-data-api

Hi i am writing android app that lets users take videos and upload them to youtube, as of now i am not anticipating any where close to reaching quota since Im still developing the app. But I am worried since the limit is 1,000,000 units per day, I will eventually cross it.
So how do we increase it, I noticed under the quota tab there is an option, clicking it brings up a form. But it doesn't mention how much will it cost me to increase the limit? Also I couldn't find any google support page so I am asking it here
thanks

With the current quota, you can upload near 660 videos a day. If you get close to that number you can fill that form which is a long form and you need two cups of coffee and perhaps more than 2 hours to do it. In around 48 hours, they will send the result and will approve it, if your app behaves compatible with "terms of service". And it's free.
Cheers.
Edit
It is not documented anywhere, but very recently (almost since the day I answered this question), YouTube has changed the data-api, that each day it accepts 50 videos, and afterward it accepts only one video per 15 minutes. And because it is just applied without being documented nor explained by YouTube, we cannot anticipate what the limitation is going to be.

Related

Failing core web vitals despite 90%+ PSI results

Site : www.purelocal.com.au
Tested 1000's of URL's in Google PSI - all are Green 90%+.
However , in Google webmaster tools = 0 GOOD URLS.
Can someone please explain what Google requires and what we can do to pass core web vitals before JUNE ?
We've spent months optimising everything and cannot further optimise but Google says that NONE of our URL's pass core web vitals...it's just ridiculous.
Looking at your website's report in the CrUX Dashboard, there are a couple of things you could optimize more:
First, your site's LCP is right on the edge of having 75% good desktop experiences, and phone experiences are below that at 66% good. https://web.dev/optimize-lcp/ has some great tips for addressing LCP issues.
Second, while your site's desktop FID experiences are overwhelmingly good (98%), you do seem to have a significant issue for phone users (only 44% good). There are similarly great tips in the https://web.dev/optimize-fid/ article.
While the big green "98" score on PSI makes it look like the page is nearly perfect, what matters most in terms of the user experience is real field data. That information can be found in the "Field Data" and "Origin Summary" sections of the report.
You mentioned in the comments that your server response time is an issue. I can confirm this with lab testing:
https://webpagetest.org/graph_page_data.php?tests=210506_AiDcJ0_0283e8c51814788904bdf19cebe7a5c8&medianMetric=TTFB&fv=1&median_run=1&zero_start=true&control=NOSTAT#TTFB
https://webpagetest.org/result/210506_AiDcJ0_0283e8c51814788904bdf19cebe7a5c8/8/details/#waterfall_view_step1
The long light blue bar on line 1 of the chart above shows how long it takes your server to respond to the request. In this case the time to first byte (TTFB) is 1.132 seconds. This is going to be a huge problem for most users to achieve a fast LCP because in these tests it takes 1.9 seconds just to get the HTML to the client. No amount of frontend optimizations can make the HTML arrive sooner than that. You need to focus on backend optimizations to get the TTFB down.
I can't give you any specific hosting recommendations but it does seem like the shared hosting is adversely affecting your users' LCP performance.

Freezing while downloading large datasets through Shodan?

I'm using Shodan's API through the Anaconda Terminal on Windows 10 to get data against the query below, but after a few seconds of running, the ETA timer freezes, and my network activity drops to zero. Hitting Control+C restarts it when this happens and gets it moving for a few seconds again, but it stops soon after.
shodan download --limit 3100000 data state:"wa"
Also, when it is running- the download speeds seem pretty slow; and I wanted to inquire if there was any way I can speed it up? My Universities internet is capable of upwards of 300 Mbps, but the download seems to cap at 5 Mbps.
I don't know how to solve either of these issues; my device has enough space and my internet isn't disconnecting. We've tried running the Anaconda Terminal as an Administrator, but that hasn't helped either.
I am not familiar with the specific website, but in general seeing limited speed or stopped downloads are not caused by things 'on your side' like the university connection, or even your download script.
Odds are that the website wants to protect itself, and that you need to use the api differently (for example with a different account). Or that you have some usage limits in place based on your account, that you hit.
The best course of action may be to contact the website and ask them how to do this.
I heard back from Shodan support; cross-posting some of their reply here-
The API is not designed for large, bulk export of data. As a result,
you're encountering a few problems/ limits:
There is a hard limit of 1 million results per search query. This means that it isn't possible to download all results for the search query "state:wa".
The search API performs best on the first few pages and progressively responds slower the deeper into the results you get.
This means that the first few pages return instantly whereas the 100th
page will take potentially 10+ seconds.
You can only send 1 request per second so you can't multiplex/ parallelize the search requests.
A lot of high-level analysis can be performed using search facets.
There's documentation on facets in the shodan.pdf booklet floating around their site for returning summary information from their API.

Google authenticator expiration time (or drift?)

I'm using a gem to enable google's multi factor authentication for my app. https://github.com/jaredonline/google-authenticator
We want to start using text messages to make this a bit more accessible, and I was wondering if anyone knew if you could control the expiration of the tokens? Would bumping the drift up to 300 seconds work? Just curious if anyone else has encountered this. Thanks!
Basically, yes.
You can't change the length of a GA token (must be 30 seconds) but the drift will let you set a window for how much older a token can be. The underlying ROTP library will compute all tokens over the time window and succeed if any match.
But you probably don't want a window that high. The security comes from the user and the site being (mostly) in sync. You should account for a bit of drift between your server and the user's device, but anything much more than ~30 seconds is going to mostly be to your users' detriment. It's a 6 digit number -- it shouldn't take 5 minutes to key in.

disqus comments refreshing takes really long

I added disqus to my site (on localhost) and when I add comments it doesn't refresh automatically, it takes a really long time. About 10-15 minutes. I already turned developer mode on, and it is working because it reads the amount of comments, only it takes ages to read them. So when there are 5 comments, and I add 10 more it will stay on 5 for about 15 minutes. Is there something I can do about that?
The comments are cached, so no it's not something you can control. Realtime is still immediate, so that's generally how new comments get delivered to the client.
However, I wouldn't necessarily count on it being that way forever. We do change things to better the user experience based on feedback, whenever possible.

Web site loading speed is slow

My website http://theminimall.com is taking more loading time than before
initially i had ny server in US at that time my website speed is around 5 sec.
but now i had transferred my server to Singapore and loading speed is got increased is about 10 sec.
the more waiting time is going in getting result from Store Procedure(sql server database)
but when i execute Store Procedure in Sql Server it is returning result very fast
so i assume that the time taken is not due to the query execution delay but the data transfer time from the sql server to the web server how can i eliminate or reduce the time taken any help or advice will be appreciated
thanks in advance
I took a look at your site on websitetest.com. You can see the test here: http://www.websitetest.com/ui/tests/50c62366bdf73026db00029e.
I can see what you mean about the performance. In Singapore, it's definitely fastest, but even there its pretty slow. Elsewhere around the world it's even worse. There are a few things I would look at.
First pick any sample, such as http://www.websitetest.com/ui/tests/50c62366bdf73026db00029e/samples/50c6253a0fdd7f07060012b6. Now you can get some of this info in the Chrome DevTools, or FireBug, but the advantage here is seeing the measurements from different locations around the world.
Scroll down to the waterfall. All the way on the right side of the Timeline column heading is a drop down. Choose to sort descending. Here we can see the real bottlenecks. The first thing in the view is GetSellerRoller.json. It looks like hardly any time is spent downloading the file. Almost all the time is spent waiting for the server to generate the file. I see the site is using IIS and ASP.net. I would definitely look at taking advantage of some server-side caching to speed this up.
The same is true for the main html, though a bit more time is spent downloading that file. Looks like its taking so long to download because it's a huge file (for html). I would take the inline CSS and JS out of there.
Go back to the natural order for the timeline, then you can try changing the type of file to show. Looks like you have 10 CSS files you are loading, so take a look at concatenating those CSS files and compressing them.
I see your site has to make 220+ connection to download everything. Thats a huge number. Try to eliminate some of those.
Next down the list I see some big jpg files. Most of these again are waiting on the server, but some are taking a while to download. I looked at one of a laptop and was able to convert to a highly compressed png and save 30% on the size and get a file that looked the same. Then I noticed that there are well over 100 images, many of which are really small. One of the big drags on your site is that there are so many connections that need to be managed by the browser. Take a look at implementing CSS Sprites for those small images. You can probably take 30-50 of them down to a single image download.
Final thing I noticed is that you have a lot of JavaScript loading right up near the top of the page. Try moving some of that (where possible) to later in the page and also look into asynchronously loading the js where you can.
I think that's a lot of suggestions for you to try. After you solve those issues, take a look at leveraging a CDN and other caching services to help speed things up for most visitors.
You can find a lot of these recommendations in a bit more detail in Steve Souder's book: High Performance Web Sites. The book is 5 years old and still as relevant today as ever.
I've just taken a look at websitetest.com and that website is completely not right at all, my site is amoung the 97% fastest and using that website is says its 26% from testing 13 locations. Their servers must be over loaded and I recommend you use a more reputatable testing site such as http://www.webpagetest.org which is backed by many big companies.
Looking at your contact details it looks like the focus audience is India? if that is correct you should use hosting where-ever your main audience is, or closest neighbor.

Resources