Is there way to limit the user in reCaptcha V2 image matching challange. Like if user fail image maching challange for 5 time the then return fail - recaptcha

I am using reCaptcha V2. Sometime user knowingly or unknowingly clicking the wrong image at the image maching challange. So the new image grid is keep comming and comming and user keep cliking the wrong images again and again. My question is , Is there any way to limit that? Is there any setting that will allow developer to set the limit so if user image verify challange fails N time then we can stop the user doing it further and ask them to try again the entire process from the start.
Please help me out regarding this.
Thanks in advance
Please help me on
Is there any way to limit that? Is there any setting that will allow developer to set the limit so if user image verify challange fails N time then we can stop the user doing it further and ask them to try again the entire process from the start.

Related

Cannot request an increased API Reads Per User Per Seconed - Google Website error?

I keep getting errors from the Google API telling me that I have exceeded the limit of 100 reads per second per user when trying to read from Google Sheets using the API. I'm using CozyRoc's REST Connection and REST Source connected to Google Sheets then connecting to a SQL Server Destination table and trying to populate the table from the Google Sheet.
Please don't try to offer any suggestions about other programs or other ways of accessing the data. I have scores of SSIS packages that use the same set up to import data. In all of those that cycle through about 10 Google sheets with anywhere from one to 20 tabs and as many as 400 rows, I've had to set the Reads per Second to one to avoid the error. That makes my uploads incredibly slow. And YES, I contacted CozyRoc who tells me it's a Google API problem, not theirs, and there's nothing they can or will do about it.
So . . . I made sure we have a billing account . . .
I was able to sign in and look at the Quotas screen. 500 reads per Second and 100 Reads per Second per user.
I was able to request an increased limit for the Reads Per Second to 500 which doesn't require a billing account, but that doesn't change the Reads Per Second Per User.
CozyRoc uses my user id (oAuth token) to access Google Sheets, so every read only comes from one user.
A popup displays when I try to edit the Reads Per Second Per User from the Quotas screen asking me to set the limit to a maximum of 100 or to request a quota limit increase.
I click on that request button and I am immediately returned to the Quotas screen. I'm never asked to set a new limit, I'm never told that the request was sent and received, and I see no notice telling me how long it may take to process the request. It's been about a week now since I first tried. It would be nice if they would tell you SOMETHING!
My thought is that there may be something wrong with their website, perhaps with that popup. I've tried calling Cloud Support, but they refuse to help, saying basically "Not my job to help you with setting quota limits," even though my question is really whether there is a known issue with the quota limit increase website/popup.
SO . . . Is anyone else having a similar problem?
Can anyone tell me if what's happening is normal and that it just takes WEEKS for Google to process a quota limit increase or how long it normally takes for them to process a quota limit rate increase?
Is there anywhere I can reach out to at Google where I can do a screen sharing session and show them what's happening to get an answer or find out whether or not my request was even received?
Any ideas or thoughts as to how I can find out what's going on?
Please let me know.
Thanks,
How to edit API specific quota
The user per minute quota can be edited API and project specific
Go to the GCP console
Chose the project for which you want to icnrease the quota limit
Go onIAM & Admin -> Quotas
Choose Google Sheets API and Read reuquests
Click on the pencil next to Read requests per 100 seconds per user
Choose a quota limit of below 100 - which is the maimum limit allowed by the Sheets API
Click on Save
If you want to increase your limit above 100 requests per 100 seconds per user:
Mind that this limit is the above the normally alloweed quota
Click on apply for higher quota
Click on ALL QUOTA for Read requests per 100 seconds per user
Check the toickbox next to GLOBAL
Confirm your contact details and click on Next
Enter the desire new limit and click on Done
KEEP IN MIND
If you do not have a billing account, you are not eligible to chose a limit of more than 100 read requests for user per 100 seconds and you will get the error message
You can't request more quota because your project is not linked to a billing account.

Throttling in Laravel

Can somebody help me out with Laravel's Throttling ? Right now, my website uses throttling to prevent user from logging in for 'x' seconds, if the password they entered are wrong for 'x' number of time.
After logging in, user will require Two-Factor Authentication to update their information but i would like to throttle the Two-FA too, so that they will be locked out from updating their account. I can actually reuse the login's throttling codes to lock the user out but the issue is that, when the user logs out, they wont be able to log in due to the temporary lock.
I would like to create a custom throttle just for Two-FA and probably prevent the user from accessing that specific route for 'x' seconds.
I have tried searching around, but everything is related to login. If somebody could suggest me a package which will fit my requirement or provide a simple tut. will really be helpful to me. Thanks for your time.
This is all outlined in the ThrottlesLogins trait, but I'll try to simplify it even further.
Generate a unique key for the user and type of request:
$key = '2fa:' . $user->id;
Add a hit (increment count) on every request to the endpoint using the Illuminate\Cache\RateLimiter class:
app(RateLimiter::class)->hit($key, $timeoutInMinutes);
Check if the limit has been reached before processing the request:
$bool = app(RateLimiter::class)->tooManyAttempts($key, $maxAttempts, $timeoutInMinutes);

grails - how create new session for different browser tabs

I'm trying to create simple web-app using grails.
Now, I need create new session when user opens same page in different tabs to avoid displaying same data in all opened tabs.
is it possible to define that page was opened in new tab? if it possible how create new session in controller action?.
or maybe there is a way to get something like browser tab-id?
You seem to misunderstand how a session works and they are assigned.
A session is per browser (and domain/host).
So, even though you can create a new session in a controller action it won't help because that will become the session for all the tabs of the browser and the previous session(s) will be invalidated/abandoned.
There is no such thing as a browser tab id.
You'll need to address the root issue which is causing your data affinity to be based on a browser session. Make it based on something else. (Just a general suggestion since this isn't part of your questions and you haven't provided any details.)
Here is my thoughts on this.
What you are trying to accomplish may appear simple but you will need some mechanism to capture who each session be whether it be a spring security username or actual http session id and to then store with that what controller actions they have visited so far and to keep this consistently updated whilst checking it over and over again.
Something as simple as
[
['10001':[controller:'someController', 'someAction'],[controller:'someController1', 'someAction1'],
],
['10002':[controller:'someController', 'someAction'],[controller:'someController1', 'someAction1']
]
Where '10001' is your key of your map and is your session id then it contains a list of internal maps of places visited that you capture and try to work out if they been there already - basically the question here is....
Where is the AI to say if they have seen someAction1 they should see action2 and what happens when they seen action1 and action2 and so on an ever ending loop of and what next ?
Either way you could do all that as a session variable that contains a map like above - the issue you will hit will be concurrent map (where it gets updated and read at the same time).
So you will then need to look over and into using concurrent hashmaps to get around such issues.
Either way the problem with all of above is the consistent logic to figure out if they have seen all possible options then what next ?
I think you are far better off thinking of it from a different point of view as in base it on timestamp and move the query or whatever it is to randomly generate a different output based on that timestamp since that is always going to change regardless of the user

Sending Large amount of data(base64 strings) to the server via ajax(post)

I have page, on which i am holding base64 representation of some images(around 1mb each ), now i am posting this data via ajax to the server(contentType is default - url-encoded). This works fine if i have one or two images to be sent, but if i have more than 2mb of request data, the server doesn't accept that, and request parameters doesn't have anything, so i increased the maxPostSize in my tomcat, and it started accepting more data as well, but i am a bit apprehensive if this would create memory issues, especially if i have lot of images ?
Also i tried changing the contentType to multipart/formdata, but it errors out, "saying the request was rejected because no multipart boundary was found".
EDIT
I think i should elaborate more, actually requirement is something like this - User clicks on an upload link, he should be able to upload the file and then he should be able to see a thumbnail of the image on the page(all this without refreshing the page). I tried following approaches for this.
Reading the file using file reader, showing the thumbnail and then explicitly triggering the upload, when user clicks on save, simple but not cross browser, doesn't work in IE
Allowed user to upload the file send the base64 version of the image from the server and when user clicks on save, send the base64 string back to the server and convert it back to byte array and save to the db.
Now, i have a screen where all the records are by default editable, so clicking on save means, sending the image strings for all the records to the server, which will ofcourse create memory issues.
"3". Not implemented yet, but thinking of first saving the other fields(the non image fields) and then explicitly saving the images one by one(looks okay, but number of requests will be high)
Waiting for someone to suggest a 4th approach, hope i have explained enough
Disclaimer ... Not done anything like this, but...
Why not send each image separately? :)

python-twitter GetSearch giving empty list

My code was working correctly till yesterday and I was able to fetch tweets, from GetSearch(), but now it is returning empty list, though I check my credentials are correct
Is something changed recently??
Thank you
They might have a limit of requests in a certain amount of time or they had a failure on the system. You can ask for new credentials to see if the problem was the first one and try getting the tweets with them.

Resources