recaptcha validation API returns always true - recaptcha

I am using google's recaptcha test setup (Site key: 6LeIxAcTAAAAAJcZVRqyHh71UMIEGNQ_MXjiZKhI) but always get returned success: true..
My python code is straight forward I guess
challenge = event["body"]
data = {
'secret': "6LeIxAcTAAAAAGG-vFI1TnRWxMZNFuojJ4WifJWe", ##google's generic secret
'response' : challenge
}
captcha_verify_url = "https://www.google.com/recaptcha/api/siteverify";
r = requests.get(captcha_verify_url, params=data)
I can send whatever and to get:
{
"success": true,
"challenge_ts": "2021-01-08T13:07:54Z",
"hostname": "testkey.google.com"
}
Is that a normal behavior?

Yes, this is perfectly normal behaviour designed for automated testing.
recaptcha-docs

Related

Does Google Script have an equivalent to python's Session object?

I have this python script and I want to get Google Script equivalent but I do not know how to "pass" whatever needs to be passed between next get or post request once I log in.
import requests
import json
# login
session = requests.session()
data = {
'LoginName': 'name',
'Password': 'password'
}
session.post('https://www.web.com/en-CA/Login/Login', data=data)
session.get('https://www.web.com//en-CA/Redirect/?page=Dashboard')
# get customer table
data = {
'page': '1',
'pageSize': '100'
}
response = session.post('https://www.web.com/en-CA/Reporting', data=data)
print(response.json())
I wonder if there is an equivalent to .session() object from python's requests module. I did search google but could not find any working example. I am not a coder so I dot exactly know that that .session() object does. Would it be enough to pass headers from response when making new request?
UPDATE
I read in some other question that Google might be using for every single UrlFetchApp.fetch different IP so login and cookies might not work, I guess.
I believe your goal as follows.
You want to achieve your python script with Google Apps Script.
Issue and workaround:
If my understanding is correct, when session() of python is used, the multiple requests can be achieved by keeping the cookie. In order to achieve this situation using Google Apps Script, for example, I thought that the cookie is retrieved at 1st request and the retrieved cookie is included in the request header for 2nd request. Because, in the current stage, UrlFetchApp has no method for directly keeping cookie and using it to the next request.
From above situation, when your script is converted to Google Apps Script, it becomes as follows.
Sample script:
function myFunction() {
const url1 = "https://www.web.com/en-CA/Login/Login";
const url2 = "https://www.web.com//en-CA/Redirect/?page=Dashboard";
const url3 = "https://www.web.com/en-CA/Reporting";
// 1st request
const params1 = {
method: "post",
payload: {LoginName: "name", Password: "password"},
followRedirects: false
}
const res1 = UrlFetchApp.fetch(url1, params1);
const headers1 = res1.getAllHeaders();
if (!headers1["Set-Cookie"]) throw new Error("No cookie");
// 2nd request
const params2 = {
headers: {Cookie: JSON.stringify(headers1["Set-Cookie"])},
followRedirects: false
};
const res2 = UrlFetchApp.fetch(url2, params2);
const headers2 = res2.getAllHeaders();
// 3rd request
const params3 = {
method: "post",
payload: {page: "1", pageSize: "100"},
headers: {Cookie: JSON.stringify(headers2["Set-Cookie"] ? headers2["Set-Cookie"] : headers1["Set-Cookie"])},
followRedirects: false
}
const res3 = UrlFetchApp.fetch(url3, params3);
console.log(res3.getContentText())
}
By this sample script, the cookie can be retrieved from 1st request and the retrieved cookie can be used for next request.
Unfortunately, I have no information of your actual server and I cannot test for your actual URLs. So I'm not sure whether this sample script directly works for your server.
And, I'm not sure whether followRedirects: false in each request is required to be included. So when an error occurs, please remove it and test it again.
About the method for including the cookie to the request header, JSON.stringify might not be required to be used. But, I'm not sure about this for your server.
Reference:
Class UrlFetchApp
You might want to try this:
var nl = getNewLine()
function getNewLine() {
var agent = navigator.userAgent
if (agent.indexOf("Win") >= 0)
return "\r\n"
else
if (agent.indexOf("Mac") >= 0)
return "\r"
return "\r"
}
pagecode = 'import requests
import json
# login
session = requests.session()
data = {
\'LoginName\': \'name\',
\'Password\': \'password\'
}
session.post(\'https://www.web.com/en-CA/Login/Login\', data=data)
session.get(\'https://www.web.com//en-CA/Redirect/?page=Dashboard\')
# get customer table
data = {
\'page\': \'1\',
\'pageSize\': \'100\'
}
response = session.post(\'https://www.web.com/en-CA/Reporting\', data=data)
print(response.json())'
document.write(pagecode);
I used this program

Mock Graphql server with multiple stubs in Cypress

Problem:
I’m using cypress with angular and apollo graphQl. I’m trying to mock the graph server so I write my tests using custom responses. The issue here is that all graph calls go on a single endpoint and that cypress doesn’t have default full network support yet to distinguish between these calls.
An example scenario would be:
access /accounts/account123
when the api is hit two graph calls are sent out - a query getAccountDetails and another one with getVehicles
Tried:
Using one stub of the graph endpoint per test. Not working as it stubs with the same stub all calls.
Changing the app such that the query is appended 'on the go' to the url where I can intercept it in cypress and therefore have a unique url for each query. Not possible to change the app.
My only bet seems to be intercepting the XHR call and using this, but I don't seem to be able to get it working Tried all options using XHR outlined here but to no luck (it picks only the stub declared last and uses that for all calls) https://github.com/cypress-io/cypress-documentation/issues/122.
The answer from this question uses Fetch and therefore doesn't apply:
Mock specific graphql request in cypress when running e2e tests
Anyone got any ideas?
With cypress 6.0 route and route2 are deprecated, suggesting the use of intercept. As written in the docs (https://docs.cypress.io/api/commands/intercept.html#Aliasing-individual-GraphQL-requests) you can mock the GraphQL requests in this way:
cy.intercept('POST', '/api', (req) => {
if (req.body.operationName === 'operationName') {
req.reply({ fixture: 'mockData.json'});
}
For anyone else hitting this issue, there is a working solution with the new cypress release using cy.route2()
The requests are sent to the server but the responses are stubbed/ altered on return.
Later Edit:
Noticed that the code version below doesn't alter the status code. If you need this, I'd recommend the version I left as a comment below.
Example code:
describe('account details', () => {
it('should display the account details correctly', () => {
cy.route2(graphEndpoint, (req) => {
let body = req.body;
if (body == getAccountDetailsQuery) {
req.reply((res) => {
res.body = getAccountDetailsResponse,
res.status = 200
});
} else if (body == getVehiclesQuery) {
req.reply((res) => {
res.body = getVehiclesResponse,
res.status = 200
});
}
}).as('accountStub');
cy.visit('/accounts/account123').wait('#accountStub');
});
});
Both your query and response should be in string format.
This is the cy command I'm using:
import * as hash from 'object-hash';
Cypress.Commands.add('stubRequest', ({ request, response, alias }) => {
const previousInteceptions = Cypress.config('interceptions');
const expectedKey = hash(
JSON.parse(
JSON.stringify({
query: request.query,
variables: request.variables,
}),
),
);
if (!(previousInteceptions || {})[expectedKey]) {
Cypress.config('interceptions', {
...(previousInteceptions || {}),
[expectedKey]: { alias, response },
});
}
cy.intercept('POST', '/api', (req) => {
const interceptions = Cypress.config('interceptions');
const receivedKey = hash(
JSON.parse(
JSON.stringify({
query: req.body.query,
variables: { ...req.body.variables },
}),
),
);
const match = interceptions[receivedKey];
if (match) {
req.alias = match.alias;
req.reply({ body: match.response });
}
});
});
With that is posible to stub exact request queries and variables:
import { MUTATION_LOGIN } from 'src/services/Auth';
...
cy.stubRequest({
request: {
query: MUTATION_LOGIN,
variables: {
loginInput: { email: 'test#user.com', password: 'test#user.com' },
},
},
response: {
data: {
login: {
accessToken: 'Bearer FakeToken',
user: {
username: 'Fake Username',
email: 'test#user.com',
},
},
},
});
...
Cypress.config is what make it possible, it is kind of a global key/val getter/setter in tests which I'm using to store interceptions with expected requests hash and fake responses
This helped me https://www.autoscripts.net/stubbing-in-cypress/
But I'm not sure where the original source is
A "fix" that I use is to create multiple aliases, with different names, on the same route, with wait on the alias between the different names, as many as requests you have.
I guess you can use aliases as already suggested in Answer by #Luis above like this. This is given in documentation too. Only thing you need to use here is multiple aliases as you have multiple calls and have to manage the sequence between them . Please correct me if i understood you question in other way ??
cy.route({
method: 'POST',
url: 'abc/*',
status: 200.
response: {whatever response is needed in mock }
}).as('mockAPI')
// HERE YOU SHOULD WAIT till the mockAPI is resolved.
cy.wait('#mockAPI')

GET request with query parameters returns 403 error (signature does not match) - AWS Amplify

Problem
I was trying to use 'aws-amplify' GET API request with query parameters on the client side, but it turned out to be Request failed with status code 403, and the response showed:
"message":"The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Consult the service documentation for details.
Note: React.js as front-end, Javascript as back-end.
My code
Front-end
function getData() {
const apiName = 'MyApiName';
const path = '/path';
const content = {
body:{
data:'myData',
},
};
return API.get(apiName, path, content);
}
Back-end
try {
const result = await dynamoDbLib.call("query", params);
} catch (e) {
return failure({ status: false });
}
What I did to debug
The GET lambda function works fine in Amazon Console (Tested)
If I change the backend lambda function so that the frontend request can be made without parameters, i.e. return API.get(apiName, path), then no error shows up.
My question
How can I make this GET request with query parameters works?
I changed GET to POST (return API.post()), everything works fine now.
If anyone can provide a more detailed explanation, it would be very helpful.

Socket.io authorization, "No Authorization header was found"

Trying to add authorization to my real-time app, I get this message in chrome console:
socket.io-parser decoded 4{"message":"No Authorization header was found","code":"credentials_required", "type":"UnauthorizedError"}
Here is my code (I use angular-fullstack):
socket.service.js (client side)
'use strict';
angular.module('smthing')
.factory('socket', function(socketFactory, Auth) {
var ioSocket = io(null, {
//Auth.getToken() returns $cookieStore.get('token') from angular auth.service.js
'query': 'token=' + Auth.getToken()
});
...
socketio.js (server side)
socketio.use(require('socketio-jwt').authorize({
secret: 'smthing',
handshake: true
}));
socketio.on('connection', function (socket) {
console.log('smthing');
...
"smthing" never prints. If I remove the authorization part, everything works correctly. I thought it was pretty straight forward... Any help would be great !
I was able to resolve this in my own environment. I use this snippet, though I had issues until resolving the root issue.
io.set('authorization', socketioJwt.authorize({
secret: jwtSecret,
handshake: true
}));
The root issue was that the token required for socketio-jwt was not being added to the request query. If you inspect socketio-jwt/lib/index.js, you'll see some code like:
//get the token from query string
if (req._query && req._query.token) {
token = req._query.token;
}
else if (req.query && req.query.token) {
token = req.query.token;
}
This means that you should be able to log the value for query and _query here, and see your token.
If you cannot, you must be sure on in your socket.io client that you are setting the token to the query. The typical way for this is to add it to the connectParams.
In the iOS/Swift client, this looks like this:
self.socket = [[SocketIOClient alloc] initWithSocketURL:#"localhost:8091"
options:#{
#"connectParams" : #{#"token" : <my_token>}
}];
Actually the right syntax:
var ioSocket = io(null, {
query: {
token: Auth.getToken()
}
});

JSONP pass api key

I've got an arduino uploading sensor data to cosm.com. I made a simple webpage on my local web server to query the cosm.com API and print out the values.
The problem is that if I am not logged into cosm.com in another tab, I get this popup.
The solution is to pass my public key to cosm.com, but I am in way over my head here.
The documentation gives an example of how to do it in curl, but not javascript
curl --request GET --header "X-ApiKey: -Ux_JTwgP-8pje981acMa5811-mSAKxpR3VRUHRFQ3RBUT0g" https://api.cosm.com/v2/feeds/120687/datastreams/sensor_reading
How do I pass my key into the url?:
function getJson() {
$.ajax({
type:'GET',
url:"https://api.cosm.com/v2/feeds/120687/datastreams/sensor_reading",
//This line isn't working
data:"X-ApiKey: -Ux_JTwgP-8pje981acMa5811-mSAKxpR3VRUHRFQ3RBUT0g",
success:function(feed) {
var currentSensorValue = feed.current_value;
$('#rawData').html( currentSensorValue );
},
dataType:'jsonp'
});
}
UPDATE:
It must be possible because hurl.it is able to query the api
http://www.hurl.it/hurls/75502ac851ebc7e195aa26c62718f58fecc4a341/47ad3b36639001c3a663e716ccdf3840352645f1
UPDATE 2:
While I never did get this working, I did find a work around. Cosm has their own javascript library that does what I am looking for.
http://cosm.github.com/cosm-js/
http://jsfiddle.net/spuder/nvxQ2/5/
You need to send it as a header, not as a query string, so try this:
function getJson() {
$.ajax({
type:'GET',
url:"https://api.cosm.com/v2/feeds/120687/datastreams/sensor_reading",
headers:{"X-ApiKey": "-Ux_JTwgP-8pje981acMa5811-mSAKxpR3VRUHRFQ3RBUT0g"},
success:function(feed) {
var currentSensorValue = feed.current_value;
$('#rawData').html( currentSensorValue );
},
dataType:'jsonp'
});
}
It should be much easier to get it to work using CosmJS. It is an officially supported library and provides full coverage of Cosm API.

Resources