For a API request I send the jwt with authentication header while request. So I get informations from the jwt. But with websocket it is not possible to set headers.
I also do not want to set the jwt as data in every request. So I want to implement a authentication (server side), witch save data from the jwt to a redis DB with the socket id as an key.
Now I have the problem, that the socket or socket id is not part of the feathers context in hooks or only available as symbol witch I do not figure out how to get it right now. But I think there have to be a more elegant version of how to save data in combination with sockets and get theme back again.
Which is the best way to save user data in a socket connection, to not send the data everytime again?
Mabye it will also help if somebody can tell me how to read the Symbol in the following structur:
connection: {provider: "socketio", Symbol(#feathersjs/socketio/socket): Socket}
How to authenticate a socket connection without the Feathers client is described in the API documentation here. This can be done by authenticating the connection like this:
const io = require('socket.io-client');
const socket = io('http://localhost:3030');
socket.emit('create', 'authentication', {
strategy: 'local',
email: 'hello#feathersjs.com',
password: 'supersecret'
}, function(error, authResult) {
console.log(authResult);
// authResult will be {"accessToken": "your token", "user": user }
// You can now send authenticated messages to the server
});
Or by sending the existing access token when establishing the socket connection:
const io = require('socket.io-client');
const socket = io('http://localhost:3030', {
extraHeaders: {
Authorization: `Bearer <accessToken here>`
}
});
I found a soltuion in a simular question: How to add parameters to a FeathersJS socket connection
It is possible to save information to the connection by adding it to socket.feathers and it will added as params in every following request.
Socketio Middleware
const jwtHandler = (feathers, authorization, next) => {
const regex = /Bearer (.+)/g;
const jwt = (regex.exec(authorization) || [])[1]; // todo length >= 2 test.
try {
const { accountId, userId } = decode(jwt);
feathers.accountId = accountId;
feathers.userId = userId;
feathers.authorization = authorization;
} catch (err) {
throw new Forbidden('No valid JWT', err);
}
next();
};
const socketJwtHandler = io => io.use((socket, next) => {
jwtHandler(socket.feathers, socket.handshake.query.authorization, next);
});
and than calling context.param.userId in hooks.
Related
I'm working on a project where we currently use Cognito User pools for auth., but after some research we found that if we want more fine-grained access-control we should use an Identity pool instead.
The theory is simple : first we create an Identity Pool that uses the Cognito user pool as Auth provider. Then in API Gateway we set up our Lambda to use Authorizer: AWS_IAM. To access it, User now has to :
Sign in to User pool, which gives user a JWT Token.
Exchange that JWT Token with the Identity pool for temporary AWS Credentials.
Use those new credentials to sign API request to the protected Lambda.
Steps 1 and 2 work fine, with a test user we manage to get the JWT Token and successfully exchange it for AWS credentials. They look like this (modified for security reasons):
awsAccessKey: ASIAZFDXSW29NWI3QZ01
awsSecretKey: B+DrYdPMFGbDd1VRLSPV387uHT715zs7IsvdNnDk
awsSessionToken: IQoJb3JpZ2luX2VjEA8aCWV1LXdlc3QtMyJHMEUCIQC4kHasZrfnaMezJkcPtDD8YizZlKESas/a5N9juG/wIQIgShWaOIgIc4X9Xrtlc+wiGuSC1AQNncwoac2vFkpJ3gkqxAQIWBAAGgw2NTI5NTE0MDE0MDIiDDuTZ1aGOpVffl3+XCqhBDmjCS3+1vSsMqV1GxZ96WMoIoEC1DMffPrBhc+NnBf94eMOI4g03M5gAm3uKAVCBkKO713TsQMaf4GOqqNemFC8LcJpKNrEQb+c+kJqqf7VWeWxveuGuPdHl1dmD2/lIc8giY0+q4Wgtbgs6i0/gR5HzdPfantrElu+cRNrn/wIq4Akf+aARUm14XsIgq7/1fT9aKSHpTgrnTLHeXLKOyf/lZ947XdH71IHDZXBUdwdPikJP/Rikwill6RRTVw7kGNOoacagCmmK7CD6uh9h0OnoW3Qw5df+zX5Z8U7U55AyQfEyzeB7bW3KH65yJn6sopegxIIFfcG2CLIvtb5cZYImAz/4BdnppYpsrEgLPUTvRAXn6KUa5sXgc5Vd7tJeRo5qpYckrR2qfbebsU+0361BCYK2HxGJqsUyt1GVsEoAosxofpn/61mYJXqfeR0ifCAgL7OMOquvlaUVXhHmnhWnUSIOUQ+XtRc+DxUDjwn5RPD7QTwLHIat7d4BI4gZJPAcMT9gZrBVO/iN88lk5R0M5LBzFwd5jiUW46H/G755I4e5ZHaT1I37TY3tbcObIFGVVNz5iHDpK/NePTJevKTshe8cYxXczOQgos4J/RsNpqouO9qRgT9JDyXjU3Etyxqm9RzbLYgV3fl5WwZl5ofVmrBsy3adq+088qEz5b9cogPgDggA/nQaPv7nAZHT8u0ct/hw230pmXUDGCutjOML2G6ZYGOoUCy+BitAN0SZOYWlbZlYomIGKMNQuXjV4z+S9CEW8VunqW4Rgl7rTba6xbI0DdX9upYEczeln6pTl+2UPEDYf6usayFfMsGDvJXesqC5EOtWco1Z8tem/wDQIH7ZbioQHZ7UJDd5ntUAruFveY7sXmKsQbtah/RB5W5HLYy19hCmyGpYMnVXxR0FcNGImsweNcprtw9MmQqy2SUK9V6Rwn1yIE6svfAT3NVyzp9ILbP/qSQLGHNhm4CNd8+EJZZa9rcmCbQiQ+iBJ8FW+AmRSCC4LiB1dhuH1KsFo88DyNhYdVf3py8XV4CDR7l+UyuZMrIQsERwx9JzwVBjfv9COT948mvyGTY
The issue is the signing. Our Lambda is behind a CloudFront proxy + API Gateway. Requests to e.g john.dev.project.io are forwarded to the 'real' API origin at api.dev.project.io.
Using Postman and setting AWS Signature, the request doesn't work and gives following error :
The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Consult the service documentation for details.\n\nThe Canonical String for this request should have been\n'................................................................................................................................................................................................................................................................'\n\nThe String-to-Sign should have been\n'............................................................................'\n
We found however, that by overriding the Host header to the real origin of the API, request now works fine :
So it seems that since the custom URL we use and the original API URL are different, signatures don't match. The problem is that by default browsers don't allow you to override Host header for security reasons, so our front-end signed requests always fail.
Maybe the proxy is also modifying other headers before forwarding to origin, which would also invalidate the signature from my understanding...
Any help appreciated in solving this issue!
I was facing a similar issue when trying to make a signed request to an API Gateway endpoint behind an Akamai proxy.
The trick to solve it was indeed to generate a request as if you were sending it directly to the API Gateway URL, sign that request using sigv4 and then send that signed request to the proxy endpoint instead.
I've put together a simple NodeJS code to exemplify how to do this:
const AWS = require("aws-sdk");
const { HttpRequest } = require("#aws-sdk/protocol-http");
const { SignatureV4 } = require("#aws-sdk/signature-v4");
const { NodeHttpHandler } = require("#aws-sdk/node-http-handler");
const { Sha256 } = require("#aws-crypto/sha256-browser");
const REGION = "ca-central-1";
const PROXY_DOMAIN = "proxy.domain.com";
const PROXY_PATH = "/proxypath";
const API_GATEWAY_DOMAIN = "API-ID.execute-api.ca-central-1.amazonaws.com";
const API_GATEWAY_PATH = "/apigateway/path";
const IDENTITY_ID = "{{identity-pool-region}}:{{identity-pool-id}}";
const POOL_REGION = "{{identity-pool-region}}";
const REQUEST_BODY = { test: "test" };
const METHOD = "POST";
const udpatedSignedRequestExample = async () => {
try {
const BODY = JSON.stringify(REQUEST_BODY);
const request = new HttpRequest({
body: BODY,
headers: {
"Content-Type": "application/json",
host: API_GATEWAY_DOMAIN,
},
hostname: API_GATEWAY_DOMAIN,
port: 443,
method: METHOD,
path: API_GATEWAY_PATH,
});
console.log("request", request);
const credentials = await getCredentials();
console.log(credentials);
const signedRequest = await signRequest(request, credentials);
console.log("signedRequest", signedRequest);
const updatedSignedRequest = updateRequest(signedRequest);
console.log("updatedSignedRequest", updatedSignedRequest);
const response = await makeSignedRequest(updatedSignedRequest);
console.log(response.statusCode + " " + response.body.statusMessage);
} catch (error) {
console.log(error);
}
};
const getCredentials = async () => {
var cognitoidentity = new AWS.CognitoIdentity({ region: POOL_REGION });
var params = {
IdentityId: IDENTITY_ID,
};
const response = await cognitoidentity
.getCredentialsForIdentity(params)
.promise();
return {
accessKeyId: response.Credentials.AccessKeyId,
secretAccessKey: response.Credentials.SecretKey,
sessionToken: response.Credentials.SessionToken,
expiration: response.Credentials.Expiration,
};
};
const signRequest = async (request, credentials) => {
const signer = new SignatureV4({
credentials: credentials,
region: REGION,
service: "execute-api",
sha256: Sha256,
});
const signedRequest = await signer.sign(request);
return signedRequest;
};
const updateRequest = (httpRequest) => {
httpRequest.hostname = PROXY_DOMAIN;
httpRequest.path = PROXY_PATH;
httpRequest.headers.host = PROXY_DOMAIN;
return httpRequest;
};
const makeSignedRequest = async (httpRequest) => {
const client = new NodeHttpHandler();
const { response } = await client.handle(httpRequest);
return response;
};
udpatedSignedRequestExample();
Hope that helps.
I am using the Node.js ws library, to listen to events in user accounts on a 3rd party API. For each user, I open a websocket to listen to the events in the user's account.
Turns out, the 3rd-party API doesn't provide a userID for each event, so if I have 10 websocket connections to user-accounts, I cannot determine which account an event came from.
I have access to a unique userId prior to starting each of my connections.
Is there a way to append or wrap the websocket connection with the userId identifier, to each connection I make, such that when I receive an event, I can access the custom identifier, and subsequently know which user's account the event came from?
The code below is a mix of real code, and pseudocode (i.e customSocket)
const ws = new WebSocket('wss://thirdparty-api.com/accounts', {
port: 8080,
});
ws.send(
JSON.stringify({
action: 'authenticate',
data: {
oauth_token: access_token,
},
})
);
// wrap and attach data here (pseudocode at top-level)
customSocket.add({userId,
ws.send(
JSON.stringify({
action: 'listen',
data: {
streams: ['action_updates'],
},
})
)
})
// listen for wrapper data here, pseudocode at top level
customSocket.emit((customData) {
ws.on('message', function incoming(data) {
console.log('incoming -> data', data.toString());
})
console.log('emit -> customData', customData);
})
Looking at the socket.io library, the namespace feature may solve for this, but I can't determine if that's true or not. Below is an example in their documentation:
// your application has multiple tenants so you want to dynamically create one namespace per tenant
const workspaces = io.of(/^\/\w+$/);
workspaces.on('connection', socket => {
const workspace = socket.nsp;
workspace.emit('hello');
});
// this middleware will be assigned to each namespace
workspaces.use((socket, next) => {
// ensure the user has access to the workspace
next();
});
I found a solution to this which is fairly simple. First create a message handler function:
const eventHandler = (uid, msg) => {
console.log(`${uid} did ${msg}`);
};
Then, when you create the websocket for the given user, wrap the .on event with the handler:
const createSocketForUser = (uid, eventHandler) => {
const socket = new WebSocket(/* ... */);
socket.onmessage = (msg) => {
eventHandler(uid, msg)
};
return socket;
}
c# winform tries to send node.js socket through socket.
The client is connected to server, but the socket.emit value and socket.on value do not communicate normally.
I'd like to find a solution to this.
I would like to send this name of client to the server as json type data, receive json type data from the server, read it, and send data back to json.
The data of socket.emit and socket.on are not working properly, so the code has been deleted.
c# code
private void socketLogin(string email, string pw)
{
var socket = IO.Socket("http://localhost:3000/login.html");
socket.On(Socket.EVENT_CONNECT, () =>
{
});
var loginjson = new JObject();
loginjson.Add("email", email);
loginjson.Add("password", pw);
socket.Emit("socketlogin", loginjson.ToString());
socket.On("login", (data) => {
MessageBox.Show(data.ToString());
});
}
node.js Code
var server = require('http').Server(app);
var io = require('socket.io')(server);
io.on('connection', function(socket) {
console.log('connection');
socket.on('socketlogin', function(data) {
var testLogin = { 'Login': "success" };
socket.emit('login', data);
});
});
server.listen(app.get('3000'))
in your C# you are making your socket inside a function, but at the end of the function the socket is thrown away because it is only a local variable.
There are many ways to deal with this, but essentially what you want to do is use a thread to handle the socket comms then dispatch things back to your UI thread.
I'm looking into implementing a "subscription" type using server-sent events as the backing api.
What I'm struggling with is the interface, to be more precise, the http layer of such operation.
The problem:
Using the native EventSource does not support:
Specifying an HTTP method, "GET" is used by default.
Including a payload (The GraphQL query)
While #1 is irrefutable, #2 can be circumvented using query parameters.
Query parameters have a limit of ~2000 chars (can be debated)
which makes relying solely on them feels too fragile.
The solution I'm thinking of is to create a dedicated end-point for each possible event.
For example: A URI for an event representing a completed transaction between parties:
/graphql/transaction-status/$ID
Will translate to this query in the server:
subscription TransactionStatusSubscription {
status(id: $ID) {
ready
}
}
The issues with this approach is:
Creating a handler for each URI-to-GraphQL translation is to be added.
Deploy a new version of the server
Loss of the flexibility offered by GraphQL -> The client should control the query
Keep track of all the end-points in the code base (back-end, front-end, mobile)
There are probably more issues I'm missing.
Is there perhaps a better approach that you can think of?
One the would allow a better approach at providing the request payload using EventSource?
Subscriptions in GraphQL are normally implemented using WebSockets, not SSE. Both Apollo and Relay support using subscriptions-transport-ws client-side to listen for events. Apollo Server includes built-in support for subscriptions using WebSockets. If you're just trying to implement subscriptions, it would be better to utilize one of these existing solutions.
That said, there's a library for utilizing SSE for subscriptions here. It doesn't look like it's maintained anymore, but you can poke around the source code to get some ideas if you're bent on trying to get SSE to work. Looking at the source, it looks like the author got around the limitations you mention above by initializing each subscription with a POST request that returns a subscription id.
As of now you have multiple Packages for GraphQL subscription over SSE.
graphql-sse
Provides both client and server for using GraphQL subscription over SSE. This package has a dedicated handler for subscription.
Here is an example usage with express.
import express from 'express'; // yarn add express
import { createHandler } from 'graphql-sse';
// Create the GraphQL over SSE handler
const handler = createHandler({ schema });
// Create an express app serving all methods on `/graphql/stream`
const app = express();
app.use('/graphql/stream', handler);
app.listen(4000);
console.log('Listening to port 4000');
#graphql-sse/server
Provides a server handler for GraphQL subscription. However, the HTTP handling is up to u depending of the framework you use.
Disclaimer: I am the author of the #graphql-sse packages
Here is an example with express.
import express, { RequestHandler } from "express";
import {
getGraphQLParameters,
processSubscription,
} from "#graphql-sse/server";
import { schema } from "./schema";
const app = express();
app.use(express.json());
app.post(path, async (req, res, next) => {
const request = {
body: req.body,
headers: req.headers,
method: req.method,
query: req.query,
};
const { operationName, query, variables } = getGraphQLParameters(request);
if (!query) {
return next();
}
const result = await processSubscription({
operationName,
query,
variables,
request: req,
schema,
});
if (result.type === RESULT_TYPE.NOT_SUBSCRIPTION) {
return next();
} else if (result.type === RESULT_TYPE.ERROR) {
result.headers.forEach(({ name, value }) => res.setHeader(name, value));
res.status(result.status);
res.json(result.payload);
} else if (result.type === RESULT_TYPE.EVENT_STREAM) {
res.writeHead(200, {
'Content-Type': 'text/event-stream',
Connection: 'keep-alive',
'Cache-Control': 'no-cache',
});
result.subscribe((data) => {
res.write(`data: ${JSON.stringify(data)}\n\n`);
});
req.on('close', () => {
result.unsubscribe();
});
}
});
Clients
The two packages mentioned above have companion clients. Because of the limitation of the EventSource API, both packages implement a custom client that provides options for sending HTTP Headers, payload with post, what the EvenSource API does not support. The graphql-sse comes together with it client while the #graphql-sse/server has companion clients in a separate packages.
graphql-sse client example
import { createClient } from 'graphql-sse';
const client = createClient({
// singleConnection: true, use "single connection mode" instead of the default "distinct connection mode"
url: 'http://localhost:4000/graphql/stream',
});
// query
const result = await new Promise((resolve, reject) => {
let result;
client.subscribe(
{
query: '{ hello }',
},
{
next: (data) => (result = data),
error: reject,
complete: () => resolve(result),
},
);
});
// subscription
const onNext = () => {
/* handle incoming values */
};
let unsubscribe = () => {
/* complete the subscription */
};
await new Promise((resolve, reject) => {
unsubscribe = client.subscribe(
{
query: 'subscription { greetings }',
},
{
next: onNext,
error: reject,
complete: resolve,
},
);
});
;
#graphql-sse/client
A companion of the #graphql-sse/server.
Example
import {
SubscriptionClient,
SubscriptionClientOptions,
} from '#graphql-sse/client';
const subscriptionClient = SubscriptionClient.create({
graphQlSubscriptionUrl: 'http://some.host/graphl/subscriptions'
});
const subscription = subscriptionClient.subscribe(
{
query: 'subscription { greetings }',
}
)
const onNext = () => {
/* handle incoming values */
};
const onError = () => {
/* handle incoming errors */
};
subscription.susbscribe(onNext, onError)
#gaphql-sse/apollo-client
A companion package of the #graph-sse/server package for Apollo Client.
import { split, HttpLink, ApolloClient, InMemoryCache } from '#apollo/client';
import { getMainDefinition } from '#apollo/client/utilities';
import { ServerSentEventsLink } from '#graphql-sse/apollo-client';
const httpLink = new HttpLink({
uri: 'http://localhost:4000/graphql',
});
const sseLink = new ServerSentEventsLink({
graphQlSubscriptionUrl: 'http://localhost:4000/graphql',
});
const splitLink = split(
({ query }) => {
const definition = getMainDefinition(query);
return (
definition.kind === 'OperationDefinition' &&
definition.operation === 'subscription'
);
},
sseLink,
httpLink
);
export const client = new ApolloClient({
link: splitLink,
cache: new InMemoryCache(),
});
If you're using Apollo, they support automatic persisted queries (abbreviated APQ in the docs). If you're not using Apollo, the implementation shouldn't be too bad in any language. I'd recommend following their conventions just so your clients can use Apollo if they want.
The first time any client makes an EventSource request with a hash of the query, it'll fail, then retry the request with the full payload to a regular GraphQL endpoint. If APQ is enabled on the server, subsequent GET requests from all clients with query parameters will execute as planned.
Once you've solved that problem, you just have to make a server-sent events transport for GraphQL (should be easy considering the subscribe function just returns an AsyncIterator)
I'm looking into doing this at my company because some frontend developers like how easy EventSource is to deal with.
There are two things at play here: the SSE connection and the GraphQL endpoint. The endpoint has a spec to follow, so just returning SSE from a subscription request is not done and needs a GET request anyway. So the two have to be separate.
How about letting the client open an SSE channel via /graphql-sse, which creates a channel token. Using this token the client can then request subscriptions and the events will arrive via the chosen channel.
The token could be sent as the first event on the SSE channel, and to pass the token to the query, it can be provided by the client in a cookie, a request header or even an unused query variable.
Alternatively, the server can store the last opened channel in session storage (limiting the client to a single channel).
If no channel is found, the query fails. If the channel closes, the client can open it again, and either pass the token in the query string/cookie/header or let the session storage handle it.
I'm developing a Dart application that will need authentication and session control. I'm trying shelf_auth to do that, but the examples doesn't seem to work or, more likely, I'm not implementing them the right way.
In short, this is what I want to happen:
An user opens the application on the browser.
The user enters the login information (login and password), which are POSTED to the server.
If the provided information is valid, the application generates a session code that is passed to the client and stored on the DB (server-side). This code will be sent with every transaction to the server-side.
The package shelf_auth has some examples, but I don't know which one to follow. So my question is: how could I do that with shelf_auth? I'm not asking for anyone to code this for me, but just to point me to the right direction.
EDIT: The example that I was trying out was this: example_with_login_and_jwt_session.dart. Seems that it's lacking CORS headers (this question helped me fixing it) and, even providing valid information, it responds "Unauthorized".
This is how I'm POSTING the information:
import "dart:html";
void main() {
Map _queryParameters = {
"username": "fred",
"password": "blah"
};
var _button = querySelector("#login_button");
_button.onClick.listen((MouseEvent e) {
e.preventDefault();
var requisition = new HttpRequest();
Uri uri = new Uri(path: "http://localhost:8080/login", queryParameters: _queryParameters);
requisition.open("POST", uri.toString());
requisition.setRequestHeader("content-type", "application/x-www-form-urlencoded");
requisition.onLoadEnd.listen((_) {
print(requisition.response.toString());
});
requisition.send();
});
}
I got it working with this client code
import "dart:html";
void main() {
Map _queryParameters = {"username": "fred", "password": "blah"};
var _button = querySelector("#login_button");
_button.onClick.listen((MouseEvent e) async {
e.preventDefault();
var requisition = new HttpRequest();
Uri uri = new Uri(
path: "http://localhost:8080/login/");
requisition.onLoadEnd.listen((_) {
print(requisition.response.toString());
});
HttpRequest request = await HttpRequest.postFormData(
"http://localhost:8080/login/", _queryParameters
//,withCredentials: true
);
print(request.response);
});
}
The example server expects the credentials in the body instead of query parameters and I set withCredentials: true so authentication cookies are sent with the request. Worked without withCredentials.