This is meant to be executed as a script for AJAX request:
app.use(route.post('/ajax_request', function(ctx) {
var p = new Promise(function(res){
res('Some result to be received as AJAX resp');
});
p.then(function (val){
ctx.body = val; //resolved after response is sent
});
}))
So how to send some asynchronously received (in this case wrapped in Promise) data back to client (as an AJAX response in this case)?
If you're using Koa 2, you should be using async functions for your middleware. If you're not familiar with using async/await or any other ES6+ features with Koa 2, I would recommend learning how to use Babel to transpile your code in order to allow the usage of them. If you don't want to have a transpilation step, you should currently just use Koa 1.X.
In Koa 2, the code would look like this:
app.use(route.post('/ajax_request', async (ctx) => {
let p = new Promise(function(res){
res('Some result to be received as AJAX resp');
});
let val = await p;
ctx.body = val;
}));
Related
I'm currently working on capturing crypto-currency price infos from OKEX. I found that, when a page navigate to https://www.okex.com/trade-spot/eth-usdt, it will initiate a websocket named public and send subscription orders via this websocket. Then, data corresponding sent subscriptions will flow through the same websocket.
My question is, in addition to passively inspecting dataflows in this websocket, is there any way to control it, namely to send subscriptions? If so, can this approch be automated(by puppeteer or something equivalent)?
After some research and learning, I solved this problem using evaluateHandle() and queryObjects(). However communication via selected websockets must be done in DOM context.
const puppeteer = require('puppeteer');
async function unsubscrible(page, wsHandle){
page.evaluate(arr=>{
arr[0].send(JSON.stringify({"op": "unsubscribe","args":[{"channel":"instruments","instType":"SPOT"},{"channel":"instruments","instType":"FUTURES"},{"channel":"instruments","instType":"SWAP"},{"channel":"instruments","instType":"OPTION"},{"channel":"tickers","instId":"ETH-USDT"},{"channel":"cup-tickers-3s","ccy":"USDT"},{"channel":"mark-price","instId":"ETH-USDT"},{"channel":"index-tickers","instId":"ETH-USDT"},{"channel":"itn-status"},{"channel":"optimized-books","instId":"ETH-USDT"},{"channel":"trades","instId":"ETH-USDT"}]}));
}, wsHandle);
};
async function ping(page, wsHandle){
page.evaluate(arr=>{
arr[0].send('ping');
}, wsHandle);
}
async function main() {
const browser = await puppeteer.launch({
headless : false,
args : [
'--auto-open-devtools-for-tabs',
]
});
const page = (await browser.pages())[0];
page.setDefaultNavigationTimeout(0);
await page.goto('https://www.okex.com/trade-spot/eth-usdt');
const wsHandle = await page.evaluateHandle(()=>WebSocket.prototype);
const ws = await page.queryObjects(wsHandle);
await unsubscrible(page, ws);
setTimeout(()=>{
for (i=0; i<10; i++) ping(page, ws);
}, 20000)
}
main();
I'm looking into implementing a "subscription" type using server-sent events as the backing api.
What I'm struggling with is the interface, to be more precise, the http layer of such operation.
The problem:
Using the native EventSource does not support:
Specifying an HTTP method, "GET" is used by default.
Including a payload (The GraphQL query)
While #1 is irrefutable, #2 can be circumvented using query parameters.
Query parameters have a limit of ~2000 chars (can be debated)
which makes relying solely on them feels too fragile.
The solution I'm thinking of is to create a dedicated end-point for each possible event.
For example: A URI for an event representing a completed transaction between parties:
/graphql/transaction-status/$ID
Will translate to this query in the server:
subscription TransactionStatusSubscription {
status(id: $ID) {
ready
}
}
The issues with this approach is:
Creating a handler for each URI-to-GraphQL translation is to be added.
Deploy a new version of the server
Loss of the flexibility offered by GraphQL -> The client should control the query
Keep track of all the end-points in the code base (back-end, front-end, mobile)
There are probably more issues I'm missing.
Is there perhaps a better approach that you can think of?
One the would allow a better approach at providing the request payload using EventSource?
Subscriptions in GraphQL are normally implemented using WebSockets, not SSE. Both Apollo and Relay support using subscriptions-transport-ws client-side to listen for events. Apollo Server includes built-in support for subscriptions using WebSockets. If you're just trying to implement subscriptions, it would be better to utilize one of these existing solutions.
That said, there's a library for utilizing SSE for subscriptions here. It doesn't look like it's maintained anymore, but you can poke around the source code to get some ideas if you're bent on trying to get SSE to work. Looking at the source, it looks like the author got around the limitations you mention above by initializing each subscription with a POST request that returns a subscription id.
As of now you have multiple Packages for GraphQL subscription over SSE.
graphql-sse
Provides both client and server for using GraphQL subscription over SSE. This package has a dedicated handler for subscription.
Here is an example usage with express.
import express from 'express'; // yarn add express
import { createHandler } from 'graphql-sse';
// Create the GraphQL over SSE handler
const handler = createHandler({ schema });
// Create an express app serving all methods on `/graphql/stream`
const app = express();
app.use('/graphql/stream', handler);
app.listen(4000);
console.log('Listening to port 4000');
#graphql-sse/server
Provides a server handler for GraphQL subscription. However, the HTTP handling is up to u depending of the framework you use.
Disclaimer: I am the author of the #graphql-sse packages
Here is an example with express.
import express, { RequestHandler } from "express";
import {
getGraphQLParameters,
processSubscription,
} from "#graphql-sse/server";
import { schema } from "./schema";
const app = express();
app.use(express.json());
app.post(path, async (req, res, next) => {
const request = {
body: req.body,
headers: req.headers,
method: req.method,
query: req.query,
};
const { operationName, query, variables } = getGraphQLParameters(request);
if (!query) {
return next();
}
const result = await processSubscription({
operationName,
query,
variables,
request: req,
schema,
});
if (result.type === RESULT_TYPE.NOT_SUBSCRIPTION) {
return next();
} else if (result.type === RESULT_TYPE.ERROR) {
result.headers.forEach(({ name, value }) => res.setHeader(name, value));
res.status(result.status);
res.json(result.payload);
} else if (result.type === RESULT_TYPE.EVENT_STREAM) {
res.writeHead(200, {
'Content-Type': 'text/event-stream',
Connection: 'keep-alive',
'Cache-Control': 'no-cache',
});
result.subscribe((data) => {
res.write(`data: ${JSON.stringify(data)}\n\n`);
});
req.on('close', () => {
result.unsubscribe();
});
}
});
Clients
The two packages mentioned above have companion clients. Because of the limitation of the EventSource API, both packages implement a custom client that provides options for sending HTTP Headers, payload with post, what the EvenSource API does not support. The graphql-sse comes together with it client while the #graphql-sse/server has companion clients in a separate packages.
graphql-sse client example
import { createClient } from 'graphql-sse';
const client = createClient({
// singleConnection: true, use "single connection mode" instead of the default "distinct connection mode"
url: 'http://localhost:4000/graphql/stream',
});
// query
const result = await new Promise((resolve, reject) => {
let result;
client.subscribe(
{
query: '{ hello }',
},
{
next: (data) => (result = data),
error: reject,
complete: () => resolve(result),
},
);
});
// subscription
const onNext = () => {
/* handle incoming values */
};
let unsubscribe = () => {
/* complete the subscription */
};
await new Promise((resolve, reject) => {
unsubscribe = client.subscribe(
{
query: 'subscription { greetings }',
},
{
next: onNext,
error: reject,
complete: resolve,
},
);
});
;
#graphql-sse/client
A companion of the #graphql-sse/server.
Example
import {
SubscriptionClient,
SubscriptionClientOptions,
} from '#graphql-sse/client';
const subscriptionClient = SubscriptionClient.create({
graphQlSubscriptionUrl: 'http://some.host/graphl/subscriptions'
});
const subscription = subscriptionClient.subscribe(
{
query: 'subscription { greetings }',
}
)
const onNext = () => {
/* handle incoming values */
};
const onError = () => {
/* handle incoming errors */
};
subscription.susbscribe(onNext, onError)
#gaphql-sse/apollo-client
A companion package of the #graph-sse/server package for Apollo Client.
import { split, HttpLink, ApolloClient, InMemoryCache } from '#apollo/client';
import { getMainDefinition } from '#apollo/client/utilities';
import { ServerSentEventsLink } from '#graphql-sse/apollo-client';
const httpLink = new HttpLink({
uri: 'http://localhost:4000/graphql',
});
const sseLink = new ServerSentEventsLink({
graphQlSubscriptionUrl: 'http://localhost:4000/graphql',
});
const splitLink = split(
({ query }) => {
const definition = getMainDefinition(query);
return (
definition.kind === 'OperationDefinition' &&
definition.operation === 'subscription'
);
},
sseLink,
httpLink
);
export const client = new ApolloClient({
link: splitLink,
cache: new InMemoryCache(),
});
If you're using Apollo, they support automatic persisted queries (abbreviated APQ in the docs). If you're not using Apollo, the implementation shouldn't be too bad in any language. I'd recommend following their conventions just so your clients can use Apollo if they want.
The first time any client makes an EventSource request with a hash of the query, it'll fail, then retry the request with the full payload to a regular GraphQL endpoint. If APQ is enabled on the server, subsequent GET requests from all clients with query parameters will execute as planned.
Once you've solved that problem, you just have to make a server-sent events transport for GraphQL (should be easy considering the subscribe function just returns an AsyncIterator)
I'm looking into doing this at my company because some frontend developers like how easy EventSource is to deal with.
There are two things at play here: the SSE connection and the GraphQL endpoint. The endpoint has a spec to follow, so just returning SSE from a subscription request is not done and needs a GET request anyway. So the two have to be separate.
How about letting the client open an SSE channel via /graphql-sse, which creates a channel token. Using this token the client can then request subscriptions and the events will arrive via the chosen channel.
The token could be sent as the first event on the SSE channel, and to pass the token to the query, it can be provided by the client in a cookie, a request header or even an unused query variable.
Alternatively, the server can store the last opened channel in session storage (limiting the client to a single channel).
If no channel is found, the query fails. If the channel closes, the client can open it again, and either pass the token in the query string/cookie/header or let the session storage handle it.
I'm trying to redirect URL to distribute (OAuth 2.0)my slack app with API gateway and lambda function (AWS) but I can't realize how to get the code.
the event that returns is null.
My lambda code :
// Lambda handler
exports.handler = (event, context, callback) => {
var messageTest = {
client_id: CLIENT_ID,
client_secret: CLIENT_SECRET,
code: event.code
};
var queryTest = qs.stringify(messageTest);
https.get(`https://slack.com/api/oauth.access?${queryTest}`, (res, err) => {
console.log("statusCode: ", res.statusCode);
console.log("headers: ", res.headers);
var data = [];
res.on('data', function(chunk) {
data.push(chunk);
});
res.on('end', function() {
var result = JSON.parse(data.join(''))
console.log(result);
});
});
callback(null);
};
My redirect URL is the lambda URL.
The event that i get is null.
How can i get the "code" from the oAuth 2.0?
Assuming you are using Lambda Proxy integration (and therefore you don't use a Body Mapping Template), the JSON payload that you send to your API Gateway will be received by your Lambda as a stringified JSON in event.body.
So, you'll need to parse that first and you can get your code.
const body = JSON.parse(event.body)
const code = body.code
Reference: Input Format of a Lambda Function for Proxy Integration
Here is my code:
var socket = require('socket.io-client')('http://127.0.0.1:3000/printers', {
query: "shop=" + "123456",
transports: ["websocket"]
});
If I delete query, I can connect to socket. Where am I wrong?
There doesn't seem to be anything wrong with your client-side code. I can connect by copying and pasting your code.
I suspect the problem is within your server-side code. Here is an example I am using with your client-side code:
var http = require('http');
var io = require('socket.io');
var server = http.createServer(function(req,res){
res.writeHead(200);
res.end('Hello, World!\n');
});
server.listen(80);
var socket = io.listen(server);
socket.use(function(socket, next){
console.log("Query: ", socket.handshake.query);
if (socket.handshake.query.shop == '123456') {
next();
}
next(new Error('You cannot use that shop'));
});
socket.on('connection', function(client) {
console.log('New Connection');
});
I'm able to obtain the query data in the 'socket.use' function. If I don't call the next() function, the client will never get the message that the server has received the response and is connected.
I recommend checking out the example used in this thread.
I am using NodeJS. One of my function (lets call it funcOne) receives some input which I pass to another function (lets call it funcTwo) which produces some output.
Before I pass the input to funcTwo I need to make an Ajax call to an endpoint passing the input and then I must pass the output produced by the AJAX call to funcTwo. funcTwo should be called only when the AJAX call is successful.
How can I achieve this in NodeJS. I wonder if Q Library can be utilized in this case
Using request
function funcOne(input) {
var request = require('request');
request.post(someUrl, {json: true, body: input}, function(err, res, body) {
if (!err && res.statusCode === 200) {
funcTwo(body, function(err, output) {
console.log(err, output);
});
}
});
}
function funcTwo(input, callback) {
// process input
callback(null, input);
}
Edit: Since request is now deprecated you can find alternatives here
Since request is deprecated. I recommend working with axios.
npm install axios#0.16.2
const axios = require('axios');
axios.get('https://api.nasa.gov/planetary/apod?api_key=DEMO_KEY')
.then(response => {
console.log(response.data.url);
console.log(response.data.explanation);
})
.catch(error => {
console.log(error);
});
Using the standard http library to make requests will require more effort to parse/get data. For someone who was used to making AJAX request purely in Java/JavaScript I found axios to be easy to pick up.
https://www.twilio.com/blog/2017/08/http-requests-in-node-js.html