Next.js 13 - new fetch approach makes two HTTP requests instead of one - next.js13

I'm figuring out how data fetching works in Next.js 13 and I noticed that this code, when revalidating a fetching current data, makes two HTTP requests to the server instead of one. The requests go right after each other.
async function getData() {
const url = `${process.env.API_URL}/public/monitoring/websocket/`
const res = await fetch(url, {next: {revalidate: 10}})
if (!res.ok) {
throw new Error('Failed to fetch data')
}
return res.text()
}
export default async function Home() {
const data = await getData()
return (<div>{data}</div>)
}
Q: Is there any reason why there are two HTTP requests to the server?

Related

How to pass RESTDataSource response headers to the ApolloServer response header (express)

I have ApolloServer running where the frontend makes a query request and the ApolloService fetches the request and then performs a request with RESTDataSource to a third-party service, I receive a response with a header.
Currently, ApolloServer only parses the body through the resolver and sends it back to the client
I wanted to pass also the header received to the client
I don't know how to do that at the RESTDataSource level since I don't have access to the Apollo response
I hope this was clear enough to explain the problem
export abstract class myClass extends RESTDataSource {
getSomething() {
const endpoint = this.endpointPath;
return this.get(endpoint);
}
async didReceiveResponse<T>(response, request): Promise<T | null> {
// these are the response headers desired to have them sent back to the client
console.log(response.headers);
if (response.ok) {
return this.parseBody(response) as any as Promise<T>;
} else {
throw await this.errorFromResponse(response);
}
}
}
In the appolloService initialization i have
const apolloServer = new ApolloServer({
context: async ({ res, req }) => {
// these headers are not the same as received from the getSomething() response above
console.log(res.getHeaders)
}
)}
I solved the issue by passing the res to the context and accessing the response in the didReceiveResponse, then adding the headers needed.
adding a response to context
const apolloServer = new ApolloServer({
context: async ({ res, req }) => {
return {
res: res,
};}
using the response to append the headers to it
async didReceiveResponse<T>(response, request): Promise<T | null> {
// use this.authHeader value in the class anywhere
this.context.res.setHeader(
"x-is-request-cached",
response.headers.get("x-is-request-cached") ?? false
);
this.context.res.setHeader(
"x-request-cached-time",
response.headers.get("x-request-cached-time")
);
if (response.ok) {
return (await this.parseBody(response)) as any as Promise<T>;
} else {
throw await this.errorFromResponse(response);
}}
by doing this you will achieve the desired outcome of passing the headers to the graphQl client

Nextjs api resolving before form.parse completes so i cant send response back

I am trying to send an image to Next.js api and then use that image to upload to db.
I am using :
const body = new FormData();
body.append("file", prewiedPP);
const response = await fetch("/api/send-pp-to-server", {
method: "POST",
body ,
headers: {
iext: iExt,
name: cCtx.userDetail ,
},
});
Then in the api :
async function handler(req, res) {
if (req.method === "POST") {
console.log("In");
const form = new formidable.IncomingForm();
form.parse(req,
async (err, fields, files) =>
{
// console.log(req.headers.iext);
// console.log(req.headers.name);
const fdata = fs.readFileSync(files.file.filepath);
await delUserPP(req.headers.name , req.headers.iext);
await setUserPP(
fdata ,
req.headers.name ,
req.headers.iext ,
files.file.mimetype
);
fs.unlinkSync(files.file.filepath);
return;
});
console.log("out");
}
}
export default handler;
The callback function in the from.parse happens after the handler already resolved.
Is there anyway to make the api call only resolve after the setUserPP function is done?
I want to send a response back to the client but the api script finishes to "fast" and before the callback in form.parse runs.
Thanks

AWS.ApiGatewayManagementApi() postToConnection no initial response detected on client

I have a lambda function that returns a message to the client.
function replyToMessage (messageText,connectionId) {
const data = {message:messageText}
const params = {
ConnectionId : connectionId,
Data: Buffer.from(JSON.stringify(data))
}
return api.postToConnection(params).promise()
.then(data => {})
.catch(error => {console.log("error",error)})
}
This code is called once when the connection is made and I get a response to my client. When I call the function again with a different endpoint, it doesn't send a response to my client. However, when I call it a third time, I get the response to my client from the second call. Here's my switch when the Lambda function is called.
switch(route) {
case "$connect":
break
case "$disconnect":
break
case "connectTo":
await connectToService(JSON.parse(event.body).eventId,connectionId)
await replyToMessage("Connected eventId to connId",connectionId)
break
case "disconnectFrom":
await disConnectToService(JSON.parse(event.body).eventId,connectionId)
break
case "project":
responseItems = await getBroadcastIds (JSON.parse(event.body).eventId,JSON.parse(event.body).sourceId,connectionId)
console.log(responseItems)
responseItems.Items.forEach(async function(item) {
await replyToMessage(JSON.parse(event.body).sourceId,item.connectionId)
})
responseItems = []
break
default :
console.log("Unknown route", route)
The issue appears to be the async forEach loop. Switching to the following resolves the issue.
for (const item of responseItems.Items) {
console.log("Sending to:",item.connectionId);
await replyToMessage(JSON.parse(event.body).sourceId,item.connectionId)
}
See this post for the answer that led to this resolution. Using async/await with a forEach loop

I can't get header from backend in vuejs

I have a a spring boot backend that validates user login credentials. After validating the user it sends a login token in its response header. This part definitly works because I have seen it work in postman:
Now I am trying to get the token into my vuejs front end by doing the following:
import axios from 'axios'
const databaseUrl = 'http://localhost:9090/api'
const datbaseUrlBase = 'http://localhost:9090'
async function getSubjects(){
const result = await axios.get(`${databaseUrl}/subject`)
return result.data
}
async function updateSubject(subject){
let body = {
"name": subject.name,
"first_name": subject.first_name,
"date_of_birth": subject.date_of_birth
}
let result = await axios.put(`${databaseUrl}/subject/${subject.subjectid}`, body)
return result.data
}
async function getSubject(id){
let result = await axios.get(`${databaseUrl}/subject/${id}`)
return result.data
}
async function getSimulationsForSubject(id){
let result = await axios.get(`${databaseUrl}/subject/${id}/simulation`)
return result.data
}
async function deleteSubject(id){
await axios.delete(`${databaseUrl}/subject/${id}`)
}
async function makeSubject(subject){
await axios.post(`${databaseUrl}/subject`, subject)
}
async function updateDiagnose(diagnose, id){
await axios.put(`${databaseUrl}/subject/${id}/diagnose/${diagnose.diagnoseid}`, diagnose)
}
async function addSymptomToDiagnose(symptom, diagnoseid, subjectid){
await axios.post(`${databaseUrl}/subject/${subjectid}/diagnose/${diagnoseid}/symptom`, symptom)
}
async function updateSymptom(symptom_id, symptom, subjectid, diagnoseid){
await axios.put(`${databaseUrl}/subject/${subjectid}/diagnose/${diagnoseid}/symptom/${symptom_id}`, symptom)
}
async function getDiagnoseForSubject(diagnoseid, subjectid){
let result = await axios.get(`${databaseUrl}/subject/${subjectid}/diagnose/${diagnoseid}`)
return result.data
}
async function deleteSymptomForDiagnose(subjectid, diagnoseid, symptomid){
await axios.delete(`${databaseUrl}/subject/${subjectid}/diagnose/${diagnoseid}/symptom/${symptomid}`)
}
async function getStatisticsForSimulation(subjectid, simulationid){
let result = await axios.get(`${databaseUrl}/subject/${subjectid}/simulation/${simulationid}/statistics`)
return result.data
}
async function login(login){
let result = await axios.post(`${datbaseUrlBase}/login`, login)
return result.headers
}
export default{
getSubjects,
updateSubject,
getSubject,
getSimulationsForSubject,
deleteSubject,
makeSubject,
updateDiagnose,
addSymptomToDiagnose,
getDiagnoseForSubject,
deleteSymptomForDiagnose,
updateSymptom,
getStatisticsForSimulation,
login
}
Notice the login function above. Whenever I run this code the console.log gives undefined in the browser.
And the console.log(result.headers) gives this:
Is there anyway of accessing this token in my vuejs frontend?
If the server is cross-origin then browser CORS dictates that only a handful of default headers are accessible in a response.
You need to either have a matching origin, or enable the Access-Control-Expose-Headers header by setting it in your response like this:
Access-Control-Expose-Headers: token
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Expose-Headers

GraphQL subscription using server-sent events & EventSource

I'm looking into implementing a "subscription" type using server-sent events as the backing api.
What I'm struggling with is the interface, to be more precise, the http layer of such operation.
The problem:
Using the native EventSource does not support:
Specifying an HTTP method, "GET" is used by default.
Including a payload (The GraphQL query)
While #1 is irrefutable, #2 can be circumvented using query parameters.
Query parameters have a limit of ~2000 chars (can be debated)
which makes relying solely on them feels too fragile.
The solution I'm thinking of is to create a dedicated end-point for each possible event.
For example: A URI for an event representing a completed transaction between parties:
/graphql/transaction-status/$ID
Will translate to this query in the server:
subscription TransactionStatusSubscription {
status(id: $ID) {
ready
}
}
The issues with this approach is:
Creating a handler for each URI-to-GraphQL translation is to be added.
Deploy a new version of the server
Loss of the flexibility offered by GraphQL -> The client should control the query
Keep track of all the end-points in the code base (back-end, front-end, mobile)
There are probably more issues I'm missing.
Is there perhaps a better approach that you can think of?
One the would allow a better approach at providing the request payload using EventSource?
Subscriptions in GraphQL are normally implemented using WebSockets, not SSE. Both Apollo and Relay support using subscriptions-transport-ws client-side to listen for events. Apollo Server includes built-in support for subscriptions using WebSockets. If you're just trying to implement subscriptions, it would be better to utilize one of these existing solutions.
That said, there's a library for utilizing SSE for subscriptions here. It doesn't look like it's maintained anymore, but you can poke around the source code to get some ideas if you're bent on trying to get SSE to work. Looking at the source, it looks like the author got around the limitations you mention above by initializing each subscription with a POST request that returns a subscription id.
As of now you have multiple Packages for GraphQL subscription over SSE.
graphql-sse
Provides both client and server for using GraphQL subscription over SSE. This package has a dedicated handler for subscription.
Here is an example usage with express.
import express from 'express'; // yarn add express
import { createHandler } from 'graphql-sse';
// Create the GraphQL over SSE handler
const handler = createHandler({ schema });
// Create an express app serving all methods on `/graphql/stream`
const app = express();
app.use('/graphql/stream', handler);
app.listen(4000);
console.log('Listening to port 4000');
#graphql-sse/server
Provides a server handler for GraphQL subscription. However, the HTTP handling is up to u depending of the framework you use.
Disclaimer: I am the author of the #graphql-sse packages
Here is an example with express.
import express, { RequestHandler } from "express";
import {
getGraphQLParameters,
processSubscription,
} from "#graphql-sse/server";
import { schema } from "./schema";
const app = express();
app.use(express.json());
app.post(path, async (req, res, next) => {
const request = {
body: req.body,
headers: req.headers,
method: req.method,
query: req.query,
};
const { operationName, query, variables } = getGraphQLParameters(request);
if (!query) {
return next();
}
const result = await processSubscription({
operationName,
query,
variables,
request: req,
schema,
});
if (result.type === RESULT_TYPE.NOT_SUBSCRIPTION) {
return next();
} else if (result.type === RESULT_TYPE.ERROR) {
result.headers.forEach(({ name, value }) => res.setHeader(name, value));
res.status(result.status);
res.json(result.payload);
} else if (result.type === RESULT_TYPE.EVENT_STREAM) {
res.writeHead(200, {
'Content-Type': 'text/event-stream',
Connection: 'keep-alive',
'Cache-Control': 'no-cache',
});
result.subscribe((data) => {
res.write(`data: ${JSON.stringify(data)}\n\n`);
});
req.on('close', () => {
result.unsubscribe();
});
}
});
Clients
The two packages mentioned above have companion clients. Because of the limitation of the EventSource API, both packages implement a custom client that provides options for sending HTTP Headers, payload with post, what the EvenSource API does not support. The graphql-sse comes together with it client while the #graphql-sse/server has companion clients in a separate packages.
graphql-sse client example
import { createClient } from 'graphql-sse';
const client = createClient({
// singleConnection: true, use "single connection mode" instead of the default "distinct connection mode"
url: 'http://localhost:4000/graphql/stream',
});
// query
const result = await new Promise((resolve, reject) => {
let result;
client.subscribe(
{
query: '{ hello }',
},
{
next: (data) => (result = data),
error: reject,
complete: () => resolve(result),
},
);
});
// subscription
const onNext = () => {
/* handle incoming values */
};
let unsubscribe = () => {
/* complete the subscription */
};
await new Promise((resolve, reject) => {
unsubscribe = client.subscribe(
{
query: 'subscription { greetings }',
},
{
next: onNext,
error: reject,
complete: resolve,
},
);
});
;
#graphql-sse/client
A companion of the #graphql-sse/server.
Example
import {
SubscriptionClient,
SubscriptionClientOptions,
} from '#graphql-sse/client';
const subscriptionClient = SubscriptionClient.create({
graphQlSubscriptionUrl: 'http://some.host/graphl/subscriptions'
});
const subscription = subscriptionClient.subscribe(
{
query: 'subscription { greetings }',
}
)
const onNext = () => {
/* handle incoming values */
};
const onError = () => {
/* handle incoming errors */
};
subscription.susbscribe(onNext, onError)
#gaphql-sse/apollo-client
A companion package of the #graph-sse/server package for Apollo Client.
import { split, HttpLink, ApolloClient, InMemoryCache } from '#apollo/client';
import { getMainDefinition } from '#apollo/client/utilities';
import { ServerSentEventsLink } from '#graphql-sse/apollo-client';
const httpLink = new HttpLink({
uri: 'http://localhost:4000/graphql',
});
const sseLink = new ServerSentEventsLink({
graphQlSubscriptionUrl: 'http://localhost:4000/graphql',
});
const splitLink = split(
({ query }) => {
const definition = getMainDefinition(query);
return (
definition.kind === 'OperationDefinition' &&
definition.operation === 'subscription'
);
},
sseLink,
httpLink
);
export const client = new ApolloClient({
link: splitLink,
cache: new InMemoryCache(),
});
If you're using Apollo, they support automatic persisted queries (abbreviated APQ in the docs). If you're not using Apollo, the implementation shouldn't be too bad in any language. I'd recommend following their conventions just so your clients can use Apollo if they want.
The first time any client makes an EventSource request with a hash of the query, it'll fail, then retry the request with the full payload to a regular GraphQL endpoint. If APQ is enabled on the server, subsequent GET requests from all clients with query parameters will execute as planned.
Once you've solved that problem, you just have to make a server-sent events transport for GraphQL (should be easy considering the subscribe function just returns an AsyncIterator)
I'm looking into doing this at my company because some frontend developers like how easy EventSource is to deal with.
There are two things at play here: the SSE connection and the GraphQL endpoint. The endpoint has a spec to follow, so just returning SSE from a subscription request is not done and needs a GET request anyway. So the two have to be separate.
How about letting the client open an SSE channel via /graphql-sse, which creates a channel token. Using this token the client can then request subscriptions and the events will arrive via the chosen channel.
The token could be sent as the first event on the SSE channel, and to pass the token to the query, it can be provided by the client in a cookie, a request header or even an unused query variable.
Alternatively, the server can store the last opened channel in session storage (limiting the client to a single channel).
If no channel is found, the query fails. If the channel closes, the client can open it again, and either pass the token in the query string/cookie/header or let the session storage handle it.

Resources