I'm trying to effectively do a PUT request with GraphQL
Mutation:
export const UPDATE_CAT = gql`
mutation updateCat($catRef: RefInput, $payload: CatInput) {
updateCat(ref: $catRef, input: $payload) {
${ref}
}
}
`;
Query variables:
{
"catRef": {
"id": "7b342789-e527-42a6-997b-cfe2fb6bdb07",
"typename": "cat.beacon.Beacon"
},
"payload":{
"position": null,
}
}
Although this seems to wipe all of the props on the cat entity (it does not delete the resource). Is this the correct syntax for a PUT like request?
First of all, I think it's worth to note that graphql is protocol agnostic. That means any transport protocol can be used, not necessarily HTTP (despite HTTP is used in most cases).
Also, what do you mean by PUT like request? As I understand, you state that HTTP PUT method is used to patch data. Which might not be true. It depends on how you implement this. You can easily create a REST API that is accepting data via GET and sending via POST or PUT method despite this is not recommended by best practices.
The same situation here with your problem. It's not about your mutation signature, it's about mutation resolver implementation.
In order to make this work, you should determine in your resolver function on a server if this is a new or existing resource and handle both scenarios properly.
How can you check if this is a new resource? Id prop will be undefined in this case, right?
If you are working with third-party api then check it's docs.
Related
The Problem
When navigating away from query components that use the state of the app route as required variables, I get GraphQL errors of the sort:
Variable "$analysisId" of required type "ID!" was not provided.
"Navigating away" means, for example, going
from: /analysis/analysis-1/analyse/
to: /user-profile/
Background
I am building an SPA using Apollo GraphQL, and I have some queries which follow this pattern:
query Analyse($analysisId: ID!) {
location #client {
params {
analysisId #export(as: "analysisId")
}
}
analysis(analysisId: $analysisId) {
id
# ... etc
}
}
The location field gets a representation of the SPA router's state. That state is held in an Apollo client "reactive variable". Query components are programmed to not begin subscribing to the query unless that reactive variable exists and has the required content.
shouldSubscribe(): boolean {
return !!(locationVar()?.params?.analysisId);
}
Params represents express-style URL params, so the route path is /analysis/:analysisId/analyse.
If the user navigates to /analysis/analysis-1/analyse, the query component's variables become: { analysisId: "analysis-1" }`. This works fine when loading the component.
What I Think is Happening
When the component connects to the DOM, it checks to see if it's required variables are present in the router state, and if they are, it creates an ObservableQuery and subscribes.
Later, when the user navigates away, the ObservableQuery is still subscribed to updates when suddenly the required analysisId variable, exported by the client field location.params.analysisId is nullified.
I think that since the ObservableQuery is still subscribed, it sends off the query with null analysisId variable, even though it's required.
What I've Tried
By breaking on every method in my query component base class, I'm reasonably sure that the component base class is not at fault - there's no evidence that it is refetching the component when the route changes. Instead, I think this is happening inside the apollo client.
I could perhaps change the schema for the query from analysis(analysisId: ID!): Analysis to analysis(analysisId: ID): Analysis, but that seems roundabout, as I might not have control over the server.
How do I prevent apollo client from trying to fetch a query when it has required variables and they are not present?
This seems to be working fine so far, in my HttpLink, src/apollo/link/http.ts:
import { ApolloLink, from } from '#apollo/client/link/core';
import { HttpLink } from '#apollo/client/link/http';
import { hasAllVariables } from '#apollo-elements/lib/has-all-variables';
const uri =
'GRAPHQL_HOST/graphql';
export const httpLink = from([
new ApolloLink((operation, forward) => {
if (!hasAllVariables(operation))
return;
else
return forward(operation);
}),
new HttpLink({ uri }),
]);
I'm using Apollo client for my React front-end to be able to use GraphQL hosted by Laravel Lighthouse.
Normal uses cases work, but I'm now finding myself facing certain cases where I'd like to pre-fetch a GraphQL result on the initial page request instead of having the browser load the page and then send a separate ajax query.
I see that Apollo client supports both "Store rehydration" and "Server-side rendering", but I haven't found any documentation within Lighthouse about how to do it.
I think my Blade view needs to contain something like this:
<script>
window.__APOLLO_STATE__ = '{!!$postsGqlResultJson ?? "null" !!}'; // https://www.apollographql.com/docs/react/performance/server-side-rendering/
</script>
So then the question is: how do I generate the $postsGqlResultJson?
UPDATE:
Let's say I have in the front-end:
const POSTS = gql`
query Posts($first: Int, $page: Int) {
posts(first: $first, page: $page, orderBy: { field: CREATED_AT, order: DESC }) {
paginatorInfo {
currentPage
hasMorePages
total
}
data {
id
...
}
}
`;
const options = {
variables: {
first: 100,
page: 1,
}
};
const { loading, error, data } = useQuery(query, options);
I would think that Lighthouse might offer some sort of PHP function equivalent of useQuery, where I can somehow provide those same arguments (the POSTS gql and the options object), and it would return a JSON string of the results data, nested as described in the GQL.
Am I misunderstanding? I really appreciate your help. :-)
The Apollo Client features you describe seem like concerns of the frontend and assume you are serving your frontend from a Node.js server.
There is nothing special to do here in Lighthouse - from its perspective it does not matter if queries are sent as a part of server-side rendering or from the client.
I am exploring this library
https://github.com/Asymmetrik/graphql-fhir
It does not contain logic for how to implement resolvers for a 3rd party FHIR Server.
Has anyone attempted this?
I'll add some pseudo code below, but that library is essentially a wrapper and has no backend, so you can definitely use it to wrap other FHIR servers. GraphQL resolvers can resolve sync or async. So if we took the patient resolver (https://github.com/Asymmetrik/graphql-fhir/blob/master/src/resources/4_0_0/profiles/patient/resolver.js), for example, and wanted to connect it to a third party server, like HAPI or some other server. You could implement it like so (pseudocode so untested):
module.exports.getPatient = function getPatient(root, args, context = {}, info) {
// args contains the arguments in GraphQL format, note that these may
// not map directly to another FHIR server for naming restriction reasons
// e.g. fooBar in graphql might be foo-bar in REST
// Make an HTTP request, use any http library, for example, fetch
return fetch('some/fhir/server/patient', {
method: 'post',
body: JSON.stringify(args) // remember args may need to be mapped
})
.then(response => response.json())
.then(results => {
// Make sure the response matches what the resolver expects, in this
// case, a single patient
return results;
});
};
There is an example at https://github.com/Asymmetrik/graphql-fhir/blob/master/FAQ.md#resolvers, but that is loading a local patient, you just need to make an HTTP request to some 3rd party server and return the results asynchronously. For handling errors, make sure to check out this as well, https://github.com/Asymmetrik/graphql-fhir/blob/master/FAQ.md#resolvers.
I'm trying to update the values and connections on my current viewer within the Relay store.
So without calling the mutation signIn if I print:
console.log(viewer.name) // "Visitor"
console.log(viewer.is_anonymous) // true
on Mutations we got the method updater which gives us the store, so in my mutation I'm doing something like this:
mutation SignInMutation($input: SignInInput!){
signIn(input: $input){
user {
id
name
email
is_anonymous
notifications{
edges{
node {
id
...NotificationItem_notification
}
}
}
}
token
}
}
So my updater method has:
const viewer = store.get(viewer_id);
const signIn = store.getRootField('signIn');
viewer.copyFieldsFrom(signIn.getLinkedRecord('user'))
After this I updated the store I got the name email is_anonymous fields updated with the data that just came from the graphql endpoint (I mean now name is "Erick", is_anonymous is now false, which is great), but If I try to do viewer.notifications and render it, the length of the viewer.connections seem to be 0 even when it has notifications.
How can I update my current viewer and add the notifications from the MutationPayload into the store without the need to force fetch?
Im using the latest relay-modern and graphql.
PS: Sorry for the bad formation, but is just impossible to format the code the way OF wants me to, i formated it to 4 spaces and still gave me errors.
With some reorganisation of your GraphQL schema it might be possible to remove the need to interact directly with the Relay store after your sign-in mutation. Consider:
viewer {
id
currentUser {
name
email
}
}
When a user that is not logged in, currentUser would return null.
You could then modify your login mutation to be:
mutation SignInMutation($input: SignInInput!){
signIn(input: $input){
viewer {
id
currentUser {
name
email
token
}
}
}
}
Knowing the 'nullability' of the currentUser field provides an elegant way of determining if the user is logged in or not.
Based on the presence of the token field implies that you are using JWT or similar to track login status. You would need to store this token in local storage and attach it to the headers of the outgoing Relay requests to your GraphQL endpoint if it is present.
Storing the token itself would have to be done in the onCompleted callback of where you make the mutation request (you will have access to the payload returned by the server in the arguments of the callback function).
As an alternative to the token, you could also explore using cookies which would provide the same user experience but likely require less work to implement then JWT tokens.
Web API allows me to capture the body of a POST request in a JObject:
$.post('/api/Query/DoSomething', { Foo: "one", Bar: 4 });
public string Post(JObject data)
{
// data is populated
}
However the same technique does not work with a get request and URI parameters.
$.get('/api/Controller', { Foo : "one", Bar : 4 });
public string Get([FromUri]JObject data)
{
// data is empty
}
Any workaround here?
It doesn't work because a GET request does not have a body, and hence no content type. Therefore, Web API does not know that you have JSON in your URL. You have a few choices:
Pass your data as query string parameters, as is traditionally done in GET requests, and change your method to accept those parameters individually, or in a regular class (POCO).
Change your GET method to accept a string instead of a JObject, then use JSON.Net to deserialize it manually, e.g. JObject obj = JObject.Parse(data);
If you're feeling ambitious, you might be able to implement a custom binder to do this.
My recommendation is option 1. Traditionally, a GET method is just intended to look something up, so you really should only be passing IDs and simple query options anyway. It is unusual to be passing JSON data in a URL. Also the length of URLs can be limited by some browsers. If you find you are needing to pass JSON data, use POST (or PUT) instead.
You can create an object and bind to it using the FromUri.
Check out this solution which I am using https://stackoverflow.com/a/49632564/2463156.