Passing default/static values from server to client - graphql

I have an input type with two fields used for filtering a query on the client.
I want to pass the default values (rentIntervalLow + rentIntervalHigh) from server to the client, but don't know how to do it.
Below is my current code. I've come up with two naïve solutions:
Letting the client introspect the whole schema.
Have a global config object, and create a querable Config type with a resolver that returns the config object values.
Any better suggestions than the above how to make default/config values on the server accessible to the client?
// schema.js
const typeDefs = gql`
input FilteringOptions {
rentIntervalLow: Int = 4000
rentIntervalHigh: Int = 10000
}
type Home {
id: Int
roomCount: Int
rent: Int
}
type Query {
allHomes(first: Int, cursor: Int, input: FilteringOptions): [Home]
}
`
export default typeDefs
I'm using Apollo Server 2.8.1 and Apollo React 3.0.

It's unnecessary to introspect the whole schema to get information about a particular type. You can just write a query like:
query {
__type(name:"FilteringOptions") {
inputFields {
name
description
defaultValue
}
}
}
Default values are values that will be used when a particular input value is omitted from the query. So to utilize the defaults, the client would pass an empty object to the input argument of the allHomes field. You could also give input a default value of {}, which would allow the client not to provide the input argument at all, while still relaying the min and max default values to the resolver.
If, however, your intent is to provide the minimum and maximum values to your client in order to drive some client-specific logic (like validation, drop down menu values, etc.), then you should not utilize default values for this. Instead, this information should be queried directly by the client, using, for example, a Config type like you suggested.

Related

Apollo graphql client - field policies - No read function for parent types

I can’t seem to find a way to read an entire type without having to resort to individual fieldPolicies for every field in that type.
const cache = new InMemoryCache({
typePolicies: {
SomeType: {
fields:{
// defining individual field (READ) policies would be insane (at least for my case)
// is there at least something like a wildcard mechanism?
},
merge, // yeah... possible at type level
read // ??? not possible (WHYYYY), so... Is there any other way to do this?
}
}
})

Apollo conditional data sources & initialization lifecycle

I have a specific use case where a user’s data sources are conditional - e.g based on the data sources saved in the database for every specific user.
This also means every data source has unique credentials for every user, which is fine for RESTDataSource because I can use the willSendRequest to set the Authentication headers before each request.
However, I have custom data sources that have proprietary clients (for example JSForce for Salesforce) - and they have their own fetch mechanism.
As of now - I have a custom transformer directive that fetches the tokens from the database and adds it into the context - however, the directive is ran before the dataSource.initialize() method - so that I can’t use the credentials there because the context still doesn’t have it.
I also don’t want to initialize all data sources for every user even if he doesn’t use said data source in this request - but the dataSources() function doesn’t accept any parameter and is not contextual.
Bottom line is - is it possible to pass data sources conditionally based even on the Express request? When is the right time to pass the tokens and credentials to the dataSource? Maybe add my own custom init function and call it from the directive?
So you have options. Here are 2 choices:
1. Just add your dataSources
If you just initialize all dataSources, internally it can check to see if the user has access. You could have a getClient function that resolves on the client or throws an UnauthorizedError, depending.
2. Don't just add your dataSources
So if you really don't want to initialize the dataSources at ALL, you can absolutely do this by adding the "dataSources" yourself, just like Apollo does it.
const server = new ApolloServer({
// this example uses apollo-server-express
context: async ({ req, res }) => {
const accessToken = req.headers?.authorization?.split(' ')[1] || ''
const user = accessToken && buildUser(accessToken)
const context = { user }
// You can't use the name "dataSources" in your config because ApolloServer will puke, so I called them "services"
await addServices(context)
return context
}
})
const addServices = async (context) => {
const { user } = context;
const services = {
userAPI: new UserAPI(),
postAPI: new PostAPI(),
}
if (user.isAdmin) {
services.adminAPI = new AdminAPI()
}
const initializers = [];
for (const service of Object.values(services)) {
if (service.initialize) {
initializers.push(
service.initialize({
context,
cache: null, // or add your own cache
})
);
}
}
await Promise.all(initializers);
/**
* this is where you have to deviate from Apollo.
* You can't use the name "dataSources" in your config because ApolloServer will puke
* with the error 'Please use the dataSources config option instead of putting dataSources on the context yourself.'
*/
context.services = services;
}
Some notes:
1. You can't call them "dataSources"
If you return a property called "dataSources" on your context object, Apollo will not like it very much [meaning it throws an Error]. In my example, I used the name "services", but you can do whatever you want... except "dataSources".
With the above code, in your resolvers, just reference context.services.whatever instead.
2. This is what Apollo does
This pattern is copied directly from what Apollo already does for dataSources [source]
3. I recommend you still treat them as DataSources
I recommend you stick to the DataSources pattern and that your "services" all extend DataSource. It's going to be easier for everyone involved.
4. Type safety
If you're using TypeScript or something, you're going to lose a bit of type safety, since the context.services is either going to be one shape or another. Even if you're not, if you're not careful, you may end up throwing "Cannot read property users of undefined" errors instead of "Unauthorized" errors. You might be better off creating "dummy services" that reflect the same object shape but just throw Unauthorized.

How to ignore unknown enum values?

I'm wondering what would be the best way to ignore/discard the unknown enum values in GraphQL/Apollo server.
Let's say my GraphQL schema defines array of enums "enum Service { Supermarket, TicketSales }" and it works fine now, but later on other service I'm using is adding some new values (e.g. Playground) and my client just doesn't support it and I would just like to ignore it and return the supported values without error.
What would be the best way to do this in GraphQL. My first idea was to make directive that would read the supported values from schema and ignore everything else, but after googling around I didn't find any good examples how to do it. Can you point me a direction where to go about this?
If your resolver function will accept arbitrary strings, then you can use a custom scalar type, or just String.
"""
The type of a service. `Supermarket` means..., and
`TicketSales` means...; any other value is ignored.
"""
scalar Service
GraphQL generally places responsibility on the client to conform to the server's expectations, rather than making the server try to support any request. There are a couple of places you can reasonably expect an enum value like this to appear:
enum Service { Supermarket, TicketSales }
type Query {
inAReturnValue: Service!
asAQueryParam(service: Service!): Node
}
type Mutation {
asAMutationInput(service: Service!): Node
}
In particular it may not make sense to tell the server "make the type of this object be a playground" if the server just doesn't understand that. Conversely, if the server knows about "playground", it could return it in cases the client may not expect. Having an enum here makes it explicit what the server knows about. The server has said what it supports and it's the client's responsibility to cooperate.
Note that it's possible for the client to find out if the server supports playgrounds, if it's an enum value, and this might help it inform its behavior.
query GetServiceTypes {
__type(name: "Service") {
enumValues { name }
}
}
After playing around I found something that I can use to get around my original problem, so I will post it here in case somebody else is wondering the same thing.
So my original problem was in short that I'm receiving several different "available services" kind of string arrays from another services and I was thinking to map them to enum for better typescript support etc. But the problem was that if I get some unknown value from another service, my graphql will fail.
So my original idea was to fix it with directive which I after all got working:
# In schema
directive #mapUnknownTo(value: String) on ENUM
enum SomeAttribute #mapUnknownTo(value: "__UNKNOWN__") {
SomeAttribute1
AnotherAttribute
SomethingElse
__UNKNOWN__
}
And the directive implementation is:
import { SchemaDirectiveVisitor } from 'graphql-tools';
import { GraphQLEnumType } from 'graphql';
export class MapUnknownToDirective extends SchemaDirectiveVisitor {
visitEnum(type: GraphQLEnumType) {
const { value = '__UNKNOWN__' } = this.args;
const valueMap = type.getValues().reduce((map, v) => map.set(v.value, v.name), new Map<string, string>());
type.serialize = (v: string): string => valueMap.get(v) || value;
}
}
So this will map all the values not defined in schema into some custom value, which is not exactly what I originally wanted, but at least it's not giving an error, so it's okay-ish.
I'm still not 100% sure if directives are way to go on cases like this, but at least it's one possible solution.

Deleting Apollo Client cache for a given query and every set of variables

I have a filtered list of items based on a getAllItems query, which takes a filter and an order by option as arguments.
After creating a new item, I want to delete the cache for this query, no matter what variables were passed. I don't know how to do this.
I don't think updating the cache is an option. Methods mentionned in Apollo Client documentation (Updating the cache after a mutation, refetchQueries and update) all seem to need a given set of variables, but since the filter is a complex object (with some text information), I would need to update the cache for every given set of variables that were previously submitted. I don't know how to do this. Plus, only the server does know how this new item impact pagination and ordering.
I don't think fetch-policy (for instance setting it to cache-and-network) is what I'm looking for, because if accessing the network is what I want after having created a new item, when I'm just filtering the list (typing in a string to search), I want to stay with the default behavior (cache-only).
client.resetStore would reset the store for all type of queries (not only the getAllItems query), so I don't think it's what I'm looking for either.
I'm pretty sure I'm missing something here.
There's no officially supported way of doing this in the current version of Apollo but there is a workaround.
In your update function, after creating an item, you can iterate through the cache and delete all nodes where the key starts with the typename you are trying to remove from the cache. e.g.
// Loop through all the data in our cache
// And delete any items where the key start with "Item"
// This empties the cache of all of our items and
// forces a refetch of the data only when it is next requested.
Object.keys(cache.data.data).forEach(key =>
key.match(/^Item/) && cache.data.delete(key)
)
This works for queries that exist a number of times in the cache with different variables, i.e. paginated queries.
I wrote an article on Medium that goes in to much more detail on how this works as well as an implementation example and alternative solution that is more complicated but works better in a small number of use cases. Since this article goes in to more detail on a concept I have already explained in this answer, I believe it is ok to share here: https://medium.com/#martinseanhunt/how-to-invalidate-cached-data-in-apollo-and-handle-updating-paginated-queries-379e4b9e4698
this worked for me (requires apollo 2 for cache eviction feature) - clears query matched by regexp from cache
after clearing cache query will be automatically refeteched without need to trigger refetch manually (if you are using angular: gql.watch().valueChanges will perform xhr request and emit new value)
export const deleteQueryFromCache = (cache: any, matcher: string | RegExp): void => {
const rootQuery = cache.data.data.ROOT_QUERY;
Object.keys(rootQuery).forEach(key => {
if (key.match(matcher)) {
cache.evict({ id: "ROOT_QUERY", fieldName: key })
}
});
}
ngrx like
resolvers = {
removeTask(
parent,
{ id },
{ cache, getCacheKey }: { cache: InMemoryCache | any; getCacheKey: any }
) {
const key = getCacheKey({ __typename: "Task", id });
const { [key]: deleted, ...data } = cache.data.data;
cache.data.data = { ...data };
return id;
}
}

Google Datastore go client, storing dynamic data

Our application provides functionality which enables a customer to create dynamic forms and business rules. We recently decided to explore Google infrastructures so we don't have to spend time tweaking and adjusting our infrastructure.
Thus far, we have managed well using a NOSQL database such as arangodb to store random data sets through their JSON HTTP REST APIs, that stores any sort of data structure, so long as it is a valid JSON. However, Google data store go client library and Datastore doesn;t work with JSON and also imposes rules like no silce []type, no map map[type]type failing with errors such as datastore: invalid Value type e.t.c
I explored option of implementing PropertyLoadSaver interface load/save functions with modifications to create PropertyList and Property to crate a []Property. Case below Collection is type Collection map[string]interface{} which holds an sort of data set
func (m Collection) Save() ([]datastore.Property, error) {
data := []datastore.Property{}
for key, value := range m {
if util.IsSlice(value) {
props := datastore.PropertyList{}
for _, item := range value.([]string) {
props = append(props, datastore.Property{Name: key, Value: item})
}
data = append(data, datastore.Property{Name: key, Value: props})
} else {
data = append(data, datastore.Property{Name: key, Value: value, NoIndex: true})
}
}
json.NewEncoder(os.Stdout).Encode(data)
return data, nil
}
Yes, we can create a struct which we can populate based on map data and save that to Datastore. We were however wondering if there possibly is an easier way to just receive a map and save it to Datastore with no added complexity.
Alternative
type Person struct{
Name string
Surname string
Addresses []Address
...
}
type Address struct{
Type string
Detail string
}
This map[string]interface{}{Name:"Kwasi", Surname:"Gyasi-Agyei", Addresses:...} can than be marshaled into above struct to be saved by Datastore go client lib.
Am however more interested in taking advantage of PropertList, []Property, unless that route is unnecessarily complex. What am basically asking is, which is the most appropriate route that offer the same type of flexibility as a schemaless database.

Resources