I have a custom hook that looks something like this:
import { useQuery, useQueryClient } from 'react-query'
import { get } from '#/util/api' // Custom API utility
import produce from 'immer' // Using immer for deep object mutation
export function useData() {
const queryClient = useQueryClient()
const { data, isSuccess } = useQuery(
'myData', () => get('data')
)
function addData(moreData) {
const updatedData = produce(data.results, (draft) => {
draft.push(moreData)
})
setData(updatedData)
}
function setData(newData) {
queryClient.setQueryData('myData', newData)
}
return {
data: data && data.results,
setData,
addData,
}
}
My data in data.results is an array of objects. When I call addData it creates a copy of my current data, mutates it, then calls setData where queryClient.setQueryData is called with a new array of objects passed in as my second argument. But my cached data either doesn't update or becomes undefined in the component hooked up to the useData() hook. Does anyone know what I'm doing wrong?
code looks good from react-query perspective, but I'm not sure if that's how immer works. I think with your code, you will get back the same data instance with just a new data.results object on it. I would do:
const updatedData = produce(data, (draft) => {
draft.results.push(moreData)
})
Related
I am digging graphql so I followed a tutorial, And I stucked in this part.
Home.js
function Home() {
const {
loading,
data: { getPosts: posts } // <===## Here ##
} = useQuery(FETCH_POSTS_QUERY);
return (
<div>
{loading ? (
<h1>Loading posts..</h1>
) : (
posts &&
posts.map((post) => (
<p>
{post.content}
</p>
))
)}
</div>
);
}
const FETCH_POSTS_QUERY = gql`
{
getPosts {
id
content
}
}
`;
export default Home;
resolver
Query: {
async getPosts() {
try {
const posts = await Post.find().sort({ createdAt: -1 });
return posts;
} catch (err) {
throw new Error(err);
}
}
},
Whole code: https://github.com/hidjou/classsed-graphql-mern-apollo/tree/react10
In above example is working well, and it use it use data: { getPosts: posts } for deconstruction of returned data. but In my code, I followed it but I got an error
TypeError: Cannot read property 'getPosts' of undefined
Instead, If I code like below,
function Home() {
const {
loading,
data // <===## Here ##
} = useQuery(FETCH_POSTS_QUERY);
if(loading) return <h1>Loading...</h1>
const { getPosts: posts } = data // <===## Here ##
return (
<div>
{loading ? (
<h1>Loading posts..</h1>
) : (
posts &&
posts.map((post) => (
<p>
{post.content}
</p>
))
)}
</div>
);
}
It working well. Seems like my code try to reference data before it loaded. But I don't know why this happen. Code is almost same. Different things are 1. my code is on nextjs, 2. my code is on apollo-server-express. Other things are almost same, my resolver use async/await, and will return posts. Am I miss something?
my resolver is like below.
Query: {
async getPosts(_, { pageNum, searchQuery }) {
try {
const perPage = 5
const posts =
await Post
.find(searchQuery ? { $or: search } : {})
.sort('-_id')
.limit(perPage)
.skip((pageNum - 1) * perPage)
return posts
} catch (err) {
throw new Error(err)
}
},
Your tutorial may be out of date. In older versions of Apollo Client, data was initially set to an empty object. This way, if your code accessed some property on it, it wouldn't blow up. While this was convenient, it also wasn't particularly accurate (there is no data, so why are we providing an object?). Now, data is simply undefined until your operation completes. This is why the latter code is working -- you don't access any properties on data until after loading is false, which means the query is done and data is no longer undefined.
If you want to destructure data when your hook is declared, you can utilize a default value like this:
const {
loading,
data: { getPosts: posts } = {}
} = useQuery(FETCH_POSTS_QUERY)
You could even assign a default value to posts as well if you like.
Just keep in mind two other things: One, data will remain undefined if a network error occurs, even after loading is changed to true, so make sure your code accounts for this scenario. Two, depending on your schema, if there's errors in your response, it's possible for your entire data object to end up null. In this case, you'll still hit an issue with destructuring because default values only work with undefined, not null.
I am busy with a little proof of concept where basically the requirement is to have the home page be a login screen when a user has not logged in yet, after which a component with the relevant content is shown instead when the state changes upon successful authentication.
I have to state upfront that I am very new to react and redux and am busy working through a tutorial to get my skills up. However, this tutorial is a bit basic in the sense that it doesn't deal with connecting with a server to get stuff done on it.
My first problem was to get props to be available in the context of the last then of a fetch as I was getting an error that this.props.dispatch was undefined. I used the old javascript trick around that and if I put a console.log in the final then, I can see it is no longer undefined and actually a function as expected.
The problem for me now is that nothing happens when dispatch is called. However, if I manually refresh the page it will display the AuthenticatedPartialPage component as expected because the localstorage got populated.
My understanding is that on dispatch being called, the conditional statement will be reavaluated and AuthenticatedPartialPage should display.
It feels like something is missing, that the dispatch isn't communicating the change back to the parent component and thus nothing happens. Is this correct, and if so, how would I go about wiring up that piece of code?
The HomePage HOC:
import React from 'react';
import { createStore, combineReducers } from 'redux';
import { connect } from 'react-redux';
import AuthenticatedPartialPage from './partials/home-page/authenticated';
import AnonymousPartialPage from './partials/home-page/anonymous';
import { loggedIntoApi, logOutOfApi } from '../actions/authentication';
import authReducer from '../reducers/authentication'
// unconnected stateless react component
const HomePage = (props) => (
<div>
{ !props.auth
? <AnonymousPartialPage />
: <AuthenticatedPartialPage /> }
</div>
);
const mapStateToProps = (state) => {
const store = createStore(
combineReducers({
auth: authReducer
})
);
// When the user logs in, in the Anonymous component, the local storage is set with the response
// of the API when the log in attempt was successful.
const storageAuth = JSON.parse(localStorage.getItem('auth'));
if(storageAuth !== null) {
// Clear auth state in case local storage has been cleaned and thus the user should not be logged in.
store.dispatch(logOutOfApi());
// Make sure the auth info in local storage is contained in the state.auth object.
store.dispatch(loggedIntoApi(...storageAuth))
}
return {
auth: state.auth && state.auth.jwt && storageAuth === null
? state.auth
: storageAuth
};
}
export default connect(mapStateToProps)(HomePage);
with the Anonymous LOC being:
import React from 'react';
import { connect } from 'react-redux';
import { Link } from 'react-router-dom';
import { loggedIntoApi } from '../../../actions/authentication';
export class AnonymousPartialPage extends React.Component {
constructor(props) {
super(props);
}
onSubmit = (e) => {
e.preventDefault();
const loginData = { ... };
// This is where I thought the problem initially occurred as I
// would get an error that `this.props` was undefined in the final
// then` of the `fetch`. After doing this, however, the error went
// away and I can see that `props.dispatch is no longer undefined
// when using it. Now though, nothing happens.
const props = this.props;
fetch('https://.../api/auth/login', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify(loginData)
})
.then(function(response) {
return response.json();
})
.then(function(data) {
if(data && data.jwt) {
props.dispatch(loggedIntoApi(data));
localStorage.setItem('auth', JSON.stringify(data));
}
// else show an error on screen
});
};
render() {
return (
<div>
... onSubmit gets called successfully somewhere in here ...
</div>
);
}
}
export default connect()(AnonymousPartialPage);
the action:
// LOGGED_INTO_API
export const loggedIntoApi = (auth_token) => ({
type: 'LOGGED_INTO_API',
auth: auth_token
});
// LOGGED_OUT_OF_API
export const logOutOfApi = (j) => ({
type: 'LOG_OUT_OF_API'
});
and finally the reducer:
const authDefaultState = { };
export default (state = authDefaultState, action) => {
switch (action.type) {
case 'LOGGED_INTO_API':
// SOLUTION : changed this line "return action.auth;" to this:
return { ...action.auth, time_stamp: new Date().getTime() }
case 'LOG_OUT_OF_API':
return { auth: authDefaultState };
default:
return state;
}
};
My suggestion would be to make sure that the state that you are changing inside Redux is changing according to javascript's equality operator!. There is a really good answer to another question posted that captures this idea here. Basically, you can't mutate an old object and send it back to Redux and hope it will re-render because the equality check with old object will return TRUE and thus Redux thinks that nothing changed! I had to solve this issue by creating an entirely new object with the updated values and sending it through dispatch().
Essentially:
x = {
foo:bar
}
x.foo = "baz"
dispatch(thereWasAChange(x)) // doesn't update because the x_old === x returns TRUE!
Instead I created a new object:
x = {
foo:"bar"
}
y = JSON.parse(JSON.stringify(x)) // creates an entirely new object
dispatch(thereWasAChange(y)) // now it should update x correctly and trigger a rerender
// BE CAREFUL OF THE FOLLOWING!
y = x
dispatch(thereWasAChange(y)) // This WON'T work!!, both y and x reference the SAME OBJECT! and therefore will not trigger a rerender
Hope this helps!
I have a function that returns a BehaviorSubject but when I try to use the data I get back from the function I need to use it once all the data is back, is there a way to know when the BehaviorSubject is done pulling all the data?
I tried using .finally but it never gets called. Here is the code I'm using.
getData() {
let guideList = '';
this.getChildren(event.node)
.subscribe(
function(data) {
console.log('here');
guideList = data.join(',');
},
function(err) {
console.log('error');
},
function() {
console.log('done');
console.log(guideList);
}
);
}
getChildren(node: TreeNode) {
const nodeIds$ = new BehaviorSubject([]);
//doForAll is a promise
node.doForAll((data) => {
nodeIds$.next(nodeIds$.getValue().concat(data.id));
});
return nodeIds$;
}
Attached is a screen shot of the console.log
Easiest way is to just collect all the data in the array and only call next once the data is all collected. Even better: don't use a subject at all. It is very rare that one ever needs to create a subject. Often people use Subjects when instead they should be using a more streamlined observable factory method or operator:
getChildren(node: TreeNode) {
return Observable.defer(() => {
const result = [];
return node.doForAll(d => result.push(d.id)).then(() => result);
});
}
I've followed the documentation about using graphql-tools to mock a GraphQL server, however this throws an error for custom types, such as:
Expected a value of type "JSON" but received: [object Object]
The graphql-tools documentation about mocking explicitly states that they support custom types, and even provide an example of using the GraphQLJSON custom type from the graphql-type-json project.
I've provided a demo of a solution on github which uses graphql-tools to successfully mock a GraphQL server, but this relies on monkey-patching the built schema:
// Here we Monkey-patch the schema, as otherwise it will fall back
// to the default serialize which simply returns null.
schema._typeMap.JSON._scalarConfig.serialize = () => {
return { result: 'mocking JSON monkey-patched' }
}
schema._typeMap.MyCustomScalar._scalarConfig.serialize = () => {
return mocks.MyCustomScalar()
}
Possibly I'm doing something wrong in my demo, but without the monkey-patched code above I get the error regarding custom types mentioned above.
Does anyone have a better solution than my demo, or any clues as to what I might be doing wrong, and how I can change the code so that the demo works without monkey-patching the schema?
The relevant code in the demo index.js is as follows:
/*
** As per:
** http://dev.apollodata.com/tools/graphql-tools/mocking.html
** Note that there are references on the web to graphql-tools.mockServer,
** but these seem to be out of date.
*/
const { graphql, GraphQLScalarType } = require('graphql');
const { makeExecutableSchema, addMockFunctionsToSchema } = require('graphql-tools');
const GraphQLJSON = require('graphql-type-json');
const myCustomScalarType = new GraphQLScalarType({
name: 'MyCustomScalar',
description: 'Description of my custom scalar type',
serialize(value) {
let result;
// Implement your own behavior here by setting the 'result' variable
result = value || "I am the results of myCustomScalarType.serialize";
return result;
},
parseValue(value) {
let result;
// Implement your own behavior here by setting the 'result' variable
result = value || "I am the results of myCustomScalarType.parseValue";
return result;
},
parseLiteral(ast) {
switch (ast.kind) {
// Implement your own behavior here by returning what suits your needs
// depending on ast.kind
}
}
});
const schemaString = `
scalar MyCustomScalar
scalar JSON
type Foo {
aField: MyCustomScalar
bField: JSON
cField: String
}
type Query {
foo: Foo
}
`;
const resolverFunctions = {
Query: {
foo: {
aField: () => {
return 'I am the result of resolverFunctions.Query.foo.aField'
},
bField: () => ({ result: 'of resolverFunctions.Query.foo.bField' }),
cField: () => {
return 'I am the result of resolverFunctions.Query.foo.cField'
}
},
},
};
const mocks = {
Foo: () => ({
// aField: () => mocks.MyCustomScalar(),
// bField: () => ({ result: 'of mocks.foo.bField' }),
cField: () => {
return 'I am the result of mocks.foo.cField'
}
}),
cField: () => {
return 'mocking cField'
},
MyCustomScalar: () => {
return 'mocking MyCustomScalar'
},
JSON: () => {
return { result: 'mocking JSON'}
}
}
const query = `
{
foo {
aField
bField
cField
}
}
`;
const schema = makeExecutableSchema({
typeDefs: schemaString,
resolvers: resolverFunctions
})
addMockFunctionsToSchema({
schema,
mocks
});
// Here we Monkey-patch the schema, as otherwise it will fall back
// to the default serialize which simply returns null.
schema._typeMap.JSON._scalarConfig.serialize = () => {
return { result: 'mocking JSON monkey-patched' }
}
schema._typeMap.MyCustomScalar._scalarConfig.serialize = () => {
return mocks.MyCustomScalar()
}
graphql(schema, query).then((result) => console.log('Got result', JSON.stringify(result, null, 4)));
I and a few others are seeing a similar issue with live data sources (in my case MongoDB/Mongoose). I suspect it is something internal to the graphql-tools makeExecutableSchema and the way it ingests text-based schemas with custom types.
Here's another post on the issue: How to use graphql-type-json package with GraphQl
I haven't tried the suggestion to build the schema in code, so can't confirm whether it works or not.
My current workaround is to stringify the JSON fields (in the connector) when serving them to the client (and parsing on the client side) and vice-versa. A little clunky but I'm not really using GraphQL to query and/or selectively extract the properties within the JSON object. This wouldn't be optimal for large JSON objects I suspect.
If anyone else comes here from Google results, the solution for me was to add the JSON resolver as parameter to the makeExecutableSchema call. It's described here:
https://github.com/apollographql/apollo-test-utils/issues/28#issuecomment-377794825
That made the mocking work for me.
When writing a Mocha test spec against an action creator how can I be certain what a timestamp will be if it is generated within the action creator?
It doesn't have to utilize Sinon, but I tried to make use of Sinon Fake Timers to "freeze time" and just can't seem to get this pieced together wither with my limited knowledge of stubbing and mocking. If this is considered a Redux anti-pattern please point me in a better direction, but my understanding is that Redux action creators can be non-pure functions, unlike reducers.
Borrowing a little from the Redux Writing Tests Recipes here is the core of my problem as I understand it...
CommonUtils.js
import moment from 'moment';
export const getTimestamp = function () {
return moment().format();
};
TodoActions.js
import { getTimestamp } from '../../utils/CommonUtils';
export function addTodo(text) {
return {
type: 'ADD_TODO',
text,
timestamp: getTimestamp() // <-- This is the new property
};
};
TodoActions.spec.js
import expect from 'expect';
import * as actions from '../../actions/TodoActions';
import * as types from '../../constants/ActionTypes';
import { getTimestamp } from '../../utils/CommonUtils';
describe('actions', () => {
it('should create an action to add a todo', () => {
const text = 'Finish docs';
const timestamp = getTimestamp(); // <-- This will often be off by a few milliseconds
const expectedAction = {
type: types.ADD_TODO,
text,
timestamp
};
expect(actions.addTodo(text)).toEqual(expectedAction);
});
});
When testing time I have used this library successfully in the past: https://www.npmjs.com/package/timekeeper
Then in a beforeEach and afterEach you can save the time to be something specific and make your assertions then reset the time to be normal after.
let time;
beforeEach(() => {
time = new Date(1451935054510); // 1/4/16
tk.freeze(time);
});
afterEach(() => {
tk.reset();
});
Now you can make assertions on what time is being returned. Does this make sense?
I would still love to see other answers but I finally got a reasonable solution. This answer uses proxyquire to override/replace the getTimestamp() method defined in CommonUtils when used by TodoActions for the duration of the test.
No modifications to CommonUtils.js or TodoActions.js from above:
TodoActions.spec.js
import expect from 'expect';
import proxyquire from 'proxyquire';
import * as types from '../../constants/ActionTypes';
const now = '2016-01-06T15:30:00-05:00';
const commonStub = {'getTimestamp': () => now};
const actions = proxyquire('../../actions/TodoActions', {
'../../utils/CommonUtils': commonStub
});
describe('actions', () => {
it('should create an action to add a todo', () => {
const text = 'Finish docs';
const timestamp = now; // <-- Use the variable defined above
const expectedAction = {
type: types.ADD_TODO,
text,
timestamp
};
expect(actions.addTodo(text)).toEqual(expectedAction);
});
});