My data model is a list with items. Very simple:
{
_id: 1,
name: "List 1",
items: [
{ _id: 2, text: "Item text 1" },
{ _id: 3, text: "Item text 2" }
]
}
Adding a new list with optimistic response works perfectly:
const [addListMutation] = useAddListMutation({
update: (cache, { data }) => {
const cachedLists =
(cache.readQuery<GetAllListsQuery>({
query: GetAllListsDocument,
})?.lists as TList[]) ?? [];
if (data) {
cache.writeQuery({
query: GetAllListsDocument,
data: {
lists: [...cachedLists, data?.list as TList],
},
});
}
},
});
const addList = async (name: string) => {
const list = {
_id: ..new id here,
name,
items: [],
};
const variables: AddListMutationVariables = {
data: list,
};
await addListMutation({
variables,
optimisticResponse: {
list,
},
});
};
This gets reflected immediately in my component using const { loading, data } = useGetAllListsQuery();. data is updated twice; first with the optimistic response and then after the mutation is done. Just like expected.
Now I'm trying to add an item to the list this way:
const [updateListMutation] = useUpdateListMutation({
update: (cache, { data }) => {
const cachedLists =
(cache.readQuery<GetAllListsQuery>(
{
query: GetAllListsDocument,
},
)?.lists as TList[]) ?? [];
if (data?.list) {
// Find existing list to update
const updatedList = data?.list as TList;
const updatedListIndex = cachedLists.findIndex(
(list: TList) => list._id === updatedList._id,
);
// Create a copy of cached lists and replace entire list
// with new list from { data }.
const updatedLists = [...cachedLists];
updatedLists[updatedListIndex] = { ...updatedList };
cache.writeQuery({
query: GetAllListsDocument,
data: {
lists: updatedLists,
},
});
}
}
});
const updateList = async (updatedList: TList) => {
const variables: UpdateListMutationVariables = {
query: {
_id: updatedList._id,
},
set: updatedList,
};
await updateListMutation({
variables,
optimisticResponse: {
list: updatedList,
},
});
};
const addListItem = async (list: TList, text: string) => {
const updatedList = R.clone(list);
updatedList.items.push({
_id: ...new item id here,
text: 'My new list item',
});
await updateList(updatedList);
};
The problem is is in my component and the const { loading, data } = useGetAllListsQuery(); not returning what I expect. When data first changes with the optimistic response it contains an empty list item:
{
_id: 1,
name: "List 1",
items: [{}]
}
And only after the mutation response returns, it populates the items array with the item with text 'My new list item'. So my component first updates when the mutation is finished and not with the optimistic response because it can't figure out to update the array. Don't know why?
(and I have checked that the updatedLists array in writeQuery correctly contains the new item with text 'My new list item' so I'm trying to write the correct data).
Please let me know if you have any hints or solutions.
I've tried playing around with the cache (right now it's just initialized default like new InMemoryCache({})). I can see the cache is normalized with a bunch of List:1, List:2, ... and ListItem:3, ListItem:4, ...
Tried to disable normalization so I only have List:{id} entries. Didn't help. Also tried to add __typename: 'ListItem' to item added, but that only caused the { data } in the update: ... for the optimistic response to be undefined. I have used hours on this now. It should be a fairly simple and common use case what I'm trying to do :).
package.json
"#apollo/client": "^3.3.4",
"graphql": "^15.4.0",
"#graphql-codegen/typescript": "^1.19.0",
Related
I'm trying to create a page with a load more and with a button that execute a mutation against a particular item of the list, this is the query:
const { data, fetchMore, networkStatus } =
useSearchItemsQuery({
notifyOnNetworkStatusChange: true,
variables,
});
this is the fetchMore:
fetchMore({
variables: {
offset: data.searchItems.nodes.length,
},
});
this is my apollo config:
return new ApolloClient({
...
cache: new InMemoryCache({
typePolicies: {
Item: {
keyFields: ["uuid"],
merge: true,
},
Query: {
fields: {
searchItems: {
keyArgs: false,
merge(existing = [], incoming, { args }) {
console.log("options", args);
return deepmerge(existing, incoming, {
arrayMerge: (destinationArray, sourceArray) => {
const refs = [...destinationArray, ...sourceArray].map(
(o) => o.__ref
);
const array = [...destinationArray, ...sourceArray].filter(
({ __ref }, index) => !refs.includes(__ref, index + 1)
);
console.log("array", array);
return array;
},
});
},
},
},
},
},
}),
});
};
the fetchMore works correctly but when I try to run a mutation I can see the changes on the screen but the searchItems query is been refetched and the list on the screen update with the initial state, before the fetchMore.
Example:
open the page and 1 item appear
click fetchMore
a new item appear
run a mutation against the 1st item or the 2nd item
the list now contains only the 1st item
I can see on the network tab the after the mutation the initial query it's been executed with the initial offset/limit
I have records in strapi. I am using strapi content API. In my front-end, I need to display only 2 records randomly. For limiting, I have used limit query from content API. But random fetching what keyword I need to use. The official documentation doesn't provide any details regarding this - https://strapi.io/documentation/v3.x/content-api/parameters.html#available-operators
There's no official Strapi API parameter for random. You have to implement your own. Below is what I've done previously, using Strapi v3:
1 - Make a service function
File: api/mymodel/services/mymodel.js
This will contain our actual random query (SQL), and wrapping it in a service is handy because it can be used in many places (cron jobs, inside other models, etc).
module.exports = {
serviceGetRandom() {
return new Promise( (resolve, reject) => {
// There's a few ways to query data.
// This example uses Knex.
const knex = strapi.connections.default
let query = knex('mydatatable')
// Add more .select()'s if you want other fields
query.select('id')
// These rules enable us to get one random post
query.orderByRaw('RAND()')
query.limit(1)
// Initiate the query and do stuff
query
.then(record => {
console.log("getRandom() record: %O", record[0])
resolve(record[0])
})
.catch(error => {
reject(error)
})
})
}
}
2 - Use the service somewhere, like a controller:
File: api/mymodel/controllers/mymodel.js
module.exports = {
//(untested)
getRandom: async (ctx) => {
await strapi.services.mymodel.serviceGetRandom()
.then(output => {
console.log("getRandom output is %O", output.id)
ctx.send({
randomPost: output
}, 200)
})
.catch( () => {
ctx.send({
message: 'Oops! Some error message'
}, 204) // Place a proper error code here
})
}
}
3 - Create a route that points to this controller
File: api/mymodel/config/routes.json
...
{
"method": "GET",
"path": "/mymodelrandom",
"handler": "mymodel.getRandom",
"config": {
"policies": []
}
},
...
4 - In your front-end, access the route
(However you access your API)
e.g. ajax call to /api/mymodelrandom
There is no API parameter for getting a random result.
So: FrontEnd is the recommended solution for your question.
You need to create a random request range and then get some random item from this range.
function getRandomInt(max) {
return Math.floor(Math.random() * Math.floor(max));
}
const firstID = getRandomInt(restaurants.length);
const secondID = getRandomInt(3);
const query = qs.stringify({
id_in:[firstID,secondID ]
});
// request query should be something like GET /restaurants?id_in=3&id_in=6
One way you can do this reliably is by two steps:
Get the total number of records
Fetch the number of records using _start and _limit parameters
// Untested code but you get the idea
// Returns a random number between min (inclusive) and max (exclusive)
function getRandomArbitrary(min, max) {
return Math.random() * (max - min) + min;
}
const { data: totalNumberPosts } = await axios.get('/posts/count');
// Fetch 20 posts
const _limit = 20;
// We need to be sure that we are not fetching less than 20 posts
// e.g. we only have 40 posts. We generate a random number that is 30.
// then we would start on 30 and would only fetch 10 posts (because we only have 40)
const _start = getRandomArbitrary(0, totalNumberPosts - _limit);
const { data: randomPosts } = await axios.get('/posts', { params: { _limit, _start } })
The problem with this approach is that it requires two network requests but for my needs, this is not a problem.
This seem to work for me with Strapi v.4 REST API
Controller, Get 6 random entries
"use strict";
/**
* artwork controller
*/
const { createCoreController } = require("#strapi/strapi").factories;
module.exports = createCoreController("api::artwork.artwork", ({ strapi }) => {
const numberOfEntries = 6;
return {
async random(ctx) {
const entries = await strapi.entityService.findMany(
"api::artwork.artwork",
{
populate: ["image", "pageHeading", "seo", "socialMedia", "artist"],
}
);
const randomEntries = [...entries].sort(() => 0.5 - Math.random());
ctx.body = randomEntries.slice(0, numberOfEntries);
},
};
});
Route
random.js
"use strict";
module.exports = {
routes: [
{
method: "GET",
path: "/artwork/random",
handler: "artwork.random",
config: {
auth: false,
},
},
],
};
API
http://localhost:1337/api/artwork/random
To match default data structure of Strapi
"use strict";
/**
* artwork controller
*/
const { createCoreController } = require("#strapi/strapi").factories;
module.exports = createCoreController("api::artwork.artwork", ({ strapi }) => {
const numberOfEntries = 6;
return {
async random(ctx) {
const entries = await strapi.entityService.findMany(
"api::artwork.artwork",
{
populate: ["image", "pageHeading", "seo", "socialMedia", "artist"],
}
);
const randomEntries = [...entries]
.sort(() => 0.5 - Math.random())
.slice(0, numberOfEntries);
const structureRandomEntries = {
data: randomEntries.map((entry) => {
return {
id: entry.id,
attributes: entry,
};
}),
};
ctx.body = structureRandomEntries;
},
};
});
There is also a random sort plugin.
https://www.npmjs.com/package/strapi-plugin-random-sort
This seem to work for me with Strapi v4.3.8 and graphql
src/index.js
"use strict";
module.exports = {
register({ strapi }) {
const extensionService = strapi.service("plugin::graphql.extension");
const extension = ({ strapi }) => ({
typeDefs: `
type Query {
randomTestimonial: Testimonial
}
`,
resolvers: {
Query: {
randomTestimonial: async (parent, args) => {
const entries = await strapi.entityService.findMany(
"api::testimonial.testimonial"
);
const sanitizedRandomEntry =
entries[Math.floor(Math.random() * entries.length)];
return sanitizedRandomEntry;
},
},
},
resolversConfig: {
"Query.randomTestimonial": {
auth: false,
},
},
});
extensionService.use(extension);
},
bootstrap({ strapi }) {},
};
graphql query:
query GetRandomTestimonial {
randomTestimonial {
__typename
name
position
location
description
}
}
generate random testimonial on route change/refresh
https://jungspooner.com/biography
I am trying to either add a new field into a returned payload or add a new field copying the contents of another field in the returned payload object. Here is my reducer code...
[actionTypes.GET_PAYTYPE_CONTRIBUTORS]: (state, action) => {
return {...state, paytypeContributors: { ...action.payload }, loadingPaytypeContributors: false, }
},
For each entry in action.payload.Items, I need to either change the field name ID to Id or add Id to the payload Items array with the same contents as the ID field has.
Here is where I tried to do this...
[actionTypes.GET_PAYTYPE_CONTRIBUTORS]: (state, action) => ({...state, paytypeContributors: { ...action.payload, Id: action.payload.Items.ID }, loadingPaytypeContributors: false}),
The payload returns an object and then Items inside of the object is an array and ID is a field in the array. Any ideas on how to do this?
If I understood correctly, that would be my take on this:
const actionFunc = (state, action) => ({ ...state,
paytypeContributors: { ...action.payload,
Items: action.payload.Items.map(Item => {
const newItem = { ...Item,
Id: Item.ID
};
delete newItem['ID'];
return newItem;
})
},
loadingPaytypeContributors: false
});
const state = {};
const action = {
payload: {
Items: [{
ID: 1
},
{
ID: 2
},
{
ID: 3
},
{
ID: 4
}
]
}
}
console.log(actionFunc(state, action));
I have this function which works:
export const tagsByLabel = async (params) => {
const findManyParams = {
where: { userId: userIdFromSession },
orderBy: { title: "asc" },
};
if (params) {
const { searchTerm } = params;
findManyParams.where.title = { contains: searchTerm };
}
console.log("findManyParams", findManyParams);
const tagsByLabelResult = await db.tag.findMany(findManyParams);
console.log("tagsByLabelResult", tagsByLabelResult);
return tagsByLabelResult;
};
If I search for 'mex', I see:
findManyParams {
where: { userId: 1, title: { contains: 'mex' } },
orderBy: { title: 'asc' }
}
tagsByLabelResult [
{
id: 9,
title: 'mex',
description: 'Mexican food',
userId: 1,
createdAt: 2020-05-03T22:16:09.134Z,
modifiedAt: 2020-05-03T22:16:09.134Z
}
]
And for an empty query, tagsByLabelResult contains all tag records.
How can I adjust my tagsByLabel function to aggregate (using "group by") the records and output a "count" for each record of tagsByLabelResult in order by count descending?
tagsByLabelResult [
{
id: 9,
title: 'mex',
description: 'Mexican food',
count: 25,
userId: 1,
createdAt: 2020-05-03T22:16:09.134Z,
modifiedAt: 2020-05-03T22:16:09.134Z
}
]
I see the docs example of prisma.user.count(), but that seems to retrieve a simple count of the result of the whole query rather than a count as a field with a "group by".
I'm using RedwoodJs, Prisma 2, Apollo, GraphQL.
As of now groupBy support is still in spec here so currently you would only be able to use count with specific querying.
As a workaround, you would have to use prisma.raw for the timebeing.
In my tags.sdl.js I needed to add:
type TagCount {
id: Int!
title: String!
count: Int!
principles: [Principle]
description: String
createdAt: DateTime!
modifiedAt: DateTime!
}
And change query tagsByLabel(searchTerm: String): [Tag!]! to tagsByLabel(searchTerm: String): [TagCount!]!
In my TagsAutocomplete.js component, I now have:
export const TagsAutocomplete = ({ onChange, selectedOptions, closeMenuOnSelect }) => {
const state = {
isLoading: false,
};
const client = useApolloClient();
const promiseOptions = useCallback(
async (searchTerm) => {
try {
const { data } = await client.query({
query: QUERY_TAGS_BY_LABEL,
variables: { searchTerm },
});
console.log("promiseOptions data", data);
const tags = data.tags.map((tag) => {
if (!tag.label.includes("(")) {
//ONEDAY why does the count keep getting appended if this condition isn't checked here?
tag.label = tag.label + " (" + tag.count + ")";
}
return tag;
});
console.log("promiseOptions tags", tags);
return tags;
} catch (e) {
console.error("Error fetching tags", e);
}
},
[client]
);
};
And in my tags.js service, I now have:
export const tagsByLabel = async (params) => {
let query = `
SELECT t.*, COUNT(pt.B) as count FROM tag t LEFT JOIN _PrincipleToTag pt ON t.id = pt.B WHERE t.userId = ${userIdFromSession} `;
if (params) {
const { searchTerm } = params;
if (searchTerm) {
query += `AND t.title LIKE '%${searchTerm}%' `;
}
}
query += "GROUP BY t.id ORDER BY count DESC, t.title ASC;";
console.log("query", query);
const tagsByLabelResult = await db.raw(query);
//TODO get secure parameterization working
console.log("tagsByLabelResult", tagsByLabelResult);
return tagsByLabelResult;
};
But, as mentioned in the comment, I'm still trying to figure out how to get secure parameterization working.
"vue-rx": "^6.1.0",
"rxjs": "^6.4.0",
"vue": "^2.5.17",
I'm new in vue-rx and rxjs,But when I see several demo of rx, I'm quite interested in this.So I want to use it in my project which posts a request when attribute num will not change anymore
[
{
id: 0,
name: 'giftA',
num: 0 // will turn to 1,2,3,4,5,...after running `send({id: 0})` function 1,2,3,4,5,...times
},
{
id: 1,
name: 'giftB',
num: 0
},
...
]
And Here is my solution:
using $watchAsObservable to watch the change of sendCalledTimes, and then using mergeMap to post the request.
the variable sendCalledTimes is a number which will sendCalledTimes++ when called send function, And after posting the request, reset this to sendCalledTimes = 0.
So that $watchAsObservable('sendCalledTimes')(vue-rx) will execute every three seconds, and will reduce request times in my project. But i think it's still not good because it just like a timer and can't watch weather num of each object in the Array changes. The good example should be like this search example.
data() {
return {
sendCalledTimes: 0,
giftArr: []
}
},
created() {
this.$watchAsObservable('sendCalledTimes').pipe(
pluck('newValue'),
filter(val => val > 0),
debounceTime(3000),
// if `sendCalledTimes` is the same number as previous
// will not execute follows
// distinctUntilChanged(),
mergeMap(
(val) => this.requestSendGift()
),
).subscribe(
(val) => { }
)
},
methods: {
send (obj) {
let pushFlag = true
for (const gift in this.giftArr) {
if (gift.id === obj.id) {
gift.num++
pushFlag = false
break
}
}
if (pushFlag) {
this.giftArr.push(obj)
}
// observable
this.sendCalledTimes++
},
async requestSendGift () {
for (const gift in this.giftArr) {
// example for post a request to store each gift
await axios({
data: gift,
type: 'post',
url: '...'
}).then(res => { ... })
}
// reset `this.sendCalledTimes`
this.sendCalledTimes = 0
}
}
Also since vue-rx doesn't have many examples on github, so i need help to solve creating good subscription for this situation.
I have tried this, but failed:
data () {
return {
giftArr: []
}
},
subscriptions: {
test: from(this.giftArr) // console.log(this.$observables.test) throw an error: typeError: Cannot read property 'giftArr' of undefined
},
It would be greatly appreciated if anyone can help me to solve this question.
It's a little unclear from your question exactly what you're trying to do, but I've created an example based on what I believe to be your intent.
I made some assumptions:
You have a 'gifts' array that represents all of the gifts that will ever exist.
You want to make updates to that array.
Every time you make an update to the array, you want to see the update in the form of an Observable emitting an event.
Use a Subject
I think what you want is a Subject.
const gift$ = new Subject();
Make it Emit on Updates
And you would set it up to emit every time you increment num or add a new gift.
function addGift(gift) {
gifts.push(gift);
gift$.next(gift);
}
function incrementGift(gift) {
gift.num++;
gift$.next(gift);
}
All together it could look something like this:
import { Subject } from 'rxjs';
const gift$ = new Subject();
const gifts = [{ id: 0, name: 'giftA', num: 0 }, { id: 1, name: 'giftB', num: 0 }];
function addGift(gift) {
gifts.push(gift);
gift$.next(gift);
}
function incrementGift(gift) {
gift.num++;
gift$.next(gift);
}
function sendGift(newGift) {
const currentGift = gifts.find(g => g.id === newGift.id);
currentGift ? incrementGift(currentGift) : addGift(newGift);
}
gift$.subscribe(update => {
console.log(gifts);
console.log(update);
});
// You should see an initial logging of 'gifts' and update will be 'undefined' at first. Then you'll see a log for every 'sendGift'.
sendGift({ id: 0 });
sendGift({ id: 3, name: 'giftC', num: 0 });
StackBlitz