I want to get data from Firestore COLLECTION A, where are IDs of users (createdBy and lastUpdateBy). Users data (firstName, surName...) are in COLLECTION B. If I fetch data from Firestore and use OrderBy lastUpdateBy, the order is firstly by USER (from the newest to the oldest), after that by lastUpdateAt, as I just want. Please, can you help me, what is wrong?
// Fetch Data from the Firestore
let refUsers = db.collection('users')
db.collection('campTypes').orderBy("lastUpdateAt", "asc")
.onSnapshot((snapshot) => {
snapshot.docChanges().forEach(change => {
refUsers.doc(change.doc.data().createdBy).get()
.then(createdByUser => {
var cbUser = createdByUser.data()
refUsers.doc(change.doc.data().lastUpdateBy).get()
.then(updatedByUser => {
var ubUser = updatedByUser.data()
if(change.type === 'added') {
this.campTypes.unshift({
id: change.doc.id,
name: change.doc.data().name,
status: change.doc.data().status,
createdAt: change.doc.data().createdAt,
createdBy: change.doc.data().createdBy,
createdByUser: cbUser,
lastUpdateAt: change.doc.data().lastUpdateAt,
lastUpdateBy: change.doc.data().lastUpdateBy,
lastUpdateByUser: ubUser
})
}
})
})
})
})
Image: Datatable, where the newest update isn't on the top
Related
Let's say I have a field called user with a data that looks something like this
{
"id": "abc123",
"name": "John Smith"
}
I want to make a route where I can find where user.id equals, say, abc123 and should return the blogs that has a user with the id above
I've tried doing
async findByUser(ctx) {
let blogs = await strapi.services.blogs.find({
user: {id:ctx.params.id},
return blogs;
},
but that doesn't seem to work as it returns an empty array and isn't searching specifically in the id property. How do I do this using strapi?
edit: User is not an relation, it is an individual JSON field.
Okay, for querying a JSON object property, you will need to write a custom query. Look at the example below.
Implementation for PostGreSQL
async findByUser(ctx) {
const response = await strapi
.query('blogs')
.model.query((qb) => {
qb.where('user', '#>', `{"id": "${ctx.params.id}" }`);
// qb.where('user', '#>', `{"name": "${ctx.params.name}" }`);
})
.fetch();
return response.toJSON();
},
Implementation for SQLite
async findByUser(ctx) {
const response = await strapi
.query('blogs')
.model.query((qb) => {
qb.where('user', 'LIKE', `%"id":"${ctx.params.id}"%`);
})
.fetch();
return response.toJSON();
},
P.S: Just use fetch instead of fetchAll for consistency.
Hi there thanks to Salvino's help I think i am able to find a solution
async findByUser(ctx) {
const response = await strapi
.query('blogs')
.model.query((qb) => {
qb.where('user', 'LIKE', `%"id":"${ctx.params.id}"%`);
})
.fetchAll();
return response.toJSON();
},
I have an array of countries received from Apollo backend without an ID field.
export const QUERY_GET_DELIVERY_COUNTRIES = gql`
query getDeliveryCountries {
deliveryCountries {
order
name
daysToDelivery
zoneId
iso
customsInfo
}
}
`
Schema of these objects:
{
customsInfo: null
daysToDelivery: 6
iso: "UA"
name: "Ukraine"
order: 70
zoneId: 8
__typename: "DeliveryCountry"
}
In nested components I read these objects from client.readQuery.
What I want is to insert it to localStorage, read it initially and write this data to Apollo Client Cache.
What I've already tried to do:
useEffect(() => {
const deliveryCountries = JSON.parse(localStorage.getItem('deliveryCountries') || '[]')
if(!deliveryCountries || !deliveryCountries.length) {
getCountriesLazy()
} else {
deliveryCountries.map((c: DeliveryCountry) => {
client.writeQuery({
query: QUERY_GET_DELIVERY_COUNTRIES,
data: {
deliveryCountries: {
__typename: "DeliveryCountry",
order: c.order,
name: c.name,
daysToDelivery: c.daysToDelivery,
zoneId: c.zoneId,
iso: c.iso,
customsInfo: c.customsInfo
}
}
})
})
}
}, [])
But after execution the code above I have only one object in countries cache. How to write all objects without having an explicit ID, how can I do it? Or maybe I'm doing something wrong?
Lol. I just had to put the array into necessary field without iterating. writeQuery replaces all the data and not add any "to the end".
client.writeQuery({
query: QUERY_GET_DELIVERY_COUNTRIES,
data: {
deliveryCountries: deliveryCountries
}
})
At the moment I have the following query:
return await this.siteRepository.find({
where: [{ id: Like(`%${q}%`) }, { name: Like(`%${q}%`) }]
});
But I would like to be able to pass a list of column names to be used for the query from an array and not write each one of them manually.
const columns = ["id","name", "lastName", "age"]
const query = {};
return await this.siteRepository.find({
where: columns.map(column=>{(query[`${column}`] = `Like("%${q}%")}`)})
});
Is this even possible? I'm starting to feel like it currently is not.
I didn't manage to accomplish what I wanted with the Repository TypeORM methods but I did manage to do it with the QueryBuilder
Here is my solution
const res = ['id', 'name'].map(item => `${item} LIKE :q`).join(' OR ');
return await this.siteRepository
.createQueryBuilder()
.where(res, { q: `%${q}%` })
.getMany();
Which yields a query of
SELECT `User`.`id` AS `User_id`, `User`.`name` AS `User_name`, `User`.`lastName` AS `User_lastName`, `User`.`active` AS `User_active` FROM `user` `User` WHERE name LIKE ? OR id LIKE ?
I am using apollo-datasource-rest with apollo-server-lambda and trying to figure out how to map a query to a specific resolver. I have the following schema, in which the plan query is supposed to return a list of users (that should be driven by the users query rather than the user query).
type Query {
user(user_id: String, username: String): User
users(user_ids: [String!]): [User!]
plan(plan_id: String): Plan
}
type User {
id: String
username: String
first: String
last: String
image: String
}
type Plan {
title: String
image: String
attending: [User]
}
The plan query resolver datasource is as follows:
planReducer(data) {
return {
image: data.public.info.image,
title: data.public.info.title,
attending: Object.keys(data.public.attending)
}
}
data.public.attending in the planReducer returns an array of user_ids that I would like to then be able to feed into my users query as opposed to my user query.
These are my current resolvers:
user: (_, { username }, { dataSources }) =>
dataSources.userData.getUserByUsername({ username: username }),
users: async (_, { user_ids }, { dataSources }) => {
const usersArray = await dataSources.userData.getUsersByIds({ userIds: user_ids })
return usersArray
},
plan: async (_, { plan_id }, { dataSources }) => {
return dataSources.planData.getPlanById({ planId: plan_id })
}
Your resolver map should look like below:
const resolvers = {
Query: {
plan: async (_parent, { plan_id: planId }, { dataSources }) => (
dataSources.planData.getPlanById({ planId })
)
},
Plan: {
users: async ({ user_ids: userIds }, _variables, { dataSources }) => (
dataSources.userData.getUsersByIds({ userIds })
)
}
}
Every key within Query should be a resolver that corresponds to a query defined within the root Query of your schema. Keys that are direct children of the root, in this case Plan, will be used to resolve their corresponding types when returned from the plan resolver within Query.
If resolvers are not defined, GraphQL will fall back to a default resolver which in this case looks like:
const resolvers = {
Plan: {
title: (parent) => parent.title,
image: (parent) => parent.image,
}
}
By specifying a custom resolver, you are able to compute fields to return to your clients based on the return value of parent resolvers.
Lets say I have a "category" table, each category has associated data in the "data" table and it has associated data in other tables "associated" and I want to remove a category with all it's associated data.
What I'm currently doing is something like this:
getAllDataIdsFromCategory()
.then(removeAllAssociated)
.then(handleChanges)
.then(removeDatas)
.then(handleChanges)
.then(removeCategory)
.then(handleChanges);
Is there a way to chain these queries on the db-side?
my functions currently look like this:
var getAllDataIdsFromCategory = () => {
return r
.table('data')
.getAll(categoryId, { index: 'categoryId' })
.pluck('id').map(r.row('id')).run();
}
var removeAllAssociated = (_dataIds: string[]) => {
dataIds = _dataIds;
return r
.table('associated')
.getAll(dataIds, { index: 'dataId' })
.delete()
.run()
}
var removeDatas = () => {
return r
.table('data')
.getAll(dataIds)
.delete()
.run()
}
notice that I cannot use r.expr() or r.do() since I want to do queries based on the result of the previous query.
The problem with my approach is that it won't work for large amounts of "data" since I have to bring all of the ids to the client side, and doing paging for it in the client side in a loop seems like a workaround.
You can use forEach for this:
r.table('data').getAll(categoryID, {index: 'categoryId'})('id').forEach(function(id) {
return r.table('associated').getAll(id, {index: 'dataId'}).delete().merge(
r.table('data').get(id).delete())
});