hopefully, you guys will be able to help me out please.
I can't get the pagination to work properly. It always counts the total documents and ignores the filter data. For example, there are 24 total documents, but when filters by a particular item it returns one, but still returns the total amount of pages (which is 3 as I have pageSize set to 9).
Please find my code below:
router.get('/', async (req, res) => {
try {
const pageSize = 9;
const page = Number(req.query.page) || 1;
let query;
const queryObject = { ...req.query };
const excludeFields = ['page', 'sort', 'limit', 'fields'];
excludeFields.forEach((el) => delete queryObject[el]);
let queryString = JSON.stringify(queryObject);
queryString = queryString.replace(
/\b(gte|gt|lte|lt)\b/g,
(match) => `$${match}`
);
query = Vehicle.find(JSON.parse(queryString));
if (req.query.sort) {
const sortBy = req.query.sort.split(',').join(' ');
query = query.sort(sortBy);
} else {
query = query.sort('-createdAt');
}
const count = await Vehicle.countDocuments();
const vehicles = await Vehicle.find(JSON.parse(queryString))
.limit(pageSize)
.skip(pageSize * (page - 1));
if (!vehicles) {
return res.status(200).json({ success: true, data: [] });
}
res
.status(200)
.json({ vehicles, page, totalPages: Math.ceil(count / pageSize) });
} catch (error) {
console.error(error.message);
res.status(500).send('Server Error');
}
});
How would you go about adding functioning sorting into this also please, for some reason sorting doesn't work at all for me?
Thanks very much, G.
To get the count to include the filtered data you need to add the query parameter in the countDocuments:
const count = await Vehicle.countDocuments(JSON.parse(queryString));
To add sorting functionality you can append .sort() onto the end of you find query:
const vehicles = await Vehicle.find(JSON.parse(queryString))
.limit(pageSize)
.skip(pageSize * (page - 1))
.sort(<your sort query>)
https://www.mongodb.com/community/forums/t/sorting-with-mongoose-and-mongodb/122573
Related
I have a repeater that is connected to a dataset. The dataset is told to sort by a numerical ranking (high to low). When I preview, it works great. When I go live, I can see that it loads correctly, and then reverts to sorting by date (newest to oldest). Any idea why the Sort Function is getting overwritten?
import wixData from 'wix-data';
const collectionName = 'Projects';
const fieldToFilterByInCollection = 'tags';
$w.onReady(function () {
setRepeatedItemsInRepeater(100);
loadDataToRepeater(100);
$w('#selectionTags1').onChange((event) => {
const selectedTags = $w('#selectionTags1').value;
loadDataToRepeater(selectedTags);
})
});
function loadDataToRepeater(selectedCategories = []) {
let dataQuery = wixData.query(collectionName);
if (selectedCategories.length > 0) {
dataQuery = dataQuery.hasSome(fieldToFilterByInCollection, selectedCategories);
}
dataQuery
.find()
.then(results => {
const itemsReadyForRepeater = results.items;
$w('#repeater1').data = itemsReadyForRepeater;
const isRepeaterEmpty = itemsReadyForRepeater.length === 0
if (isRepeaterEmpty) {
$w('#container1').show();
} else {
$w('#container1').show();
}
})
}
function setRepeatedItemsInRepeater() {
$w('#repeater1').onItemReady(($item, itemData) => {
$item('#container1').src = itemData.projectimage;
$item('#container1').tooltip = itemData.projectimage;
})
}
The Wix Data Query function overrides the sort parameter and resets it to sort descending by created data. When you are building the Query before you find, use the descending function or ascending function to keep it set to your desired field.
wixData.query("collectionName").descending("fieldName").find()
I have the next code, and it was working properly. to execute a request to my method fetchDropdownDataByFederationId, but now I have a requirement to execute the same method x number of times.
fetchInProgress(queryString?): Observable<IPerson[]> {
let PersonList: IPerson[] = [];
return this.getItems<IPerson[]>('', queryString).pipe(
take(1),
switchMap((wls: IPerson[]) => {
PersonList = [...wls];
//const createdbyIds = [...new Set(wls.map((f) => f.createdBy))];
return this.teamPageService.getInformation(wls.createdBy);
}),
map((teams:any) => {
console.log('> teams', teams);
for (let i = 0; i < PersonList.length; i++) {
//update information
}
//console.log('> Final value: ', PersonList);
return PersonList;
})
);
}
But, I'm not finding a way to execute my SwitchMap x number of times and get the results back to use them in my map method to parse the information.
I just moved my SwitchMap to mergeMap, something like this:
mergeMap((wls: IWalklist[]) => {
//let allIds = wls.contact.map(id => this.getSingleData(id._id) );
let drops: Dropdown[] = [];
walklistList = [...wls];
const allIds = [...new Set(wls.map((f) => f.createdBy))];
return forkJoin(...allIds).pipe(
map((idDataArray) => {
drops.push(
this.teamPageService.getInformation('');
);
return drops;
})
)
}),
But still no luck.
Can some help me? how can I fix it?
I want to increase the number of rows returned greater than the default 10. Using the limit parameter does not seem to work.
I have tried passing the limit in several different ways but I always receive only the default 10 rows returned. There should be 12 rows returned which match the filter criteria.
const filter = {
or: [
{ firstname: { contains: searchValue } },
{ lastname: { contains: searchValue } },
{ emailaddress: { contains: searchValue } },
{ phone: { contains: searchValue } }
]
};
const limit = {limit: 50};
// const limit = 50; // this does not work either
const result = await API.graphql(
graphqlOperation(listProviders, {filter}, {limit})
);
I expect to receive true number of rows which match the filter criteria. Only receive 10 rows back. What am I doing incorrectly?
You were close.
const result = await API.graphql(
graphqlOperation(listProviders, {filter}, {limit})
);
is the wrong syntax. graphqlOperation only takes two arguments: the query and some options. You want to put both keys onto that object.
const limit = 50;
const result = await API.graphql(
graphqlOperation(listProviders, {filter, limit})
);
I currently have this situation:
#Service My Service
private users = ['user1','user2'];
//Generate list of requests to join
private getHttpList(): any[] {
let gets = new Array();
for(let index in this.users)
gets.push(this.http.get('https://api.github.com/users/' + this.users[index]))
return gets;
}
...
getList(): Observable<any[]> {
return forkJoin(this.getHttpList())
}
And in my component, I do the subscribe
this.MyService.getList().subscribe(results => {
for(let res in results) {
//...Do something here
//..I wanna do the get in of https://api.github.com/users/{user}/starred
}
})
Suppose that I just know that the "starred url" after the result of getList(), how to I can "synchronous" this part, or what's the correct form to do this?
**I try do it hardcoded --Result id wrong, because the "res" is a "interable"
this.MyService.getList().subscribe(results => {
let url = 'https://api.github.com/users/';
for(let res in results) {//This don't do the things "synchronous"
this.http.get(url + res.login +'/starred').catch(err => {
throw new Error(err.message);
}).subscribe(starred_res => {
//So we set the starred_list
res.starred_list = starred_res
})
}
})
Thanks...
As I understand you want to get starred list for every user.
The simplest way is to get all starred lists and match them with users result.
// Get users
this.MyService.getList().subscribe((results: any[]) => {
const url = 'https://api.github.com/users/';
// Create requests to get starred list for every user
const starredRequests = results.map(
res => this.http.get('https://api.github.com/users/' + res.login + '/starred')
);
// Wait when all starred requests done and map them with results array
Observable.forkJoin(starredRequests).subscribe(starred => {
results.forEach((res, index) => {
res.starred_list = starred[index];
});
console.log(results);
});
});
have an observable that returns arrays/lists of things: Observable
And I have a usecase where is is a pretty costly affair for the downstream consumer of this observable to have more items added to this list. So I'd like to slow down the amount of additions that are made to this list, but not loose any.
Something like an operator that takes this observable and returns another observable with the same signature, but whenever a new list gets pushed on it and it has more items than last time, then only one or a few are added at a time.
So if the last push was a list with 3 items and next push has 3 additional items with 6 items in total, and the batch size is 1, then this one list push gets split into 3 individual pushes of lists with lengths: 4, 5, 6
So additions are batched, and this way the consumer can more easily keep up with new additions to the list. Or the consumer doesn't have to stall for so long each time while processing additional items in the array/list, because the additions are split up and spread over a configurable size of batches.
I made an addAdditionalOnIdle operator that you can apply to any rxjs observable using the pipe operator. It takes a batchSize parameter, so you can configure the batch size. It also takes a dontBatchAfterThreshold, which stops batching of the list after a certain list size, which was useful for my purposes. The result also contains a morePending value, which you can use to show a loading indicator while you know more data is incomming.
The implementation uses the new requestIdleCallback function internally to schedule the batched pushes of additional items when there is idle time in the browser. This function is not available in IE or Safari yet, but I found this excelent polyfill for it, so you can use it today anyways: https://github.com/aFarkas/requestIdleCallback :)
See the implementation and example usage of addAdditionalOnIdle below:
const { NEVER, of, Observable } = rxjs;
const { concat } = rxjs.operators;
/**
* addAdditionalOnIdle
*
* Only works on observables that produce values that are of type Array.
* Adds additional elements on window.requestIdleCallback
*
* #param batchSize The amount of values that are added on each idle callback
* #param dontBatchAfterThreshold Return all items after amount of returned items passes this threshold
*/
function addAdditionalOnIdle(
batchSize = 1,
dontBatchAfterThreshold = 22,
) {
return (source) => {
return Observable.create((observer) => {
let idleCallback;
let currentPushedItems = [];
let lastItemsReceived = [];
let sourceSubscription = source
.subscribe({
complete: () => {
observer.complete();
},
error: (error) => {
observer.error(error);
},
next: (items) => {
lastItemsReceived = items;
if (idleCallback) {
return;
}
if (lastItemsReceived.length > currentPushedItems.length) {
const idleCbFn = () => {
if (currentPushedItems.length > lastItemsReceived.length) {
observer.next({
morePending: false,
value: lastItemsReceived,
});
idleCallback = undefined;
return;
}
const to = currentPushedItems.length + batchSize;
const last = lastItemsReceived.length;
if (currentPushedItems.length < dontBatchAfterThreshold) {
for (let i = 0 ; i < to && i < last ; i++) {
currentPushedItems[i] = lastItemsReceived[i];
}
} else {
currentPushedItems = lastItemsReceived;
}
if (currentPushedItems.length < lastItemsReceived.length) {
idleCallback = window.requestIdleCallback(() => {
idleCbFn();
});
} else {
idleCallback = undefined;
}
observer.next({
morePending: currentPushedItems.length < lastItemsReceived.length,
value: currentPushedItems,
});
};
idleCallback = window.requestIdleCallback(() => {
idleCbFn();
});
} else {
currentPushedItems = lastItemsReceived;
observer.next({
morePending: false,
value: currentPushedItems,
});
}
},
});
return () => {
sourceSubscription.unsubscribe();
sourceSubscription = undefined;
lastItemsReceived = undefined;
currentPushedItems = undefined;
if (idleCallback) {
window.cancelIdleCallback(idleCallback);
idleCallback = undefined;
}
};
});
};
}
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds){
break;
}
}
}
let testSource = of(
[1,2,3],
[1,2,3,4,5,6],
).pipe(
concat(NEVER)
);
testSource
.pipe(addAdditionalOnIdle(2))
.subscribe((list) => {
// Simulate a slow synchronous consumer with a busy loop sleep implementation
sleep(1000);
document.body.innerHTML += "<p>" + list.value + "</p>";
});
<script src="https://unpkg.com/rxjs#6.5.3/bundles/rxjs.umd.js"></script>