How do you ignore Joomla Form Field SQL limit - joomla

When you use the standard Joomla Form Field SQL there's a limit to the number of entries it can return (somewhere around 5000, I found). I found this out when I queried a select list that had 30k+ entries. It wouldn't work unless I added a limit 0, 5000 max.
Is creating a custom form field the only solution or is there some other way to stop this from happening? Also, does anyone know where and why this limit exists?

Figured it out guys...
https://stackoverflow.com/a/22330536/1985305
The limit wasn't being imposed by Joomla, it was a browser Limit. Chrome/Firefox have a maximum of 10k options in select boxes. I was using two with 4k and ~5k each and maxing it out, shoot.
EDIT:
Nope, just kidding. Did it again a different way and managed to get 26k out, but I can't reach my 34k to fit all my items.
No longer using Joomla form field, but am using Joomla DB methods. Any ideas?

Related

How to perform a "from" in Elasticsearch scroll context?

I have a large dataset to query and display in website on an array.
I made a pagination system with a scroll but i can only display a maximum of 100 items at a time so i'm facing issue when i want to display data of page 200 and more because i have to scroll until them and it take too long.
I have check other parts of my code and i didn't find other perf issue, is just the scroll queries which make my api call too long. I tried setting the request size from 100 to 10000 but it doesn't change anything.
I don't think sliced ​​scroll can be a solution or then I didn't understand the functionality.
I'm desperately searching a way to skip the scroll queries before datas that i'm searching even it's not a precise method.
Hoping someone has a solution or at least a clue.
Edit:
More details about what i'm trying to achieve.
I log some actions of my users like calls in Elasticsearch indexes. They do millions of actions per month so Elasticsearch seems like a good option to store them knowing that i don't have to update them after they are stored .
I'm creating a page where my users can search for actions they've performed, but they're doing the "query" themselves. I mean they can select the period and many other parameters, order them by many parameters, etc. The number of result can be 1 or 100,000 items, but I can't show 100,000 items on my page for UI reasons, so I have to manage a pagination and send only part of the result to the page.
I made a scroll query to do it for now with a size of 1000, and i scroll until i'm in the current page of my pagination. I tried to vary the size but it's not really concluent because I can't know the number of result before the query is made.
And the deeper my user go in the pagination, the longer the query take.
I could increase the index.max_result_window with an unreachable number (but I don't know what that implies) make a simple query with a from and a second scroll query for export case but I wonder if they are a way to skip some step in a scroll when i know i'm going to take 100 items after the 1 000 000th item ?
Edit: I watched how google design its pagination and i notice that if you want to go deep in search results you can't unless you go step by step. You can't go directly to the 500th page.
This is how I done mine
So I just redesign my pagination to do the same as Google and force my users to use more precise filters to get less result. Thank you #Val for getting me to ask the right questions :)

Smart pagination algorithm that works with local data cache

This is a problem I have been thinking about for a long time but I haven't written any code yet because I first want to solve some general problems I am struggling with. This is the main one.
Background
A single page web application makes requests for data to some remote API (which is under our control). It then stores this data in a local cache and serves pages from there. Ideally, the app remains fully functional when offline, including the ability to create new objects.
Constraints
Assume a server side database of products containing +- 50000 products (50Mb)
Assume no db type, we interact with it via REST/GraphQL interface
Assume a single product record is < 1kB
Assume a max payload for a resultset of 256kB
Assume max 5MB storage on the client
Assume search result sets ranging between 0 ... 5000 items per search
Challenge
The challenge is to define a stateless but (network) efficient way fetch pages from a result set so that it is deterministic which results we will get.
Example
In traditional paging, when getting the next 100 results for some query using this url:
https://example.com/products?category=shoes&firstResult=100&pageSize=100
the search result may look like this:
{
"totalResults": 2458,
"firstResult": 100,
"pageSize": 100,
"results": [
{"some": "item"},
{"some": "other item"},
// 98 more ...
]
}
The problem with this is that there is no way, based on this information, to get exactly the objects that are on a certain page. Because by the time we request the next page, the result set may have changed (due to changes in the DB), influencing which items are part of the result set. Even a small change can have a big impact: one item removed from the DB, that happened to be on page 0 of the result set, will change what results we will get when requesting all subsequent pages.
Goal
I am looking for a mechanism to make the definition of the result set independent of future database changes, so if someone was looking for shoes and got a result set of 2458 items, he could actually fetch all pages of that result set reliably even if it got influenced by later changes in the DB (I plan to not really delete items, but set a removed flag on them, for this purpose)
Ideas so far
I have seen a solution where the result set included a "pages" property, which was an array with the first and last id of the items in that page. Assuming your IDs keep going up in number and you don't really delete items from the DB ever, the number of items between two IDs is constant. Meaning the app could get all items between those two IDs and always get the exact same items back. The problem with this solution is that it only works if the list is sorted in ID order... I need custom sorting options.
The only way I have come up with for now is to just send a list of all IDs in the result set... That way pages can be fetched by doing a SELECT * FROM products WHERE id IN (3,4,6,9,...)... but this feels rather inelegant...
Any way I am hoping it is not too broad or theoretical. I have a web-based DB, just no good idea on how to do paging with it. I am looking for answers that help me in a direction to learn, not full solutions.
Versioning DB is the answer for resultsets consistency.
Each record has primary id, modification counter (version number) and timestamp of modification/creation. Instead of modification of record r you add new record with same id, version number+1 and sysdate for modification.
In fetch response you add DB request_time (do not use client timestamp due to possibly difference in time between client/server). First page is served normally, but you return sysdate as request_time. Other pages are served differently: you add condition like modification_time <= request_time for each versioned table.
You can cache the result set of IDs on the server side when a query arrives for the first time and return a unique ID to the frontend. This unique ID corresponds to the result set for that query. So now the frontend can request something like next_page with the unique ID that it got the first time it made the query. You should still go ahead with your approach of changing DELETE operation to a removed operation because it would make sure that none of the entries from the result set it deleted. You can discard the result set of the query from the cache when the frontend reaches the end of the result set or you can set a time limit on the lifetime of the cache entry.

Grafana Templating

I am wondering if there is a way to limit the number of rows generated from grafana templating.
I can have a drop down variable "$x"on my grafana page and I can select the row editor and say repeat row for every value under $x.
Based on different criteria, I can have x anywhere between 1 and like 160 rows. I need to only be looking at about 10 at a time. I am wondering if I can control the number of rows shown/change the rows shown somewhere in grafana.
I can manually select items from the $x drop down to show only a few items of course, but its a matter of selecting only say.. 10 items right when the page loads.
Please let me know if I am not describing the problem correctly or if I need to clarify more.
Thanks,
Karan
As far as I know there is no direct option for this but there are some ways you may be able to achieve what you want.
You could select your ~10 default entries and then save the dashboard this will save the selected ones in the dashboards JSON. (or modify the JSON of the dashboard directly)
You could use the regex field in the template settings to filter some of your values and split them in groups this way. (one variable per regex group)
You could change your data in elasticsearch in order to use multiple fields where you can split on.
see PR #5616 as #Daniel Lee mentioned
In general I think you get a faster response to this in grafana directly at github.

RSS functioning problem

I need to create an RSS feed for our information system, which is written in PHP.
I had no problems with the RSS 2.0 specification, nor with the creation of RSS feed generator. Items for the feed are to be fetched from a large table containing lots of records, so it will take a lot of time to get all the necessary information from this table. Therefore, it is necessary to implement the following scheme:
To show 5 latest items to new
subscribers.
For the existing subscribers – to
show only those items which have
been added since their last view of
the feed.
I have no problems with the first condition: I can simply use the LIMIT clause
to limit the number of fetched rows. Something like this:
$items = function_select(“SELECT * FROM some_table ORDER BY date DESC LIMIT 5);
But this creates the following problem: Suppose there are real feed subscribers who have already read the items from 1 up to 10. After they've been away for some period of time new items have been created; say, 10 new items.
During their next check-in we want them to see all the new 10 items, but not all at once. They will see only the last 5 ones (from 16 up to 20), but not all 10 of them. The items from 11 up to 15 will be omitted.
I suppose that in order to succeed in solving this problem there should be a kind of a flag to be sent to feed. For example: pubDate of the lasted fetched item. Twitter's feed uses something similar. However, that link is hand-made. How could it be done another way?
Please let me know if you have any ideas. If you have any example ready (no matter in what language) just share a link with me. I would appreciate it greatly.
Thank you in advance.
Standard RSS feeds don't render different content to different users. They simply always provide the most recent few items (often 10), and rely on the RSS reader to poll them often enough that they don't miss any updates. Unless you have a particularly compelling reason not to do this, this is the simplest and most effective mechanism.

JQgrid want an ALL option on the navigator

I am working on jqGrid.I had a problem in the navigator rowList.
my data to this field is like this rowList:['25','50','75','-1'].
I am using -1 to display all the records.In the row list,i wish to use "All" instead of "-1" to display all the records is their any possibility to do this.
Thanks in advance.
Regards,
Phani Kumar
There are no All value as rowNum and in the rowList array. You should also not use -1 (see documentation about rowNum for example) or other negative values. The value -1 was allowed in some cases in the old releases of jqGrid, but it is no more supported in last releases. The value like 1000, 10000 are good enough depend on your requirements.
Having some realistic restrictions in the number of displayed rows is a good idea independent on the jqGrid support. No real user are able to examine 1000 rows of information without data paging, filtering (searching) or at least scrolling of the data in the web browser. So the displaying of about 50 or 100 row as maximum is what I would recommend you. Look at another old answer for more details and see included examples.

Resources