Wix Randomize Items in a gallery - velo

I'm trying to get a gallery of items to display randomly each time the page is opened. So far, I have:
set up a "collection" of items to display. This works.
set up some code to randomize the items in the gallery. The code randomizes the items correctly.
What I'm having trouble with is re-displaying the items in the gallery in the random order. According to the Wix docs (https://www.wix.com/velo/reference/$w/gallery/items) I should be able to assign the items to the shuffled array, but that's not working.
I've borrowed some of the code from here: Randomize dataset in a Wix repeater
But, since that's talking about a repeater and not a gallery, it's not quite the same, and I want to display all the items, not just one at a time.
import wixData from 'wix-data';
$w.onReady(function () {
initiate();
});
function initiate() {
wixData.query("ElectricityRegulators")
.find()
.then( (results) => {
let array = results.items;
let random = shuffle(array);
$w("#gallery1").data = random;
});
}
function shuffle(array) {
let currentIndex = array.length, temporaryValue, randomIndex;
while (0 !== currentIndex) {
randomIndex = Math.floor(Math.random() * currentIndex);
currentIndex -= 1;
temporaryValue = array[currentIndex];
array[currentIndex] = array[randomIndex];
array[randomIndex] = temporaryValue;
}
return array;
}
What am I missing from getting the gallery to display the randomized array of items??
TIA.

Related

unique Count of values in dimension using crossfilter (dc.js) [duplicate]

I am stuck at a unique problem involving dc.js and crossfilter. I have some data which i need to display using Number Charts powered by dc.js. However i found minimal documentation for the number charts and hence posting my query.
Here is the JSFiddle for what i have conceptualized so far.
I basically want to show the unique project count in box 1 which in this case would be 3, unique place count in box 2 which in this case would be 11 and the screen failure rate which would be 2/15*100 i.e. 15.3%
Currently i have made this working using jquery but thats just a hack. I would like to have these number charts based on cross table aggregation so that i can drill down on the data.
I have come across examples for reductions to calculate counts but they were for bar charts but in the number chart we need to have a value accessor for displaying data.
Can someone help me out please?
PS:
Here is the jquery code that i wrote. Dont know if this would be helpful.
$(document).ready(function() {
var baseURL = window.location.origin;
$.ajax({
url : baseURL + '/api/PlaceTable',
type : 'GET',
data : {},
async : true,
dataType : "json",
success : function(response) {
//Project Count
var projectIdCount = [];
for (i = 0; i < response.length; i++) {
if(response[i].Project != undefined){
if($.inArray(response[i].Project, projectIdCount) === -1){
projectIdCount.push(response[i].Project);
}
}
}
$('#number-box1').text(ProjectIdCount.length);
//Place Count
var placeIdCount = [];
for (i = 0; i < response.length; i++) {
if(response[i].Place != undefined){
if($.inArray(response[i].Place, placeIdCount) === -1){
placeIdCount.push(response[i].Place);
}
}
}
And for displaying a running sum of a column containing binary values i used this code, which worked in the number chart:
numberChart
.valueAccessor(function(x){ return +flag.groupAll().reduceCount().reduceSum(function(d) { return d.Flag; }).value();})
.group(ndx.groupAll());
The failure percentage calculation is a separate problem which I think you've asked elsewhere. To get the unique count, it is pretty easy to make a "fake groupAll" which returns the number of unique keys in its value method.
We'll also need to filter out the empty bins since crossfilter doesn't do that automatically.
function bin_counter(group) {
return {
value: function() {
return group.all().filter(function(kv) {
return kv.value > 0;
}).length;
}
};
}
var projectGroup = project.group();
projectCount
.valueAccessor(function(x){ return x;})
.group(bin_counter(projectGroup));
Updated fiddle here, still ignoring the failure% part:
http://jsfiddle.net/gordonwoodhull/vct0dzou/1/

A simple sort script on Google Sheets is not working

I have a "leaderboard"/"scoreboard", across four sheets, that I need to have auto sorting whenever updated by first Total Score (column 2) and then Total Kills (column 3). These columns are the same across all four sheets.
I've used a very simple script in the past when the scoreboard was limited to one sheet, but I have since expanded it to have Top Ten, Top Four, and Top Two on separate sheets within the same document.
The problem I'm running into: When the script updates one sheet, the other ones seem to flat out stop working entirely; in other words, the script breaks.
Can I please get some advice? I've tried several scripts already from this site, and the basic one I see some success with (but then the script seemingly breaks?) is below.
function sortOnEdit(e) {
var sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName("MAIN EVENT");
sheet.sort(3, false).sort(2, false);
}
function sortOnEdit(e) {
var sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName("TOP TEN");
sheet.sort(3, false).sort(2, false);
}
function sortOnEdit(e) {
var sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName("TOP FOUR");
sheet.sort(3, false).sort(2, false);
}
function sortOnEdit(e) {
var sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName("TOP TWO");
sheet.sort(3, false).sort(2, false);
}
Ideally, when functioning, the sheets will literally just sort themselves by the Total Score column, with Total Kills being the "tiebreaker" for sorting.
I've included a copy of my sheet if anybody could help:
https://docs.google.com/spreadsheets/d/1a6XGv09TPt5Vnxqfcd1Xba3TGMis5OelGxlvzNDl5CY/edit?usp=sharing
try something like this instead of your scripts:
function onEdit(event){
var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = event.source.getActiveSheet().getName()
var editedCell = event.range.getSheet().getActiveCell();
if(sheet=="Sheet1"){
var columnToSortBy = 2;
var tableRange = "A3:C10"; //range to be sorted
if(editedCell.getColumn() == columnToSortBy){
var range = ss.getActiveSheet().getRange(tableRange);
range.sort( { column : columnToSortBy, ascending: true } );
}}
else if(sheet=="Sheet2"){
var columnToSortBy = 7;
var tableRange = "A3:C10"; //range to be sorted
if(editedCell.getColumn() == columnToSortBy){
var range = ss.getActiveSheet().getRange(tableRange);
range.sort( { column : columnToSortBy, ascending: true } );
}
else{return}
}}
try this:
function sortOnEdit(e) {
var sh=e.range.getSheet();
var name=sh.getName();
var incl=['MAIN EVENT','TOP TEN','TOP FOUR','TOP TWO'];
if(incl.indexOf(name)==-1) return;
sh.sort(3,false).sort(2,false);
}

Export crossfilter dataset to excel in dc.js

I made a visualization page using crossfilter.js and dc.js . I want to export the filtered dataset to excel. Is any way to do this.?
I think the best way to do this is to create another dimension and then call dimension.top(Infinity) to get all the records (sorted by that dimension's key).
Jacob Rideout created a pull request for a new method to do just this without the overhead, but it was not accepted (doesn't look like it was rejected either ;):
https://github.com/square/crossfilter/pull/95
But I doubt you will notice any performance penalty for creating the extra dimension. (Please comment on that PR if you do!)
function groupArrayAdd(keyfn) {
var bisect = d3.bisector(keyfn);
return function (elements, item) {
var pos = bisect.right(elements, keyfn(item));
elements.splice(pos, 0, item);
return elements;
};
}
function groupArrayRemove(keyfn) {
var bisect = d3.bisector(keyfn);
return function (elements, item) {
var pos = bisect.left(elements, keyfn(item));
if (keyfn(elements[pos]) === keyfn(item))
elements.splice(pos, 1);
return elements;
};
}
function groupArrayInit() {
return [];
}
var facts = crossfilter(data); //pass your mater dataset here.
var filteredRows = facts.groupAll().reduce(
groupArrayAdd(dc.pluck('shift')),
groupArrayRemove(dc.pluck('shift')),
groupArrayInit}
);
filteredRows.value() will give you the crossfilted data. Every time the data is filteded, this function will give automatically five the filted output which you can use to export to excel using any jquery plugin.
Another way to find out filtered data is using below dc function:
dimension.top(Infinity)

Assigning functions to multiple slickgrids

Please help to solve this very annoying problem. I am using a for loop to iterate over an array of data and create multiple grids. It is working well but the filter function is not binding properly (it only binds to the LAST grid created) Here is the code:
// this function iterates over the data to build the grids
function buildTables() {
// "domain" contains the dataset array
for (i = 0; i < domains.length; i++) {
var dataView;
dataView = new Slick.Data.DataView();
var d = domains[i];
grid = new Slick.Grid('#' + d.name, dataView, d.columns, grids.options);
var data = d.data;
// create a column filter collection for each grid - this works fine
var columnFilters = columnFilters[d.name];
// this seems to be working just fine
// Chrome console confirms it is is processed when rendering the filters
grid.onHeaderRowCellRendered.subscribe(function (e, args) {
$(args.node).empty();
$("<input type='text'>")
.data("columnId", args.column.id)
.val(columnFilters[args.column.id])
.appendTo(args.node);
});
// respond to changes in filter inputs
$(grid.getHeaderRow()).delegate(":input", "change keyup", function (e) {
var columnID = $(this).data("columnId");
if (columnID != null) {
// this works fine - when the user enters text into the input - it
// adds the filter term to the filter obj appropriately
// I have tested this extensively and it works appropriately on
// all grids (ie each grid has a distinct columnFilters object
var gridID = $(this).parents('.grid').attr('id');
columnFilters[gridID][columnID] = $.trim($(this).val());
dataView.refresh();
}
});
//##### FAIL #####
// this is where things seem to go wrong
// The row item always provides data from the LAST grid populated!!
// For example, if I have three grids, and I enter a filter term for
// grids 1 or 2 or 3 the row item below always belongs to grid 3!!
function filter(row) {
var gridID = $(this).parents('.grid').attr('id');
for (var columnId in grids.columnFilters[gridID]) {
if (columnId !== undefined && columnFilters[columnId] !== "") {
var header = grid.getColumns()[grid.getColumnIndex(columnId)];
//console.log(header.name);
}
}
return true;
}
grid.init();
dataView.beginUpdate();
dataView.setItems(data);
dataView.setFilter(filter); // does it matter than I only have one dataView instance?
dataView.endUpdate();
grid.invalidate();
grid.render();
In summary, each function seems to be binding appropriately to each grid except for the filter function. When I enter a filter term into ANY grid, it returns the rows from the last grid only.
I have spent several hours trying to find the fault but have to admit defeat. Any help would be most appreciated.
yes, it matters that you have only one instance of dataView. and also sooner or later you will come up to the fact that one variable for all grids is also a bad idea
so add a var dataView to your loop, it should solve the problem

facing performance issues with knockout mapping plugin

I have decent large data set of around 1100 records. This data set is mapped to an observable array which is then bound to a view. Since these records are updated frequently, the observable array is updated every time using the ko.mapping.fromJS helper.
This particular command takes around 40s to process all the rows. The user interface just locks for that period of time.
Here is the code -
var transactionList = ko.mapping.fromJS([]);
//Getting the latest transactions which are around 1100 in number;
var data = storage.transactions();
//Mapping the data to the observable array, which takes around 40s
ko.mapping.fromJS(data,transactionList)
Is there a workaround for this? Or should I just opt of web workers to improve performances?
Knockout.viewmodel is a replacement for knockout.mapping that is significantly faster at creating viewmodels for large object arrays like this. You should notice a significant performance increase.
http://coderenaissance.github.com/knockout.viewmodel/
I have also thought of a workaround as follows, this uses less amount of code-
var transactionList = ko.mapping.fromJS([]);
//Getting the latest transactions which are around 1100 in number;
var data = storage.transactions();
//Mapping the data to the observable array, which takes around 40s
// Instead of - ko.mapping.fromJS(data,transactionList)
var i = 0;
//clear the list completely first
transactionList.destroyAll();
//Set an interval of 0 and keep pushing the content to the list one by one.
var interval = setInterval(function () {if (i == data.length - 1 ) {
clearInterval(interval);}
transactionList.push(ko.mapping.fromJS(data[i++]));
}, 0);
I had the same problem with mapping plugin. Knockout team says that mapping plugin is not intended to work with large arrays. If you have to load such big data to the page then likely you have improper design of the system.
The best way to fix this is to use server pagination instead of loading all the data on page load. If you don't want to change design of your application there are some workarounds which maybe help you:
Map your array manually:
var data = storage.transactions();
var mappedData = ko.utils.arrayMap(data , function(item){
return ko.mapping.fromJS(item);
});
var transactionList = ko.observableArray(mappedData);
Map array asynchronously. I have written a function that processes array by portions in another thread and reports progress to the user:
function processArrayAsync(array, itemFunc, afterStepFunc, finishFunc) {
var itemsPerStep = 20;
var processor = new function () {
var self = this;
self.array = array;
self.processedCount = 0;
self.itemFunc = itemFunc;
self.afterStepFunc = afterStepFunc;
self.finishFunc = finishFunc;
self.step = function () {
var tillCount = Math.min(self.processedCount + itemsPerStep, self.array.length);
for (; self.processedCount < tillCount; self.processedCount++) {
self.itemFunc(self.array[self.processedCount], self.processedCount);
}
self.afterStepFunc(self.processedCount);
if (self.processedCount < self.array.length - 1)
setTimeout(self.step, 1);
else
self.finishFunc();
};
};
processor.step();
};
Your code:
var data = storage.transactions();
var transactionList = ko.observableArray([]);
processArrayAsync(data,
function (item) { // Step function
var transaction = ko.mapping.fromJS(item);
transactionList().push(transaction);
},
function (processedCount) {
var percent = Math.ceil(processedCount * 100 / data.length);
// Show progress to the user.
ShowMessage(percent);
},
function () { // Final function
// This function will fire when all data are mapped. Do some work (i.e. Apply bindings).
});
Also you can try alternative mapping library: knockout.wrap. It should be faster than mapping plugin.
I have chosen the second option.
Mapping is not magic. In most of the cases this simple recursive function can be sufficient:
function MyMapJS(a_what, a_path)
{
a_path = a_path || [];
if (a_what != null && a_what.constructor == Object)
{
var result = {};
for (var key in a_what)
result[key] = MyMapJS(a_what[key], a_path.concat(key));
return result;
}
if (a_what != null && a_what.constructor == Array)
{
var result = ko.observableArray();
for (var index in a_what)
result.push(MyMapJS(a_what[index], a_path.concat(index)));
return result;
}
// Write your condition here:
switch (a_path[a_path.length-1])
{
case 'mapThisProperty':
case 'andAlsoThisOne':
result = ko.observable(a_what);
break;
default:
result = a_what;
break;
}
return result;
}
The code above makes observables from the mapThisProperty and andAlsoThisOne properties at any level of the object hierarchy; other properties are left constant. You can express more complex conditions using a_path.length for the level (depth) the value is at, or using more elements of a_path. For example:
if (a_path.length >= 2
&& a_path[a_path.length-1] == 'mapThisProperty'
&& a_path[a_path.length-2] == 'insideThisProperty')
result = ko.observable(a_what);
You can use typeOf a_what in the condition, e.g. to make all strings observable.
You can ignore some properties, and insert new ones at certain levels.
Or, you can even omit a_path. Etc.
The advantages are:
Customizable (more easily than knockout.mapping).
Short enough to copy-paste it and write individual mappings for different objects if needed.
Smaller code, knockout.mapping-latest.js is not included into your page.
Should be faster as it does only what is absolutely necessary.

Resources