YUI DataTable Loading...but no data retrieved - ajax

I have a YUI datatable bound to a YUI datasource that needs to be auto-refreshed after a couple of seconds and also manually through a button. While I am able to read the data through a local datasource (datasource declared in the same page) I am not able to read it remotely. The grid remains "Data Loading..." even though the requests to the target page (yui_data.cfm) are being made at the set interval. The source code is the following:
Source code of yui_data.cfm (for testing) is the following:
{ "records": [ {"id": 31, "name":"4fruit", "price":8323, "number":231} ] }
Source code of the page requesting the data:
myDataSource = new YAHOO.util.XHRDataSource("yui_data.cfm?");
myDataSource.responseType = YAHOO.util.XHRDataSource.TYPE_JSON;
myDataSource.responseSchema = {
resultsList: "records",
fields: [
{key:"id", parser:"number"},
{key:"name"},
{key:"price",parser:"number"},
{key:"number",parser:"number"}
]
};
myDataTable = new YAHOO.widget.DataTable("dynamicdata", myColumnDefs, myDataSource);
myCallBack = {
success: myDataTable.onDataReturnSetRows,
failure: function() {
},
scope: myDataTable,
argument: myDataTable.getState()
}
myDataSource.setInterval(5000, null, myCallBack);
The above example only works when the line
myDataSource = new YAHOO.util.XHRDataSource("yui_data.cfm?");
is changed to:
myDataSource = new YAHOO.util.XHRDataSource(YAHOO.data.sample); // as an example!

I managed to fix the problem by preceding the previous JSON output with a ResultSet and Result and then modifying the response schema resultList to read from that path.

Related

How to modify just a property from a dexie store without deleting the rest?

I'm having the dexie stores showed in the print screen below:
Dexie stores print screen
My goal is to update a dexie field row from a store without losing the rest of the data.
For example: when I edit and save the field "com_name" from the second row (key={2}) I want to update "com_name" only and not lose the rest of the properties, see first and the third row.
I already tried with collection.modify and table.update but both deleted the rest of the properties when used the code below:
dexieDB.table('company').where('dexieKey').equals('{1}')
//USING table.update
//.update(dexieRecord.dexiekey, {
// company: {
// com_name: "TOP SERVE 2"
// }
//})
.modify(
{
company:
{
com_name: TOP SERVE 2
}
}
)
.then(function (updated) {
if (updated)
console.log("Success.");
else
console.log("Nothing was updated.");
})
.catch(function (err) { console.log(err); });
Any idea how can I accomplish that?
Thanks
Alex
You where right to use Table.update or Collection.modify. They should never delete other properties than the ones specified. Can you paste a jsitor.com or jsfiddle repro of that and someone may help you pinpoint why the code doesn't work as expected.
Now that you are saying I realised that company and contact stores are created dynamically and editedRecords store has the indexes explicitly declared therefore when update company or contact store, since dexie doesn't see the indexes will overwrite. I haven't tested it yet but I suspect this is the behaviour.
See the print screen below:
Dexie stores overview
Basically I have json raw data from db and in the browser I create the stores and stores data based on it, see code below:
function createDexieTables(jsonData) { //jsonData - array, is the json from db
const stores = {};
const editedRecordsTable = 'editedRecords';
jsonData.forEach((jsonPackage) => {
for (table in jsonPackage) {
if (_.find(dexieDB.tables, { 'name': table }) == undefined) {
stores[table] = 'dexieKey';
}
}
});
stores[editedRecordsTable] = 'dexieKey, table';
addDataToDexie(stores, jsonData);
}
function addDataToDexie(stores, jsonData) {
dbv1 = dexieDB.version(1);
if (jsonData.length > 0) {
dbv1.stores(stores);
jsonData.forEach((jsonPackage) => {
for (table in jsonPackage) {
jsonPackage[table].forEach((tableRow) => {
dexieDB.table(table).add(tableRow)
.then(function () {
console.log(tableRow, ' added to dexie db.');
})
.catch(function () {
console.log(tableRow, ' already exists.');
});
});
}
});
}
}
This is the json, which I convert to object and save to dexie in the value column and the key si "dexieKey":
[
{
"company": [
{
"dexieKey": "{1}",
"company": {
"com_pk": 1,
"com_name": "CloudFire",
"com_city": "Round Rock",
"serverLastEdit": [
{
"com_pk": "2021-06-02T11:30:24.774Z"
},
{
"com_name": "2021-06-02T11:30:24.774Z"
},
{
"com_city": "2021-06-02T11:30:24.774Z"
}
],
"userLastEdit": []
}
}
]
}
]
Any idea why indexes were not populated when generating them dynamically?
Given the JSON data, i understand what's going wrong.
Instead of passing the following to update():
{
company:
{
com_name: "TOP SERVE 2"
}
}
You probably meant to pass this:
{
"company.com_name": "TOP SERVE 2"
}
Another hint is to do the add within an rw transaction, or even better if you can use bulkAdd() instead to optimize the performance.

IndexedDB "updates" every browser restart and erases data

I wrote a Firefox WebExtension that downloads data files from a website and uses IndexedDB to store/update the data. The .SQLite file that is created is ~2GB in size. Whenever I restart Firefox, the extension executes the onupgradeneeded event, even though I always use version "1". I create the database object stores and indexes in that event, so all my data ends up getting deleted.
The only time this doesn't happen is when I close Firefox while the data is being downloaded or stored. The next time I start Firefox, it does not execute the event (as should be the case). It then continues to update the database as it was programmed to do.
I installed the SQLite Manager extension in hopes that I could identify something causing the issue to the database, but nothing was obvious to me.
Here is part of my background script:
init().then(fetchData).then(addData).catch(dberror);
function init() {
req = indexedDB.open("db", 1);
req.onupgradeneeded = e => {
var name;
var key;
console.log("Upgrading database...", e.oldVersion, e.newVersion);
db = e.currentTarget.result;
var store = db.createObjectStore("db", { keyPath: "KEY" });
db.createObjectStore("version", { keyPath: "version" });
for (name in indexes) {
key = ...
store.createIndex(name, key);
};
};
return new Promise( (resolve, reject) => {
req.onsuccess = e => {
db = e.currentTarget.result;
db.onerror = dberror;
var cursor = db.transaction("MECs").objectStore("MECs").index("STATUS_DATE").openCursor(null, 'prev');
cursor.onsuccess = e => {
if (e.target.result) {
lastMod = e.target.result.key;
fileYear = lastMod.getFullYear();
}
else lastMod = new Date(startingfileYear, 0);
resolve(lastMod);
}
cursor.onerror = reject;
};
req.onerror = e => {
dberror(e);
reject(e);
}
});
}
function fetchData(param) {
// Get data based on the param and return it
return fetchFile(filename);
}
function addData(data) {
var trans = db.transaction("db", "readwrite");
var store = trans.objectStore("db");
var req;
var n = 0;
var data2 = [];
var addPromise;
trans.onerror = event => console.log("Error! Error! ", event.target.error);
trans.onabort = event => console.log("Abort! Abort! ", event.target.error);
data.forEach((row, index) => {
//process data here
data2 = ...
});
(function storeRegData(n) {
var row = data[n];
if (!row) return;
req = store.put(row);
req.onsuccess = event => {
numUpdated++;
storeRegData(++n);
}
req.onabort = event => console.log("Abort! Abort! ", event.target.error);
req.onerror = event => console.log("Error! Error! ", event.target.error);
})(0); // I'm storing one row at a time because the transaction is failing when I queue too many rows.
addPromise = fetchData(data2).then(
response => {
var trans2 = db.transaction("db", "readwrite");
var store2 = trans2.objectStore("db");
var req2;
response.forEach(row => {
req2 = store2.put(row);
req2.onsuccess = event => numUpdated++;
req2.onerror = console.log;
});
return new Promise((resolve, reject) => trans2.oncomplete = e => resolve(response));
},
console.log)
);
return new Promise((resolve, reject) => trans.oncomplete = e => {
if (noMoreData)
resolve(addPromise);
else if (moreData)
resolve( addPromise.then(fetchData).then(addData) );
});
}
And here is my manifest
{
"author": "Name",
"manifest_version": 2,
"name": "Extension",
"description": "Extension",
"version": "3.0",
"applications": {
"gecko": {
"strict_min_version": "50.0",
"id": "myID",
"update_url": "https://update.me"
}
},
"background": {
"scripts": [
"js/background.js"
]
},
"content_scripts": [
{
"matches": [ "https://match.me/*" ],
"js": [
"script.js"
],
"css": [
"style.css"
]
}
],
"icons": {
"48": "icon.png"
},
"options_ui": {
"page": "options.html"
},
"page_action": {
"browser_style": true,
"default_icon": {
"19": "icon-19.png",
"38": "icon-38.png"
},
"default_title": "Extension",
"default_popup": "popup.html"
},
"permissions": [
"https://web.address/*",
"downloads",
"notifications",
"storage",
"tabs",
"webRequest",
"webNavigation"
],
"web_accessible_resources": [
"pictures.png"
]
}
Why does Firefox think the database is at version 0 when I restart the browser? I can use the stored data after I download it, so why does it overwrite it on every restart? I could possibly do a workaround where I only create the store and indexes on extension installation or update, but that's not a solution to the actual issue.
UPDATE: I tried the following to no avail -
Close the database and re-open after storing each data file
Create a new object store for each data file
UPDATE 2: It appears this is related to a storage issue. Apparently, 2GB is the storage limit for non-persistent storage. In Firefox you can by-pass this by making the storage persistent with the following command:
indexedDB.open("db", { version: 1, storage: "persistent" })
See the bugzilla report here.
Unfortunately, when run from a background page, the popup asking for confirmation is not handled, so you can never acknowledge it. Supposedly, when Firefox 56 comes out, you'll be able to use the "unlimitedStorage" permission, which will by-pass the confirmation popup, so it should work from the background page.
Update 3: So it looks like the limit is actually ~1.5 GB. I just spent over a week re-coding the extension to create and use a different database for each year of data, making each database no larger than 150 MB. And still onupgradeneeded executes when I restart the browser and wipes all my data. If, however, I limit the total amount of data in all the databases to the above limit, it works. Unfortunately, I'm still in the same boat.
Does no one have any ideas?
As I mentioned in the updates to my question, there appears to be a limit of ~1.5GB for the "default" storage of indexedDB. Changing the storage to "persistent" will remove that limit. Because persistent storage currently requires user input, however, the database has to be opened from a window that can handle a UI response.
This can be done from the background script by creating a new window with browser.window.create() and opening the database from there. There are security restrictions that prevent inline scripts from running in the new page, so I had to link to local javascript files for that (i.e. <script src="db.js"></script>. I think you can also change the content security policy with a manifest instruction, but I didn't do that.
Hopefully, the unlimitedStorage permission will be supported in Firefox 56, which will remove the popup, allowing a persistent database to be created/accessed directly from the background script.

Associate File in Parse.Cloud

I am uploading an image through REST API and getting an answer as below
{
"url": "http://files.parsetfss.com/346a0978-68c7-4d08-a446-62f7422469e7/tfss-8b131ff0-5fd0-4dce-92e8-b7b94da5db9e-pic.jpg",
"name": "tfss-8b131ff0-5fd0-4dce-92e8-b7b94da5db9e-pic.jpg"
}
I want to associate this image to a Promotion object. I also have a Location object which has an array of Promotions. Here is my code:
function promiseToAddPromotionToLocation (locationID, promotion) {
var query = new Parse.Query("Location");
return query.get(locationID).then(function (location) {
var promotionObject = promotionObjectFromJSON(promotion);
location.add("promotions", promotionObject);
return location.save();
}, function (error) {
return Parse.Promise.error(error);
});
}
function promotionObjectFromJSON (promotion) {
var Promotion = Parse.Object.extend("Promotion");
var promotionObject = new Promotion();
if ("message" in promotion) {
promotionObject.set("message", promotion.message);
}
//This causes an error: Uncaught Tried to save an object containing an unsaved file.
if ("photo") {
promotionObject.set("photo", promotion.photo);
}
return promotionObject;
}
When I comment out the part of setting photo, it saves the promotion properly, but when I try to set the file, it gives an error saying Uncaught Tried to save an object containing an unsaved file. How can I solve this problem?
By the way promiseToAddPromotionToLocation is called with the parameters below:
{
"locationID": "fvOiAsoogc",
"promotion":{
"message":"Some text",
"photo":{"name": "tfss-8b131ff0-5fd0-4dce-92e8-b7b94da5db9e-pic.jpg", "__type": "File"}
}
}
The reason was that I was passing the file info without url. Associating file docs show that it only needs name and type. However, this is wrong. It also needs the url. This question in Parse forum reveals it.

dataSource.data() doesn't return the datas

I'm currently testing kendoUI and developping a little webapp.
For some reason I need to pass my dataSource.datas from a view to another. In order to do this I use sessionStorage and when I try to put my dataSource.data() in sessionStorage, the return is empty.
See here when I put a log to test if my dataSource.data() is correctly inserted/returned
However, when I put a log to test ma dataSource I can clearly see that _data is not empty as it is showed in the follow picture :
Did someone know the origin of my problem ?
EDIT
here is the code that shows how I add my dataSource to sessionStorage :
var qui = (e.view.params.qui) ? e.view.params.qui : "";
var quoi = (e.view.params.quoi) ? e.view.params.quoi : "";
dataSourceFournisseurs = new kendo.data.DataSource({
transport : {
read : {
url:"annuaire.json",
dataType:"json"
}
},
schema : {
data : "data",
model : {
DISTANCE: function() {
var lat = this.get("LATITUDE");
var lng = this.get("LONGITUDE");
var distance = APP.distanceBetweenCoords(lat, lng);
return "à " + distance + "km";
}
}
},
sort : {
field : "LIBELLE",
dir : "asc"
},
filter: [
{ field: "LIBELLE", operator: "contains", value: qui },
{ field: "NAFLIBELLE", operator: "contains", value: quoi }
]
});
console.log(dataSourceFournisseurs);
session.setValueObject("liste", dataSourceFournisseurs.data());
And here is how I retrieve it :
var datas = session.getValueObject("liste");
console.log(datas);
PS :
setValueObject and getValueObject are two methods I wrote in order to Stringify the datas I set and Parse the retrieved datas (there are fully functionnal I use them for over a year)
the two console.log are those that represent the picture above (picture 1 with second log and picture 2 with first log)
EDIT END
Try using dataSourceFournisseurs.view(). This should give you the all of the data. Using data is meant for initial configuration, and is not meant to be used as a method for retrieving data.
Bonne chance!

Parse.com manipulate Response Object

I am trying to work Ember with Parse.com using
ember-model-parse-adapter by samharnack.
I add added a function to make multiple work search(like search engine) for which I have defined a function on cloud using Parse.Cloud.define and run from client.
The problem is the Array that my cloud response returns is not compatible with Ember Model because of two attributes they are __type and className. how can I modify the response to get response similar to that i get when I run a find query from client. i.e without __type and className
Example responses
for App.List.find() = {
"results":[
{
"text":"zzz",
"words":[
"zzz"
],
"createdAt":"2013-06-25T16:19:04.120Z",
"updatedAt":"2013-06-25T16:19:04.120Z",
"objectId":"L1X55krC8x"
}
]
}
for App.List.cloudFunction("sliptSearch",{"text" : this.get("searchText")})
{
"results":[
{
"text":"zzz",
"words":[
"zzz"
],
"createdAt":"2013-06-25T16:19:04.120Z",
"updatedAt":"2013-06-25T16:19:04.120Z",
"objectId":"L1X55krC8x",
"__type" : Object, //undesired
"className" : "Lists" //undesired
}
]
}
Thanks Vlad something like this worked for me for array
resultobj = [];
searchListQuery.find({
success: function(results) {
for( var i=0, l=results.length; i<l; i++ ) {
temp = results.pop();
resultobj.push({
text: temp.get("text"),
createdAt: temp.createdAt,
updatedAt: temp.updatedAt,
objectId: temp.id,
words: "",
hashtags: ""
});
}
In your cloud code before you make any response, create and object and extract from it the attributes/members you need and then response it. like so:
//lets say result is some Parse.User or any other Parse.Object
function(result)
{
var responseObj = {};
responseObj.name = responseObj.get("name");
responseObj.age = responseObj.get("age");
responseObj.id = responseObj.id;
response.success(responseObj);
}
on the response side you will get {"result": {"name": "jhon", "age": "26", "id": "zxc123s21"}}
Hope this would help you

Resources