I am trying to figure out how to set and retrieve a value in the app-localstorage-document. I worked before with the iron-meta element and did it like so:
<iron-meta id="meta" key="id" value="{{meta}}"></iron-meta>
Polymer({
is: 'login-form',
properties: {
meta: {
type: String,
value: ''
},
},
getValue: function() {
this.meta = '100'
var savedValue = this.$.meta.byKey('id');
console.log(savedValue);
}
});
But when I try something similar with the app-localstorage-document it just returns: Promise {[[PromiseStatus]]: "resolved", [[PromiseValue]]: undefined}
I cant find any example on how to work with this element. Maybe someone can help me out here.
<app-localstorage-document id="meta" key="id" data="{{meta}}" storage="window.localStorage"></app-localstorage-document>
Polymer({
is: 'login-form',
properties: {
meta: {
type: String,
value: ''
},
},
getValue: function() {
this.$.meta.setStoredValue('id', '50');
var savedValue = this.$.meta.getStoredValue('id');
console.log(savedValue);
}
});
I am still studying this element myself. The documentation is not all that direct. This what I understand so far.
The issue is more about changing the way you think about accessing the storage data. The app-localstorage-document element is handling it for you.
<app-localstorage-document id="meta" key="id" data="{{meta}}" storage="window.localStorage"></app-localstorage-document>
The attribute "data" is synced with the storage key of "id". Any time "id" is updated, the variable assigned to "data" is updated. This means in this changes to the key "id" will bubble up into the variable "meta".
If you need to access information in storage, it should be accessible in the "meta" variable.
this.meta.name or {{meta.name}}
this.meta.id or {{meta.id}}
This would imply that the variable assigned to "data" should be of type Object.
meta:{
type:Object,
value:{},
notify:true
}
As a result, if you are using this element, there is no reason to directly access local storage. That is what this element is for.
The basic idea is to synchronize in-memory data to localStorage.
app-localstorage-document stores the 'data' in localStorage or sessionStorage. Any change in data flows back to storage. In your case, you just need to set the {{meta}} object with desired values, and app-localstorage-document will ensure that it is stored in local storage.
As you can always work with in-memory data, you might not read the data from localStorage in the same element. However, to read the stored data in other elements, you could use iron-localstorage or even directly read from localStorage with the key.
Related
The setup:
My basic setup is a Next.js app querying data from a GraphQL API.
I am fetching an array of objects from the API and am able to display that array on the client.
I want to be able to filter the data based on Enum values that are defined in the API schema. I am able to pass these values programmatically and the data is correctly updated.
I want those filters to be persistent when a user leaves the page & come back. I was originally planning to use Redux, but then I read about apollo-link-state and the ability to store local (client) state into the Apollo store, so I set out to use that instead. So far, so good.
The problem:
When I try to combine the local query and the remote query into a single one, I get the following error: networkError: TypeError: Cannot read property 'some' of undefined
My query looks like this:
const GET_COMBINED = gql`
{
items {
id
details
}
filters #client
}
`
And I use it inside a component like this:
export default const Items = () => (
<Query query={GET_COMBINED}>
{({ loading, error, data: { items, filters } }) => {
...do stuff...
}}
</Query>
)
IF however, I run the queries separately, like the following:
const GET_ITEMS = gql`
{
items {
id
details
}
}
`
const GET_FILTERS = gql`
{
filters #client
}
`
And nest the queries inside the component:
export default const Items = () => (
<Query query={GET_ITEMS}>
{({ loading, error, data: { items } }) => {
return (
<Query query={GET_FILTERS}>
{({ data: { filters } }) => {
...do stuff...
}}
</Query>
)
}}
</Query>
)
Then it works as intended!
But it seems far from optimal to nest queries like this when a single query would - in theory, at least - do the job. And I truly don't understand why the combined query won't work.
I've stripped my app to its bare bones trying to understand, but the gist of it is, whenever I try to combine fetching local & remote data into a single query, it fails miserably, while in isolation both work just fine.
Is the problem coming from SSR/Next? Am I doing it wrong? Thanks in advance for your help!
Edit 2 - additional details
The error is triggered by react-apollo's getDataFromTree, however even when I choose to skip the query during SSR (by passing the ssr: false prop to the Query component), the combined query still fails. Besides, both the remote AND local queries work server-side when run separately. I am puzzled.
I've put together a small repo based on NextJS's with-apollo example that reproduces the problem here: https://github.com/jaxxeh/next-with-apollo-local
Once the app is running, clicking on the Posts (combined) link straight away will trigger an error, while Posts (split) link will display the data as intended.
Once the data has been loaded, the Posts (combined) will show data, but the attempt to load extra data will trigger an error. Reloading (i.e. server-rendering) the page will also trigger an error. Checkboxes will be functional and their state preserved across the app.
The Posts (split) page will fully function as intended. You can load extra post data, reload the page and set checkboxes.
So there is clearly an issue with the combined query, be it on the server-side (error on reload) or the client-side (unable to display additional posts). Direct writes to the local state (which bypass the query altogether) do work, however.
I've removed the Apollo init code for brevity & clarity, it is available on the repo linked above. Thank you.
Add an empty object as your resolver map to the config you pass to withClientState:
const stateLink = withClientState({
cache,
defaults: {
filters: ['A', 'B', 'C', 'D']
},
resolvers: {},
typedefs: `
type Query {
filters: [String!]!
}
`,
})
There's a related issue here. Would be great if the constructor threw some kind of error if the option was missing or if the docs were clearer about it.
I want to be able to do updates on an object while it is still being created.
For example: Say I have a to-do list where I can add items with names. I also want to be able to edit names of items.
Now say a user with a slow connection creates an item. In that case I fire off a create item mutation and optimistically update my UI. That works great. So far no problem
Now let's say the create item mutation is taking a bit of time due to a slow network. In that time, the user decides to edit the name of the item they just created. For an ideal experience:
The UI should immediately update with the new name
The new name should eventually be persisted in the server
I can achieve #2 by waiting for the create mutation to finish (so that I can get the item ID), then making an update name mutation. But that means parts of my UI will remain unchanged until the create item mutation returns and the optimistic response of the update name mutation kicks in. This means #1 won't be achieved.
So I'm wondering how can I achieve both #1 and #2 using Apollo client.
Note: I don't want to add spinners or disable editing. I want the app to feel responsive even with a slow connection.
If you have access to the server you can implement upsert operations, and you can reduce all queries to the such one:
mutation {
upsertTodoItem(
where: {
key: $itemKey # Some unique key generated on client
}
update: {
listId: $listId
text: $itemText
}
create: {
key: $itemKey
listId: $listId
text: $itemText
}
) {
id
key
}
}
So you will have a sequence of identical mutations differing only in variables. An optimistic response accordingly, can be configured to this one mutation. On the server you need to check if an item with such a key already exists and create or update an item respectively.
Additionally you might want to use apollo-link-debounce to reduce number of requests when user is typing.
I think the easiest way to achieve your desired effect is to actually drop optimistic updates in favor of managing the component state yourself. I don't have the bandwidth at the moment to write out a complete example, but your basic component structure would look like this:
<ApolloConsumer>
{(client) => (
<Mutation mutation={CREATE_MUTATION}>
{(create) => (
<Mutation mutation={EDIT_MUTATION}>
{(edit) => (
<Form />
)}
</Mutation>
)}
</Mutation>
)}
</ApolloConsumer>
Let's assume we're dealing with just a single field -- name. Your Form component would start out with an initial state of
{ name: '', created: null, updates: null }
Upon submitting, the Form would do something like:
onCreate () {
this.props.create({ variables: { name: this.state.name } })
.then(({ data, errors }) => {
// handle errors whichever way
this.setState({ created: data.created })
if (this.state.updates) {
const id = data.created.id
this.props.update({ variables: { ...this.state.updates, id } })
}
})
.catch(errorHandler)
}
Then the edit logic looks something like this:
onEdit () {
if (this.state.created) {
const id = this.state.created.id
this.props.update({ variables: { name: this.state.name, id } })
.then(({ data, errors }) => {
this.setState({ updates: null })
})
.catch(errorHandler)
} else {
this.setState({ updates: { name: this.state.name } })
}
}
In effect, your edit mutation is either triggered immediately when the user submits (since we got a response back from our create mutation already)... or the changes the user makes are persisted and then sent once the create mutation completes.
That's a very rough example, but should give you some idea on how to handle this sort of scenario. The biggest downside is that there's potential for your component state to get out of sync with the cache -- you'll need to ensure you handle errors properly to prevent that.
That also means if you want to use this form for just edits, you'll need to fetch the data out of the cache and then use that to populate your initial state (i.e. this.state.created in the example above). You can use the Query component for that, just make sure you don't render the actual Form component until you have the data prop provided by the Query component.
The Ext.data.Model class represents the backend models. And just like in the server code, some of its fields can be of another declared model type via the reference property. I've found out that using a model's getAssociatedData() function returns an object with all those referenced fields. However they only contain the reference object's data object they are not full fledged initialized Ext.data.Models, which forces a primitive object access and there is no way to use the model's configured proxies etc for loading/saving. Is this the correct/only way of using this functionality? We've also been looking for a way to add columns from referenced fields on a grid but it doesn't seem to work... I'm starting to doubt the usefulness of declaring referenced fields.
Example code:
Ext.define('MyApp.ModelA', {
extend: 'Ext.data.Model',
fields: [{
name: 'modelb',
reference: 'MyApp.ModelB'
}]
});
Ext.define('MyApp.ModelB', {
extend: 'Ext.data.Model',
fields: [{
name: 'modelId',
type: 'int'
}]
});
//...
var modelA = new MyApp.ModelA().load();
var modelB = modelA.getAssociatedData().modelb; //This is the only way to access it.
var modelBId = modelB.get('modelId') //This returns undefined because the function .get doesn't exist.
var modelBId = modelB.id; //This works because it is a simple object property access.
//...
As Chad Peruggia said, it seems that ExtJS creates special getters for reference fields that match the field name. Using getAssociatedData() returns only the primitive form of those objects (only their data values) but using the special getter (in my case getModelb()) it returns a full fledged model initialized with the given data.
I have schemas set up so that I can have an array of complex input sets. Something like:
address = {
street:{
type: String
},
city: {
type: String
},
active_address: {
type: Boolean,
optional: true
},
...
}
people: {
name:{
type: String
},
address:{
type: [address],
optional: true,
defaultValue: []
}
}
This way adding an address is optional, but if you add an address all of the address fields are required.
This worked in (I believe it was) version 4.2.2. This still works on insert type autoforms, but not on update type autoforms. Doing an update, none of the fields will submit unless all required fields in the nested schema are also valid.
For reference, I'm creating the form as such:
{{#autoForm collection="people" id=formId type="update" doc=getDocument autosave=true template="autoupdate"}}
{{> afQuickField name='name' template="autoupdate" placeholder="schemaLabel"}}
{{> afQuickField name='address' template="autoupdate"}}
{{/autoForm}}
My templates (autoupdate) I copy-pasted the entirety of bootstrap3 autoform templates and rearranged some of the html to fit my needs. I updated these to the best of my ability according to the 5.0.0 changelog when I updated. It could possibly be in there if someone can think of an attribute in the templates that would cause inconsistent behavior between insert and update that changed in 5.0.0.
More information
I just tried recreating all of my form templates using the bootstrap3 templates from 5.0.2. Still the same behavior.
+
I have a Boolean (checkbox) input in the address schema. Looking in a doc, the address array is populated with [0 : {active_address: false}]
active_address: {
type: Boolean,
optional: true
}
Not sure if that helps...
+
As per #mark's suggestion, I added defaultValue:[]. It fixed the issue... sort of. There are no "open" nested schemas in the update form now, and other values can be changed. If you "add" a nested schema to the form with the add button, that entire form becomes required even if you don't insert any value in any field. This happens regardless of the Boolean type input.
I can nail down the Boolean type input in the nested schema causes that entire nested schema to become necessary to do the insert. Removing the Boolean input caused it to be insertable again. So there's a new problem in the same vein.
This new issue can be found here
I think the best solution is to add a defaultValue: [] to the address field in the schema. The behavior you described in the question (not allowing the update) is actually intended -- read on to see why.
The thing is, this behavior only exists if an array form element has already been added to the form. What I mean is, if you click the minus sign that removes the street, city, etc. inputs from the form, the update succeeds because AutoForm doesn't misinterpret the unchecked checkbox as the user explicitly unchecking the box (and therefore setting the value to false). Setting the defaultValue to an empty array lets AutoForm know to not present the address form unless the user has explicitly clicked the plus sign (i.e, they have an address they want to enter), in which case the behavior of making the street, city, etc. fields required is what you want.
Note that this means you'll have to update the existing documents in your collection that are missing the address field and set it to an empty array. Something like this in the mongo shell:
db.people.update({ "address": { $exists: false } }, { $set: { "address": [] } }, { multi: true })
You'll probably want to make sure that the query is correct by running a find on the selector first.
Edit
If the behavior you want is to show the sub-form without making it required, you can work around the checkbox issue by using the formToDoc hook and filtering out all address objects that only have the active_address field set to false (the field that AutoForm mistakenly adds for us).
AutoForm.addHooks('yourFormId', {
formToDoc: function (doc) {
doc.address = _.reject(doc.address, function (a) {
return !a.street && !a.city && !a.active_address;
});
return doc;
}
});
The formToDoc hook is called every time the form is validated, so you can use it to modify the doc to make it so that AutoForm is never even aware that there is an address sub-field, unless a property of it has been set. Note that if you're using this solution you won't have to add the defaultValue: [] as stated above.
I have a store + model which is connected to a 3rd party plugin (Ext.ux.TouchGridPanel). The plugin calls the store's sort() method properly with the relevant mapping. Everything is working fine, and the store sorts itself. However, I would prefer to add customer sorting to the store. I have tried adding a sortType field into my model:
Ext.regModel("Transactions", {
fields: [
{
name: 'unit',
type: 'string',
sortType: function(value) {
console.log('PRINT GDAMNIT');
return 0;
}
},
...
]
});
This, however, is not working, and the sortType is not getting called.
TLDR: How to make custom sorting work for stores?
Your store will need a sorter added that will sort on that field before it will call the sortType function.
var store = new Ext.data.Store({
model: 'Transactions',
sorters: [
{
property: 'unit',
direction: 'DESC'
}
]}
);
Sort type converts the value of a field into another value to ensure proper ordering. If you aren't sorting on that field than there is no reason to call that function. You could add the sortDir to the field which would sort the field into ascending/descending order based on the type of the field alone.
A workaround might be to (I know this sounds inefficient but bear with me) add an extra field to your model instances (lets call it sortField) and use that for your sorting function. You can then loop through your model instances in your store applying your custom sorting algorithm and assign a sort value of 0,1,2,3,4,5 etc.. to sortField. Then in your store, you can add 'sorters: 'sortField'... Hope this helps a bit, I'm going through something similar at the current moment.
The custom SortType in Sencha Touch 2 works accordingly, as per http://docs.sencha.com/touch/2-0/#!/api/Ext.data.SortTypes:
Ext.apply(Ext.data.SortTypes, {
asPerson: function(person){
// expects an object with a first and last name property
return person.lastName.toUpperCase() + person.firstName.toLowerCase();
}
});
Ext.define('Employee', {
extend: 'Ext.data.Model',
config: {
fields: [{
name: 'person',
sortType: 'asPerson'
}, {
name: 'salary',
type: 'float' // sortType set to asFloat
}]
}
});
What you're attempting to do might be tricky. Calling store.sort() removes all existing sorters by default (according to Sencha Touch API documentation). To keep the existing sorters you would need to add the sorter to the MixedCollection store.sorters.
Secondly, to call the sort method with a custom sort function, you would need to pass a specific sorterFn instead of property to the Sorter (again, see the API for more details) - but this might prove tricky since the sort call is initiated from the plugin.
Not sure if this helps to solve your problem, but maybe it assists you to look at the right direction.