Meteor load static data once for several templates - performance

I'm working with meteor and FlowRouter. I have a collection of a country's administrative divisions and the data is about 2000 documents. I read this data in several routes so at the moment I'm subscribing to the same collection every time I visit one of the routes that is using this data.
This is causing a slow performance and a waste of resources. Given that this collection doesn't change, is there any way to load or subscribe to this data once and have it available for the whole app or specific routes?
Maybe save the data in settings.json and have it available as an object would be better?
Thanks in advance for any help.

You need to keep the subscriptions active between routes. You can do this using this package (written by the same author as FlowRouter so it all works nicely together):
https://github.com/kadirahq/subs-manager
Alternatively, create a Meteor method to return the data and save it in your Session. In this case it won't be reactive, so it depends on your needs.

Any subscription you make that's external to the routing will be in global scope, which will then mean that data from that subscription is available everywhere. All you need to do is set up the subscription say in the root layout file for your site and then that data will be kept in your local minimongo store at all times.
The Todo list collection in the Todo app example here is an example of this, this is the code from that example:
Tasks = new Mongo.Collection("tasks");
if (Meteor.isServer) {
// This code only runs on the server
Meteor.publish("tasks", function () {
return Tasks.find();
});
}
if (Meteor.isClient) {
// This code only runs on the client
Meteor.subscribe("tasks");
You can then query that local data as you would normally.

Related

What's the best way to propagate data through your Vue3 App?

Is there a recommended way of how to propagate data through your vue components? What I'm trying to do is get the data from backend once and propagate it everywhere in my project but I can't find the right strategy.
sessionStorage: Works great and resets on refresh/close window but as soon as you need to create target="_blank" anchor tags, it will not propagate your data to new tabs.
localStorage: Requires in my opinion more work than sessionStorage because you need to delete data manually to keep things tidy. One big problem for me is that it looks like you can't pass markdown and arrays properly, at least without stringify. I've built a project with localStorage and ended up sending ajax requests from most of my components because I couldn't propagate the data through my app how I wanted. At this point my frontend is backend.
My personal problems with localStorage: I am using the marked package to display Markdown but it throws errors if passed undefined. This gets problematic, when I want to use it in a reactive state because instead of resulting in undefined, it throws an error and crashes the whole app. The point I am trying to make is that when you pass an undefined localStorage value to marked in an either or field like so:
const state = reactive({
value: marked(localStorage.value) || ""
})
it crashes your app, if localStorage.value is empty.
Another problem is that I fetch text content depending on a locale and store it in localStorage. This is great until the user changes locale and all content strings have to be replaced by the translated strings. It gets really tricky, if I want to use one component as template to load in different locales.
vuex: I've tried vuex shortly and found it useful but didn't see the benefit over just using localStorage for my purposes. Prolly I'll give it another go.
How do you propagate data through your app?
There are a few good arguments why Vuex is better than Local Storage:
https://www.quora.com/What-is-the-benefit-of-using-Vuex-over-LocalStorage-to-store-the-state-of-an-application-in-Vue-js
You can also try composables. They are reusable functions (similar to mixins) in composition-api (you need composition-api plugin in vue2, in vue3 it is built-in). It can be also the place you store your data. It can be easier and more intuitive than Vuex.
First, create directory /composables and add javascript file (it's
good practice to create file beginning with use word) useState.js:
import { reactive, toRefs } from "vue";
const state = reactive({
isMenuOpened: false
});
const toggleMenuState = () => {
state.isMenuOpened = !state.isMenuOpened;
};
export default {
...toRefs(state),
toggleMenuState
};
toRefs converts all of the properties, to a plain object with
properties that are refs
Now you can use composable in vue components:
<script>
import useState from "./composables/useState";
export default {
setup() {
const { isMenuOpened, toggleMenuState } = useState;
return {
isMenuOpened,
toggleMenuState,
};
},
};
</script>
Demo:
https://codesandbox.io/s/happy-chandrasekhar-o05uv?file=/src/App.vue
About composition api and composables:
https://v3.vuejs.org/guide/composition-api-introduction.html
Since you have mentioned local storage and session storage, I believe you must be trying to share your state across tabs/windows rather than just different components on a single page. At this scale I don't think this is necessarily a VueJS specific issue/pattern. Generally speaking, you want data to be shared pretty much across process boundaries.
Session Storage used to be one of the most sensible ways because it is shared between one window and all the child windows it has created, until all of them are closed at which point the storage will be discarded as well. However, depending on your use cases, Chrome (within the past year) made a change to NOT inherit the session storage from the original window if the popup windows is opened as noopener, hence if you are relying on the noopener (and its performance implications), session storage is no longer usable for this purpose.
Vuex does not solve this issue neither. In fact, Vuex is pretty much irrelevant here. Given the application architecture implied, the state management capability Vuex brings to your app will be likely redundant because any state mutation will probably be submitted to your backend anyway. In some sense the Vuex store is on your backend rather than your frontend.
So we typically do one of the three approaches:
directly broadcast from backend to all frontend tabs. E.g. there is no state sync-ing directly between frontend tabs. Every single tab (child window) communicates directly with the server: it mutates the state by submitting actions to the server, and only the server can change the state and broadcast the changes back to all the tabs in real time (again, conceptually it feels like the Vuex store is on your backend)
use SharedWorker. SharedWorker is shared by all the browsing context with the same origin. It is activated the moment the first browsing context (of a certain origin) is created, and is kept alive until the last browsing context is destroyed. In some sense its sharing semantic is similar to that of the old session storage. You can use SharedWorker as the single entity to communicate to your backend. States can be maintained by the SharedWorker and accessed from the tabs in a RPC fashion. Or states can be maintained separately in each tab and SharedWorker just broadcast the changes to the tabs.
if you actually do not have a backend, but you just want to build multi-window single page application, you can make one of your tabs special and act as the owner of the state store. For all the child windows created from this "master" window, their local store will be a proxy - the actions they perform against the local store will be proxied over to the master window; and the master window performs the action in its store, and broadcast the changes to all the child windows.
By the way, I have used the word "store" many times, but I do not necessary mean the Vuex store. The store is just a shared place where you keep your state.

Send notifications from one laravel app to another

I have two different Laravel 5.4 apps, a restaurant menu system to recieve and manage orders, and one website from where customer can place their orders. Both apps run on different server(localy), which means, in my (windows)system I can run only one app at a time(localhost:8000). Both are using the same database tables. My question is how can I notify the restaurant menu system when user places an order from the website i.e., adding new row to Orders table in db? I need a notification as well as auto generate new row in the table like here:
Restaurant Menu System . I have tried doing it with JQuery Ajax, but failed as there is nothing to trigger the ajax function in order page. Tried JQuery setInterval() from here but it seems like a very inefficient way and also gives an error of Uncaught SyntaxError: Invalid or unexpected token. I want to be as smooth as Facebook notifications. Is there any package or trick to do it?
The website looks just like any other ecommerce website with a cart and checkout system from where user can pay and place orders. Any leads is appreciated.
You have two options that I can think of.
One is a technique called comet which I believe Facebook uses or at least used at one point. It basically opens an ajax connection with your server and your server will occasionally check to see if there are any changes, in your case new orders, and when there is, will respond to the request appropriately. A very basic version of what that might look like is...
while (true) {
$order = Order::where('new', 1)->first();
if ($order !== null) {
$order->new = 0;
$order->save();
return $order;
}
sleep(5); // However long you want it to sleep for each time it checks
}
When you open up an ajax connection to this, it's just going to wait for the server to respond. When an order is made and the server finally does respond, your ajax function will get a response and you will need to do two things.
Show the order and do whatever other processing you want to do on it
Re-open the connection which will start the waiting process again
The disadvantage to this approach is it's still basically the setInterval approach except you've moved that logic to the server. It is more efficient this way because the biggest issue is it's just a single request instead of many so maybe not a big deal. The advantage is it's really easy.
The second way is a little bit more work I think but it would be even more efficient.
https://laravel.com/docs/5.4/broadcasting
You'd probably have to setup an event on your orders table so whenever anything is created there, it would use broadcasting to reach out to whatever javascript code you have setup to manage that.

Asynchronous data loading in flux stores

Say I have a TodoStore. The TodoStore is responsible for keeping my TODO items. Todo items are stored in a database.
I want to know what is the recommended way for loading all todo items into the store and how the views should interact with the store to load the TODO items on startup.
The first alternative is to create a loadTodos action that will retrieve the Todos from the database and emit a TODOS_LOADED event. Views will then call the loadTodos action and then listen to the TODOS_LOADED event and then update themselves by calling TodoStore.getTodos().
Another alternative is to not have a loadTodos action, and have a TodoStore.getTodos() that will return a promise with the existing TODO items. If the TodoStore has already loaded the TODO items, it just returns them; if not, then it will query from the database and return the retrieved items. In this case, even though the store now has loaded the TODO items, it will not emit a TODOS_LOADED event, since getTodos isn't an action.
function getTodos() {
if (loaded)
return Promise.resolve($todoItems);
else
return fetchTodoItemsFromDatabase().then(todoItems) {
loaded = true;
$todoItems = todoItems;
return $todoItems;
});
}
I'm sure many will say that that breaks the Flux architecture because the getTodos function is changing the store state, and store state should only be changed though actions sent in from the dispatcher.
However, if you consider that state for the TodoStore is the existing TODO items in the database, then getTodos isn't really changing any state. The TODO items are exactly the same, hence no view need to be updated or notified. The only thing is that now the store has already retrieved the data, so it is now cached in the store. From the View's perspective, it shouldn't really care about how the Store is implemented. It shouldn't really care if the store still needs to retrieve data from the database or not. All views care about is that they can use the Store to get the TODO items and that the Store will notify them when new TODO items are created, deleted, or changed.
Hence, in this scenario, views should just call TodoStore.getTodos() to render themselves on load, and register an event handler on TODO_CHANGE to be notified when they need to update themselves due to a addition, deletion, or change.
What do you think about these two solutions. Are they any other solutions?
The views do not have to be the entities that call loadTodos(). This can happen in a bootstrap file.
You're correct that you should try your best to restrict the data flow to actions inside the dispatch payload. Sometimes you need to derive data based on the state of other stores, and this is what Dispatcher.waitFor() is for.
What is Flux-like about your fetchTodoItemsFromDatabase() solution is that no other entity is setting data on the store. The store is updating itself. This is good.
My only serious criticism of this solution is that it could result in a delay in rendering if you are actually getting the initial data from the server. Ideally, you would send down some data with the HTML. You would also want to make sure to call for the stores' data within your controller-views' getInitialState() method.
Here is my opinion about that, very close to yours.
I maintain the state of my application in Store via Immutable.Record and Immutable.OrderedMap from Immutable.js
I have a top controller-view component that get its state from the Store.
Something such as the following :
function getInitialState() {
return {
todos: TodoStore.getAll()
}
}
TodoStore.getAll methods will retrieve the data from the server via a APIUtils.getTodos() request if it's internal _todos map is empty. I advocate for read data triggered in Store and write data triggered in ActionCreators.
By the time the request is processing, my component will render a simple loading spinner or something like that
When the request resolves, APIUtils trigger an action such as TODO_LIST_RECEIVE_SUCCESS or TODO_LIVE_RECEIVE_FAIL depending on the status of the response
My TodoStore will responds to these action by updating its internal state (populating it's internal Immutable.OrderedMap with Immutable.Record created from action payloads.
If you want to see an example through a basic implementation, take a look to this answer about React/Flux and xhr/routing/caching .
I know it's been a couple of years since this was asked, but it perfectly summed up the questions I am struggling with this week. So to help any others that may come across this question, I found this blog post that really helped me out by Nick Klepinger: "ngrx and Tour of Heroes".
It is specifically using Angular 2 and #ngrx/store, but answers your question very well.

Difference between Dojo Data and dojo ajax

I am wondering when to use dojo data and for example 'dojo.data.itemfilereadstore' to get data from the server and when you should choose to use ajax and for example 'dojo.xhrGet' to retrieve data from the server.
Let me take as an example my homepage where I give my user an overview off items. He can filter this in a way that he can choos to retrieve items off type A, type B or items off type A and B.
Should I use:
dojo.xhrGet({
url: "get-items.php", //json result
load: function(response) {
showItems(respone.items);
}
});
OR
dojo.data.itemfilereadstore
Those two things have vastly different purposes:
dojo.xhr is a data transport - its main purpose is sendig and receiving messages from the server.
dojo.data is a data store - its main purpose is to represent a collection of data items, supporting things like querying, monitoring for updates, etc. The fact that some data stores support being initialized directly from the server is just a coincidence. Those features are there purely for convenience.
So
Use dojo.xhrGet if you just need to fetch data once and then are done with it.
Use a data store if you want to use the extra functionality from the datastore interface. (like serving as a model for Tree widgets or watching for updates in an MVC style)
Btw, since 1.6 there is a new dojo.store API as a leaner alternative to dojo.data. Keep that in mind when deciding wether to use a datastore.

Storing, Loading, and Updating a Trie in ASP.NET MVC 3

I have a trie-based word detection algorithm for a custom dictionary. Note that regular expressions are too brittle with this dictionary as entries may contain spaces, periods, etc.
I've implemented the algorithm in a local C# app that reads in the dictionary from file and stores the trie in memory (it's compact, so no RAM size issues at all). Now I would like to use this algorithm in an MVC 3 app on a cloud host like AppHarbor, with the added twist that I want a web interface to enable adding/editing words.
It's fast enough that loading the dictionary from file and building the trie every time a user uploads their text would not be an issue (< 1s on my laptop). However, if I want to enable admins to edit the dictionary via the web interface, that would seem tricky since the dictionary would potentially be getting updated while a user is trying to upload text for analysis.
What is the best strategy for storing, loading, and updating the trie in an MVC 3 app?
I'm not sure if you are looking for specific implementation details, or more conceptual ideas about how to handle but I'll throw some ideas out there for now.
Actual Trie Classes - Here is a good C# example of classes for setting up a Trie. It sounds like you already have this part figured out.
Storing: I would persist the trie data to XML unless you are already using a database and have some need to have it in a dbms. The XML will be simple to work with in the MVC application and you don't need to worry about database connectivity issues, or the added cost of a database. I would also have two versions of the trie data on the server, a production copy and a production support copy, the second for which your admin can perform transactions against.
Loading In your admin module of the application, you may implement a feature for loading the trie data into memory, the frequency of data loading depends on your application needs. It could be scheduled or available as a manual function. Like in wordpress sites, if a user should access it while updating they would receive a message that the site is undergoing maintenance. You may choose to load into memory on demand only, and keep the trie loaded at all times except for if problems occurred.
Updating - I'd have a second database (or XML file) that is used for applying updates. The method of applying updates to production would depend partially on the frequency, quantity, and time of updates. One safe method might be to store transactions entered by the admin.
For example:
trie.put("John", 112);
trie.put("Doe", 222);
trie.Remove("John");
Then apply these transactions to your production data as needed via an admin function. If needed put your site into "maint" mode. If the updates are few and fast you may be able to code the site so that it will hold all work until transactions are processed, a user might have to wait a few milliseconds longer for a result but you wouldn't have to worry about mutating data issues.
This is pretty vague but just throwing some ideas out there... if you provide comments I'll try to give more.
1 Store trie in cache:
It is not dynamic data, and caching helps us in other tasks (like concurrency access to trie by admin and user)
2 Make access to cache clear:
:
public class TrieHelper
{
public Trie MyTrie
{
get
{
if (HttpContext.Current.Cache["myTrieKey"] == null)
HttpContext.Current.Cache["myTrieKey"] = LoadTrieFromFile(); //Returns Trie object
return (Trie)HttpContext.Current.Cache["myTrieKey"];
}
}
3 Lock trie object while adding operation in progress
public void AddWordToTrie(string word)
{
var trie = MyTrie;
lock (HttpContext.Current.Cache["myTrieKey"])
{
trie.AddWord(word);
} // notify that trie object locking when write data to file is not reuired
WriteNewWordToTrieFile(word); // should lock FileWriter object
}
}
4 If editing is performs by 1 admin at a time - store trie in xml file - it will be easy to implement logic of search element, after what word your word should be added (you can create function, that will use MyTrie object in memory), and add it, using linq to xml.
I've got a kind'a the same but 10 times bigger :)
The client design it's own calendar with questions ans possible answer in the meanwhile some is online and being used by the normal user.
What I come up was something as test and deploy. The Admin enters the calendar values and set it up correctly and after he can use a Preview button to see if it's like he needs/wants, then, to make the changes valid to all end users, he need to push Deploy.
He, as an ADMIN, will know that, until he pushes the DEPLOY button, all users accessing the Calendar will have the old values. Soon he hits deploy all is set in the Database, and pushed the files he uploaded into Amazon S3 (for faster access).
I update the Cache with the new calendar and the new Calendar object is cached until the App pool says otherwise or he hit the Deploy button again.
You could do something like this.
As you are going to perform your application in the cloud environment, I'd suggest you to take a look at CQRS and durable messaging and provide some concurrency model (possibly, optimistic concurrency and intelligent conflict detection http://skillsmatter.com/podcast/design-architecture/cqrs-not-just-for-server-systems 5:00)
Also, obviously, you need to analyze your business requirements more precisely because, as Udi Dahan mentioned, race conditions are result of the lack of business analysis.

Resources