For some reporting issues I have to use codes as
{{ Model1.Model2.Model3.name }}
in for loops. I know it is not the best way (or may be the worst) to use but things happened and now I have to figure out some way to make this load faster. Because although there are 300 rows it takes nearly 10 secs to load.
My question is, how can I cache some of these results which are not actually queries on the backend? Or would you suggest an other way to make page loading faster?
Have you tried using a different strategy to get the model meta data ? I mean, if you didn't setup anything on your config file, everytime you make a query, Phalcon must query the database first to know the "metadata" of the table(Columns, column type, nullable, etc).
You can change your strategy to annotation, or at least cache the table metadata.
Please check Phalcon documentation
Last 6 months I'm developing an Emberjs components.
I started had first performance problems when I tried develop a table component. Each cell in this table was Ember.View and each cell had a binding to object property. When the table had 6 columns and I tried to list about 100 items, it caused that browser was frozen for a while. So I discovered, that it is better write a function which return a string instead of using handlebars and handle bindings manually by observers.
So, is there any good practice how to use minimum of bindings? Or how to write bindings and not lose a lot of performance. For example... not use bindning for a big amount of data?
How many Ember.View objects is possible append to page?
Thanks for response
We've got an ember application that displays 100s of complex items on a single page. Each item uses out-of-box handlebars bindings to output about a dozen properties. It renders very quickly (< 100ms) and very little of that time is spent on bindings.
If you find the browser is hanging at 6 columns and 100 items there is for sure something else is wrong.
So, is there any good practice how to use minimum of bindings? Or how to write bindings and not lose a lot of performance.
Try setting Ember.LOG_BINDINGS = true to see what is going on. Could be the properties you are binding to have circular dependencies or costly to resolve. See this post for great tips on debugging:
http://www.akshay.cc/blog/2013-02-22-debugging-ember-js-and-ember-data.html
How many Ember.View objects is possible append to page?
I don't believe there is a concrete limit, for sure it will depend on browser. I wouldn't consider optimizing unless there were more than a few thousand.
Check out this example to see how ember can be optimized to handle very large tables:
http://addepar.github.com/ember-table/
Erik Bryn recently gave a talk about Ember.ListView which would drastically cut down the number of views on a page by reusing list cells which have gone out of the viewport.
Couple this technique with group-helper to reduce the number of metamorph script tags if your models multiple properties at the same time. Additionally, consider using {{unbound}} if you don't need any of the properties to live update.
Suppose I have to create functionalities A, B and C through custom coding in Drupal using hooks.
Either I can club three of them in custom1.module or I can create three separate modules for them, say custom1.module, custom2.module and custom3.module.
Benefits of creating three modules:
Clean code
Easily searchable
Mutually independent
Easy to commit in multi-developer projects
Cons:
Every module entry gets stored in the database and requires a query.
To what extent does it mar the performance of the site?
Is it better to create a single large custom module file for the sake of reducing database queries or break it into different smaller ones?
This issue might be negligible for small scale websites, let the case be for large scale performance oriented sites.
I would code it based on how often do I need function A , B and C
Actual example:
I made a module which had two needs
1) Send periodic emails based on user preference. Lets call this function A
2) Custom content made in a module . Lets call this function B
3) Social integration . Lets call this function C
So what I did is as Function A is only called once a week I made a separate module for it.
As for Function B and C I put it all together as they would always be called together.
If you have problems with performance then check out this link . Its a good resource for performance improvement.
http://www.vmirgorod.name/10/11/5/tuning-drupal-performance
It lists a nice module called boost. I have not used it but I have heard good things about it.
cheers,
Vishal
Drupal .module files are all loaded with every page load. There is very little performance related that can be gained or lost simply by separating functions into different .module files. If you are not using an op code cache, then you can get improved performance by creating .inc files and referencing those files in the menu items in hook_menu, such that those files only get loaded when the menu items are accessed. Then infrequently called functions do not take up memory space.
File separation in general is a very small performance issue compared to how the module is designed with respect to caching, memory use, and/or database access and structure. Let the code maintenance and dependency issues drive when you do or do not create separate modules.
i'm actually interested in:
Is it better to create a single large custom module file for the sake of reducing database queries or break it into different smaller ones?
I was poking around and found a few things regarding benchmarking the database. Maybe the suggestion here is to fire up the dev version and test. check out db benchmarking
now i understand that doesn't answer specifically but i'd have to say its unique to each environment. I hate to use that type of answer but i truly believe it is. Depends on modules installed, versions used, hardware and os tunables among many other things.
Beforehand let me thank you all !! Really guys you help a lot. When I will finish my web site and will have much time on watching how userbase is growing I will come here again and again to answer to another people questions(if I can )
So here is the problem.
I made a web-site on CodeIgniter. A social network engine. Something like phpfox, classmates_com or facebook.
It's right now somehow not multilingual, So the UI strings are in the view files, and next step will be move them to the language files.
I want the user to have ability to change the language. So I assume that in database user will have row "lang_local" which would be by default set to en, and then to any other language he will change .
So what is eating my nervs and enery is following.
I will make on this engine several demographic social networks,and I would like to manage theese web-sites in centralized manner with one backend . So whenever I would like to make a new web-network, I just add the domain settings install the script in new folder and add it in database sites
I see it like this
on every table in database like users,comments,messages,categories ,etc I will have a row site_id , and on each query add/update/delete I add a WHERE SITE_ID=XXX
and in table sites(site_id,site_name,domain_name) will have all domains , so that in backend I can filter data by website.
Is this a good way? What if i will need then to be multiserver, what about load balancing? Who can tell me what would be a right,PROFESSIONAL way? My maximum user limit for a database is something like for start 10.000 in one-two year 100.000users
There are loads of ways to do multi-site, but this is a perfectly good way to handle things. I use this approach in my internal work CMS.
The only downside is that it could potentially become massive and have performance issues. You may need to write an export script so you can grab everything belonging to a site then move them to their own install.
I've read a statement somewhere that generating UI automatically from DB layout (or business objects, or whatever other business layer) is a bad idea. I can also imagine a few good challenges that one would have to face in order to make something like this.
However I have not seen (nor could find) any examples of people attempting it. Thus I'm wondering - is it really that bad? It's definately not easy, but can it be done with any measure success? What are the major obstacles? It would be great to see some examples of successes and failures.
To clarify - with "generating UI automatically" I mean that the all forms with all their controls are generated completely automatically (at runtime or compile time), based perhaps on some hints in metadata on how the data should be represented. This is in contrast to designing forms by hand (as most people do).
Added: Found this somewhat related question
Added 2: OK, it seems that one way this can get pretty fair results is if enough presentation-related metadata is available. For this approach, how much would be "enough", and would it be any less work than designing the form manually? Does it also provide greater flexibility for future changes?
We had a project which would generate the database tables/stored proc as well as the UI from business classes. It was done in .NET and we used a lot of Custom Attributes on the classes and properties to make it behave how we wanted it to. It worked great though and if you manage to follow your design you can create customizations of your software really easily. We also did have a way of putting in "custom" user controls for some very exceptional cases.
All in all it worked out well for us. Unfortunately it is a sold banking product and there is no available source.
it's ok for something tiny where all you need is a utilitarian method to get the data in.
for anything resembling a real application though, it's a terrible idea. what makes for a good UI is the humanisation factor, the bits you tweak to ensure that this machine reacts well to a person's touch.
you just can't get that when your interface is generated mechanically.... well maybe with something approaching AI. :)
edit - to clarify: UI generated from code/db is fine as a starting point, it's just a rubbish end point.
hey this is not difficult to achieve at all and its not a bad idea at all. it all depends on your project needs. a lot of software products (mind you not projects but products) depend upon this model - so they dont have to rewrite their code / ui logic for different client needs. clients can customize their ui the way they want to using a designer form in the admin system
i have used xml for preserving meta data for this sort of stuff. some of the attributes which i saved for every field were:
friendlyname (label caption)
haspredefinedvalues (yes for drop
down list / multi check box list)
multiselect (if yes then check box
list, if no then drop down list)
datatype
maxlength
required
minvalue
maxvalue
regularexpression
enabled (to show or not to show)
sortkey (order on the web form)
regarding positioning - i did not care much and simply generate table tr td tags 1 below the other - however if you want to implement this as well, you can have 1 more attribute called CssClass where you can define ui specific properties (look and feel, positioning, etc) here
UPDATE: also note a lot of ecommerce products follow this kind of dynamic ui when you want to enter product information - as their clients can be selling everything under the sun from furniture to sex toys ;-) so instead of rewriting their code for every different industry they simply let their clients enter meta data for product attributes via an admin form :-)
i would also recommend you to look at Entity-attribute-value model - it has its own pros and cons but i feel it can be used quite well with your requirements.
In my Opinion there some things you should think about:
Does the customer need a function to customize his UI?
Are there a lot of different attributes or elements?
Is the effort of creating such an "rendering engine" worth it?
Okay, i think that its pretty obvious why you should think about these. It really depends on your project if that kind of model makes sense...
If you want to create some a lot of forms that can be customized at runtime then this model could be pretty uselful. Also, if you need to do a lot of smaller tools and you use this as some kind of "engine" then this effort could be worth it because you can save a lot of time.
With that kind of "rendering engine" you could automatically add error reportings, check the values or add other things that are always build up with the same pattern. But if you have too many of this things, elements or attributes then the performance can go down rapidly.
Another things that becomes interesting in bigger projects is, that changes that have to occur in each form just have to be made in the engine, not in each form. This could save A LOT of time if there is a bug in the finished application.
In our company we use a similar model for an interface generator between cash-software (right now i cant remember the right word for it...) and our application, just that it doesnt create an UI, but an output file for one of the applications.
We use XML to define the structure and how the values need to be converted and so on..
I would say that in most cases the data is not suitable for UI generation. That's why you almost always put a a layer of logic in between to interpret the DB information to the user. Another thing is that when you generate the UI from DB you will end up displaying the inner workings of the system, something that you normally don't want to do.
But it depends on where the DB came from. If it was created to exactly reflect what the users goals of the system is. If the users mental model of what the application should help them with is stored in the DB. Then it might just work. But then you have to start at the users end. If not I suggest you don't go that way.
Can you look on your problem from application architecture perspective? I see you as another database terrorist – trying to solve all by writing stored procedures. Why having UI at all? Try do it in DB script. In effect of such approach – on what composite system you will end up? When system serves different businesses – try modularization, selectively discovered components, restrict sharing references. UI shall be replaceable, independent from business layer. When storing so much data in DB – there is hard dependency of UI – system becomes monolith. How you implement MVVM pattern in scenario when UI is generated? Designers like Blend are containing lots of features, which cannot be replaced by most futuristic UI generator – unless – your development platform is Notepad only.
There is a hybrid approach where forms and all are described in a database to ensure consistency server side, which is then compiled to ensure efficiency client side on deploy.
A real-life example is the enterprise software MS Dynamics AX.
It has a 'Data' database and a 'Model' database.
The 'Model' stores forms, classes, jobs and every artefact the application needs to run.
Deploying the new software structure used to be to dump the model database and initiate a CIL compile (CIL for common intermediate language, something used by Microsoft in .net)
This way is suitable for enterprise-wide software and can handle large customizations. But keep in mind that this approach sets a framework that should be well understood by whoever gonna maintain and customize the application later.
I did this (in PHP / MySQL) to automatically generate sections of a CMS that I was building for a client. It worked OK my main problem was that the code that generates the forms became very opaque and difficult to understand therefore difficult to reuse and modify so I did not reuse it.
Note that the tables followed strict conventions such as naming, etc. which made it possible for the UI to expect particular columns and infer information about the naming of the columns and tables. There is a need for meta information to help the UI display the data.
Generally it can work however the thing is if your UI just mirrors the database then maybe there is lots of room to improve. A good UI should do much more than mirror a database, it should be built around human interaction patterns and preferences, not around the database structure.
So basically if you want to be cheap and do a quick-and-dirty interface which mirrors your DB then go for it. The main challenge would be to find good quality code that can do this or write it yourself.
From my perspective, it was always a problem to change edit forms when a very simple change was needed in a table structure.
I always had the feeling we have to spend too much time on rewriting the CRUD forms instead of developing the useful stuff, like processing / reporting / analyzing data, giving alerts for decisions etc...
For this reason, I made long time ago a code generator. So, it become easier to re-generate the forms with a simple restriction: to keep the CSS classes names. Simply like this!
UI was always based on a very "standard" code, controlled by a custom CSS.
Whenever I needed to change database structure, so update an edit form, I had to re-generate the code and redeploy.
One disadvantage I noticed was about the changes (customizations, improvements etc.) done on the previous generated code, which are lost when you re-generate it.
But anyway, the advantage of having a lot of work done by the code-generator was great!
I initially did it for the 2000s Microsoft ASP (Active Server Pages) & Microsoft SQL Server... so, when that technology was replaced by .NET, my code-generator become obsoleted.
I made something similar for PHP but I never finished it...
Anyway, from small experiments I found that generating code ON THE FLY can be way more helpful (and this approach does not exclude the SAVED generated code): no worries about changing database etc.
So, the next step was to create something that I am very proud to show here, and I think it is one nice resolution for the issue raised in this thread.
I would start with applicable use cases: https://data-seed.tech/usecases.php.
I worked to add details on how to use, but if something is still missing please let me know here!
You can change database structure, and with no line of code you can start edit data, and more like this, you have available an API for CRUD operations.
I am still a fan of the "code-generator" approach, and I think it is just a flavor of using XML/XSLT that I used for DATA-SEED. I plan to add code-generator functionalities.