I making Notes PWA App that work offline and online.
I am saving notes records on IndexedDB with Dexie.js. I am using Quasar as frontend and backend as Laravel 8.
I want to sync IndexedDB Database with remote database. How can I do?
Look at Dexie Cloud - David has worked this out for Postgres.
If you want to write your own, it is a LOT MORE WORK than you think.
We wrote our own to work with SQL Server and C#
(it is proprietary, I can't share it)
Months later, we are still finding edge cases - there are many edge cases.
If you still want to write your own, you are probably mistaken..
If records can be created client side - use GUID identifiers
If it is an existing database with integer identifiers, add a GUID identifier field.
Set aside a few months of time to work out the rest.
Related
We have developed an internal crm and used it for the last months. Now we have decided to open it to the public as a Saas project and I'm wondering which is the best solution to upgrade the database structure that actually is made for only one company and expand it to be able to manage multiple paying customers.
At the moment the scheduled solution is to add a "customer" field to every column in the database and upgrade the backend logic to use this field.
Are there more elegant solutions to this problem?
The database is mySql and the backend is made with laravel.
CRM Data can be very sensitive and you need to be extremely careful not to "leak" data to wrong customers.
For an existing app, I would argue for a system to create fresh DB for each customer.
You would have 1 codebase that connects to customer specific DB.
This way you dont need to change too much in your current DB structure, but "just" implement the mechanism to use the correct DB according to customer account.
This is how I would do it :
In any wah this is a massive paradigm change from an internal app to a SAAS platform app, and you should identify the necessary steps to go through to achieve the desired result.
I have just developped a Human Ressource WebApp in Laravel.
So in this app, I have my user database.
Now, I have 2 old apps, that I'm about to rewrite / Migrate to Laravel.
The first one is to manage employee payments.
The second app is for monitoring my salesforce in field.
So, Basically, I will basically rewrite them.
My Question is about architecture. I will develop those 2 apps separating APIs in Lumen / Laravel for each app, individually, and then writing clients in Angular2.
I will use OAuth2.0 to set authorization and scopes between apps.
Now, My indecision resides on Database part.
Should I keep database separated, include database in the same schema, or how should I deal with my databases.
Today, I have 1 DB per app, but also, I have redundancy in users: Each App works with the same users, and so duplicate this info.
Right now, When User is created in App1, it calls APIs to create it in app2 and app3. This is not so nice, IMHO.
So, I think I would like to have a single User Database, but I don't really have experience of that:
should I extract User's info in a central database for all users, should
should I keep User's info duplicated between my DB?
if I keep it in 3 differents DB, how should I make JOINs, transactions, Foreign Keys, etc.
I know there is not 1 answer, it depends of what I want, but this is also my problem, I don't know which question should I be asking to make my decision???
It is important to mention that I don't have so much traffic, max 1000 users, and they all belongs to my enterprise, it will not increase to much. So, really, make a scallable app should not be my priority right now.
Any Advice will be appreciated!!!
Since you have shared databases between your apps, the common practice is to have it in the same Laravel/Lumen app.
The way you can achieve this is separating the apps inside route groups, namespacing the controllers hierarchies of each app and declaring inside each model the connection you will be using for this model.
This way you can use the same DB connections for all of your apps, and share the same data without the need of duplicating it.
Say I have 10K users for my app and I want to switch to my own custom server for backend. I have seen the Parse export functionality but I don't get how it can help me in this situation.
I mean even if I export all data and make updates to app so that it makes calls to my new custom server, still, it will take months for all my users to use updated version of app(many users don't update immediately, my last update on fb was year ago).
Also, during this transition half of my users would be having their
data on my custom server and other half(those who haven't updated)
would be using parse server, so for queries that require all data in one place this becomes an issue (I could solve this via replication but imagine how slow it would be in realtime to push the data to both - my server and parse.com).
Has anyone thought about this ?
What you could do is when you release a new version of your app, when a user logs in and they are on parse, migrate their data at that point to the new server and from that point on that user uses the custom server. That way users move to the new server as they upgrade, I always have a flag that is fetched from my server to force the user to upgrade if is needed. Hope that helps.
Copying data over to your new backend periodically until you have finalize your mobile client code and then allow the user to update their app on the App Store or Google Play Store would provide the switch over. Doing that elegantly would be dependent on the type of app and user base you have for the app. I wrote up a part 1 of a blog on these considerations for migrating over from Parse to Couchbase Mobile stack and the reasons why to consider the stack.
If you can already attach a new system in place to have new data in two places (Parse and customer backend) then the copy and merge in the future might be easier to handle but this is case by case. Then when on mobile app update, you can depreciate the server. Or push data to have local store for those users who will be on older versions since Parse will eventually stop working. Any new experiences will require update to the new App version.
It seems to me as a wise idea to test run my workflow on a local server before deploying in at the customer's. To be entirely sure, I'd like to copy all the data from their DB to my test organization (I have full access rights). The problem is that I can't see any straightforward way to export the whole shabang to a XML Spreadsheet.
What's the best way to export/import everything from/to a DB? The source and the target servers are not the same.
Of course I've got the option of backing up the clients DB and restore it, would the brown stuff hit the fan, but it'll far more professional if I won't have to.
The client's DB is in the cloud, which makes me suspect that perhaps I won't be able to access it at all and as far I can see, there's no way to back-up the data there. Am I missing it or is it that bad?
I fully agree that would be sensible. Usually we have a number development and test servers for all our work, generally we do not exactly mirror the data in the client database however.
We create a representative sample of data in our dev servers and then just move across the Crm solution for deployment.
As far as I know there is not straight forward way to get all the data, if you really want to do this I would suggest taking a back up of their database and importing to yours.
(As a side note, not all clients are happy for copies of their database - especially if its a live system - to be taken off site. Personally if it is a live database I wouldn’t put that risk on yourself, if the data gets lost or leaked you might suffer the consequences).
James raises good points about the business aspects of your request, however to get hold of the record-level data there are few options. The easiest by far is a wholesale export and import of the underlying SQL database. (For the record, the alternative is to do a data migration from live into a different db but this is no small task so I won't even entertain that any further here).
You mention that the client is using CRM Online ("...client's DB is in the cloud..."). You can raise a (free) support request with CRM Online Support who will provide you with a copy of the YourOrg_MSCRM database which can then be reimported into an on-premise deployment.
If you wish to simply have a test instance that has a copy of the Microsoft CRM Online organization, Microsoft does provide a means to do that. Depending on how many professional user licenses that the customer has, this may be free, but could be an extra cost and both instances would count against the storage limit for Microsoft CRM Online. You can see full details here - https://community.dynamics.com/crm/b/crmteamblog/archive/2014/03/20/introducing-sandbox-instances-in-crm-online.aspx . You can see steps on how to setup a sandbox instance here - https://technet.microsoft.com/en-us/library/dn467371.aspx "Add an instance to your subscription". This is something that I have used with one of our Microsoft CRM customers as it was a very good way to help validate the Scribe Online migration and customization changes we were making before moving those into production. The nice thing about doing it this way is that everything is still contained in the same Office 365 tenant and you can limit which users have access to the Sandbox organization, which is important for customers in knowing that their data is safe and not on some unknown server or machine.
I need the document or form to be automatically updated when the data in the Oracle database is updated. If not possible, could anyone give me guidance to a solution similar to this?
Thanks for the help.
Oracle supports triggers in Java, so you could execute some code when data is changed in the DB. Sharepoint 2007 does support WebServices, so you could create a client which calls the web service to update the form.
But that sounds more simple than it is. The documentation for the web services in Sharepoint ... uh ... could be better. Many installations of Sharepoint insist on domain logins, so you would need to figure out a way to run Oracle with a Windows Domain logon.
In the end, it's probably more simple to create an email when the data changes in Oracle and have someone manually update the form.
That said, you paid a lot of money to Microsoft for Sharepoint, so they are obliged to tell you which API to use and they can probably even provide an example in, say C#. If all else fails, you can run a little server in C# which updates the form and which listens to data packets sent by a Java trigger in Oracle.
[EDIT] [Here is a blog post]2 to get you started with Java, Apache Axis and Sharepoint. Post a comment if you have any updates. LuUnfortunately, I'm no longer working at a company which uses Sharepoint.