using sqlite database with Windows store app - windows

I guess I am not the first one who encount this issue, but can't find much information after a bit of research. Here is my question:
A windows store app access a sqlite database, the database contains a
few tables, and it is read only. The size of database is 20 MB.
at the starting of the App, it will copy the database to
application folder (if it is not already there). It works fine,
when i test it manually (although it is not lighting fast). but it
always failed badly when testing again the certification test
toolkits, failed at the preformance test with "app crash" or "app
can start" error.
so my question is
1) is this the correct way of using sqlite database in windows
store app? (i mean using a 20MB database locally) or should i port
the data to cloud?
2) is the failure of the certification toolkit really matter? (
will it also means failure of publishing process?)
Thanks in advance

You are going on perfect way. If your app doesn't need Internet connectivity at all then don't go for cloud database. You should use extended splash screen to copy the database, you should not do that thing in App.xaml.cs. If you use cloud database then it will require more time for request-response. I think SQLite transaction is faster than that.
The certification may fail, if you are not using latest version of WACK. If your app fails WACK test, it won't be published.

Related

Self-Hosted Integration Runtime times out after ADF pipeline loads a few tables

I have recently installed Integration Runtime for a local server which hosts an Access DB. The idea is to pull data from it and store in in Azure SQL DB. I have done the following:
Integration Runtime Services Installed on local machine hosting the MS Access DB and connected to it using ODBC
Created linked services in ADF to connect to the DB
Created Datasets for source and destination DB for each table required. One for the source msaccess and one for the target in azure sql db
Created a pipeline to copy the data from the source and sink into the asql db mentioned in step #3
Basically, all the connections work however when I trigger my pipeline to load around 10 of these tables, it runs and loads the first two and then fails afterwards by timing out. I must restart the Integration Runtime everytime to get it back up and running otherwise I can no longer query the tables.
To mitigate this I figured there was too much traffic and the server needed to rest between calls so I added wait timers in between each step of the pipelines but not much success. It did help a bit but that might be coincidence.
The error log in the monitor spits out at the failed step is :
Error: 2200
ErrorCode=UserErrorFailedToConnectOdbcSource,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=,Source=Microsoft.DataTransfer.Runtime.GenericOdbcConnectors,''Type=System.Data.Odbc.OdbcException,Message=,Source=,'
btw the integration runtime install service running is 5.12.7984.1 and the version of access installed is through office 365 x64. The exact MS access driver is 16.14430.20006. The OS is Windows Server 2019
I am getting the exact same error. To start, I did check the event viewer logs and saw some errors to do with access. So I gave the user running the IR more access to the registry keys/general log on as a service rights. This helped a little, but I am still stuck with the same problem.
When copying from an Access DB located on the SHIR itself to flat files in lake storage, I encountered the same error.
Removing Office 365 from the machine, then re-installing the Access runtime solved the problem.
This answer is from #ezaidi comments above.

A couple of heroku postgres questions (just started, am lost)

I have provisioned postgres on my heroku app and also installed postgres locally to maintain parity (as the documentation recommends) with the online database but I'm also not understanding how this will work. Am I supposed to be accessing a local copy of a database when running on my own computer (while building and before deploying) and then using heroku's separate postgres database once it is deployed? If it is parity, shouldn't they both be using the heroku postgres database?
In other words, will my local app (during production) and heroku app (deployed and live) be using the same online postgres database?
Thanks.
Am I supposed to be accessing a local copy of a database when running on my own computer (while building and before deploying) and then using heroku's separate postgres database once it is deployed?
Yes, that's exactly it. Without seeing what bit of documentation you're referencing it's hard to say what they mean but perhaps there's another way to explain it.
In your local development environment, you may find that you need to test database schema changes (this is just one example, there are many). If you only had the one heroku postgres database you'd be forced to test these changes in production, which might result in poor usability for your users and that doesn't even account for the possibility of making a mistake and accidentally destroying your production data. There are a number of other shortcomings and challenges with this single database configuration.
For these reasons and more, it's best to keep your production data completely separated from your development/staging/test environment by creating a local/staging database. You might reasonably ask, "What about the data? I need data to test!". There are many ways to put together your test database and which you choose will likely depend on your needs. A shortlist of possibilities:
Use a seed file to generate mock data in your db
Use a model factory (usually runs in conjunction with your testing framework)
Take a dump of your production database, anonymize and redact sensitive information and use that for local testing.

Is 'quick.db" supported on Heroku?

I receive this error whenever I run an application which requires the quick.db
Here's the error:
There's nothing preventing you from building the quick.db dependency into your application but it will not work as expected. It uses sqlite under the hood which will attempt to store your data in the file system. On Heroku, that file system is ephemeral and your data will be wiped at least once per day.

Laravel Restore a Backup

I'm fairly new to server administration. I have my Laravel app up and running and I want to make sure it has proper backups. I have researched some backup packages and I have settled on https://github.com/spatie/laravel-backup.
However, once the server fails, I need to know how to use the most recent backup (which will be on AWS S3) to restore the database on the rebuilt server. Are there any suggestions for guides on how to do this? I can't seem to find any unless it doesn't really require much learning and instead just a couple mySQL commands.
Thanks!
I would use replication and within Laravel i would try to switch connection to the replica database server so things can run smoothly until the problem is resolved.
Take a look at this Cross-Region Replication
A typical production environment is automatically running backups on most important things that your deployment needs in order to recover from a failure. Those parts would commonly be your database and storage folder, and configuration files.
Also when you deploy a laravel application there aren't many things that are "worth" backing up , you can choose the entire disk to be mirrored somewhere or you can schedule a backup script which run every N times and backups the things that are more important to your application.
Personally i wouldn't rely on an package from laravel to handle my backups , you can always use other backup utilities, replication and so on.
Update
Take a look at the link below:
User Guide » Amazon RDS DB Instance Lifecycle » Backing Up and Restoring
Backing Up and Restoring
You can call the API function RestoreDBInstanceFromDBSnapshot as showed on example.
But i don't think something automated exists that would auto restore or magically make everything work, you need to do a lot of security checks if something like that would even be attempted. Final word i believe a good solution manually entering or sending the request would be the most solid solution.

how to access my database which is in pc from other pcs in a local network by using visual studio & C# windows application?

I make a software using C# & my database is msaccess..Now I want to make a gateway which will be install in all pcs in a local network & i want to make connection with that database which is in a single computer...but i dont know what is the procedure to do that in C# windows application..please solve my problems...thanks..
I try to upload my data from client pc to the database which is in a server..I dont know the codeing to do that..pls send me some code for that...advanced thanks..
It would be helpful to know a bit more about what you're trying to achieve (you can edit your question to provide more information) but fundamentally you've got two options:
As its an access database you can put the .mdb file in a shared folder and it will be accessible to multiple instances of your application (store the location in the configuration data for your application). This will work - in some cases very well - but access can be a bit slow running over a network as its all file based.
Create a self hosted "web" service (WCF ideally, but I can't remember what your options are for VS2005) that provides the methods you need to interact with the database and then connect from your client applications to the "server" application over the network.
The "best" solution will depend on the detail of your problem and what you're trying to achieve. If each instance of the application accessing the database directly is the preferred choice then I'd strongly advise you look at using SQL Server Express instead of access.

Resources