Move records from Development to Production in CloudKit - production

I did something silly and made hundreds of records in Development environment in CloudKit. A previous thread mentioned that the records could be downloaded into a file and re-uploaded to the Production environment. Is there any other way I could do this, and if not, how would I go about downloading in records and storing it into a file?
Thanks in advance!

There is no option do do it in one run. You need one app that is connected to the development environment for reading your records. Then if you want to write to the production environment, you can only do that by re-signing your app. So indeed you first need to download all data, store them somewhere, and then writing them back to your production database.
Since the CKRecord complies to the NSCoding protocol you can write the results of your query directly to a file using:
NSKeyedArchiver.archiveRootObject(records, toFile: filePath)
Then if you want to read that file you can use:
result = NSKeyedUnarchiver.unarchiveObjectWithFile(filePath)

Related

Where can I find the database Room.DataBaseBuilder(...).build() creates?

I'm quite new to android and I am currently working on a app which should utilize a Room database. Following the documentation a room database can be created through the following lines:
myDatabase = Room.databaseBuilder(appContext, MyDatabase.class, "MyDB")
.build();
Now where did room create the database file?
It can't be found in my project folder.
The documentation doesn't mention anything about it and -generally speaking- barely gives any information about how this thing works.
Where is the database?
Does DatabaseBuilder.build() manage, to open the existing database created from previous app launches?
The list of questions is long.
Any information about the .build() thing aswell as further information about Room (misconceptions etc.) are very appreciated, for the documentation doesn't really make things clear for me.
Thank you!
Now where did room create the database file?
The database (a file) will be placed at the default location on the actual device which is data/data/<the_package_name>/database/MyDB.
In your case, as you have coded :-
myDatabase = Room.databaseBuilder(appContext, MyDatabase.class, "MyDB")
.build();
Then the database files will be: -
data/data/<your_package_name>/databases/MyDB
data/data/<your_package_name>/databases/MyDB-wal
data/data/<your_package_name>/databases/MyDB-shm
It can't be found in my project folder.
The database file is not part of the project, it is a file that is created and maintained on the actual device on which the App has been installed.
However, you can use Database Inspector (now App Inspection) on Android Studio to view the database e.g. :-
You can also view the files, if whatever device you test on allows access, by using Device File Explorer. e.g.
Does DatabaseBuilder.build() manage, to open the existing database created from previous app launches?
Yes, if the file exists then it is opened otherwise the file is created. If you uninstall the App this effectively delete's the file. The whole idea of a database is that it persists.
The build() undertakes various tasks, primarily seeing if the underlying file exists and then opening the file. In doing so it
extracts the version number that is stored in the file and compares the number against the number coded within the App (via the #Database).
If the version number from the App is greater then an attempt is tried to find a Migration (recently AutoMigration's have been added to Room).
compares the expected schema (according to the entities defined as part of the #Database), against what is found in the file.
A mismatch will result in the app crashing, so fixes would have to be made.
Note references to file is a simplification, by default Room uses a loggin mode called WAL (Write-Ahead Logging). In WAL mode there will be an additional 2 files that the SQLite routines maintain (you don't need to do anything):-
the database file name suffixed with -wal is the primary wal file into which changes are written (they are applied to the main database automatically).
the database file name suffixed with -shm (this is a WAL file for the WAL file).

Simple Local Database Solution for Ruby?

I'm attempting to write a simple Ruby/Nokogiri scraper to get event information from multiple pages and then output it to a CSV that is attached to an email sent out weekly.
I have completed the scraping components and the CSV component and it's working perfectly. However, I now realize that I need to know when new events are added, which means I need some sort of database. Ideally I would just store this locally.
I've dabbled a bit with using the ruby gem 'sequel', but the data does not seem to persist beyond the running of the program. Do I need to download some database software to work with 'sequel'? Also I'm not using the Rails framework, just Ruby.
Any and all guidance is deeply appreciated!
I'm guessing you did Sequel.sqlite, as in the first example in the Sequel README, which creates an in-memory SQLite database. To create a database in your filesystem instead of memory, just pass it a path, e.g.:
Sequel.sqlite("./my-database.db")
This is, of course, assuming that you have the sqlite3 gem installed. If the given file doesn't exist, it will be created.
This is covered in the Sequel docs.

Store data in selenium webdriver

Im looking for a way to store data in selenium for use in future tests.
Im using jenkins, maven + selenium and testng.
How can i store some data, lets say i want to run test, get some data from website (weather forecast). Store it somewhere and next day run test to check if forecast match todays weather.
I can store it in txt file, and parse by regex but im sure there is better way to do it?
You have to consider what "selenium webdriver" is in this context. It is a Java app. It "exists" only when it is running. Once the run stops, it is purged from memory, including all data it held. If you are using JUnit or TestNG (as you specified), then this data is purged even more frequently: after every the class.
To accomplish what you are asking, you will need something external to your tests. You can certainly utilize a txt file as you suggested. A spreadsheet might do for your purposes. Most applications utilize an entire database.
You also mentioned Jenkins. This will make the external storage a more interesting problem, as Jenkins often purges the current working directory before each run.

Read application log written on Windows Azure

I have 10 applications they have same logic to write the log on a text file located on the application root folder.
I have an application which reads the log files of all the applicaiton and shows details in a web page.
Can the same be achieved on Windows Azure? I don't want to use the 'DiagnosticMonitor' API's. As I cannot change logging logic of application.
Thanks,
Aman
Even if technically this is possible, this is not advisable as the Fabric Controller can re-create any role at a whim (well - with good reasons, but unpredictable none-the-less) and so whenever this happens you will lose any files stored locally on a role.
So - primarily you should be looking for a different place to store those logs, and there are many options, but all require that you change the logging logic of the application.
You could do this, but aside from the issue Yossi pointed out (the log would be ephemeral; it could get deleted at any time), you'd have a different log file on each role instance (VM). That means when you hit your web page to view the log, you'd see whatever happened to be on the log on that particular VM, instead of what you presumably want (a roll-up of the log files across all VMs).
Windows Azure Diagnostics could help, since you can configure it to copy log files off to blob storage (so no need to change the logging). But honestly I find Diagnostics a bit cumbersome for this. It will end up creating a lot of different blobs, and you'll have to change the log viewer to read all those blobs and combine them.
I personally would suggest writing a separate piece of code that monitors the log file and, for each new line, stores the line as an entity (row) in table storage. This bit of code could be launched as a startup task and just run continuously as a separate process (leaving everything else unchanged). Then modify the log viewer to read the last n entities from table storage and display them.
(I'm assuming you can modify the log viewer even if you can't modify the apps that log to the file.)
What about writing logs to something like azure storage table? Just need to define unique ParitionKey/RowKey, then you can easily retrieve the log for the web page.

Good way to demo a classic ASP web site

What is the best way to save data in session variables in a classic web site?
I am maintaining a classic web site and want to be able to allow my users to demo all functionality of the site, this means allowing them to delete records.
The closet example I have seen so far are the demos of Telerik controls where they are saving the dataset in sessions on first load and allowing the user to manipulate the data.
How can I achieve the same in ASP with an MS Access backend?
If you want to persist the state over multiple pages (e.g. to demo you complete application) then it's a bit tricky.
I would suggest copying the MDB file for each session and using the copied version. This would ensure that every session uses its own data.
create a version of your access db which will be used as a fresh template for each user
on session copy the template and name it after the users session ID
use the individual MDB
Note: Then only drawback I can see here is that you need to remove the unused MDB files as it can get a lot after sometime. You could do it with a scheduled task or even on session start before you create a new one.
I am not sure what you can use to check if it's used or not but check the files creation date or maybe the LDF file can help you as well (if it does not exist = unused).
You can store a connection or inclusive an object in a session variable as far you remember what kind of variable are you storing at the retrieving time. I had never stored a dataset in a session variable but I had stored a lot of arrays in session variables so you can use the ADO Getrows method to locate a complete dataset into a session variable.
How big is the Access database? If your database is small enough (relative to the server capacity, expected number of users, and so forth) then I like the idea of using a fresh copy of the database for each user that runs the demo.
With this approach, you simplify your possible code paths. Otherwise this "are we in demo mode or not?" logic will permeate a heck of a lot of your code.
I'd do it like this...
When the user begins the demo, make a copy of the Access DB for that user to use. If your db is foo.mdb, copy it to /tempdb/foo_1234567890.mdb where 1234567890 is the user's session ID.
Alter the user's connection string to point to the fresh database copy. From this point on, your app can operate like "normal" with no further modifications.
Have a scheduled task that deletes all files in /tempdb with last-modified times more than __ hours in the past. If you don't have the ability to schedule tasks on the server (perhaps you're in a shared hosting environment, etc) then you could do this at the same time you do step #1.

Resources