Good way to demo a classic ASP web site - session

What is the best way to save data in session variables in a classic web site?
I am maintaining a classic web site and want to be able to allow my users to demo all functionality of the site, this means allowing them to delete records.
The closet example I have seen so far are the demos of Telerik controls where they are saving the dataset in sessions on first load and allowing the user to manipulate the data.
How can I achieve the same in ASP with an MS Access backend?

If you want to persist the state over multiple pages (e.g. to demo you complete application) then it's a bit tricky.
I would suggest copying the MDB file for each session and using the copied version. This would ensure that every session uses its own data.
create a version of your access db which will be used as a fresh template for each user
on session copy the template and name it after the users session ID
use the individual MDB
Note: Then only drawback I can see here is that you need to remove the unused MDB files as it can get a lot after sometime. You could do it with a scheduled task or even on session start before you create a new one.
I am not sure what you can use to check if it's used or not but check the files creation date or maybe the LDF file can help you as well (if it does not exist = unused).

You can store a connection or inclusive an object in a session variable as far you remember what kind of variable are you storing at the retrieving time. I had never stored a dataset in a session variable but I had stored a lot of arrays in session variables so you can use the ADO Getrows method to locate a complete dataset into a session variable.

How big is the Access database? If your database is small enough (relative to the server capacity, expected number of users, and so forth) then I like the idea of using a fresh copy of the database for each user that runs the demo.
With this approach, you simplify your possible code paths. Otherwise this "are we in demo mode or not?" logic will permeate a heck of a lot of your code.
I'd do it like this...
When the user begins the demo, make a copy of the Access DB for that user to use. If your db is foo.mdb, copy it to /tempdb/foo_1234567890.mdb where 1234567890 is the user's session ID.
Alter the user's connection string to point to the fresh database copy. From this point on, your app can operate like "normal" with no further modifications.
Have a scheduled task that deletes all files in /tempdb with last-modified times more than __ hours in the past. If you don't have the ability to schedule tasks on the server (perhaps you're in a shared hosting environment, etc) then you could do this at the same time you do step #1.

Related

Store infrequently changing info in Spring App

I am working On a Microservice (Spring boot) that require to store some static information that infrequently changes (once per quarter). The data (below) is about the company reports that looks like
reportId#1: "frequency"="daily","to":"some email ids"
reportId#2: "frequency"="weekly", "to":"some emailids"
As you can see an entry in the data is basically a Report id, and associated attributes are frequency of reports and receiver's email id.
My question is.. What is the best place to store this information? I have some thoughts..and here are my views.
a) NoSQL DB like MongoDB seems to be a good option.. I can create a Collection and store it there and retrieve it once during app startup. But the I thought, whether creating a Collection just to store this static info is a good choice?
b) Redis seems to be another good option. I can create a template for above dataset and store it there. I can query the Redis based on the reportId to retrieve the frequency and senders list.
c) Store it in a file in the classpath and load at the app startup. The downside is that, I will have to redeploy the app with new changes in file whenever this report listing changes. I believe externalizing this information to either Mongo or Redis is a better option.
d) The app is running in the AWS..so I can even store this in a file in S3 bucket.
Would like to know your views?
Since the config will only change once a quarter, the overheard of a database is not required. You should consider Apache commons configuration. It will allow you to load config changes from files without the need for an application restart.
http://commons.apache.org/proper/commons-configuration///userguide/howto_reloading.html

Is it possible to make a runtime db connection and use it in Schema, DB and models without effecting configs?

I want to use dynamic databases on runtime without effecting config/database.php because of concurrent users.
I have a main db with a table that contains reference to several other dbs. Now at runtime I need to not only connect to those dbs but also may want to run migrations on them.
I am aware that this is possible by having a second connection entry in config.database.connections but I have a feeling that if two users hit the server at the same time, the physical config file changes may create a conflict.
I also read (and also experimented) that you can edit the second connection using below code at runtime:
\Config::set('database.connections.mysql2.database', 'somedynamicdb');
DB::purge('mysql2');
But I fear that if it persists changes for different users, then it may conflict for concurrent users. And if it does not persist changes, then it wont work for migrations.
I want to understand/know two things specifically:
What is the scope of this above code (i.e. Config::set() call)? Does it persist over different user calls to the server?
If I call migrations using Artisan::call('migrate') with a --database=connectionname clause, right after I change the db name in connectionname, will that use the dynamically set database or the physical config value?
UPDATE
Also worth noting that a call to Artisan::call('migrate') with a --database=connectionname, will make the new connection persist for the rest of your app call.
See here for details:
https://github.com/laravel/framework/issues/28253
Config::set will only apply for the request for which it was set, won't apply to any other requests, and will not persist beyond the request. If you're not processing a request (e.g. a CLI command) then it won't affect anything beyond the current PHP process.
As for Item #2, if you're invoking from the command line, you can just do DB_CONNECTION=connectionname php artisan migrate. If you need to invoke the artisan command from code, using Config::set is still the right way to go.
We use connection created on the fly here all time and works very well. We setup this on Middleware that we included after authentication and is only valid on the user current user request based on login information.

windows registry storage best practice

Background
I've recently been shunted into the world of windows programming and I'm still trying to find my way around the best practices and ways of doing things. So I was just hoping for some pointers on use of the registry
Not particularly relevant but the background is that I am creating an installer in Golang, a couple of points to get out the way on that:
I am aware MSI's would usually be best practice for an installer (I have my reasons for going custom exe)
I know there are more obvious language choices than golang, just go with it
Current registry use
As part of the install process, I store several pieces of data in the registry:
run once commands:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\RunOnce
I create a few entries here: to restart the process after a system reboot and to delete some temp files on reboot after uninstall
an uninstall entry:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall\Vendor
Product
Content here is the same as an MSI would create, I was careful not to create any additional custom fields here (all static data until uninstall)
an application entry:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Vendor\Product
I store some additional data about the installation here, some of which is needed for uninstall such as state info from before installation (again all static content)
a temporary entry:
Computer\HKEY_CURRENT_USER\SOFTWARE\Vendor\Product
I store some temporary data here which can include some sensitive user entered data (usernames/passwords). I run some symmetric encryption to obscure the data though my understanding is this is area of the registry is encrypted so only the user could access anyway (would like confirmation on that)
This data is used to resume after restart and then deleted
Questions
I'm looking for confirmation / corrections on my current use of the registry?
I now have need to pass some data between an application and a running service, this data would be updated every 1-2 minutes and would be a few bytes of JSON. Does the registry seem like a reasonable place to store variable data like this? If so is there a particular place that better for variable data - I was going to add it to:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Vendor\Product
HCKU isn't encrypted to my knowledge. It's stored in a file called NTUser.dat and could be loaded as a hive under HKEY_USERS and visible to other processes with sufficient rights to do so.
You would need to open up the rights to HKLM\SOFTWARE\Vendor\Product if you expect a user priv process to be able to write to it. If you want to pass data to a service you might want to use some sort of IPC pipe to do so. Not sure what's available in Golang for this.

Neo4j in memory db

I saw Neo4j can run as Impermanent DB for unit testing porpouses, I'm not sure if this fits my needs. I have my data stored in neo4j the usual way (persistent) but, starts from my data, I want to let each user start an "experimental session": the users add/delete nodes and relationships, but NOT in permanent way, just experimenting with the data (after that session the edits should be lost). The edits shouldn't be saved and obiouvsly they shouldn't be visibile to the others. What's the best way to accomplish that?
Using impermanent database should work. You would
need to import the data to each new database
spring-data-neo4j is not able to connect to multiple databases (in current release), you would need to start multiple instances of your application, e.g. in a tomcat container
when your application stops (or crashes) you would obviously lose data
Or you could potentially use only 1 database with the base data being public (= visible to everyone) and then for all new nodes/relationships you can add owner property.
When querying the data you would check the property is either public or the current user.
At the end of the session you would just delete all nodes and relationships with given owner.
If you also want to edit existing data then it gets more complicated, you could create a copy of the node/relationship and somehow handle that, or if it's not too large copy whole dataset.
You can build a docker image from the neo4j base image (or build your own) and copy your graph.db into it.
Then you can have every user start a docker container from said image.
If that doesn't answer your question, more info is needed.

Read application log written on Windows Azure

I have 10 applications they have same logic to write the log on a text file located on the application root folder.
I have an application which reads the log files of all the applicaiton and shows details in a web page.
Can the same be achieved on Windows Azure? I don't want to use the 'DiagnosticMonitor' API's. As I cannot change logging logic of application.
Thanks,
Aman
Even if technically this is possible, this is not advisable as the Fabric Controller can re-create any role at a whim (well - with good reasons, but unpredictable none-the-less) and so whenever this happens you will lose any files stored locally on a role.
So - primarily you should be looking for a different place to store those logs, and there are many options, but all require that you change the logging logic of the application.
You could do this, but aside from the issue Yossi pointed out (the log would be ephemeral; it could get deleted at any time), you'd have a different log file on each role instance (VM). That means when you hit your web page to view the log, you'd see whatever happened to be on the log on that particular VM, instead of what you presumably want (a roll-up of the log files across all VMs).
Windows Azure Diagnostics could help, since you can configure it to copy log files off to blob storage (so no need to change the logging). But honestly I find Diagnostics a bit cumbersome for this. It will end up creating a lot of different blobs, and you'll have to change the log viewer to read all those blobs and combine them.
I personally would suggest writing a separate piece of code that monitors the log file and, for each new line, stores the line as an entity (row) in table storage. This bit of code could be launched as a startup task and just run continuously as a separate process (leaving everything else unchanged). Then modify the log viewer to read the last n entities from table storage and display them.
(I'm assuming you can modify the log viewer even if you can't modify the apps that log to the file.)
What about writing logs to something like azure storage table? Just need to define unique ParitionKey/RowKey, then you can easily retrieve the log for the web page.

Resources