I am using Infragistics IGGrid with remote type pagination, sorting and filtering to display the records in MVC. In the server, the session state mode is "SqlServer".
I wanted to fetch 100000 records from database and due to nature of the IGGrid remote type feature, it will always be hitting the database.
As per the client policy I cannot hit the db and cannot store fetched content in external file for pagination, sorting and filtering.
As of now I am using Session to store those records and its working fine in local environment but after deployment on the Dev server I am getting error like I have to use Serializable to handle session.
Is there any alternate way to handle those huge dataset smoothly.
Related
I'm doing an application in Laravel that contains several connections to different databases, in which each one reads a service audit table. The application is to visualize logs of different applications.
To improve the reading speed, could be possible download every X minutes all data from different bases to a local base in Redis and read the queries directly from it?
You can do this via scheduled tasks:
https://laravel.com/docs/5.7/scheduling#scheduling-artisan-commands
This will allow you to run an Artisan Command
https://laravel.com/docs/5.7/artisan
In this command you can get the data from your DB and save it to your Redis table
To access mulitple Databases follow the details here:
https://laravel.com/docs/5.7/database#read-and-write-connections
And to setup redis here is the docs
https://laravel.com/docs/5.7/redis
All that you will need to do is track what you have transfered.
Get what you have not transferred and then save that data to the Redis table
I am using Laravel 5.5. I would like to store session data in a table. When the session is over, the data should be deleted. Can anyone help me accomplish this task?
I would suggest using Redis for storing and destroying session data within a Laravel application. One of the most apparent use cases for Redis is using it as a session cache.
If you are determined to store your session data in a database, check out this documentation that gives you the option on how to handle session data.
You need to change the session configuration file is stored in config/session.php to the database option and created the needed table.
I have a web service which processes inserts/updated data to DB. When client calls this webservice, UserId(currently logged in user to portal) will be sent in Request. I need to pass this userId to Db connection or set it in sys context for Audit purpose. we have existing audit tables and triggers to inserts/updates to Audit table after insert/update on actual table. So to track these changes I need to pass this UserId somehow to connection so that it can be retrieved from DB from Sys Context or $session and inserts in Audit table. I am currently using Spring and Hibernate transactions to process data with DB.
I tried to Set client info on Connection but it's not working. I tried below:
Session session=sessionFactory.getCurrentSession();
SessionImpl sImpl=(SessionImpl) session;
Connection connection=sImpl.connection();
connection.setClientInfo("ClientUser", "ABC");
And also I am trying to set client info by calling Stored procedure DBMS_APPLICATION_INFO.SET_CLIENT_INFO before performing operation on DB every time from application code.but I am not sue if it's a correct way to handle it.
I am trying it with both OCI and thin JDBC drivers but not able find a way to set this user id.
Can someone let me know if there is any efficient way to pass user id on sys context or with Connection. I am currently using hibernate4, Spring, Websphere Server, Oracle DB.
I am using Spring #Transactional to handle hibernate Connections and transactions to perform operation on DB.Connections are from Connection pool and I am using org.springframework.jndi.JndiObjectFactoryBean for dataSource.
is there any way to have interceptor or wrapper around connection to set it when we get the connection from connection pool.
Has anyone done this before?
This is described in spring data JDBC Extensions for the Oracle Database
Chapter
8.2 Configuration of a Custom DataSource Connection Preparer
...but you could implement a ConnectionPreparer that would use the current users login id. That way you can capture user login information even if your data source is configured with a shared user name.
This is a solution for oracle, which I think you are using. It should be also possible to adapt that to another database.
I would like to know if there's a way for ehcache to detect a database update/insert (spring/java, hibernate web application) and refresh its cache with the current(latest) data from the database if not, what would be the best way detect a database update so as to keep the data between and cache and database in sync?
Caching tools (ehcahce , infinispan) provides the solution other way round, i-e, to refresh the cache with DB data in configured interval , for example : every 15 or 30 minutes. Don't use cache, if the data keeps changing continuously...
I have the following issue:
Two instances of an application on two different systems should share a small database.
The main problem is that both systems can only exchange data through a network-folder.
I don't have the possibilty to setup a database-server somewhere.
Is it possible to place a H2 database on the network-folder and let both instances connect to the database (also concurrently)?
I could connect with both instances to the db using the embedded mode if I disable the file-locking, right?
The instances can perfom either READ or INSERT operations on the db. Do I risk data corruptions using multiple concurrent embedded connections?
As the documentation says; ( http://h2database.com/html/features.html#auto_mixed_mode
)
Multiple processes can access the same database without having to start the server manually. To do that, append ;AUTO_SERVER=TRUE to the database URL. You can use the same database URL independent of whether the database is already open or not. This feature doesn't work with in-memory databases.
// Application 1:
DriverManager.getConnection("jdbc:h2:/data/test;AUTO_SERVER=TRUE");
// Application 2:
DriverManager.getConnection("jdbc:h2:/data/test;AUTO_SERVER=TRUE");
From H2 documentation:
It is also possible to open the database without file locking; in this
case it is up to the application to protect the database files.
Failing to do so will result in a corrupted database.
I think that if your application use always the same configuration (shared file database on network folder), you need to create an application layer that manages concurrency