We have been running one server for the past few months and it contains all the files, SQL data, and is running as our server. We have recently bought 2 more servers to use replication because our database load was so high.
We are going to use a simple master slave replication using transaction replication in MSSQL however our methods that we use to acess LINQ entities must be changes.
For all functions that update they need to connect to the master, but all the ones that select need to query the slave.
How can we edit the connection string based on the function that needs to be done?
Any help would be appreciated.
Thanks
The simplest approach would be;
Create two connection strings on the web.config <connectionStrings> section for read and write.
When querying data, pass the read connection string name to the context's constructor.
and, pass the write connection string name when updating.
If you are using LINQ to entities, you can pass the connection string to the instance of the context i.e ModelContext ctx = new ModelContext("[edmx format connectionstring]");
Related
I have a simple web app UI (which stores certain dataset parameters (for simplicity, assuming they are all data tables in a single Redshift database, but the schema/table name can vary, and the Redshift is in AWS). Tableau is installed on an EC2 instance in the same AWS account.
I am trying to determine an automated way of passing 'parameters' as a data source (i.e. within the connection string inside Tableau on EC2/AWS) rather than manually creating data source connections and inputting the various customer requests.
The flow for the user would be say 50 users select various parameters on the UI (for simplicity suppose the parameters are stored as a JSON file in AWS) -> parameters are sent to Tableau and data sources created -> connection is established within Tableau without the customer 'seeing' anything in the back end -> customer is able to play with the data in Tableau and create tables and charts accordingly.
How may I do this at least through a batch job or cloud formation setup? A "hacky" solution is fine.
Bonus: if the above is doable in real-time across multiple users that would be awesome.
** I am open to using other dashboard UI tools which solve this problem e.g. QuickSight **
After installing Tableau on EC2 I am facing issues in finding an article/documentation of how to pass parameters into the connection string itself and/or even parameterise manually.
An example could be customer1 selects "public_schema.dataset_currentdata" and "public_scema.dataset_yesterday" and one customer selects "other_schema.dataser_currentdata" all of which exist in a single database.
3 data sources should be generated (one for each above) but only the data sources selected should be open to the customer that selected it i.e. customer2 should only see the connection for other_schema.dataset_currentdata.
One hack I was thinking is to spin up a cloud formation template with Tableau installed for a customer when they make a request, creating the connection accordingly, and when they are done then just delete the cloud formation template. I am mainly unsure how I would get the connection established though i.e. pass in the parameters. I am not sure spinning up 50 EC2's though is wise. :D
An issue I have seen so far is creating a manual extract limits the number of rows. Therefore I think I need a live connection per customer request. Hence I am trying to get around this issue.
You can do this with a combination of a basic embed and applying filters. This would load the Tableau workbook. Then you would apply a filter based on whatever values your user selects from the JSON.
The final missing part is that you would use a parameter instead of a filter and pass those values to the database via initial sql.
Have a web service which connect to oracle database. On database side I have two database, from first I need select some information and in second database need to make some operations.
My question is about which is better to connect only second database and make replication by dbms or scheduler job from first db which release x times in a day to refresh data or make two data source on java side and after select some data from first database, connect second one to make some operations.
From my point of view, I'd access only the "second" database (in which you do those operations) and let it acquire data it needs from the "first" database via a database link.
That can be done directly, such as
select some_columns from db1_table#db_link where ...
or, if it turns out to be way too slow and difficult to tune, create a materialized view in the second database which would then be refreshed using one of available options (a scheduled refresh might be one of the choices).
As it is primarily opinion-based answer, I presume that you'll hear other options from someone else. Further discussion will raise the most appropriate solution to the top.
I am running a raw query on Laravel using DB::statement() method. This statement needs to fetch data from my secondary database.
I wonder if there is any clean way to specify database connection to this method instead of using database name prefix before all fields.
Yes, you can use the connection method, which accepts any connection name stored in the config/database.php file:
DB::connection('pgsql')->statement('your statement here...');
You can read more in the Using Multiple Database Connections Documentation.
We have a database where all connection strings are saved in a table. I need to use these connection strings to obtain data in their respective database. I can do it running queries but I want to use Linq and EF. How can I do it?
Thanks a lot.
Let me rephrase again, I am having access to database which will have a table containing multiple connection strings. I want to map database at runtime and then use it to retrieve the data. Is it possible?
ObjectContext exposes this constructor, which takes a connection string argument, so you can do:
ObjectContext yourContext = new ObjectContext(yourConnectionString);
It is not possible to map database at runtime. It will work only if your application knows mapping for every database and has all necessary classes prepared = you will have to create it in design time.
How connect to different db depending on #request.host value?
Using Sinatra and MongoDB with Mongoid.
I need to read a Sintra application's menu, data ... from different databases. I wish to deploy it only in one place and depending on request.host(subdomain) value to serve the specific pages.
You're probably better off storing all your data in one database marking/tagging/categorizing it depending on the subdomain you're on.
If you setup your Mongoid connection manually already, you could do something like this:
connection = Mongo::Connection.new
Mongoid.database = connection.db(#request.host)
But still, I think you're better of with one database.