I am using Visual Studio Ultimate 2013 to do some load tests. I have some test data attached to my web test, with 10,000 rows of unique data, and i also have a web test plugin in another project, which i have referenced. in the web test.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.VisualStudio.QualityTools.WebTestFramework;
using Microsoft.VisualStudio.TestTools.WebTesting;
namespace VUControl
{
public class VUControl : WebTestPlugin
{
public override void PreWebTest(object sender, PreWebTestEventArgs e)
{
base.PreWebTest(sender, e);
e.WebTest.MoveDataTableCursor("testdata", "strictly#csv", e.WebTest.Context.WebTestUserId);
}
}
}
I have set the Source table properties to 'Do not move cursor automatically'.
The load test is set to run on the cloud, running for 5 minutes with 500 users.
On running the tests, i am getting around 9500 tests completing succesfully, but i am only getting around 10 unique sets of data generated in the database.
The page i am testing is essentially a registration page on a web site.
can anyone tell me why i would only see 10 random selections of data appear?
I would guess that you have 10 load generators? Or maybe a max of 10 threads?
The real problem, in my humble opinion, is that the MoveDataTableCursor() method works in a stand alone webtest, but NOT when that same web or API test is included in a load test. You can workaround this issue by implementing a row count for your dataset, like so:
// datasetRowNumber is a pointer to the row number of an in-memory copy of the dataset. Each agent has a copy of the dataset.
static int datasetRowNumber;
public override void PreWebTest(object sender, PreWebTestEventArgs e)
{
int totalAgentCount = e.WebTest.Context.AgentCount; // Used in a modulus operation to skip rows in the dataset
int agentID = e.WebTest.Context.AgentId; //Used in conjunction with totalAgentCount in the modulus operation
while ((datasetRowNumber++ % totalAgentCount) != (e.WebTest.Context.AgentId - 1))
{
// We have incremented datasetRowNumber in the line above.
// Here is where we will use it to point to the new row.
e.WebTest.MoveDataTableCursor(DSName, tableName, datasetRowNumber);
}
string dataValue = e.WebTest.Context["DataSource1.SampleData.PRNCode"].ToString();
// Logging. Comment this out during a load test!
// writer.WriteToLog("Value=" + dataValue + ", TotalAgentCount=" + e.WebTest.Context.AgentCount + ", AgentID=" + e.WebTest.Context.AgentId + ", Iteration=" + iteration);
}
The above code is an implementation of the code in the following blog:
https://blogs.msdn.microsoft.com/slumley/2008/03/14/description-of-access-methods-in-data-sources/. Sean Lumley's "Description of Access Methods" works great for a web test, but when placed into a load test, the MoveDataTableCursor() method does not work as expected.
The above code makes use of the overload for MoveDataTableCursor()
described subtlety in https://blogs.msdn.microsoft.com/slumley/2010/01/04/vsts-2010-feature-data-source-enhancements
Without the datasetRowNumber variable, Slumley's code does not advance the cursor in a load test; in Lumley's version, the cursor advancement only works in a stand-alone webtest.
Related
In an old DB application I'd like to start moving towards code first approach.
There are a lot of SPs, triggers, functions, etc. in the database which make things error prone.
As a starter, I'd like to have a proof of concept, therefore I started with a new solution, where I imported the entire database (Add new item -> ADO.NET entity data model -> Code First from database)
As a simple first shot I wanted to query 1 column of 1 table. The table contains about 5k rows and the result delivers 3k strings. This takes over 90 seconds now!
Here's the code of the query:
static void Main(string[] args)
{
using (var db = new Model1())
{
var theList = db.T_MyTable.AsNoTracking()
.Where(t => t.SOME_UID != null)
.OrderBy(t => t.SOMENAME)
.Select(t => t.SOMENAME)
.ToList();
foreach (var item in theList)
{
Console.WriteLine(item);
}
Console.WriteLine("Number of names: " + theList.Count());
}
Console.ReadKey();
}
In the generated table code I added the column type "VARCHAR" to all of the string fields/column properties:
[Column(TypeName = "VARCHAR")] // this I added to all of the string properties
[StringLength(50)]
public string SOME_UID { get; set; }
I assume I miss out an important step, can't believe code first query is so slow.
I figured the root cause is the huge context that needs to be built, existing of over 1000 tables/files.
How I found the problem: using the profiler I observed that the expected query hits the database after about 90 seconds, telling me that the query itself is fast. Then I tried the same code in a new project, where I only imported the single table I access in the code.
Another proof that it's context related is executing the query twice in the same session; the second time was executed in the milliseconds.
Key point: if you have a legacy database with a lot of tables, don't use 1 single DbContext that contains all the tables (except for initializing the database), but several smaller domain specific ones with the tables you need for the given domain context. Entities can exist in multiple DbContexts, taylor the relationships (e.g. by "Ignore"-ing where not required) and do lazy loading where appropriate. These things help to boost performance.
I'm using EF 5 with Oracle database.
I'm doing a select count in a table with a specific parameter. When I'm using EF, the query returns the value 31, as expected, But the result takes about 10 seconds to be returned.
using (var serv = new Aperam.SIP.PXP.Negocio.Modelos.SIP_PA())
{
var teste = (from ens in serv.PA_ENSAIOS_UM
where ens.COD_IDENT_UNMET == "FBLDY3840"
select ens).Count();
}
If I execute the simple query bellow the result is the same (31), but the result is showed in 500 milisecond.
SELECT
count(*)
FROM
PA_ENSAIOS_UM
WHERE
COD_IDENT_UNMET 'FBLDY3840'
There are a way to improve the performance when I'm using EF?
Note: There are 13.000.000 lines in this table.
Here are some things you can try:
Capture the query that is being generated and see if it is the same as the one you are using. Details can be found here, but essentially, you will instantiate your DbContext (let's call it "_context") and then set the Database.Log property to be the logging method. It's fine if this method doesn't actually do anything--you can just set a breakpoint in there and see what's going on.
So, as an example: define a logging function (I have a static class called "Logging" which uses nLog to write to files)
public static void LogQuery(string queryData)
{
if (string.IsNullOrWhiteSpace(queryData))
return;
var message = string.Format("{0}{1}",
queryData.Trim().Contains(Environment.NewLine) ?
Environment.NewLine : "", queryData);
_sqlLogger.Info(message);
_genLogger.Trace($"EntityFW query (len {message.Length} chars)");
}
Then when you create your context point to LogQuery:
_context.Database.Log = Logging.LogQuery;
When you do your tests, remember that often the first run is the slowest because the server has to actually do the work, but on the subsequent runs, it often uses cached data. Try running your tests 2-3 times back to back and see if they don't start to run in the same time.
I don't know if it generates the same query or not, but try this other form (which should be functionally equivalent, but may provide better time)
var teste = serv.PA_ENSAIOS_UM.Count(ens=>ens.COD_IDENT_UNMET == "FBLDY3840");
I'm wondering if the version you have pulls data from the DB and THEN counts it. If so, this other syntax may leave all the work to be done at the server, where it belongs. Not sure, though, esp. since I haven't ever used EF with Oracle and I don't know if it behaves the same as SQL or not.
I would to get "comment" by article ID . and i only can to get all comment in all article.
my plan is make LINQ code in my code
please check my code
var childrenss = new List<Sitecore.Data.Items.Item>();
foreach (var child in item.GetChildren())
{
childrenss.Add((Sitecore.Data.Items.Item)child);
}
any advice is appreciated. Thanks.
Well it looks like you should be able to use:
using System.Linq;
...
var children = item.GetChildren().ToList();
Ok I'm going to make a lot of assumptions here so if any of these are false and you need an explanation regarding any of the following let me know.
First of all I'm assuming your data in sitecore looks like this:
Video Item
Comment 1
Comment 2
Comment 3
Video Item 2
Comment 4
Comment 5
I also assume that you created a Sublayout that is meant to show the comments and that the Datasource of that Sublayout is a Video Item. (Incase this is not true you should consider it, integration of Sitecore DMS will later then be a lot easier)
In that case in your code behind of your sublayout there is no need to use any LINQ. You can simply use the following code:
public void Load_Page(object sender, EventArgs e)
{
Sublayout sublayout = Parent as Sublayout;
string datasource = sublayout.Datasource; // Contains Item GUID as string (if not using queries)
Item datasourceItem = Sitecore.Context.Database.GetItem(new ID(datasource));
Repeater.Datasource = datasourceItem.GetChildren();
Repeater.Databind();
}
So as you can see there's little to no reason at all to use any LINQ. For the sake of testing and argument you could demand that the Comment items are retrieved by using the Sitecore Index. To do that you can use the code from your other question.
I have a bit of linq to entities code in a web app. It basically keeps a count of how many times an app was downloaded. I'm worried that this might happen:
Session 1 reads the download count (eg. 50)
Session 2 reads the download count (again, 50)
Session 1 increments it and writes it to the db (database stores 51)
Session 2 increments it and writes it to the db (database stores 51)
This is my code:
private void IncreaseHitCountDB()
{
JTF.JTFContainer jtfdb = new JTF.JTFContainer();
var app =
(from a in jtfdb.Apps
where a.Name.Equals(this.Title)
select a).FirstOrDefault();
if (app == null)
{
app = new JTF.App();
app.Name = this.Title;
app.DownloadCount = 1;
jtfdb.AddToApps(app);
}
else
{
app.DownloadCount = app.DownloadCount + 1;
}
jtfdb.SaveChanges();
}
Is it possible that this could happen? How could I prevent it?
Thank you,
Fidel
Entity Framework, by default, uses an optimistic concurrency model. Google says optimistic means "Hopeful and confident about the future", and that's exactly how Entity Framework acts. That is, when you call SaveChanges() it is "hopeful and confident" that no concurrency issue will occur, so it just tries to save your changes.
The other model Entity Framework can use should be called a pessimistic concurrency model ("expecting the worst possible outcome"). You can enable this mode on an entity-by-entity basis. In your case, you would enable it on the App entity. This is what I do:
Step 1. Enabling concurrency checking on an Entity
Right-click the .edmx file and choose Open With...
Choose XML (Text) Editor in the popup dialog, and click OK.
Locate the App entity in the ConceptualModels. I suggest toggling outlining and just expanding tags as necessary. You're looking for something like this:
<edmx:Edmx Version="2.0" xmlns:edmx="http://schemas.microsoft.com/ado/2008/10/edmx">
<!-- EF Runtime content -->
<edmx:Runtime>
<!-- SSDL content -->
...
<!-- CSDL content -->
<edmx:ConceptualModels>
<Schema Namespace="YourModel" Alias="Self" xmlns:annotation="http://schemas.microsoft.com/ado/2009/02/edm/annotation" xmlns="http://schemas.microsoft.com/ado/2008/09/edm">
<EntityType Name="App">
Under the EntityType you should see a bunch of <Property> tags. If one exists with Name="Status" modify it by adding ConcurrencyMode="Fixed". If the property doesn't exist, copy this one in:
<Property Name="Status" Type="Byte" Nullable="false" ConcurrencyMode="Fixed" />
Save the file and double click the .edmx file to go back to the designer view.
Step 2. Handling concurrency when calling SaveChanges()
SaveChanges() will throw one of two exceptions. The familiar UpdateException or an OptimisticConcurrencyException.
if you have made changes to an Entity which has ConcurrencyMode="Fixed" set, Entity Framework will first check the data store for any changes made to it. If there are changes, a OptimisticConcurrencyException will be thrown. If no changes have been made, it will continue normally.
When you catch the OptimisticConcurrencyException you need to call the Refresh() method of your ObjectContext and redo your calculation before trying again. The call to Refresh() updates the Entity(s) and RefreshMode.StoreWins means conflicts will be resolved using the data in the data store. The DownloadCount being changed concurrently is a conflict.
Here's what I'd make your code look like. Note that this is more useful when you have a lot of operations between getting your Entity and calling SaveChanges().
private void IncreaseHitCountDB()
{
JTF.JTFContainer jtfdb = new JTF.JTFContainer();
var app =
(from a in jtfdb.Apps
where a.Name.Equals(this.Title)
select a).FirstOrDefault();
if (app == null)
{
app = new JTF.App();
app.Name = this.Title;
app.DownloadCount = 1;
jtfdb.AddToApps(app);
}
else
{
app.DownloadCount = app.DownloadCount + 1;
}
try
{
try
{
jtfdb.SaveChanges();
}
catch (OptimisticConcurrencyException)
{
jtfdb.Refresh(RefreshMode.StoreWins, app);
app.DownloadCount = app.DownloadCount + 1;
jtfdb.SaveChanges();
}
}
catch (UpdateException uex)
{
// Something else went wrong...
}
}
You can prevent this from happenning if you only query the download count column right before you are about to increment it, the longer the time spent between reading and incrementing the longer the time another session has to read it (and later rewriting - wrongly - incremented number ) and thus messing up the count.
with a single SQL query :
UPDATE Data SET Counter = (Counter+1)
since its Linq To Entities,it means delayed execution,for another session to screw up the Count (increment the same base,losing 1 count there) it would have to try to increment the app.Download count i beleive between the two lines:
else
{
app.DownloadCount += 1; //First line
}
jtfdb.SaveChanges(); //Second line
}
thats means that the window for the change to occur, thus making the previous count old, is so small that for an application like this is virtually impossible.
Since Im no LINQ pro, i dont know whether LINQ actually gets app.DownLoadCount before adding one or just adds one through some SQL command, but in either case you shouldnt have to worry about that imho
You could easily test what would happen in this scenario - start a thread, sleep it, and then start another.
else
{
app.DownloadCount = app.DownloadCount + 1;
}
System.Threading.Thread.Sleep(10000);
jtfdb.SaveChanges();
But the simple answer is that no, Entity Framework does not perform any concurrency checking by default (MSDN - Saving Changes and Managing Concurrency).
That site will provide some background for you.
Your options are
to enable concurrency checking, which will mean that if two users download at the same time and the first updates after the second has read but before the second has updated, you'll get an exception.
create a stored procedure that will increment the value in the table directly, and call the stored procedure from code in a single operation - e.g. IncrementDownloadCounter. This will ensure that there is no 'read' and therefore no possibility of a 'dirty read'.
I'm working on an existing report and I would like to test it with the database. The problem is that the catalog set during the initial report creation no longer exists. I just need to change the catalog parameter to a new database. The report is using a stored proc for its data. It looks like if try and remove the proc to re-add it all the fields on the report will disapear and I'll have to start over.
I'm working in the designer in Studio and just need to tweak the catalog property to get a preview. I have code working to handle things properly from the program.
If you just need to do it in the designer then right click in some whitespace and click on Database->set datasource location. From there you can use a current connection or add a new connection. Set a new connection using the new catalog. Then click on your current connection in the top section and click update. Your data source will change. But if you need to do this at runtime then the following code is the best manner.
#'SET REPORT CONNECTION INFO
For i = 0 To rsource.ReportDocument.DataSourceConnections.Count - 1
rsource.ReportDocument.DataSourceConnections(i).SetConnection(crystalServer, crystalDB, crystalUser, crystalPassword)
Next
EDIT: Saw your edit, so i'll keep my original post but have to say.. I've never had a crystal report in design mode in VS so I can't be of much help there sorry.
report.SetDatabaseLogon(UserID, Password, ServerName, DatabaseName);
After that you have to roll through all referenced tables in the report and recurse through subreports and reset their logoninfo to one based on the reports connectioninfo.
private void FixDatabase(ReportDocument report)
{
ConnectionInfo crystalConnectionInfo = someConnectionInfo;
foreach (Table table in report.Database.Tables)
{
TableLogOnInfo logOnInfo = table.LogOnInfo;
if (logOnInfo != null)
{
logOnInfo.ConnectionInfo = crystalConnectionInfo;
table.LogOnInfo.TableName = table.Name;
table.LogOnInfo.ConnectionInfo.UserID = someConnectionInfo.UserID;
table.LogOnInfo.ConnectionInfo.Password = someConnectionInfo.Password;
table.LogOnInfo.ConnectionInfo.DatabaseName = someConnectionInfo.DatabaseName;
table.LogOnInfo.ConnectionInfo.ServerName = someConnectionInfo.ServerName;
table.ApplyLogOnInfo(table.LogOnInfo);
table.Location = someConnectionInfo.DatabaseName + ".dbo." + table.Name;
}
}
//call this method recursively for each subreport
foreach (ReportObject reportObject in report.ReportDefinition.ReportObjects)
{
if (reportObject.Kind == ReportObjectKind.SubreportObject)
{
this.FixDatabase(report.OpenSubreport(((SubreportObject)reportObject).SubreportName));
}
}
}