No matter what I do, which DB I connect with, EF seems to take around 15-30 minutes to generate a model. While it's doing this, I get a "Visual Studio is busy" message in the system tray.
The first DB I connected to was complex and had a lot of data, lots of views so I thought, may be that's why. Now I have a local DB file with 1 table that has 2 columns and 3 rows. It still takes the same amount of time.
Eventually VS crashes and restarts. Has anyone had this problem before? Any idea?
I've looked at resource monitor, devenv.exe does not seem to be consuming any resources that would indicate it's doing a lot of work.
What credentials do you use to access your DB? Let's try ruling out domain latency issues first. If this is at work, can you verify the same flow on 2 separate machines using the same domain credentials? This wouldn't apply if you were using local creds.
Turns out I had a Visual Studio DVD in my DVD drive. Each time I did something with EF, VS started to read from the disc. I have no idea what or why, but the little LED would blink. Once I ejected the disk, everything ran fine. Go figure!
Related
A few days ago, one of my costumers who has(MariaDB 10.3.10) installed in windows 10, had a power issue on hes computer, after he turned on the computer again, windows pop'd the restore point option for him and he took it, before that, hes data base was working correctly as intended, but after he used the restore point, something happened, and some store procedures, went from 5-10 seconds, to 3-4 minutes.
My first thought was that some tables were corrupted, but afer some test, i decided to make a dump and install the Data in another computer with the same database engine version(MariaDb 10.3.10) and surprisingly, everything worked perfectly, i took the same database, in other 2 different computers, and everything ran correctly, no slow querys nothing, i decided to format the computer with the issue, and installed everything from scratch, but nothing changed, same issues, same problems...
Any thoughts on this=
I've recently had my laptop replaced and I've had to install Visual Studio 2015 and SQL Server 2014 Express with Management Studio.
My previous environment was Visual Studio 2015 with SQL Server 2008 R2 Express with Management Studio.
I restored the 2008 R2 databases into SQL Server 2014 Express, same database names, logins etc.
Now when I run any of my ASP.NET MVC 5 applications (using Entity Framework 6) on my laptop using Visual Studio, I'm getting sporadic timeout errors. Please see below.
Occasionally the application database calls will perform as expected, but mostly they are either very slow or timeout.
I'm finding it difficult to understand why this is as on my previous laptop using SQL Server 2008 R2 Express I never had any of these issues.
Also, these applications are on a live web server and being used by 1000s of users each day without any of these problems. This makes me think there is something possibly wrong with the installation of SQL Server 2014 Express on my laptop.
I have seen others comment on extending the Command Timeout on my DbContext
public class MyDatabase : DbContext
{
public MyDatabase ()
: base(ContextHelper.CreateConnection("Connection string"), true)
{
((IObjectContextAdapter)this).ObjectContext.CommandTimeout = 180;
}
}
But I don't see this as a solution, as I didn't need this with my previous laptop/ environment and the current live applications also don't need it.
I'm stumped here and would really appreciate any help or guidance.
Thanks.
Update
Thanks to the suggestions from Steve Py I decided to check the memory performance from my new laptop when running Visual Studio 2015 and SQL Server Express 2104 concurrently. I've included a screen shot below which shows that 90% of the available memory is used (3.5G out of 3.9G). I'm far from an expert in tuning up a device for software development, however, this seems it may be a reason as to why when I run my applications locally that they are timing out.
Is there anyone on Stackoverflow who ca inform me if this looks like the possible problem?
Thanks.
Firstly I'd look at hooking up a profiler to capture the queries coming from EF. For SQL Server you can use ExpressProfiler. This will give you the actual SQL EF is trying to run, the # of row reads, writes, and execution time. Copy the SQL queries and paste them into a new query window on the DB and execute them, plus have a look at the execution plan. Does the execution time correlate with EF? (change parameters and re-run in SQL to ensure you aren't getting cached results)
Other factors are the hardware on the two laptops. You'd hope that the new laptop would have more grunt than the old one, more cores, better cores, more RAM, but how do they compare? How much memory is free when nothing is running? What kind of HDD was in the two machines? For instance dropping from an i7 with an SSD down to an i3 with 5400rpm HDD, and half the RAM will be extremely noticeable, even if the clock speed is higher.
When it comes to databases there are a number of factors that can impact performance, even when backing up and restoring. For instance the Isolation Level and recovery model settings for the database can play a part, especially for larger databases. I'd also look at server settings such as how much RAM the database server is allocated to be able to use. Feel free to paste some results from the profiler for slow queries.
Edit: Based on the screenshot of the resource use, my guess is your new laptop is potentially underpowered. 4GB of RAM is bare-bones with Windows 10 especially to be running Visual Studio and SQL Server, even for just a development use database. The history graph for CPU and disk also show heavy activity. If that's all you've got to work with then the next step would be to look at what is using the memory. SQL Server by default will attempt to use whatever is available, and it can be quite greedy, but it's generally a good idea to set boundaries on the server. From SQL Management you can bring up the properties of the server and select "Memory" to set a minimum and maximum memory size. For 4GB I'd set the minimum to 500MB and the max to 2000MB. For processors you can use "Boost SQL Server Priority"
Next, on the database side of things look into the file and recovery options. What is the size of the database MDF file, and transaction log? (LDF file) From the database properties window under "General" you should see the "Size" which is the MDF size. For the LDF you will probably need to check on the file system. A large LDF can be bad for performance and indicate your database should be backed up and the log compressed/truncated. Lof files default to grow by percent so they can grow fast and churn the disk. In the "Options" tab you cna check the "Recovery Model" and set that to "Simple" for a dev database to significantly cut on log file churn/growth. Production databases will use "Full".
For development purposes it helps to have a bit more grunt from a laptop. While things like "ultra books" look like good options and are nice and lightweight, they rarely have the grunt and resources needed for a dev environment. (plus generally poor keyboards and displays to boot! :) ) There is also a significant gap in price between ultra books and workstation replacement laptops. What I've found fit in that gap and serve as exceptional development PC replacements are gaming laptops. They are tuned for performance and usually come with 8GB minimum with expansion available. They also happen to come with exceptional keyboards and displays. They're typically a fair bit cheaper than workstation replacements that seem to price in a premium. I use an MSI GE65 series which came with 16GB, SSD+HDD, a great keyboard, and was over $1000 cheaper than the closest "workstation" laptop. It does draw a couple stares coming into a client site with a gaming laptop with it's LED keyboard and lid badge, not a single game on it though! :)
I tried to sync database on Visual Studio 2015 after creating a project, EDT, Enum and a Table in order to create a new screen on Dynamics 365.
When I tried to synchronize it, it was stopped in the middle during schema checking process. Though it seems that the DB synchronization doesn't have problem for the first few minutes, it always stops during this process as I describe below.
Log Details:
"Schema has not changed between new table 'DPT_TableDT' and old table
'DPT_TableDT' with table id '3997'. Returning from
ManagedSyncTableWorker.ExecuteModifyTable() Syncing Table Finished:
DPT_TableDT. Time elapsed: 0:00:00:00.0010010"
Could you tell me how to solve this issue?
Thanks in advance.
Full database synchronization log
DB Sync Log
From what you've described and also shown in your screenshot, this does not look like an error but is simply describing X++ and Dynamics AX/365FO behaviour.
When you say that it "doesn't have a problem for the first few minutes" I'm guessing you're just not being patient enough. Full database syncs should generally take between 10-30 minutes, but can take shorter or longer depending on a variety of factors such as how much horsepower your development environment has, how many changes are being sync'd etc. I would wait at least one hour before considering the possibility that the sync engine has errors (or even run it overnight and see what information it has for you in the morning).
The message you've posted from the log ("Schema has not changed") isn't an error message; it is just an informational log from the sync engine. It is simply letting you know that the table did not have any changes to propagate to SQL Server.
Solution: Run the sync overnight and post a screenshot of the results or the error list window in Visual Studio.
I've recently been stymied by a long running application where Access v2003 replicas refused to synchronize. The message returned was "not enough memory". This was on machines running Windows 10. The only way I was able to force synchronizing was to move the replicas onto an old machine still running Windows 98 with Office XP, which allowed synchronizing and conflict resolution. When I moved the synchronized files back to the Windows 10 machine they still would not synchronize.
I finally had to create a blank database and link to a replica, then use make-table queries to select only data fields to create new tables. I was then able to create new replicas that would synchronize.
From this I've come to suspect the following:
Something in Windows 10 has changed and caused the problem with synchronizing/conflict resolution.
Something in the hidden/protected fields added to the replica sets is seen as a problem under Windows 10 that is not a problem under Windows 98.
One thing I noticed is that over the years the number of replicas in the synchronizing list had grown to over 900 sets, but the only way to clear the table was to create a new clean database.
I have a program that I run on multiple network PCs. When I compiled the most recent version, it runs extremely slowly on 2 PCs on the network, but runs fine for everyone else.
This used to happen with my old dev PC when I had an additional 2gb RAM installed. When I would remove the additional 2gb and recompile, it would then work fine for everyone.
Now, I am on a completely new machine and am having the same issue. I've tried to rebuild the project after rebooting, but still have the same issue.
For all other PCs, the program loads in about 3-5 seconds. On these 2 PCs, it takes anywhere from 45 seconds to 1.5 mins to load...
One of the PCs is an older Dell Dimension 8200, but the other is a newer OptiPlex that is identical to several other PCs on the network, so this is what is really making it so confusing.
For now, I've had to revert to the old version so it will run correctly for everyone.
Does anyone have any idea of anything to try?
Thanks in advance!!!
Edit:
Ok, it was an exhausting day yesterday trying various things to solve this issue. Here is what I tried and where the problem begins:
Using the new program
Went back to old versions of all updated components, but still had the same issue
Using the old program
I decided to go back to the drawing board and start from the old version of the application and incrementally add the new features a small piece at a time.
Recompiled the old version using the old components - program works fine
Updated to new DevExpress components - program works fine
Updated to new ESBPCS components - program works fine
Updated to new DeepSoftware components - program works fine
Ok, so now we know there is nothing with the component sets I've updated...
Added 1 image to each of 2 image lists - program works fine
Added new database table - program works fine
Added code to open and close the new table - program works fine
Added new action to action list and added a menu item and toolbar button to new action (action does nothing at this point) - program works fine
Added a new BLANK form to the application and added code to open the new form - BAM!!!
So, adding just one form to the application is what's causing the issue! I removed all the code for the opening of the form, commented out the uses clauses and removed the uses entry from the project source and everything is back to normal!
Anybody have any idea about this?
Thanks!
Edit 2:
For #Warren P - here is my .DPR source:
program Scheduler;
uses
ExceptionLog,
Forms,
SchedulerMainUnit in 'SchedulerMainUnit.pas' {FrmMain},
SchedulerDBInfoUnit in 'SchedulerDBInfoUnit.pas' {FrmDBInfo},
SchedulerHistoryUnit in 'SchedulerHistoryUnit.pas' {FrmHistory},
SchedulerOptionsUnit in 'SchedulerOptionsUnit.pas' {FrmOptions},
SchedulerExtVersionUnit in 'SchedulerExtVersionUnit.pas' {FrmExtVersion},
SchedulerSplashUnit in 'SchedulerSplashUnit.pas' {FrmSplash},
SchedulerInfoUnit in 'SchedulerInfoUnit.pas' {FrmInfo},
SchedulerShippedUnit in 'SchedulerShippedUnit.pas' {FrmShipped}; {<-- This is the new form with the issue}
{$R *.res}
begin
Application.Initialize;
Application.Title := 'SmartWool WIP Scheduling Assistant';
Application.CreateForm(TFrmMain, FrmMain);
Application.CreateForm(TFrmDBInfo, FrmDBInfo);
Application.CreateForm(TFrmHistory, FrmHistory);
Application.CreateForm(TFrmOptions, FrmOptions);
Application.CreateForm(TFrmExtVersion, FrmExtVersion);
Application.Run;
end.
And here is the intialization section of the main form to create the splash:
initialization
FrmSplash:=TFrmSplash.Create(Application);
FrmSplash.Show;
FrmSplash.Refresh;
Edit 3:
Anybody??? Please?
It could be that the program is waiting for timeouts when trying to access resources that are not available on that machine such as network drives or Internet hosts.
Try running Process Monitor when starting up your program and look for file open calls. Filter the output so it only shows your process.
http://technet.microsoft.com/en-us/sysinternals/bb896645
Performance problems initially can seem very daunting at first.
I have been on many teams where people have tried to guess at a reason for performance problems. This sometimes works, but is far less effective than actually measuring the code.
When reproducible on a development machine, I would recommend a profiler.
There was a previous question that asked about
Delphi Profiling tools which has several possible tools you could use.
When you can't reproduce the problem on your development machine, then it becomes a bit more difficult, but not impossible. Typically I have found that problems are related to an application dependency that is different, and not performing well. Understanding the external influences on your application can help pinpoint the problem.
Specifically common external problems in some of my applications.
Network
Database
Application Servers
Installation or Data File Location (i.e. Disk Performance)
Virus and Malware Scanners
Other application interring with yours such as a virus.
To monitor for items related to the network (i.e. Database, web services, etc...)
I typically use Wireshark which allows me to see if resources are responding in expected times. My most common problem is poor performing DNS and can found using Wireshark.
You can use the AutoRuns program to determine everything that starts up when your computer does, it's useful in determine differences between machines.
But most of all I have logging that can be turned on in my applications and this allows me to isolate the problem to a specific area of code. This narrowing down to a specific section of code reduces the guessing, and allows you to focus on a few possible problems.
I created a log function for this that I call at specific places (in your case especially during startup). It adds a timestamp to each log text and stores them in a TMemo that is regularly saved. Not only very helpful when debugging, but may also shed some light on your problem.
Are you using code signing - ie Microsoft Authenticode? If so, then outdated certificate authorities on the computers can cause significant delays to startup.
First, I would try to defragment the hard disk. If still slow, I would check the power supply. Maybe your hard disk are getting insufficient energy.
Check if there is the same antivirus software on those 2 problematic computers. If so, then your Delphi application may match byte pattern used in some virus made in Delphi. Update virus definitions to solve it, or report false alarm to antivirus company, or change antivirus software.
Check if there isn't any printer installed on those 2 problematic computers. If it is so, then add any printer and try again.
Idea 1:
One reason I have seen for very slow application load time, is when printing or reporting system components like Developer Express Express Print, are in your application.
The problem I saw when using Developer Express Printing components, is that I had an offline or non-responsive network printer in my list of printers (check the control panel printer icon) that was not responding. Some of those Developer Express components seem to read some information from each printer you have installed, and the solution was to go to those clients, and delete old printers from their control panel, that were no longer being used. Each not-responding network printer added up to 60 seconds for a TCP Timeout, to the startup time of my application.
Update - Idea 2:
Download MS DebugView and install it on the machine that runs slowly. Now go back to your main development PC, open the IDE, open your main project file (right click on the project, view project source in project viewer), this will show you the contents of your main project source file (.dpr). go to the main begin....end. block. Now set a breakpoint on the main begin statement, and single step INTO (not OVER) and you will see all the module initialization sections. In each one add this: OutputDebugString('ModuleName').
Now when you run this inside the Delphi Ide you will see messages, and see how far apart they come in, and understand what is taking a long time to initialize. Instead of installing the delphi ide onto the machine that runs slowly, Debug View (which is less than 400kb single executable) will be run, and it will show you these debug messages, along with a nice time display (##.# seconds) for each message.
MS Debug view is here.
Are you allowing the forms to be constructed on initialization within the DPR source? If so, you may do well to consider whether or not you want those forms sucking up memory the entire time, more-over if you want those forms to be wasting the application's time on load.
A rule of thumb: If the form is used a LOT during the application's execution, allow it to be constructed when the application loads (this will work out faster over-all than constructing the instance "on-demand").
If the form is not used very often at all (for example, a Dialog or an About Box), delete the "Application.CreateForm" line from the DPR source, and instead construct your instance on request...
var
LForm: TfrmAbout;
begin
Application.CreateForm(LForm, TfrmAbout);
try
LForm.ShowModal;
finally
LForm.Free;
end;
end;
Now that form (which may not even be displayed during the program's execution) is not sucking up system resources, and will not slow down the application's load time.
It may not solve your problem 100%, but it should certainly help!
My SQL Compact database is very simple, with just three tables and a single index on one of the tables (the table with 200k rows; the other two have less than a hundred each).
The first time the .sdf file is used by my Compact Framework application on the target Windows Mobile device, the system hangs for well over a minute while "something" is done to the database: when deployed, the DB is 17 megabytes, and after this first usage, it balloons to 24 megs.
All subsequent usage is pretty fast, so I'm assuming there's some sort of initialization / index building going on during this first usage. I'd rather not subject the user to this delay, so I'm wondering what this initialization process is and whether it can be performed before deployment.
For now, I've copied the "initialized" database back to my desktop for use in the setup project, but I'd really like to have a better answer / solution. I've tried "full compact / repair" in the VS Database Properties dialog, but this made no difference. Any ideas?
For the record, I should add that the database is only read from by the device application -- no modifications are made by that code.
Yes, it recreates your indexes because the database was created or opened on a desktop computer. Copy your indexed database from the device and into your setup.
more info here:
http://blogs.msdn.com/sqlservercompact/archive/2009/04/01/after-moving-the-database-from-one-platform-to-other-the-first-sqlceconnection-open-takes-more-time.aspx
Since the db is read only, and if the "initialized" db no longer inflates, I would go with simply putting it into the setup. Just confirming that your approach makes sense.