SQL Server Management Studio disable "no row was updated" popup - insert

I am doing few experiment with data ingestion procedure onto a SQL Server development environmente.
For each test i copy few thousand of row inside the DB using SSMS GUI (not scrip) because it's quicker. But every time i do a mistake in sample data format, have to manually close a ton of popup of the kind "no row was updated" (one for each bad formatted row).
I wonder if there is a way to not display a popup for every error inside the test batch, it dirve me crazy the fact that i have to manually close thousand of popup or just restart SSMS process.

press and hold the esc key till popups are gone for all records throwing error.. It is the only easiest solution I know

Only "solution" to reduce the pop ups in the first place would be to copy only the first row and try with it before using the whole batch of rows.
there is not yet an option in the designer section
Table and Database Designer Options SSMS 18
The designer evaluates each line as its on statement with its own transaction by design. Therefore its capable of inserting individual rows that are not affected by an error. There is also no option to change this behavior at query execution section.

Related

Oracle SQL Developer - Array Fetch Size

I thought for sure this would be an easy issue, but I haven't been able to find anything. In SQL Server SSMS, if I run a SQL Statement, I get back all the records of that query, but in Oracle SQL Developer, I apparently can get back at most, 200 records, so I cannot really test the speed or look at the data. How can I increase this limit to be as much as I need to match how SSMS works in that regard?
I thought this would be a quick Google search to find it, but it seems very difficult to find, if it is even possible. I found one aricle on Stack Overflow that states:
You can also edit the preferences file by hand to set the Array Fetch Size to any value.
Mine is found at C:\Users<user>\AppData\Roaming\SQL
Developer\system4.0.2.15.21\o.sqldeveloper.12.2.0.15.21\product-preferences.xml on Win 7 (x64).
The value is on line 372 for me and reads
I have changed it to 2000 and it works for me.
But I cannot find that location. I can find the SQL Developer folder, but my system is 19.xxxx and there is no corresponding file in that location. I did a search for "product-preferences.xml" and couldn't find it in the SQL Developer folder. Not sure if Windows 10 has a different location.
As such, is there anyway I can edit a config file of some sort to change this setting or any other way?
If you're testing execution times you're already good. Adding more rows to the result screen is just adding fetch time.
If you want to add fetch time to your testing, execute the query as a script (F5). However, this still has a max number of rows you can print to the screen, also set in preferences.
Your best bet I think is the AutoTrace feature. You can tell it to fetch all the rows, you'll also get a ton of performance metrics and the actual execution plan.
Check that last box
Then use this button to run the scenario

Disposing the BindingSource of a ComboBox is extremely slow...

I have a main table with 4 lookup tables. All tables are bound to SQL tables accessed via stored procedures. Each lookup table has between 3 and 27 rows. The main table has about 22,000 rows (pet food & products).
My DataGridView displays rows of the main table with the lookup columns configured as ComboBoxes. I can insert, delete, edit the main table, ... all of these functions work fine. The initial loading of the form and inserting a new row each take about a second but this is not concerning. All other operations are relatively fast.
Here is a screen shot of the DataGridView:
The problem comes when I close the form...and only after having inserted one or more rows into the main table. The closing of the form can take up to a minute. In the closing of the form I now dispose of the binding sources for the 5 tables myself so that I can time them. Disposing of the binding source for the 4 lookup tables routinely takes 10-15 seconds per table. Closing the binding source for the main table takes no time at all. Again, this only happens after inserting a new row into the main table. I can edit main table rows, change lookup column values, and delete rows and closing the form in those use cases is instant.
I have tried running the program within VS, outside of VS running a debug EXE, and running outside of VS running a release version of the EXE, all with similar results.
What can I do to prevent this long delay disposing of the ComboBox binding sources? Is this typical and are there alternatives I should be considering?
After three days of pounding my head against the wall trying all kinds of unsuccessful resolutions, I rebuilt the application from scratch. That fixed the problem and I believe I discovered the cause. At first I just created the data set with all the table adapters, which was pretty fast, and then I added a single form and grid to mimic the condition I described above above. Testing confirmed no issues at all so I continued added more forms with the same ComboBox look-ups and it continues to work fine. I am pretty sure that there was something screwy in the data set definitions I had previously. Hopefully this helps someone in the future.

Validation Rule in Access Not Accepting Valid Data; Can't Save Record

I am using some simple validation rules on a table in the database I manage (it is ANSI-89 at the moment, if that helps). One, for example, reads:
Like "#" Or Like "##"
As I understand it, this should allow any single- or double-digit number (1, 2, 35, 00, 99, et cetera). However, typing "1" into the field is rejected, and the validation rule keeps prompting for a correct input, as it is a required field.
Similarly, and more importantly, I have another field that is validated like so:
Like "######?"
"201620A" should be valid (as you may guess data for this field is based partially on the year). And, while not real data, "123456Z" should be accepted as well. Despite this, both are rejected.
Because they are required fields, I am then unable to save the record... usually. Bizarrely, I have sometimes been able to save the record successfully. I.e., it's not behaving consistently.
I am baffled. I wish to retain the validation rules, as these fields are essential and I would like to at least do some basic checking to ensure they have been entered correctly. I realize there must be some simple thing I am overlooking...
As noted in the comments, HansUp's suggestion of using Compact & Repair seems to have corrected the issue, as once I did that the problem has been consistently gone now for over two months. While it is a simple process, in case anyone needs it Microsoft's instructions on how to do this can be found here: https://support.office.com/en-us/article/Compact-and-repair-a-database-6ee60f16-aed0-40ac-bf22-85fa9f4005b2?ui=en-US&rs=en-US&ad=US&fromAR=1
They suggest backing up the database first. Here is an excerpt:
Compact and repair a database that you have open
NOTE: If other users are also currently using the database file, you
cannot perform a compact and repair operation.
On the File tab, click Info, and then click Compact and Repair
Database.
Compact and repair a database that is not open
NOTE: If other users are currently using the database file, you cannot
perform a compact and repair operation. While you run the compact and
repair operation, no one can use the database file.
Start Access, but do not open a database.
Point to Info, and then click Compact and Repair Database.
In the Database to Compact From dialog box, navigate to and
double-click the database that you want to compact and repair.

What is the preferred method of refreshing a combo box when the data changes?

What is the preferred method of refreshing a combo box when the data changes?
If a form is open and the combo box data is already loaded, how do you refresh the contents of the combo box without the form having to be closed and reloaded?
Do you have to do something on the Click event on the combo box? This would seem to be a potential slow down for the app if there is a hit to the database every time someone clicks on a combo box.
You must determine:
1) When does you data change?
If it depends on other users activity, so you can't determine whether it's changed without querying DB, you can figure out an optimal time for a refresh, like form loading or on every click, or you can use a timer control to refresh the data in a specific time.
2) When does your user need to know about that change?
Try to understand how urgent it is for the user to know about a change. Talk to them. Depending on that, decide when do you need to refresh your data.
Finally:
There isn't a correct way of doing that. It depends on a software structure, users' needs and on a specific situation.
Hope it helps. Good Luck!
UPDATE:
I can add a solutions, that I used recently. If something won't be clear, just ask.
I assume, your refreshing the combo from MS SQL Server.
If so,
1. Create a table , storing in it Combo's data changing date or a version.
2. onClick event or using timer control, which will check for changes every 5 minutes(or any other time), you can compare last change date (or version) of your combo with last change(or version) in that table we store last date(or version) and only if the date(or version) was changed, refresh the combo.
3. Last date (or version) you can store in a variable or in a textbox control, changing it's value every time you refresh the combo.
4. Update last date(or version) in that table if the data changes.
In this case, you'll just need to check for changes, not update them.
P.S. If this solution doesn't feet you, just refresh every time on click event. There's no better event for that case.
Depends on how many people will be using the form but in normal circumstances, using the onclick event of the select box is fine.
Using an ajax call is good because it means you dont have to load the entire page.
One thing is clear that you are using Dropdown means not more items you need to load in the dropdowm i think near about 20 or 30.
Then, what is the problem in database call ?
create Procedure that will use the execution plan and give you fast result.
or put a table which you need to load in cache and fill your cache at certain time
if data is change then load the data in dropdown.
I am working in Window application i am facing same thing but there is no better option
then call database or put it in the cache.
I can see two ways of doing this:
Put a "Refresh" button in UI and reload data only when the user clicks the button. It should be clear to the user (descriptive label, message box or whatever) that by hitting refresh its current selection(s) might change.
Monitor data changes in the database for the combo's underlying table. When data changes, the UI may either update immediately or just store a flag about data having changed (more on this later). In order to know rapidly when data changes, a database trigger seems the best solution to me: the trigger (UPDATE, INSERT, DELETE) is set on the combo's underlying table and increments a counter (datetime, version, whatever floats your boat) in a separate table created only for this purpose. Every time the combo is repopulated (including form load), the counter's value is attached (tag?) to it, to be compared with the current database value. Getting the current counter value could be done on a timer.
Now, if the two counters are different there are two options:
A. Update the UI immediately. I would normally hate such an UI but, not knowing what your actual requirements are, this may go as an option.
B. Set a flag that the UI should be updated. On the dropdown event, check the flag: if it's set, start by repopulating the combo.
In most situations I'd go without any refresh at all or with the first solution but it really depends on requirements.
HTH.
EDIT:
The purpose of the trigger/counter setup is not only to get change info fast but to actually know if data changed, which would be much more complicated to accomplish by directly querying the underlying table. Sorry if this wasn't clear in my initial post (or even after this addition) but English is not my native tongue.
Question 1: ComboboxName.Clear and then ComboboxName.Items.AddItem for each item.
Question 2: Of course this depends on how often the data changes and how big the list is, but I would probably put a timer that is set for every minute or so. This will prevent too many hits to the DB and will make sure your form isn't taking too much time filling in values to the combobox.

Fast query runs slow in SSRS

I have an SSRS report that calls out to a stored procedure. If I run the stored procedure directly from a query window, it will return in under 2 seconds. However, the same query run from an 2005 SSRS report takes up to 5 minutes to complete. This is not just happening on the first run, it happens every time. Additionally, I don't see this same problem in other environments.
Any ideas on why the SSRS report would run so slow in this particular environment?
Thanks for the suggestions provided here. We have found a solution and it did turn out to be related to the parameters. SQL Server was producing a convoluted execution plan when executed from the SSRS report due to 'parameter sniffing'. The workaround was to declare variables inside of the stored procedure and assign the incoming parameters to the variables. Then the query used the variables rather than the parameters. This caused the query to perform consistently whether called from SQL Server Manager or through the SSRS report.
I will add that I had the same problem with a non-stored procedure query - just a plain select statement. To fix it, I declared a variable within the dataset SQL statement and set it equal to the SSRS parameter.
What an annoying workaround! Still, thank you all for getting me close to the answer!
Add this to the end of your proc: option(recompile)
This will make the report run almost as fast as the stored procedure
I had the same problem, here is my description of the problem
"I created a store procedure which would generate 2200 Rows and would get executed in almost 2 seconds however after calling the store procedure from SSRS 2008 and run the report it actually never ran and ultimately I have to kill the BIDS (Business Intelligence development Studio) from task manager".
What I Tried: I tried running the SP from reportuser Login but SP was running normal for that user as well, I checked Profiler but nothing worked out.
Solution:
Actually the problem is that even though SP is generating the result but SSRS engine is taking time to read these many rows and render it back.
So I added WITH RECOMPILE option in SP and ran the report .. this is when miracle happened and my problem got resolve.
I had the same scenario occuring..Very basic report, the SP (which only takes in 1 param) was taking 5 seconds to bring back 10K records, yet the report would take 6 minutes to run. According to profiler and the RS ExecutionLogStorage table, the report was spending all it's time on the query. Brian S.'s comment led me to the solution..I simply added WITH RECOMPILE before the AS statement in the SP, and now the report time pretty much matches the SP execution time.
I simply deselected 'Repeat header columns on each page' within the Tablix Properties.
If your stored procedure uses linked servers or openquery, they may run quickly by themselves but take a long time to render in SSRS. Some general suggestions:
Retrieve the data directly from the server where the data is stored by using a different data source instead of using the linked server to retrieve the data.
Load the data from the remote server to a local table prior to executing the report, keeping the report query simple.
Use a table variable to first retrieve the data from the remote server and then join with your local tables instead of directly returning a join with a linked server.
I see that the question has been answered, I'm just adding this in case someone has this same issue.
I had the report html output trouble on report retrieving 32000 lines. The query ran fast but the output into web browser was very slow. In my case I had to activate “Interactive Paging” to allow user to see first page and be able to generate Excel file. The pros of this solution is that first page appears fast and user can generate export to Excel or PDF, the cons is that user can scroll only current page. If user wants to see more content he\she must use navigation buttons above the grid. In my case user accepted this behavior because the export to Excel was more important.
To activate “Interactive Paging” you must click on the free area in the report pane and change property “InteractiveSize”\ “Height” on the report level in Properties pane. Set this property to different from 0. I set to 8.5 inches in my case. Also ensure that you unchecked “Keep together on one page if possible” property on the Tablix level (right click on the Tablix, then “Tablix Properties”, then “General”\ “Page Break Options”).
I came across a similar issue of my stored procedure executing quickly from Management Studio but executing very slow from SSRS. After a long struggle I solved this issue by deleting the stored procedure physically and recreating it. I am not sure of the logic behind it, but I assume it is because of the change in table structure used in the stored procedure.
I Faced the same issue. For me it was just to unckeck the option :
Tablix Properties=> Page Break Option => Keep together on one page if possible
Of SSRS Report. It was trying to put all records on the same page instead of creating many pages.
Aside from the parameter-sniffing issue, I've found that SSRS is generally slower at client side processing than (in my case) Crystal reports. The SSRS engine just doesn't seem as capable when it has a lot of rows to locally filter or aggregate. Granted, these are result set design problems which can frequently be addressed (though not always if the details are required for drilldown) but the more um...mature...reporting engine is more forgiving.
In my case, I just had to disconnect and connect the SSMS. I profiled the query and the duration of execution was showing 1 minute even though the query itself runs under 2 seconds. Restarted the connection and ran again, this time the duration showed the correct execution time.
I was able to solve this by removing the [&TotalPages] builtin field from the bottom. The time when down from minutes to less than a second.
Something odd that I could not determined was having impact on the calculation of total pages.
I was using SSRS 2012.
Couple of things you can do, without executing the actual report just run the sproc from within the data tab of reporting services. Does it still take time?
Another option is to use SQL Profiler and determine what is coming in and out of the database system.
Another thing you can do to test it, so to recreate a simple report without any parameters. Run the report and see if it makes a difference. It could be that your RS report is corrupted or badly formed that may cause the rendering to be really slow.
Had the same problem, and fixed it by giving the shared dataset a default parameter and updating that dataset in the reporting server.
DO you use "group by" in the SSRS table?
I had a report with 3 grouped by fields and I noticed that the report runed very slowly despite having a light query, to the point where I can't even dial values in the search field.
Than I removed the groupings and now the report goes up in seconds and everything works in an instant.
In our case, no code was required.
Note from our Help Desk: "Clearing out your Internet Setting will fix this problem."
Maybe that means "clear cache."

Resources