Jmeter is showing wrong result from SQL Server - jmeter

I have a query to check the SQL Server Transaction log free space which returns different result from SQL Server and Jmeter
SELECT cast(round((total_log_size_in_bytes - used_log_space_in_bytes)*1.0/1024/1024,2,0) as decimal (18,2)) AS [free log SPACE IN MB]
FROM sys.dm_db_log_space_usage;
In Sql Server
In Jmeter
Any idea?

Related

view sql generated by WSO2 DSS service

We have a very large SQL statement in our WSO2 DSS service which is executing a query against an Oracle database. With some parameters the query works. With others it fails and the Oracle error indicates there is a missing right ")" in the SQL statement. How can we see the actual SQL being sent to Oracle when the DSS executes. We've tried enabling DEBUG on any relevant looking logger and no luck.
You need to turn debug level in you DSS/Configure/Logging for
org.wso2.carbon.dataservices.core.description.query.SQLQuery
Then in Monitor/System Logs - you can see queries that was called and it's exceptions.
You can use log4jdbc to log every jdbc query that goes from DSS to Oracle. Here is a tutorial you can follow.

SQL Server 2014 Linked Server Performance issue

When I add linked server (SQL Server 2012 or earlier versions) to SQL Server 2014 and try just to make plain select from linked server table like (select id ,image from [linkedserver].[dbname].[schema].[tablename]) I have performance impact.
Image column is blob and 200 records, each 50 kb blob data (summary 10 mb) query executes some 1 minute or so.
If I install SQL Server 2012 on the same server and add the same remote server as linked server and execute the same query I have 20 time better result (2-3 seconds)
I try to analyze query, also include statistics , but with no positive result.
Is it a SQL Server 2014 specific bug ?

SSIS - Data flow stuck at Execution Phase while using Attunity Oracle Source

I am using Attunity Oracle drivers to connect to Oracle Database on a remote server to retrieve data and dump into Excel File.
Everything works fine in Visual Studio BIDS. From VS I can connect directly to remote Oracle server and retrieve the data.
But when i deploy this ETL to my production server (64 Bit Windows Server 2008 & SQL Server 2012), ETL is always get stuck at Execution phase. After running for some time (20-30 mins), it gives following warning & still keeps running without giving any errors -
[SSIS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 0 buffers were considered and 0 were locked.
Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.
Some more info -
I have checked server memory, only 3GB is in use out of total 12GB.
I have already set SQL server to use max 8GB.
I am using SQL Server Agent job to run the ETL periodically every 15 mins.
I have tried stopping all other ETLs on the server and tried running this ETL through Execute Package Utility but still the result is same.
I am using a date range in Oracle Query to retrieve the data, when the query for a particular date range does not return any data, ETL execution is always successful !!.
Progress log (Execute Package Utility) -
Any pointers/suggestion ??
Hope I am able to describe the issue properly.
Update (5/Mar/2014) -
I tried reducing the amount of data I am retrieving, and the ETL was successful.
I have also set the DefaultBufferSize to 10 MB(Max size).
But if the query data is exceeding DefaultBufferSize then why the package is successful on my development machine but not on the server ??
Thanks,
Prateek

IBM Websphere application server jdbc Connection pool performance issue

I am using IBM WAS 8.5 on a windows server
the database I am working with, is DB2 9.7 and is installed on windows server too (on another machine).
I have a table for logs that contains more than 4,000,000 records.
the data is growing very fast.
when I run a count query on that table, the result is very confusing.
with WAS jdbc connection pool, the count take more than 10 seconds to get the result,
but with a simple jdbc connection (in the same application or out of it using any db tool) the result is gaind in less than 0.2 seconds!
I've tried jmeter to perform load test and tivoli to find the right setting but no result!
I've tried dbpool too, the result was better but not acceptable!
any idea?!
I would start with http://www-01.ibm.com/support/docview.wss?uid=swg21247168 and open a PMR if you are unable to analyze the data. This can be any number of problems and without data very difficult to hazard a guess.
Also, are you doing the necessary DB2 work on the DB2 side with runstats/reorg?
Do you have Wireshark and are you looking at the TCP between the app server and the database? Are you seeing any lag there or not?

Informatica: reduced performance for simplet one to one mapping when target change from Oracle to SQL Server

I have simple informatica(9.1) mapping(one to one) which loads data from flat file to RDBMS
it take 5 mins to load to Oracle db and 20 mins to load same file in SQL Server 2008 R2.
Can there be any source/pointers for performance improvement
A few things I can think of
for both tests is the file local to the app server running the mapping?
is the connection/distance between the app server and the data servers comparable
Is "Target load type" of the Target in the Session Properties set to "Bulk"?
Check the thread statistics in the session log to understand if the issue is while writing to db or while reading from file.
Is the PC server installed on Oracle db server? Is it the same case with SQL Server? Are SQL Server and PC server on the same box?
Is the mapping using ODBC or native connection to DB ?

Resources