Looking for guidance with reporting in Business Intelligence Development Studio - visual-studio

I've been using Microsoft SQL Server Management Studio for multiple manual SQL queries for Excel reports up until this point. I'd now like to automate my processes further with Microsoft Business Intelligence Development Studio and generate reports using this rather than Excel.
Here's a rough overview of what I'm looking to do:
Combine multiple data sources including SQL Server and Excel data
Generate a report using this combined data
Re-use this report in the future with different criteria and Excel data. While the SQL Server data sources would remain the same, Excel data would change for every report run.
My access to the SQL servers is read-only. I'm not an administrator on the PC I have access to due to tight security.
I've played around with both SSIS packages and reports, but am not quite sure where to begin. I did create an SSIS package that combines the SQL server and Excel data using a Merge Join transformation, but am not quite sure what to do with the created package or if that package can even be used in a report.
Any guidance would be appreciated.
Thanks.

Since you have BIDS, you have various options to send data to your users:
Create reports in SSRS
Export the SSRS report to Excel
Create SSIS packages to export CSV files that are read by Excel (I used this model to mimic the report data caching ability of SSRS)
The SQL Server Reporting Services book is a great place to start.
http://www.amazon.com/Microsoft-Server-2005-Reporting-Services/dp/0735622507

Related

SSRS project, files structure

my job is to develop and manage SSRS reports.
On the enterprise SSRS server i created directories to manage permissions for different groups of users. But VisualStudio SSRS project supports only flat representation of reports. It is rather unhandy to deploy them (and become worse when they have links to each other).
Creating multiple projects is incomplete solution because of cross-directory links. It will disable preview function in report designer.
So, question is: is this possible to have equal non-flat structure both in VS project and in publicated to the server reports?
Using Geoff Hudik's method for now: http://www.geoffhudik.com/tech/2011/10/13/uploading-ssrs-reports-with-powershell.html
This is a PowerShell function that allows uploading multiple reports to ReportServer using the wcf interface, so publication can be automated.
Usage is as simple as:
UploadReports "report-server.domain.com" "c:\temp" "/TEST1/AppName"

Visual Studio 2013 - DB Data Compare

I know you can compare and sync data between databases in Visual Studio using SSDT. But is there a way to compare DB data in DB project vs actual DB?
We currently use RedGate to sync schema and data and normally when someone makes a data change in their local DB he syncs it with redgate project scripts and checks them in to GIT so that everyone can sync their local DBs with those redgate scripts to be up to date. RedGate became too expensive and we are looking at the alternatives and looks like Visual Studio has SSDT that allows to do these kind of things.
So I was able to create a database project in VS and import the schema from the DB so now all developers be up to date if schema changes but it doesn't have an option to do the same thing with DB data. No option to create data scripts (and add them to the database project) to compare with DB, as I said it only allows you to do data compare between DBs but not between DB and the DB project scripts, at least that's what I found so far. Is there even a way to do it so that we can include data scripts and be able to sync them with DBs?
You could check in MERGE-scripts for static tables and include them as Post Scripts.
The merge statements will ensure that your target tables have the correct rows when publishing (insert/update/delete in the merge will do this).
Make one merge script file per table, and include them all in your Post Script File.
Only difference/downside is that you can not import into the merge script, you have to type in the code to get it version controlled, or write an SP that generates the SQL Merge statement.

Importing MS Project 2013 data into Oracle 11g database

Does anyone know if there is a straightforward way to get data out of MS Project 2013 and into Oracle 11g? We have a master schedule created in MS Proj and want to create a web-based application that will perform monitoring and metrics charting of the project schedule statuses. I have successfully exported to CSV and imported into Oracle, but this was cumbersome and required a lot of formatting of the data in the CSV format before it was pushed back into Oracle. I'm in the beginning phases here, but wanted to solicit anyone who may have had experience with this in the past.
If you don't mind writing a little code, you could use MPXJ. You should be able to extract what you need using Java or a .Net language. You can perform the import directly in code, or just generate a suitably formatted output file for import into Oracle using other tools.

Sub report is taking a lot of time for rendering

I am working on SSRS using SQL 2008 r2. I have created master report with sub reports using report viewer.
Issue: For each run .rdlc report is taking almost 5-10 min time to generate report in Win form. I am using VS 2010 and VB.Net. I believe the sub report is taking a lot of time rendering.
The main SP and sub report SP is getting executed in an instant.
The report however is running fine in BIDS environment but when loading in report viewer is taking a lot of time rendering
I have tried searching many options regarding this Issue but couldn't find enough help. I am sorry If my question is not enough clear to you all.
(P.S.= I am using embedded report for Winform for VS 2010 as .rdlc and not a Web report or .rdl (2008))
The standard "gotcha" in this scenario is that BIDS is using dataset cache files and not going back to SQL and re-running your query.
To avoid this you can either alter your parameters on each execution (invalidating the cache) or use this feature from the geniuses behind BIDS Helper:
http://bidshelper.codeplex.com/wikipage?title=Delete%20Dataset%20Cache%20Files&referringTitle=Documentation
If this is the case, the real issue is that your datasets queries are taking a long time to run. This is usually a SQL or database design issue, not SSRS.

Can you monitor the execution of an SSIS package, in BIDS, as it runs on the server?

Although my title states my current goal, I am open to alternative solutions. In short, I have a series of SSIS packages that are scheduled to run on a nightly basis, on our SQL Server machine.
Due to various updates that happen in the ether of my corporate-IT, some times these exports brake and the process will stop working, in the middle of the job. To troubleshoot this process, I fire up BIDS on my workstation and restart which ever process fails. This is useful because it's been my experience that any error generated from within BIDS are much more useful than anything I've found from within SQL Server/Package Execution History or the servers event logs.
Historically, my problem has been that not all problems will occur in BIDS, but they will, consistently on the server. These issue have been painful to diagnose and have cost me a lot of time.
As such, what I would like to do is publish my package to the SSIS server, start the server instance from BIDS and monitor the project as it runs. Is this possible?
If not, is there something else that I can do so that I can monitor the internal steps as the process executes?
I fear that none of this may be possible but I have to ask. It will make my debugging and troubleshooting life so much easier.
Possible option:
You need to make use of Logging feature in SSIS. It allows you to configure the events for which you would like to capture the messages. I usually prefer the log OnWarning and OnError to keep track of all the warning and error messages that occur in the package. You have various provider to save the logging data. I prefer to use SQL Server so that I can query the logging information.
Logging options shown from SSIS 2012:
To enable logging in a package, you need to click the package Business Intelligence Development Studio (BIDS) if you are developing packages in SSIS 2005 - 2008 R2 or SQL Server Data Tools (SSDT) if you are developing packages in SSIS 2012.
Click SSIS menu and then click Logging...
You will see the Configure SSIS Logs dialog.
On the left side, you can check the package or individual tasks to log the event data.
On the Providers and Logs tab, you can select an appropriate provider to which you can save the log information. The below screenshot shows that event information are captured in a SQL Server database using the connection manager OLEDB_PracticeDB.
On the Details tab, you can select which events you would like to capture. The below screenshot shows that I am capturing the following events.
OnError
OnInformation
OnTaskFailed
OnWarning
Thanks to #William Todd Salzman for recommending OnTaskFailed event
Sample package illustration:
Let's say we have a package named SO_15004109.dtsx with a Data Flow Task and Script Task. Data Flow Task is just a dummy with no components inside.
Script task has the following code in Main method to fire custom information, warning and error messages so we can observe how it is captured in the logging data source. The code is written for SSIS 2012 so you may need to alter it for SSIS 2005. I chose VB.NET instead of C# because you have tagged this question under sql-server-2005 and SSIS 2005 only supports VB.NET.
Script Task code in VB.NET for SSIS 2005 and above.
#Region "Imports"
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
#End Region
<Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute()> _
<System.CLSCompliantAttribute(False)> _
Partial Public Class ScriptMain
Inherits Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
Public Sub Main()
Dim fireAgain As Boolean = False
Dts.Events.FireInformation(101, "Custom Script Information", "This is a test information message.", String.Empty, 0, fireAgain)
Dts.Events.FireWarning(201, "Custom Script Warning", "This is a test warning message.", String.Empty, 0)
Dts.Events.FireError(201, "Custom Script Error", "This is a test error message.", String.Empty, 0)
Dts.TaskResult = ScriptResults.Success
End Sub
#Region "ScriptResults declaration"
Enum ScriptResults
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
End Enum
#End Region
End Class
If we execute the package, it will fail because we raised an error within the Script Task.
If you navigate to the data source where you logged the errors, you will notice that SSIS creates a table to log the information if you chose SQL Server logging provider. Below table lists the logging table that SSIS creates in the SQL Server database chosen to be the log provider.
SSIS Version Log table name Table type
-------------- ---------------- ----------
SSIS 2005 dbo.sysdtslog90 User
SSIS 2008 dbo.sysdtslog100 User
SSIS 2008 R2 dbo.sysssislog System
SSIS 2012 dbo.sysssislog System
Below query was executed in the database to view the events captured by this sample package. You will notice some messages twice because the events are logged for each containers and tasks. The table is named dbo.sysssislog because the package was created in SSIS 2012.
select id, event, source, message from dbo.sysssislog;
Personal experience with logging:
I have had fairly good success in just viewing the logging error messages to understand what went wrong. Debugging packages in production environment is not advisable, in my opinion. However, capturing log events is preferable.
When I worked in SSIS 2005 and 2008, I have created SSRS based reports that queries the log table to generate daily report of the job executions and send a PDF attachment to persons of interest.
Things have improved in SSIS 2012 that the tool comes with in-built reporting capabilities with Integration Services Catalog that creates a database named SSIDB.
At my current gig, we've started using a product from Pragmatic Works called BIxPress, which includes tools to inject logging code into existing SSIS packages and monitor running packages as they execute. It's not cheap, but we have found it helpful. (Especially since it means we don't have to roll our own monitoring code...)

Resources