Adding Programmatically datasource / dataset to LocalReport - webforms

Is there any way to add programmatically a datasource / dataset to a Microsoft.Reporting.WebForms.LocalReport when the report-XmlFile (*.rdlc) has no datasource / dataset definitions at design-time?
This works if I already have a datasource / dataset definition in my *.rdlc
C#
public byte[] RenderReport(string reportName, string reportFormat)
{
LocalReport report = LoadReport(reportName);
//Has same name like DataSet in *.rdlc
ReportDataSource rds = new ReportDataSource("DataSet1", getData());
report.DataSources.Clear();
report.DataSources.Add(rds);
return report.Render(reportName);
}
private DataTable getData()
{
DataTable dt = new DataTable();
dt.Columns.Add(new DataColumn("ID",typeof(System.String)));
dt.Columns.Add(new DataColumn("NAME", typeof(System.String)));
dt.Rows.Add(new string[] { "1", "Me" });
return dt;
}
*.rdlc
<DataSources>
<DataSource Name="DataSource1">
<ConnectionProperties>
<DataProvider>System.Data.DataSet</DataProvider>
<ConnectString>/* Local Connection */</ConnectString>
</ConnectionProperties>
</DataSource>
</DataSources>
<DataSets>
<DataSet Name="DataSet1">
<Query>
<DataSourceName>DataSource1</DataSourceName>
<CommandText>/* Local Query */</CommandText>
</Query>
<Fields>
<Field Name="ID">
<DataField>ID</DataField>
<rd:TypeName>System.String</rd:TypeName>
</Field>
<Field Name="NAME">
<DataField>NAME</DataField>
<rd:TypeName>System.String</rd:TypeName>
</Field>
</Fields>
</DataSet>
</DataSets>
But if I remove the datasource / dataset definition I get
{Microsoft.Reporting.DefinitionInvalidException: The definition of the
report '' is invalid. --->
Microsoft.ReportingServices.ReportProcessing.ReportPublishingException:
The Value expression for the text box ‘Textbox1’ refers to the field
‘ID’. Report item expressions can only refer to fields within the
current dataset scope or, if inside an aggregate, the specified
dataset scope. Letters in the names of fields must use the correct
case.}
Do I always have to create something like a "Dummy"-DataSource/DataSet or do I miss something in my code?
I hope there is another solution as manipulating the XML before rendering-process, any ideas?
Thanks!

You can't leave RDLC witout DataSets, if you are using it and RDLC is embedded in your project.
Either you leave DataSet fixed and change only it's items either try to load report definition from XML
// Valid XML with dynamic DataSources and DataSets
string s = #"<?xml version=""1.0"" encoding=""utf-8""?><Report ...>...</Report>";
report.LoadReportDefinition(new MemoryStream(Encoding.UTF8.GetBytes(s)));
return report.Render(reportName);

Related

Update Timestamp after jdbc:inbound-channel-adapter reads a record from DB

I am using Spring integration's "int-jdbc:inbound-channel-adapter" to fetch records from the DB. However after I fetch records I also need to update 2 columns
1) Status column
2) Timestamp columns
Updating the status column is not an issue as I can use below xml snippet
<int-jdbc:inbound-channel-adapter query="select * from item where status=2"
channel="target" data-source="dataSource"
update="update item set status=10 where id in (:id)" />
However when I try to update timestamp, it doesnt work
<int-jdbc:inbound-channel-adapter query="select * from item where status=2"
channel="target" data-source="dataSource"
update="update item set status=10,timestamp=:timestamp where id in (:id)"
update-sql-parameter-source-actory="timestampUpdaterSqlParameterSourceFactory">
<int-jdbc:inbound-channel-adapter>
<bean id="timestampUpdaterSqlParameterSourceFactory"
class="org.springframework.integration.jdbc.ExpressionEvaluatingSqlParameterSourceFactory" >
<property name="parameterExpressions">
<map>
<entry key="timestamp" value="#now"/>
</map>
</property>
</bean>
<bean id="now" scope="prototype" class="java.sql.Timestamp">
<constructor-arg value="#{ T(java.lang.System).currentTimeMillis()}" />
</bean>
We can use DB level methods to set the time like sysdate for oracle, but I am not keen on using DB specific methods in code for testing purposes(H2 DB used for testing)
Any Help is greatly appreciated
Thanks
I had the same issue, the problem is that :timestamp expression is evaluated as a Collection Projection, check the code here.
So my Original query was something like this
update table set status = 1, published_at = :now where id_event in (:id)
After the parsing was something like this
update table set status = 1, published_at = ?, ?, ? where id_event in (?, ?, ?)
The numbers of ? is the same as the number of results from the select statement. So if the result is more than one, you get a Bad Grammar exception.
I made a not very nice solution (intrusive) using spring-integration-java-dsl
protected void addNotCollectionProjectionExpression(
ExpressionEvaluatingSqlParameterSourceFactory factory,
String key, String expression) {
try {
Field parameterExpressionsField = factory.getClass().getDeclaredField("parameterExpressions");
parameterExpressionsField.setAccessible(true);
Map<String, Expression[]> parameterExpressions = (Map<String, Expression[]>) parameterExpressionsField
.get(factory);
Field parserField = factory.getClass().getDeclaredField("PARSER");
parserField.setAccessible(true);
ExpressionParser parser = (ExpressionParser) parserField.get(factory);
Expression compiledExpression = parser.parseExpression(expression);
Expression[] expressions = new Expression[]{
compiledExpression,
compiledExpression
};
parameterExpressions.put(key, expressions);
} catch (NoSuchFieldException | IllegalAccessException e) {
logger.error("Field parameterExpressions | PARSER can not be obtained", e);
}
}
....
//how to use it
ExpressionEvaluatingSqlParameterSourceFactory factory =
new ExpressionEvaluatingSqlParameterSourceFactory();
addNotCollectionProjectionExpression(factory, "now",
"T(com.example.MyClass).staticMethod()");
return factory;
You can notice that I am avoiding to use Collection Projection using the same expression in both elements of the array.

dbunit/unitils: how to export a multi-schema dataset?

The official tutorial of dbunit already give a good example for exporting dataset from a single database schema.
Is there any way to export different tables from different schemas into one single dataset (say Table_A from schema_A, Table_B from schema_B)?
The exported dataset, when written into an xml file, would be like this:
<?xml version='1.0' encoding='UTF-8'?>
<dataset schema:schemaA schema:schemaB>
<schemaA:tableA ..... />
<schemaA:tableA ..... />
<schemaB:tableB ..... />
</dataset>
I've just got the same problem and to fix it you need to set the FEATURE_QUALIFIED_TABLE_NAMES properties:
See below the same example code with the change (I removed some part of the code because I don't need full database export):
public static void main(String[] args) throws Exception
{
// database connection
Class driverClass = Class.forName("com.microsoft.sqlserver.jdbc.SQLServerDriver");
Connection jdbcConnection = DriverManager.getConnection(
"jdbc:sqlserver://<server>:1433;DatabaseName=<dbName>", "<usr>", "<passwd>");
IDatabaseConnection connection = new DatabaseConnection(jdbcConnection);
Properties props = new Properties();
props.put(DatabaseConfig.FEATURE_QUALIFIED_TABLE_NAMES, "true");
connection.getConfig().setPropertiesByString(props);
// dependent tables database export: export table X and all tables that
// have a PK which is a FK on X, in the right order for insertion
String[] depTableNames = TablesDependencyHelper.getAllDependentTables( connection, "vehicle.Vehicle_Series_Model_SMA" );
IDataSet depDataset = connection.createDataSet( depTableNames );
FlatXmlDataSet.write(depDataset, new FileOutputStream("vehicle.Vehicle_Series_Model_SMA.xml"));
}

NHibernate schema export issue with Oracle Blob field

I'm having an issue creating my Oracle DB with schemaexport function of NHibernate.
For a property defined as byte[], it creates a DB field of type RAW (btw limited to 2000 byte).
This field type is not enough for my needs, and I need NH to create a blob field instead.
How can I achive that?
I tried declaring the field in the mapping file (I use xml mapping, thus hbm files) specifying either type="Binary" and type="BinaryBlob", but none of those seems to have the desired effects: the created field is always a RAW.
Can anyone help me here?
<property name="prop">
<column name="blobcolumn" sql-type="BinaryBlob">
</property>
Update: maybe this could also do the trick
<property name="prop" type="Binary" length="1000000"/>
I had a similar problem and the solution is the length attribute:
<property name="Attachment" length="5224880"/>
If you specify no length then whatever you write in the type attribute it's going to end as RAW(2000) in oracle because its maximum is 2000 bytes, but if you say I need 5 MB or in bytes its 5224880 bytes then nhibernate switch's to BLOB automatically because it's bigger then 2000 bytes
so given the dot net property
public virtual byte[] Attachment { get; set; }
the proper mapping would be
<property name="Attachment" length="5224880"/>
Or you can explore the OracleLiteDialect.cs in the codebase (source code) of Nhibernate
If someone wants a convention way to do the byte[] type translate into BLOB in database i came up with this:
public class ByteArrayToDbBlobConvention : IPropertyConvention, IPropertyConventionAcceptance
{
public void Accept(IAcceptanceCriteria<IPropertyInspector> criteria)
{
criteria.Expect(x => x.Type == typeof(byte[]));
}
public void Apply(IPropertyInstance instance)
{
instance.CustomSqlType("BLOB");
}
}

HQL and Session.Query ignores eager fetching defined in mapping

I have a problem with NHibernate not using my mappings configuration for eager loading a collection when I get something using HQL or Linq (Session.Query). Session.Get and Session.QueryOver is working like expected.
I'm using NHibernate 3.2. Here's the mapping of a collection in my Product mapping.
<bag name="OrderItems" inverse="true" cascade="none" lazy="false" fetch="join">
<key column="order_id" />
<one-to-many class="OrderItem" />
</bag>
and from the other side the mapping looks like this:
<many-to-one name="Product" class="Product" column="product_id" not-null="true" />
I have 4 Tests, 2 are successfull and 2 are not. They use Session.SessionFactory.Statistics to keep track of CollectionFetchCount (was OrderItems selected in 1 joined query or in a separate). The intent is to have OrderItems selected and loaded when selecting the product as OrderItems are almost always accessed as well.
LastCreated is a simple reference to the last product inserted into the DB.
[Test] /* Success */
public void Accessing_Collection_Using_Session_Get_Results_In_1_Select()
{
// Get by Id
var product = Session.Get<Product>(LastCreated.Id);
var count = product.OrderItems.Count;
Assert.AreEqual(0,statistics.CollectionFetchCount,"Product collectionfetchcount using Get");
}
[Test] /* Success */
public void Accessing_Collection_Using_Session_QueryOver_Results_In_1_Select()
{
// Get by Id
var product = Session.QueryOver<Product>().SingleOrDefault();
var count = product.OrderItems.Count;
Assert.AreEqual(0, statistics.CollectionFetchCount, "Product collectionfetchcount using QueryOver");
}
[Test] /* Fail */
public void Accessing_Collection_Using_Session_Query_Results_In_1_Select()
{
// Get by IQueryable and Linq
var product = Session.Query<Product>().Single(x => x.Id == LastCreated.Id);
var count = product.OrderItems.Count;
Assert.AreEqual(0, statistics.CollectionFetchCount, "Product collectionfetchcount using Linq");
}
[Test] /* Fail */
public void Accessing_Collection_Using_HQL_Results_In_1_Select()
{
// Get by IQueryable and Linq
var product = Session.CreateQuery("from Product where Id = :id")
.SetParameter("id",LastCreated.Id)
.UniqueResult<Product>();
var count = product.OrderItems.Count;
Assert.AreEqual(0, statistics.CollectionFetchCount, "Product collectionfetchcount using HQL");
}
Is this intended behaviour or am I doing something wrong?
HQL queries will not respect a fetch="join" set in mapping. This is because they are freeform queries, making it impossible for NH to guess how to transform them to add the join.
Linq is implemented as a wrapper for HQL, QueryOver is a wrapper for Criteria; that's why you see the different behaviors.
If you need eager loads in Linq/HQL, you will have to make them explicit in the query (using join fetch and Fetch()/FetchMany()

parameterized oracle query with enterprise library

My .cs file contains this code
DbCommand dbc =
db.GetStoredProcCommand(
string.Format(ConfigurationSettings.AppSettings["INSERT_TBLREPORTTYPE"]
, rtype, zoneid, name));
and my query in web.cofig is:
<add key="INSERT_TBLREPORTTYPE"
value="INSERT INTO TBLREPORT(ID,TYPE,RELATIONID,ISACTIVE,NAME)
VALUES(SEQ_REPORT.NEXTVAL,{0},{1},0,{2}) "/>
How to add parameter so that I can prevent sql injection on my site?
I tried
db.AddInParameter(dbc, "NAME", DbType.String, name);
db.AddInParameter(dbc, "RELATIONID", DbType.Int32, zoneid);
db.AddInParameter(dbc, "TYPE", DbType.String, rtype);
and also
dbc.Parameters[0].DbType = DbType.String;
dbc.Parameters[0].Value = name;
dbc.Parameters[1].DbType = DbType.Int32;
dbc.Parameters[1].Value = zoneid;
dbc.Parameters[2].DbType = DbType.String;
dbc.Parameters[2].Value = rtype;
None of them is working. Can anyone give me suggestions regarding this????
The parameter prefix for Oracle is :. So you should change your SQL to:
<add key="INSERT_TBLREPORTTYPE"
value="INSERT INTO TBLREPORT(ID, TYPE, RELATIONID, ISACTIVE, NAME)
VALUES(SEQ_REPORT.NEXTVAL, :TYPE, :RELATIONID, 0, :NAME) "/>
Then you can add the parameters using:
db.AddInParameter(dbc, ":NAME", DbType.String, name);

Resources