How to add date and time to the file name - spring

I am working on the JasperReports using Spring. I am giving the chance to customer to download the report in pdf and csv formats. But my requirement is in download prompt i need to show the file name with date and time. But I am able to show only the file name. How can I show the file name with date and time while downloading?
I am using the below code in my jasper-views.xml:
<bean id="test-pdf"
class="org.springframework.web.servlet.view.jasperreports.JasperReportsPdfView"
p:url="classpath:reports/testreport.jrxml"
p:reportDataKey="datasource">
<property name="headers">
<props>
<prop key="Content-Disposition">
attachment; filename=TestReport.pdf
</prop>
</props>
</property>
</bean>
How can i add date and time to the file name while downloading?

before writing your stream , set the file name :
response.setHeader("Content-Disposition", "inline; filename=" + fileName);
in the file name , you can set your date dynamically :)
and this link may be useful :
securly-download-file-inside-browser-with-correct-filename

In your example, you're able to show the filename because you can also specify what the filename should be when you actually generate the report. If you want to display the date and time the report is generated, and the report is generated when they press some button to download the report, you're really just displaying the current time. I don't know what value there is in that, but if it's necessary, just use some javascript to continually update the current time in their browser, maybe using a timer that fires once a minute to update the displayed time.

Related

Shibboleth 4 IDP: Query two different login sources with the Password flow

I have two login sources (an Active Directory and a local MySQL Database) that each contain different users. I want to configure the Password flow in this way:
query the AD first
if this succeeds, the user gets logged in
if it fails, query the local database and log the user in if this succeeds
else, authentication fails
How can I achieve that?
This is the solution I found:
inside the file conf/authn/password-authn-config.xml put the following lines or replace if they already exist:
<import resource="jaas-authn-config.xml"/>
<!-- Ordered list of CredentialValidators to apply to a request. -->
<util:list id="shibboleth.authn.Password.Validators">
<ref bean="shibboleth.JAASValidator"/>
</util:list>
Comment out any other resources that you don't need, such as ldap-authn-config.xml or krb5-authn-config.xml.
In my case, I want the login to succeed if either of my login sources return 'okay'. Therefore you need this line:
<!-- Controls whether all validators in the above bean have to succeed, or just one. -->
<util:constant id="shibboleth.authn.Password.RequireAll" static-field="java.lang.Boolean.FALSE"/>
If you want all login sources to succeed, just replace 'FALSE' with 'TRUE'.
Next, put the following inside conf/authn/jaas-authn-config.xml:
<!-- Specify your JAAS config. -->
<bean id="JAASConfig" class="org.springframework.core.io.FileSystemResource" c:path="%{idp.home}/conf/authn/jaas.config" />
<util:property-path id="shibboleth.authn.JAAS.JAASConfigURI" path="JAASConfig.URI" />
<!-- Specify the application name(s) in the JAAS config. -->
<util:list id="shibboleth.authn.JAAS.LoginConfigNames">
<value>ShibUserPassAuthLDAP</value>
<value>ShibUserPassAuthJAAS</value>
</util:list>
Now open conf/authn/jaas.config and write this:
ShibUserPassAuthJAAS {
relationalLogin.DBLogin required debug=true
dbDriver="com.mysql.jdbc.Driver"
userTable="login"
userColumn="email"
passColumn="password"
dbURL="jdbc:mysql://localhost:3306/login"
dbUser="your_db_user"
dbPassword="your_db_password"
hashAlgorithm="SHA2" // or what u need
saltColumn="salt" // leave empty if you don't need this
errorMessage="Invalid password"
where="status < 9999"; // remove if you don't need this
};
ShibUserPassAuthLDAP {
org.ldaptive.jaas.LdapLoginModule required
ldapUrl="ldap://localhost:10389" // your active directory url
useStartTLS="true"
baseDn="OU=example,OU=example,DC=example,DC=org" // change this to whatever you need
bindDn="CN=shibboleth,OU=example,DC=example,DC=local" // change this to whatever you need
bindCredential="your_ad_password"
userFilter="(sAMAccountName={user})"
credentialConfig="{trustCertificates=file:/opt/shibboleth-idp/credentials/ldap.pem}";
};
relationalLogin.DBLogin is a java class I use to actually check the credentials. You can download it from here: download the jar
Just put it in this directory on your idp: {shibboleth_root}/edit-webapp/WEB-INF/lib/
Now make sure you configured the password flow correctly in conf/authn/general_authn.xml:
<bean id="authn/Password" parent="shibboleth.AuthenticationFlow"
p:passiveAuthenticationSupported="true"
p:forcedAuthenticationSupported="true"/>
And to enable the Password flow change this line in idp.properties:
idp.authn.flows=
to this:
idp.authn.flows=Password
After you completed these steps, don't forget to restart jetty for the changes to take effect.
Explanation
The two entries called ShibUserPassAuthLDAP and ShibUserPassAuthJAAS in jaas-authn-config.xml are where the magic happens: the password flow will try to validate the credentials using those two configurations you provided. It will try the first one and finish authentication if it succeeds, or try the second configuration if the first fails.

Setting relative classpath for BIRT POJO Datasource in .rptdesignfile

In the .rptdesign file when the"pojoDataSetClassPath" property is set to an absolute path , the report gets generated successfully.However when it is set to a relative path, the data from the data-source is not passed to the report thus the report has no data in it.
For example:
When i set the property value to the absolute path(C:\jars\Abc.jar) in .rptdesign file, report gets generated successfully.
<property name="pojoDataSetClassPath">C:\jars\Abc.jar;</property>
However when i set the property value to the relative path(./jars/Abc.jar), report does not contain any data.
<property name="pojoDataSetClassPath">./jars/Abc.jar;</property>
Can anyone please suggest a correct way to set the relative classpath , thus helping in picking the right pojodatasources.
Thanks in advance!

BIRT: Specifying XML Datasource file as parameter does not work

Using BIRT designer 3.7.1, it's easy enough to define a report for an XML file data source; however, the input file name is written into the .rptdesign file as constant value, initially. Nice for the start, but useless in real life. What I want is start the BIRT ReportEngine via the genReport.bat script, specifying the name of the XML data source file as parameter. That should be trivial, but it is surprisingly difficult...
What I found out is this: Instead of defining the XML data source file as a constant in the report definition you can use params["datasource"].value, which will be replaced by the parameter value at runtime. Also, in BIRT Designer you can define the Report Parameter (datasource) and give it a default value, say "file://d:/sample.xml".
Yet, it doesn't work. This is the result of my Preview attempt in Designer:
Cannot open the connection for the driver: org.eclipse.datatools.enablement.oda.xml.
org.eclipse.datatools.connectivity.oda.OdaException: The xml source file cannot be found or the URL is malformed.
ReportEngine, started with 'genReport.bat -p "datasource=file://d:/sample.xml" xx.rptdesign' says nearly the same.
Of course, I have made sure that the XML file exists, and tried different spellings of the file URL. So, what's wrong?
What I found out is this: Instead of defining the XML data source file as a constant in the report definition you can use params["datasource"].value, which will be replaced by the parameter value at runtime.
No, it won't - at least, if you specify the value of &XML Data Source File as params["datasource"].value (instead of a valid XML file path) at design time then you will get an error when attempting to run the report. This is because it is trying to use the literal string params["datasource"].value for the file path, rather than the value of params["datasource"].value.
Instead, you need to use an event handler script - specifically, a beforeOpen script.
To do this:
Left-click on your data source in the Data Explorer.
In the main Report Design pane, click on the Script tab (instead of the Layout tab). A blank beforeOpen script should be visible.
Copy and paste the following code into the script:
this.setExtensionProperty("FILELIST", params["datasource"].value);
If you now run the report, you should find that the value of the parameter datasource is used for the XML file location.
You can find out more about parameter-driven XML data sources on BIRT Exchange.
Since this is an old thread but still usefull, i ll add some info :
In the edit datasource, add some url to have sample data to create your dataset
Create your dataset
Then remove url as shown
add some script

hibernate + Oracle 11.2 + BLOB

I googled a lot and haven't found working issue... In my web app I need to upload large files using ajax. I use ajaxfileupload plugin for it. In my FormBean class I mapped file to InputStream:
private InputStream fileData;
and
#FormParam("file")
#PartType("application/octet-stream")
#JsonIgnore
public void setFileData(InputStream fileData) {
this.fileData = fileData;
}
It works fine. I can save this stream into a file and haven't got any problems with java heap size. Now I'm trying to save it into database using Hibernate. Like this:
repFile.setFileData(session.getLobHelper().createBlob(file.getFileData(), 1024L));
and when I save repFile object I have ORA-01461 can bind a LONG value only for insert into a LONG column.
It works with Oracle 10. But it crashes with Oracle 11.2
I tried to add lobHandler to my configuration - didn't help
<property name="lobHandler">
<bean class="org.springframework.jdbc.support.lob.OracleLobHandler">
<property name="nativeJdbcExtractor">
<bean class="org.springframework.jdbc.support.nativejdbc.CommonsDbcpNativeJdbcExtractor"/>
</property>
</bean>
</property>
And set batch size to 0 and allow steams
<prop key="hibernate.jdbc.use_streams_for_binary">true</prop>
<prop key="hibernate.jdbc.batch_size">0</prop>
That didn't help also... does anyone have a solution for this? Any help would be good.
You need to map the domain class like this:
#javax.persistence.Lob
private java.sql.Blob fileData;
Also, make sure that you create the column in the database as a 'BLOB'.
Finally, I recommend that you do not use 'InputStream' in your FormBean, but instead use something like 'MultiPartFile', since you can read an InputStream only once (unless you rewind/reset it). Also, MultiPartFile will give you the filename and length.

How can I configure the indexes for using db4o with Spring?

I'm currently evaluating the Spring-db4o integration. I was impressed by the declarative transaction support as well as the ease to provide declarative configuration.
Unfortunately, I'm struggling to figure how to create an index on specific fields. Spring is preparing the db during the tomcat server startup. Here's my spring entry :
<bean id="objectContainer" class="org.springmodules.db4o.ObjectContainerFactoryBean">
<property name="configuration" ref="db4oConfiguration" />
<property name="databaseFile" value="/WEB-INF/repo/taxonomy.db4o" />
</bean>
<bean id="db4oConfiguration" class="org.springmodules.db4o.ConfigurationFactoryBean">
<property name="updateDepth" value="5" />
<property name="configurationCreationMode" value="NEW" />
</bean>
<bean id="db4otemplate" class="org.springmodules.db4o.Db4oTemplate">
<constructor-arg ref="objectContainer" />
</bean>
db4oConfiguration doesn't provide any means to specify the index. I wrote a simple ServiceServletListener to set the index. Here's the relevant code:
Db4o.configure().objectClass(com.test.Metadata.class).objectField("id").indexed(true);
Db4o.configure().objectClass(com.test.Metadata.class).objectField("value").indexed(true);
I inserted around 6000 rows in this table and then used a SODA query to retrieve a row based on the key. But the performance was pretty poor. To verify that indexes have been applied properly, I ran the following program:
private static void indexTest(ObjectContainer db){
for (StoredClass storedClass : db.ext().storedClasses()) {
for (StoredField field : storedClass.getStoredFields()) {
if(field.hasIndex()){
System.out.println("Field "+field.getName()+" is indexed! ");
}else{
System.out.println("Field "+field.getName()+" isn't indexed! ");
}
}
}
}
Unfortunately, the results show that no field is indexed.
On a similar context, in OME browser, I saw there's an option to create index on fields of each class. If I turn the index to true and save, it appears to be applying the change to db4o. But again, if run this sample test on the db4o file, it doesn't reveal any index.
Any pointers on this will be highly appreciated.
Unfortunately I don't know the spring extension for db4o that well.
However the Db4o.configure() stuff is deprecated and works differently than in earlier versions. In earlier versions there was a global db4o configuration. Not this configuration doesn't exist anymore. The Db4o.configure() call doesn't change the configuration for running object containers.
Now you could try to do this work around and a running container:
container.ext().configure().objectClass(com.test.Metadata.class).objectField("id").indexed(true);
This way you change the configuration of the running object container. Note that changing the configuration of a running object container can lead to dangerous side effect and should be only used as last resort.

Resources