Informatica Java transformation to generate output file for each MQ message in Realtime MQ schedule - informatica-powercenter

I am trying to generate FlatFile as output contains MQ message Data which is configured to run in Real time. need help with Java code configuration in Informatica PowerCenter Java transformation.
Source is MQ message, Target is Flatfile. Schedule is MQ Realtime with Destructive Read option for MQSeries messages and recovery strategy configured.
I am trying below code, but output is not generated.
Writer writer = null;
filename_1 = o_File_Name;
try {
writer = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(o_File_Name), "utf-8"));
writer.write(MESSAGE_DATA);
} catch (Exception ex) {
// Report
} finally
{
try {writer.close();
} catch (Exception e)
{/*ignore*/}
}
For each M.Q. message it should generate a separate output file which contains message data in it.

Configure below code in Java transformation in "Java Code" tab in "On Input Row" box.
Writer writer = null;
//just writing out the filename here so that you can write to your target for reconciling
filename1 = o_File_Name;
try {
writer = new BufferedWriter(new OutputStreamWriter(
new FileOutputStream(o_File_Name), "utf-8"));
writer.write(MESSAGE_DATA);
} catch (Exception ex) {
// Report
} finally {
try {writer.close();} catch (Exception e) {/*ignore*/}
}
under Import Packages table - add below packages.
import java.io.Writer;
import java.io.BufferedWriter;
import java.io.OutputStreamWriter;
import java.io.FileOutputStream;
pass MESSAGE_DATA as input data which should be content in a file. and pass o_File_Name as location of your file -- $$TGTPATH\\FLATFILES\\xyz.txt

Related

spring write string to file - spacing error

In my Spring boot application, I receive String, now I want to save them as files in a specific directory.
How can I do so ?
I have gone through this, but it is receiving file and saving, but I want to write to those files.
I'm using this code, raw JAVA:
PrintWriter writer = null;
try {
writer = new PrintWriter("file.txt", "UTF_32");
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
writer.println(data);
writer.close();
But it not how everyone will probably want, take a look:
It looks like it's your character encoding, UTF_32.
Notepad does not support UTF_32, only ansi, UTF_8, UTF_16.
See:
Can Notepad read UTF-32?

Error When trying to use XSSF on Jmeter

I am getting error when trying to create xlsx file using Jmeter. actually I already try using HSSF (for .xls) and it is works fine. But when I am trying to change it using xlsx, I am getting error. I already copy the jar file for poi and poi-ooxml on jmeter lib file. here is my simple script :
import org.apache.poi.ss.usermodel.Cell;
import org.apache.poi.ss.usermodel.Row;
import org.apache.poi.xssf.usermodel.XSSFSheet;
import org.apache.poi.xssf.usermodel.XSSFWorkbook;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.*;
import java.lang.String;
import java.lang.Object;
XSSFWorkbook workbook = new XSSFWorkbook();
XSSFSheet sheet = workbook.createSheet("Sample sheet");
Row row = sheet.createRow(0);
Cell cell = row.createCell(0);
cell.setCellValue("HENCIN");
try {
FileOutputStream out = new FileOutputStream(new File("D:\\Jmeter\\testhencin.xlsx"));
workbook.write(out);
out.close();
System.out.println("Excel written successfully..");
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
Actually when I am trying to find the error, the problem are getting from this line :
XSSFWorkbook workbook = new XSSFWorkbook();
XSSFSheet sheet = workbook.createSheet("Sample sheet");
Please someone help me to figure it out. it works on HSSF but on XSSF it is not working. I am getting error :Response code: 500
Response message: org.apache.jorphan.util.JMeterException: Error invoking bsh method: eval org/apache/xmlbeans/XmlObject
I would suggest:
Catching all the possible exceptions and printing the stacktrace to jmeter.log file as well
Re-throwing the exception to make sure you won't get false positive sampler result, something like:
} catch (Throwable e) {
e.printStackTrace();
log.info("Error in Beanshell", e);
throw e;
}
With regards to your question, most likely it is due to missing XMLBeans jar in JMeter classpath. I would suggest the following:
Get "clean" installation of the latest JMeter version
Download the latest version of tika-app.jar and drop it to JMeter's "lib" folder
Restart JMeter to pick the jar up
Using Tika you will get all the necessary libraries bundled, moreover, you JMeter will display content of the binary files in the View Results Tree listener. See How to Extract Data From Files With JMeter article for more details.

How to get rid of NullPointerException in Flume Interceptor?

I have an interceptor written for Flume code is below:
public Event intercept(Event event) {
byte[] xmlstr = event.getBody();
InputStream instr = new ByteArrayInputStream(xmlstr);
//TransformerFactory factory = TransformerFactory.newInstance(TRANSFORMER_FACTORY_CLASS,TRANSFORMER_FACTORY_CLASS.getClass().getClassLoader());
TransformerFactory factory = TransformerFactory.newInstance();
Source xslt = new StreamSource(new File("removeNs.xslt"));
Transformer transformer = null;
try {
transformer = factory.newTransformer(xslt);
} catch (TransformerConfigurationException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
Source text = new StreamSource(instr);
OutputStream ostr = new ByteArrayOutputStream();
try {
transformer.transform(text, new StreamResult(ostr));
} catch (TransformerException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
event.setBody(ostr.toString().getBytes());
return event;
}
I'm removing NameSpace from my source xml with removeNs.xslt file. So that I can store that data into HDFS and later put into hive. When my interceptor run it throw below error :
ERROR org.apache.flume.source.jms.JMSSource: Unexpected error processing events
java.lang.NullPointerException
at test.intercepter.App.intercept(App.java:59)
at test.intercepter.App.intercept(App.java:82)
at org.apache.flume.interceptor.InterceptorChain.intercept(InterceptorChain.java:62)
at org.apache.flume.channel.ChannelProcessor.processEventBatch(ChannelProcessor.java:146)
at org.apache.flume.source.jms.JMSSource.doProcess(JMSSource.java:258)
at org.apache.flume.source.AbstractPollableSource.process(AbstractPollableSource.java:54)
at org.apache.flume.source.PollableSourceRunner$PollingRunner.run(PollableSourceRunner.java:139)
at java.lang.Thread.run(Thread.java:745)*
Can you suggest me what and where is the problem?
I found the solution. The problem was not anything else than new File("removeNs.xslt"). It was not able to find the location as I's not sure where to keep this file but later I get the flume agent path but as soon as I restart the flume agent it deletes all files which I kept in the flume agent dir. So I changed the code and kept the file material into my java code.

File Overwrite issue when trying to transfer file using FTP

I have a FTP server with only List and Put permissions. But not having delete, overwrite and Rename permissions.
Now when I try to transfer a file using simple FTP using the following implementation
private boolean sendFileStreamHelper(InputStream inputStream, String nameOfFileToStore, String filetransferDestFolder) throws FileTransferException {
Log.info("Inside SendFile inputstream method to trasport the input stream of file " + nameOfFileToStore + " data to " + filetransferDestFolder);
BufferedOutputStream os = null;
FileObject fo = null;
try {
fo = getFileObject(nameOfFileToStore, filetransferDestFolder, ftpAuthDetails.getServerName(), ftpAuthDetails.getUsername(), ftpAuthDetails
.getPassword(), ftpAuthDetails.getPort());
fo.createFile();// create a file in the remote to transfer the file
os = new BufferedOutputStream(fo.getContent().getOutputStream());
FileUtil.readStream(inputStream, os);
return true;
} catch (Exception ex) {
Log.error("File transfer exception occurred while transferrig the file " + nameOfFileToStore + " to " + filetransferDestFolder, ex);
throw new FileTransferException(ex);
} finally {
if (os != null) {
try {
os.flush();
os.close();
} catch (IOException e) {
Log.warn(getClass(), " Error while closing the buffer output stream", e);
}
}
if (fo != null) {
try {
fo.close();
} catch (IOException e) {
Log.warn(getClass(), " Error while closing the File object", e);
}
}
closeCache(); // Close the VFS Manager instance
}
}
In the above code as the File is created in the remote using the File Object instance. Later to that I am trying to write the file with the Buffered stream. Here the systems acts as if it is writing to a file which is already created and as my server is not having any overwrite permission, throwing following error.
29 Jul 2012 21:03:06 [ERROR] FC_ClusteredScheduler_Worker-2(1) com.abc.filetransfer.FileTransferClient - .sendFileStreamHelper(FileTransferClient.java:170) - File transfer exception occurred while transferrig the file *******.txt to / ex-org.apache.commons.vfs2.FileSystemException: Could not write to "ftp://******:***#***.***.***.***/*********.txt"
org.apache.commons.vfs2.FileSystemException: Could not write to "ftp://******:***#***.***.***.***/*********.txt".
at org.apache.commons.vfs2.provider.AbstractFileObject.getOutputStream(AbstractFileObject.java:1439)
at org.apache.commons.vfs2.provider.DefaultFileContent.getOutputStream(DefaultFileContent.java:461)
at org.apache.commons.vfs2.provider.DefaultFileContent.getOutputStream(DefaultFileContent.java:441)
at com.abc.filetransfer.FileTransferClient.sendFileStreamHelper(FileTransferClient.java:164)
at com.abc.filetransfer.FileTransferClient.sendFile(FileTransferClient.java:131)
at com.abc.filetransfer.FileTransferClient.sendFile(FileTransferClient.java:103)
at com.abc.filetransfer.client.FTPTransferClient.sendFile(FTPTransferClient.java:65)
Caused by: org.apache.commons.vfs2.FileSystemException: Cant open output connection for file "ftp://******:***#***.***.***.***/*********.txt".
Reason: "**550 File unavailable. Overwrite not allowed by user profile**^M
at org.apache.commons.vfs2.provider.ftp.FtpFileObject.doGetOutputStream(FtpFileObject.java:648)
at org.apache.commons.vfs2.provider.AbstractFileObject.getOutputStream(AbstractFileObject.java:1431)
Please let me know how can I handle the file transfer using file Object, such that both File creation and writing the stream should happen at once.
I have resolved the issue.
Its pretty straight forward. In the below code
fo = getFileObject(nameOfFileToStore, filetransferDestFolder, ftpAuthDetails.getServerName(), ftpAuthDetails.getUsername(), ftpAuthDetails
.getPassword(), ftpAuthDetails.getPort());
fo.createFile();// create a file in the remote to transfer the file
os = new BufferedOutputStream(fo.getContent().getOutputStream());
FileUtil.readStream(inputStream, os);
I am creating a file first using the FileObject and then trying to write the BOS into the file.
Here while writing BOS to file system considers that we are trying to add data to a already existing file (as I am doing it in two separate steps, Creating a file and writing Data to the same) and returns the Error
**550 File unavailable. Overwrite not allowed by user profile*
I have removed the
fo.createFile()
as the BOS while writing the data will any way create a file if not available.
Thanks for your time.
Purushotham Reddy

Unable to load OpenNLP sentence model in Hadoop map-reduce job

I'm trying to get OpenNLP integrated into a map-reduce job on Hadoop, starting with some basic sentence splitting. Within the map function, the following code is run:
public AnalysisFile analyze(String content) {
InputStream modelIn = null;
String[] sentences = null;
// references an absolute path to en-sent.bin
logger.info("sentenceModelPath: " + sentenceModelPath);
try {
modelIn = getClass().getResourceAsStream(sentenceModelPath);
SentenceModel model = new SentenceModel(modelIn);
SentenceDetectorME sentenceBreaker = new SentenceDetectorME(model);
sentences = sentenceBreaker.sentDetect(content);
} catch (FileNotFoundException e) {
logger.error("Unable to locate sentence model.");
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (modelIn != null) {
try {
modelIn.close();
} catch (IOException e) {
}
}
}
logger.info("number of sentences: " + sentences.length);
<snip>
}
When I run my job, I'm getting an error in the log saying "in must not be null!" (source of class throwing error), which means that somehow I can't open an InputStream to the model. Other tidbits:
I've verified that the model file exists in the location sentenceModelPath refers to.
I've added Maven dependencies for opennlp-maxent:3.0.2-incubating, opennlp-tools:1.5.2-incubating, and opennlp-uima:1.5.2-incubating.
Hadoop is just running on my local machine.
Most of this is boilerplate from the OpenNLP documentation. Is there something I'm missing, either on the Hadoop side or the OpenNLP side, that would cause me to be unable to read from the model?
Your problem is the getClass().getResourceAsStream(sentenceModelPath) line. This will try to load a file from the classpath - neither the file in HDFS nor on the client local file system is part of the classpath at mapper / reducer runtime, so this is why you're seeing the Null error (the getResourceAsStream() returns null if the resource cannot be found).
To get around this you have a number of options:
Amend your code to load the file from HDFS:
modelIn = FileSystem.get(context.getConfiguration()).open(
new Path("/sandbox/corpus-analysis/nlp/en-sent.bin"));
Amend your code to load the file from the local dir, and use the -files GenericOptionsParser option (which copies to file from the local file system to HDFS, and back down to the local directory of the running mapper / reducer):
modelIn = new FileInputStream("en-sent.bin");
Hard-bake the file into the job jar (in the root dir of the jar), and amend your code to include a leading slash:
modelIn = getClass().getResourceAsStream("/en-sent.bin");</li>

Resources