InputStreamReader throws an NPE as follows when I execute this code on the second iteration of the for loop. The code works perfectly for the first iteration and returns the following NPE on the second iteration. I am using the code snippet to read contents of specific file from an FTP location and display them. Please note all the lines till the new InputStreamReader work perfectly even on second iteration. Any ideas why?
Exception in thread "main" java.lang.NullPointerException
at java.io.Reader.<init>(Reader.java:61)
at java.io.InputStreamReader.<init>(InputStreamReader.java:55)
at com.test.txtweb.server.task.CallBackRetryTask.main(CallBackRetryTask.java:229)
Here is the source code:
public static void main(String[] args){
String strDate = "20130805";
FTPClient ftpClient = new FTPClient();
try {
ftpClient.connect(host);
String pathToFiles = "/path/to/File";
String ftpFileName = "";
List<String> ftpFileNames = null;
InputStream iStream;
if(ftpClient.login(username, password)){
ftpClient.enterLocalPassiveMode();
FTPFile[] ftpFiles = ftpClient.listFiles();
ftpFileNames = new ArrayList<String>();
for (FTPFile ftpFile : ftpFiles) {
ftpFileName = ftpFile.getName();
if(ftpFileName.contains(strDate)){
iStream = ftpClient.retrieveFileStream(pathToFiles + ftpFileName);
System.out.println(ftpClient.getReplyString());
InputStreamReader isr = new InputStreamReader(iStream); //Error on this line on second iteration
BufferedReader br = new BufferedReader(isr);
String line = "";
while ((line = br.readLine()) != null) {
System.out.println(line);
}
}
}
}
Figured out the issue after quite a lot of debugging. I was not calling the completePendingCommand() after transferring the file to check the status of the transfer.
The API for FTPClient.retrieveFileStream() states that to finalize the file transfer you must call FTPClient.completePendingCommand() and check its return value to verify success. After correcting this issue, everything started working fine.
Related
i am using org.apache.commons.imaging to access metadata of jpeg file and write it to a text file. But instead of writing metadata to file, program writes random characters to text file. Can someone please help me resolve this issue? Following is the code
//Method to access image metadata and write to text file
public void removeExifMetadata(final File jpegImageFile, final File dst)
throws IOException, ImageReadException, ImageWriteException {
OutputStream os = null;
boolean canThrow = false;
try {
os = new FileOutputStream(dst);
os = new BufferedOutputStream(os);
new ExifRewriter().removeExifMetadata(jpegImageFile, os);
canThrow = true;
} finally {
IoUtils.closeQuietly(canThrow, os);
File metadata=new File("matadata.txt");
if(!metadata.exists()){
metadata.createNewFile();
}
System.out.printf("in try block\n");
FileOutputStream fos = new FileOutputStream(metadata);
TeeOutputStream myOut=new TeeOutputStream(System.out, fos);
PrintStream ps = new PrintStream(myOut);
System.setOut(ps);
System.out.printf("in final block\n");
}
}
//call to removeExifMetadata from main
File imageFile=new File("toddler.jpg");
File out=new File("exif.txt");
e.removeExifMetadata(imageFile, out);
I am trying to learn how to use try with resources. First I tried to put java.io.File myFile = new java.io.File(filename) in the resource parenthesis, but netbeans told me that it is not autoclosable. Am I properly handling this exception? I was under the impression that the Exception would be generated in the line where I define the file class object.
//This method writes to a csv or txt file, specify full filepath (including
//extension) Each value will be on a new line
public void writeFile(String filename)
{
java.io.File myFile = new java.io.File(filename);
try(java.io.PrintWriter outfile = new java.io.PrintWriter(myFile))
{
for (int i = 0; i < size; i++)
{
//print all used elements line by line
outfile.println(Integer.toString(this.getElement(i)));
}
} catch (FileNotFoundException fileNotFoundException)
{
//print error
}
}//end writeFile(String)----------------------------------------------------
I have to build a client/server database; the server builds the database that contains a single table. It has to pass the table to
the client which will display it in a TableView.
The server retrieves the SQL commands from a txt file.
The question is how do I (Server) pass the table that I've created to the client?
I thought of ArrayList but it gets really complicated and hard to handle.
The server code:
//Load driver, connect to JDBC..
private void DBSetup()
{
try
{
statement = connection.createStatement();
FileInputStream fstream = new FileInputStream("mysql.txt");
// Get the object of DataInputStream
DataInputStream in = new DataInputStream(fstream);
BufferedReader br = new BufferedReader(new InputStreamReader(in));
String strLine;
//Read File Line By Line
while ((strLine = br.readLine()) != null)
{
System.out.println(strLine);
//INSERT PLATFORM->RUNLATER
status.appendText(strLine+"\n");
if (strLine != null && !strLine.equals(""))
statement.execute(strLine);
}
br.close();
}catch (java.lang.Exception ex)
{
Platform.runLater(()->
{
status.setText(ex.toString());
});
}
}
My program supports multiple clients, so I have a hashTable that holds the socket and the outputStream of each client.
P.S.: I'm working with javaFX.
I searched around for this but I could not find a soultion.
Sorry about my bad description. Im not very good at this.
I have a UI class
Its calling a "lotto" class.
That lotto classes constructor is called a method named readData()
readData is reading from a file using BufferedReader
Im not getting an error message but its just not reading.
It gets stuck at BufferedReader fr = new BufferedReader... and goes to the catch thing.
If its a file not found problem how would i make it track where my file is. Im using eclipse and the program is stored on my usb. I need to hand it in to my teacher so i cant just put a location in. Is there code that tracks where my program is then takes the file from that folder?
Here is the code being used.
import java.io.*;
//contructor
public Lotto()
{
try
{
readData();
nc = new NumberChecker();
}
catch(IOException e)
{
System.out.println("There was a problem");
}
}
private void readData() throws IOException
{
//this method reads winning tickets date and pot from a file
BufferedReader file = new BufferedReader (new FileReader("data.txt"));
for(int i=0;i<5;i++)
{
System.out.println("in "+i);
winningNums[i] = file.readLine();
winningDates[i] = file.readLine();
weeksMoney[i] = Integer.parseInt(file.readLine());
System.out.println("out "+i);
}
file.close();
}
if you get an error in this line of code
BufferedReader file = new BufferedReader (new FileReader("data.txt"));
Then it is probably a FileNotFoundException
Make sure that the data.txt file is in the same folder as your compiled .class file and not the .java source.
It would be best to use a proper root to your file ex. c:\my\path\data.txt
And don't forget the \
Try surrounding the BufferedReader in a try catch and look for a file not found exception as well as IO exception. Also try putting in the fully qualified path name with double backslashes.
BufferedReader file;
try {
file = new BufferedReader (new FileReader("C:\\filepath\\data.txt"));
for(int i=0;i<5;i++)
{
System.out.println("in "+i);
winningNums[i] = file.readLine();
winningDates[i] = file.readLine();
weeksMoney[i] = Integer.parseInt(file.readLine());
System.out.println("out "+i);
}
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
I'm trying to read from a hdfs file line by line and then create a hdfs file and write to it line by line. The code that I use looks like this:
Path FileToRead=new Path(inputPath);
FileSystem hdfs = FileToRead.getFileSystem(new Configuration());
FSDataInputStream fis = hdfs.open(FileToRead);
BufferedReader reader = new BufferedReader(new InputStreamReader(fis));
String line;
line = reader.readLine();
while (line != null){
String[] lineElem = line.split(",");
for(int i=0;i<10;i++){
MyMatrix[i][Integer.valueOf(lineElem[0])-1] = Double.valueOf(lineElem[i+1]);
}
line=reader.readLine();
}
reader.close();
fis.close();
Path FileToWrite = new Path(outputPath+"/V");
FileSystem fs = FileSystem.get(new Configuration());
FSDataOutputStream fileOut = fs.create(FileToWrite);
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(fileOut));
writer.write("check");
writer.close();
fileOut.close();
When I run this code in my outputPath file V has not been created. But if I replace the part for reading with the part for writing the file will be created and check is written into it.
Can anyone please help me understand how to use them correctly to be able to read first the whole file and then write to the file line by line?
I have also tried another code for reading from one file and writing to another one but the file will be created but there is nothing written into it!
I use sth like this:
hadoop jar main.jar program2.Main input output
Then in my first job I read from arg[0] and write to a file in args[1]+"/NewV" using map reduce classes and it works.
In my other class (non map reduce)I use args[1]+"/NewV" as input path and output+"/V_0" as output path (I pass these strings to constructor). here is the code for the class :
public class Init_V {
String inputPath, outputPath;
public Init_V(String inputPath, String outputPath) throws Exception {
this.inputPath = inputPath;
this.outputPath = outputPath;
try{
FileSystem fs = FileSystem.get(new Configuration());
Path FileToWrite = new Path(outputPath+"/V.txt");
Path FileToRead=new Path(inputPath);
BufferedWriter output = new BufferedWriter
(new OutputStreamWriter(fs.create(FileToWrite,
true)));
BufferedReader reader = new
BufferedReader(new InputStreamReader(fs.open(FileToRead)));
String data;
data = reader.readLine();
while ( data != null )
{
output.write(data);
data = reader.readLine();
}
reader.close();
output.close(); }catch(Exception e){
}
}
}
I think, you need to understand how hadoop works properly. In hadoop, many thing is done by the system, you are just giving input and output path, then they are opened and created by hadoop if the paths are valid. Check the following example;
public int run (String[] args) throws Exception{
if(args.length != 3){
System.err.println("Usage: MapReduce <input path> <output path> ");
ToolRunner.printGenericCommandUsage(System.err);
}
Job job = new Job();
job.setJarByClass(MyClass.class);
job.setNumReduceTasks(5);
job.setJobName("myclass");
FileInputFormat.addInputPath(job, new Path(args[0]) );
FileOutputFormat.setOutputPath(job, new Path(args[1]));
job.setMapperClass(MyMapper.class);
job.setReducerClass(MyReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
return job.waitForCompletion(true) ? 0:1 ;
}
/* ----------------------main---------------------*/
public static void main(String[] args) throws Exception{
int exitCode = ToolRunner.run(new MyClass(), args);
System.exit(exitCode);
}
As you see here, you only initialize necessary variables and reading&writing is done by hadoop.
Also, in your Mapper class you are saying context.write(key, value) inside map, and similarly in your Reduce class you are doing same, it writes for you.
If you use BufferedWriter/Reader it will write to your local file system not to HDFS. To see files in HDFS you should write hadoop fs -ls <path>, the files you are looking by ls command are in your local file system
EDIT: In order to use read/write you should know the followings: Let say you have N machine in your hadoop network. When you want to read, you will not know which mapper is reading, similarly writing. So, all mappers and reducer should have those paths not to give exception.
I dont know if you could use any other class but you can use two methods for your specific reason: startup and cleanup. These methods are used only once in each map and reduce worker. So if you want to read and write you can use that files. Reading and writing is same as normal java code. For example, you want to see something for each key, and want to write it to a txt. You can do the following:
//in reducer
BufferedReader bw ..;
void startup(...){
bw = new ....;
}
void reduce(...){
while(iter.hasNext()){ ....;
}
bw.write(key, ...);
}
void cleanup(...){
bw.close();
}