Mybatis Custom Type handler: call FileInputStream.close() after query being executed - jdbc

I am trying to implement MyBatis custom type handler for File using FileInputStream.
here is my code for setting:
#MappedJdbcTypes(JdbcType.LONGVARBINARY)
public class FileByteaHandler extends BaseTypeHandler<File> {
#Override
public void setNonNullParameter(PreparedStatement ps, int i, File file, JdbcType jdbcType) throws SQLException{
try {
FileInputStream fis = new FileInputStream(file);
ps.setBinaryStream(1, fis, (int) file.length());
} catch(FileNotFoundException ex) {
Logger.getLogger(FileByteaHandler.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
My question is:
I can not close this FileInputStream at the end of this method, otherwise MyBatis will not be able to read the data from it. In fact, I do not know where I can close the FileInputStream. Is there a way to call close() after the query being excuted in MyBatis.
Thanks in advance,
UPDATE
Thanks for Jarandinor's help. Here is my code for this type handler. and hopefully it can help someone:
#MappedJdbcTypes(JdbcType.LONGVARBINARY)
public class FileByteaHandler extends BaseTypeHandler<File> {
#Override
public void setNonNullParameter(PreparedStatement ps, int i, File file, JdbcType jdbcType) throws SQLException {
try {
AutoCloseFileInputStream fis = new AutoCloseFileInputStream(file);
ps.setBinaryStream(1, fis, (int) file.length());
} catch(FileNotFoundException ex) {
Logger.getLogger(FileByteaHandler.class.getName()).log(Level.SEVERE, null, ex);
}
}
#Override
public File getNullableResult(ResultSet rs, String columnName) throws SQLException {
File file = null;
try(InputStream input = rs.getBinaryStream(columnName)) {
file = getResult(rs, input);
} catch(IOException e) {
System.out.println(e.getMessage());
}
return file;
}
public File creaetFile() {
File file = new File("e:/target-file"); //your temp file path
return file;
}
private File getResult(ResultSet rs, InputStream input) throws SQLException {
File file = creaetFile();
try(OutputStream output = new FileOutputStream(file)) {
int bufSize = 0x8000000;
byte buf[] = new byte[bufSize];
int s = 0;
int tl = 0;
while( (s = input.read(buf, 0, bufSize)) > 0 ) {
output.write(buf, 0, s);
tl += s;
}
output.flush();
} catch(IOException e) {
System.out.println(e.getMessage());
}
return file;
}
#Override
public File getNullableResult(ResultSet rs, int columnIndex) throws SQLException {
File file = null;
try(InputStream input = rs.getBinaryStream(columnIndex)) {
file = getResult(rs, input);
} catch(IOException e) {
System.out.println(e.getMessage());
}
return file;
}
#Override
public File getNullableResult(CallableStatement cs, int columnIndex) throws SQLException {
throw new SQLException("getNullableResult(CallableStatement cs, int columnIndex) is called");
}
private class AutoCloseFileInputStream extends FileInputStream {
public AutoCloseFileInputStream(File file) throws FileNotFoundException {
super(file);
}
#Override
public int read() throws IOException {
int c = super.read();
if(available() <= 0) {
close();
}
return c;
}
public int read(byte[] b) throws IOException {
int c = super.read(b);
if(available() <= 0) {
close();
}
return c;
}
public int read(byte[] b, int off, int len) throws IOException {
int c = super.read(b, off, len);
if(available() <= 0) {
close();
}
return c;
}
}
}
public AutoCloseFileInputStream(File file) throws FileNotFoundException {
super(file);
}
#Override
public int read() throws IOException {
int c = super.read();
if( c == -1 ) {
close();
}
return c;
}
public int read(byte[] b) throws IOException {
int c = super.read(b);
if( c == -1 ) {
close();
}
return c;
}
public int read(byte[] b, int off, int len) throws IOException {
int c = super.read(b, off, len);
if(available() <= 0) {
close();
}
return c;
}
}

I don't know a good way to close stream after query execution.
Method 1:
read the file to byte []
(note: in jdk 7 you can use Files.readAllBytes(Paths.get(file.getPath()));)
and use:
ps.setBytes(i, bytes);
2: or create your own class inherited from FileInputStream and override public native int read() throws IOException; method, when the end of the file is reached, close the stream:
#Override
public int read() throws IOException {
int c = super.read();
if(c == -1) {
super.close();
}
return c;
}
Maybe you should override and public int read(byte[] b) throws IOException too,
it's depends on the jdbc implementation.
3: you can change your FileByteaHandler:
1) add list of FileInputStream field;
2) put opened InputStream to that list in setNonNullParameter;
3) add closeStreams() method, where you close and remove all InputStream from list.
And invoke this method after you have invoked your mapper method: session.getConfiguration().getTypeHandlerRegistry().getMappingTypeHandler(FileByteaHandler.class).closeStreams();
Or use mybatis plugin system to run above command.

Related

How to sanitize request object in the #restcontroller

I'm doing a static code analysis using checkmarx, which gave me medium vulnerabilities for #RequestBody.
Will place a sample code below, could someone help me fixing this issue.
#RestController
#RequestMapping("/api")
#Slf4j
#Validated
public class Myclass {
#Autowired
SplitClass split;
#PostMapping(path = "/something", consumes = "application/json", produces = "application/json")
public int splitter(#RequestBody SplitRequestVO **splitReq**) {
int response = 0;
try {
int value = split.methodName(splitReq);
if(value == 1) {
response = 1;
}else {
response = 0;
}
}catch(Exception e) {
log.error("Excpetion in the MY Class controller" +e.getMessage());
}
return response;
}
}
Every time it is showing error with this util class. please check whats wrong in this class.
#Slf4j
public class FileUtil {
private FileUtil() {
throw new IllegalArgumentException("FileUtil.class");
}
public static void createFileFromBytes(String fileName, byte[] bytes) throws IOException {
File file = new File(fileName);
try (FileOutputStream fos = new FileOutputStream(file)) {
fos.write(bytes);
fos.flush();
} catch (IOException exc) {
log.error("Exception in creating file from bytes : "+exc.getMessage());
}
}
public static void deleteFile(String fileName) {
log.info("Entry into deleteFile method.");
try {
File file = new File(fileName);
if (file.exists()) {
Files.delete(file.toPath());
if (log.isDebugEnabled())
log.debug("file deleted:" + fileName);
}
} catch (Exception e) {
log.error("Exception in deleting the file : "+e.getMessage());
}
log.info("Exit from deleteFile method ");
}
public static byte[] getBytesFromFile(File file) throws IOException {
log.info("Entry into getBytesFromFile method");
byte[] bytes = null;
try (InputStream is = new FileInputStream(file)) {
long length = file.length();
if (length > 2147483647L) {
log.error("File Too large:" + file.getName());
}
bytes = new byte[(int) length];
int offset = 0;
int numRead = 0;
while (offset < bytes.length && (numRead = is.read(bytes, offset, bytes.length - offset)) >= 0)
offset += numRead;
if (offset < bytes.length)
throw new IOException("Could not completely read file " + file.getName());
} catch (Exception e) {
log.error("Exception in getBytesFromFile : "+e.getMessage());
}
log.info("Exit from getBytesFromFile method ");
return bytes;
}
public static byte[] getBytesFromFile(String fileName) throws IOException {
File file = new File(fileName);
return getBytesFromFile(file);
}
}
Here im facing the vulnerability issue to fix in the block letters. Appreciated foy help.
Thanks in advance.

Spark-Streaming CustomReceiver Unknown Host Exception

I am new to spark streaming. I want to stream a url online in order to retrieve info from a certain URL, I used the JavaCustomReceiver in order to stream a url.
This is the code I'm using (source)
public class JavaCustomReceiver extends Receiver<String> {
private static final Pattern SPACE = Pattern.compile(" ");
public static void main(String[] args) throws Exception {
SparkConf sparkConf = new SparkConf().setAppName("JavaCustomReceiver");
JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new Duration(1000));
JavaReceiverInputDStream<String> lines = ssc.receiverStream(
new JavaCustomReceiver("http://stream.meetup.com/2/rsvps", 80));
JavaDStream<String> words = lines.flatMap(new
FlatMapFunction<String, String>() {
#Override
public Iterator<String> call(String x) {
return Arrays.asList(SPACE.split(x)).iterator();
}
});
JavaPairDStream<String, Integer> wordCounts = words.mapToPair(
new PairFunction<String, String, Integer>() {
#Override
public Tuple2<String, Integer> call(String s) {
return new Tuple2<>(s, 1);
}
}).reduceByKey(new Function2<Integer, Integer, Integer>() {
#Override
public Integer call(Integer i1, Integer i2) {
return i1 + i2;
}
});
wordCounts.print();
ssc.start();
ssc.awaitTermination();
}
String host = null;
int port = -1;
public JavaCustomReceiver(String host_, int port_) {
super(StorageLevel.MEMORY_AND_DISK_2());
host = host_;
port = port_;
}
public void onStart() {
new Thread() {
#Override
public void run() {
receive();
}
}.start();
}
public void onStop() {
}
private void receive() {
try {
Socket socket = null;
BufferedReader reader = null;
String userInput = null;
try {
// connect to the server
socket = new Socket(host, port);
reader = new BufferedReader(
new InputStreamReader(socket.getInputStream(), StandardCharsets.UTF_8));
// Until stopped or connection broken continue reading
while (!isStopped() && (userInput = reader.readLine()) != null) {
System.out.println("Received data '" + userInput + "'");
store(userInput);
}
} finally {
Closeables.close(reader, /* swallowIOException = */ true);
Closeables.close(socket, /* swallowIOException = */ true);
}
restart("Trying to connect again");
} catch (ConnectException ce) {
// restart if could not connect to server
restart("Could not connect", ce);
} catch (Throwable t) {
restart("Error receiving data", t);
}
}
}
However, I keep getting a java.net.UnknownHostException
How can I fix this? What is wrong with the code that I'm using ?
After reading the code of the custom receiver referenced, it is clear that it is a TCP receiver that connects to a host:port and not an HTTP receiver that could take an URL. You'll have to change the code to read from an HTTP endpoint.

ignore exception and continue & counting files in a directory

The following snippet of code counts the number of files in a directory:
Path path = ...;
....
long count = 0;
try {
count = Files.walk(path).parallel().filter(p -> !Files.isDirectory(p)).count();
} catch (IOException ex) {
System.out.println(ex.getMessage());
}
The code above fails to get the number of files, if an exception is thrown.
The question is: How do I ignore the exception and continue counting the number of files.
In Java 7: I have used Files.walkFileTree(path, utils), with the following class:
public class Utils extends SimpleFileVisitor<Path> {
private long count;
#Override
public FileVisitResult visitFile(Path file, BasicFileAttributes attrs) throws IOException {
if (attrs.isRegularFile()) {
count++;
}
return CONTINUE;
}
#Override
public FileVisitResult visitFileFailed(Path file, IOException exc) throws IOException {
System.err.println(file.getFileName());
return CONTINUE;
}
public long countFilesInDirectoryJava7() {
return count;
}
}
Edit: Here is the stack trace of the exception:
Exception in thread "main" java.io.UncheckedIOException: java.nio.file.AccessDeniedException: E:\8431c36f5b6a3d7169de9cc70a\1025
at java.nio.file.FileTreeIterator.fetchNextIfNeeded(FileTreeIterator.java:88)
at java.nio.file.FileTreeIterator.hasNext(FileTreeIterator.java:104)
at java.util.Spliterators$IteratorSpliterator.trySplit(Spliterators.java:1784)
at java.util.stream.AbstractTask.compute(AbstractTask.java:297)
at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:731)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinTask.doInvoke(ForkJoinTask.java:401)
at java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:734)
at java.util.stream.ReduceOps$ReduceOp.evaluateParallel(ReduceOps.java:714)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233)
at java.util.stream.LongPipeline.reduce(LongPipeline.java:438)
at java.util.stream.LongPipeline.sum(LongPipeline.java:396)
at Utils.countFilesInDirectoryJava8(Utils.java:47)
at TestPath.main(TestPath.java:27)
Caused by: java.nio.file.AccessDeniedException: E:\8431c36f5b6a3d7169de9cc70a\1025
at sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:83)
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:102)
at sun.nio.fs.WindowsDirectoryStream.<init>(WindowsDirectoryStream.java:86)
at sun.nio.fs.WindowsFileSystemProvider.newDirectoryStream(WindowsFileSystemProvider.java:518)
at java.nio.file.Files.newDirectoryStream(Files.java:457)
at java.nio.file.FileTreeWalker.visit(FileTreeWalker.java:300)
at java.nio.file.FileTreeWalker.next(FileTreeWalker.java:372)
at java.nio.file.FileTreeIterator.fetchNextIfNeeded(FileTreeIterator.java:84)
... 13 more
Java Result: 1
So far, it seems Files.walk still does not play well with streams. Instead, you might want to use Files.newDirectoryStream instead. Here's some code snippet that might help.
static class FN<T extends Path> implements Function<T, Stream<T>> {
#Override
public Stream<T> apply(T p) {
if (!Files.isDirectory(p, LinkOption.NOFOLLOW_LINKS)) {
return Stream.of(p);
} else {
try {
return toStream(Files.newDirectoryStream(p)).flatMap(q -> apply((T) q));
} catch (IOException ex) {
return Stream.empty();
}
}
}
}
public static void main(String[] args) throws IOException {
Path path = Paths.get("some path");
long count = toStream(Files.newDirectoryStream(path))
.parallel()
.flatMap(new FN<>()::apply)
.count();
System.out.println("count: " + count);
}
static <T> Stream<T> toStream(Iterable<T> iterable) {
return StreamSupport.stream(iterable.spliterator(), false);
}

SimpleTextLoader UDF in Pig

I want to create a Custom Load function for Pig UDF, I have created a SimpleTextLoader using the link
https://pig.apache.org/docs/r0.11.0/udf.html , I have successfully generate the jar file for this code, register in pig and run a Pig Script.I am getting the empty output. I don't know how to solve this issue, any help would be appreciated.
Below is my Java code
public class SimpleTextLoader extends LoadFunc{
protected RecordReader in = null;
private byte fieldDel = '\t';
private ArrayList<Object> mProtoTuple = null;
private TupleFactory mTupleFactory = TupleFactory.getInstance();
private static final int BUFFER_SIZE = 1024;
public SimpleTextLoader() {
}
public SimpleTextLoader(String delimiter)
{
this();
if (delimiter.length() == 1) {
this.fieldDel = (byte)delimiter.charAt(0);
} else if (delimiter.length() > 1 && delimiter.charAt(0) == '\\') {
switch (delimiter.charAt(1)) {
case 't':
this.fieldDel = (byte)'\t';
break;
case 'x':
fieldDel =
Integer.valueOf(delimiter.substring(2), 16).byteValue();
break;
case 'u':
this.fieldDel =
Integer.valueOf(delimiter.substring(2)).byteValue();
break;
default:
throw new RuntimeException("Unknown delimiter " + delimiter);
}
} else {
throw new RuntimeException("PigStorage delimeter must be a single character");
}
}
private void readField(byte[] buf, int start, int end) {
if (mProtoTuple == null) {
mProtoTuple = new ArrayList<Object>();
}
if (start == end) {
// NULL value
mProtoTuple.add(null);
} else {
mProtoTuple.add(new DataByteArray(buf, start, end));
}
} #Override
public Tuple getNext() throws IOException {
try {
boolean notDone = in.nextKeyValue();
if (notDone) {
return null;
}
Text value = (Text) in.getCurrentValue();
System.out.println("printing value" +value);
byte[] buf = value.getBytes();
int len = value.getLength();
int start = 0;
for (int i = 0; i < len; i++) {
if (buf[i] == fieldDel) {
readField(buf, start, i);
start = i + 1;
}
}
// pick up the last field
readField(buf, start, len);
Tuple t = mTupleFactory.newTupleNoCopy(mProtoTuple);
mProtoTuple = null;
System.out.println(t);
return t;
} catch (InterruptedException e) {
int errCode = 6018;
String errMsg = "Error while reading input";
throw new ExecException(errMsg, errCode,
PigException.REMOTE_ENVIRONMENT, e);
}
}
#Override
public void setLocation(String string, Job job) throws IOException {
FileInputFormat.setInputPaths(job,string);
}
#Override
public InputFormat getInputFormat() throws IOException {
return new TextInputFormat();
}
#Override
public void prepareToRead(RecordReader reader, PigSplit ps) throws IOException {
in=reader;
}
}
Below is my Pig Script
REGISTER /home/hadoop/netbeans/sampleloader/dist/sampleloader.jar
a= load '/input.txt' using sampleloader.SimpleTextLoader();
store a into 'output';
You are using sampleloader.SimpleTextLoader() that doesn't do anything as it is just an empty constructor.
Instead use sampleloader.SimpleTextLoader(String delimiter) which is performing the actual operation of split.

Map-Reduce not reducing as much as expected with complex keys and values

No matter how simple I make the compareTo of my complex key, I don't get expected results. With the exception of if I use one key that is the same for every record, it will appropriately reduce to one record. I've also witnessed that this happens only when I process the full load, if I break off a few of the records that didn't reduce and run it on a much smaller scale those records get combined.
The sum of the output records is correct, but there is duplication at the record level of items I would have expected to group together. So where I would expect say 500 records summing up to 5,000, I end up with 1232 records summing up to 5,000 with obvious records that should have been reduced into one.
I've read about the problems with object references and complex keys and values, but I don't see anywhere that I have potential for that left. To that end you will find places that I'm creating new objects that I probably don't need to, but I'm trying everything at this point and will dial it back once it is working.
I'm out of ideas on what to try or where and how to poke to figure this out. Please help!
public static class Map extends
Mapper<LongWritable, Text, IMSTranOut, IMSTranSums> {
//private SimpleDateFormat dtFormat = new SimpleDateFormat("yyyyddd");
#Override
public void map(LongWritable key, Text value, Context context)
throws IOException, InterruptedException {
String line = value.toString();
SimpleDateFormat dtFormat = new SimpleDateFormat("yyyyddd");
IMSTranOut dbKey = new IMSTranOut();
IMSTranSums sumVals = new IMSTranSums();
String[] tokens = line.split(",", -1);
dbKey.setLoadKey(-99);
dbKey.setTranClassKey(-99);
dbKey.setTransactionCode(tokens[0]);
dbKey.setTransactionType(tokens[1]);
dbKey.setNpaNxx(getNPA(dbKey.getTransactionCode()));
try {
dbKey.setTranDate(new Date(dtFormat.parse(tokens[2]).getTime()));
} catch (ParseException e) {
}// 2
dbKey.setTranHour(getTranHour(tokens[3]));
try {
dbKey.setStartDate(new Date(dtFormat.parse(tokens[4]).getTime()));
} catch (ParseException e) {
}// 4
dbKey.setStartHour(getTranHour(tokens[5]));
try {
dbKey.setStopDate(new Date(dtFormat.parse(tokens[6]).getTime()));
} catch (ParseException e) {
}// 6
dbKey.setStopHour(getTranHour(tokens[7]));
sumVals.setTranCount(1);
sumVals.setInputQTime(Double.parseDouble(tokens[8]));
sumVals.setElapsedTime(Double.parseDouble(tokens[9]));
sumVals.setCpuTime(Double.parseDouble(tokens[10]));
context.write(dbKey, sumVals);
}
}
public static class Reduce extends
Reducer<IMSTranOut, IMSTranSums, IMSTranOut, IMSTranSums> {
#Override
public void reduce(IMSTranOut key, Iterable<IMSTranSums> values,
Context context) throws IOException, InterruptedException {
int tranCount = 0;
double inputQ = 0;
double elapsed = 0;
double cpu = 0;
for (IMSTranSums val : values) {
tranCount += val.getTranCount();
inputQ += val.getInputQTime();
elapsed += val.getElapsedTime();
cpu += val.getCpuTime();
}
IMSTranSums sumVals=new IMSTranSums();
IMSTranOut dbKey=new IMSTranOut();
sumVals.setCpuTime(inputQ);
sumVals.setElapsedTime(elapsed);
sumVals.setInputQTime(cpu);
sumVals.setTranCount(tranCount);
dbKey.setLoadKey(key.getLoadKey());
dbKey.setTranClassKey(key.getTranClassKey());
dbKey.setNpaNxx(key.getNpaNxx());
dbKey.setTransactionCode(key.getTransactionCode());
dbKey.setTransactionType(key.getTransactionType());
dbKey.setTranDate(key.getTranDate());
dbKey.setTranHour(key.getTranHour());
dbKey.setStartDate(key.getStartDate());
dbKey.setStartHour(key.getStartHour());
dbKey.setStopDate(key.getStopDate());
dbKey.setStopHour(key.getStopHour());
dbKey.setInputQTime(inputQ);
dbKey.setElapsedTime(elapsed);
dbKey.setCpuTime(cpu);
dbKey.setTranCount(tranCount);
context.write(dbKey, sumVals);
}
}
Here is the implementation of the DBWritable class:
public class IMSTranOut implements DBWritable,
WritableComparable<IMSTranOut> {
private int loadKey;
private int tranClassKey;
private String npaNxx;
private String transactionCode;
private String transactionType;
private Date tranDate;
private double tranHour;
private Date startDate;
private double startHour;
private Date stopDate;
private double stopHour;
private double inputQTime;
private double elapsedTime;
private double cpuTime;
private int tranCount;
public void readFields(ResultSet rs) throws SQLException {
setLoadKey(rs.getInt("LOAD_KEY"));
setTranClassKey(rs.getInt("TRAN_CLASS_KEY"));
setNpaNxx(rs.getString("NPA_NXX"));
setTransactionCode(rs.getString("TRANSACTION_CODE"));
setTransactionType(rs.getString("TRANSACTION_TYPE"));
setTranDate(rs.getDate("TRAN_DATE"));
setTranHour(rs.getInt("TRAN_HOUR"));
setStartDate(rs.getDate("START_DATE"));
setStartHour(rs.getInt("START_HOUR"));
setStopDate(rs.getDate("STOP_DATE"));
setStopHour(rs.getInt("STOP_HOUR"));
setInputQTime(rs.getInt("INPUT_Q_TIME"));
setElapsedTime(rs.getInt("ELAPSED_TIME"));
setCpuTime(rs.getInt("CPU_TIME"));
setTranCount(rs.getInt("TRAN_COUNT"));
}
public void write(PreparedStatement ps) throws SQLException {
ps.setInt(1, loadKey);
ps.setInt(2, tranClassKey);
ps.setString(3, npaNxx);
ps.setString(4, transactionCode);
ps.setString(5, transactionType);
ps.setDate(6, tranDate);
ps.setDouble(7, tranHour);
ps.setDate(8, startDate);
ps.setDouble(9, startHour);
ps.setDate(10, stopDate);
ps.setDouble(11, stopHour);
ps.setDouble(12, inputQTime);
ps.setDouble(13, elapsedTime);
ps.setDouble(14, cpuTime);
ps.setInt(15, tranCount);
}
public int getLoadKey() {
return loadKey;
}
public void setLoadKey(int loadKey) {
this.loadKey = loadKey;
}
public int getTranClassKey() {
return tranClassKey;
}
public void setTranClassKey(int tranClassKey) {
this.tranClassKey = tranClassKey;
}
public String getNpaNxx() {
return npaNxx;
}
public void setNpaNxx(String npaNxx) {
this.npaNxx = new String(npaNxx);
}
public String getTransactionCode() {
return transactionCode;
}
public void setTransactionCode(String transactionCode) {
this.transactionCode = new String(transactionCode);
}
public String getTransactionType() {
return transactionType;
}
public void setTransactionType(String transactionType) {
this.transactionType = new String(transactionType);
}
public Date getTranDate() {
return tranDate;
}
public void setTranDate(Date tranDate) {
this.tranDate = new Date(tranDate.getTime());
}
public double getTranHour() {
return tranHour;
}
public void setTranHour(double tranHour) {
this.tranHour = tranHour;
}
public Date getStartDate() {
return startDate;
}
public void setStartDate(Date startDate) {
this.startDate = new Date(startDate.getTime());
}
public double getStartHour() {
return startHour;
}
public void setStartHour(double startHour) {
this.startHour = startHour;
}
public Date getStopDate() {
return stopDate;
}
public void setStopDate(Date stopDate) {
this.stopDate = new Date(stopDate.getTime());
}
public double getStopHour() {
return stopHour;
}
public void setStopHour(double stopHour) {
this.stopHour = stopHour;
}
public double getInputQTime() {
return inputQTime;
}
public void setInputQTime(double inputQTime) {
this.inputQTime = inputQTime;
}
public double getElapsedTime() {
return elapsedTime;
}
public void setElapsedTime(double elapsedTime) {
this.elapsedTime = elapsedTime;
}
public double getCpuTime() {
return cpuTime;
}
public void setCpuTime(double cpuTime) {
this.cpuTime = cpuTime;
}
public int getTranCount() {
return tranCount;
}
public void setTranCount(int tranCount) {
this.tranCount = tranCount;
}
public void readFields(DataInput input) throws IOException {
setNpaNxx(input.readUTF());
setTransactionCode(input.readUTF());
setTransactionType(input.readUTF());
setTranDate(new Date(input.readLong()));
setStartDate(new Date(input.readLong()));
setStopDate(new Date(input.readLong()));
setLoadKey(input.readInt());
setTranClassKey(input.readInt());
setTranHour(input.readDouble());
setStartHour(input.readDouble());
setStopHour(input.readDouble());
setInputQTime(input.readDouble());
setElapsedTime(input.readDouble());
setCpuTime(input.readDouble());
setTranCount(input.readInt());
}
public void write(DataOutput output) throws IOException {
output.writeUTF(npaNxx);
output.writeUTF(transactionCode);
output.writeUTF(transactionType);
output.writeLong(tranDate.getTime());
output.writeLong(startDate.getTime());
output.writeLong(stopDate.getTime());
output.writeInt(loadKey);
output.writeInt(tranClassKey);
output.writeDouble(tranHour);
output.writeDouble(startHour);
output.writeDouble(stopHour);
output.writeDouble(inputQTime);
output.writeDouble(elapsedTime);
output.writeDouble(cpuTime);
output.writeInt(tranCount);
}
public int compareTo(IMSTranOut o) {
return (Integer.compare(loadKey, o.getLoadKey()) == 0
&& Integer.compare(tranClassKey, o.getTranClassKey()) == 0
&& npaNxx.compareTo(o.getNpaNxx()) == 0
&& transactionCode.compareTo(o.getTransactionCode()) == 0
&& (transactionType.compareTo(o.getTransactionType()) == 0)
&& tranDate.compareTo(o.getTranDate()) == 0
&& Double.compare(tranHour, o.getTranHour()) == 0
&& startDate.compareTo(o.getStartDate()) == 0
&& Double.compare(startHour, o.getStartHour()) == 0
&& stopDate.compareTo(o.getStopDate()) == 0
&& Double.compare(stopHour, o.getStopHour()) == 0) ? 0 : 1;
}
}
Implementation of the Writable class for the complex values:
public class IMSTranSums
implements Writable{
private double inputQTime;
private double elapsedTime;
private double cpuTime;
private int tranCount;
public double getInputQTime() {
return inputQTime;
}
public void setInputQTime(double inputQTime) {
this.inputQTime = inputQTime;
}
public double getElapsedTime() {
return elapsedTime;
}
public void setElapsedTime(double elapsedTime) {
this.elapsedTime = elapsedTime;
}
public double getCpuTime() {
return cpuTime;
}
public void setCpuTime(double cpuTime) {
this.cpuTime = cpuTime;
}
public int getTranCount() {
return tranCount;
}
public void setTranCount(int tranCount) {
this.tranCount = tranCount;
}
public void write(DataOutput output) throws IOException {
output.writeDouble(inputQTime);
output.writeDouble(elapsedTime);
output.writeDouble(cpuTime);
output.writeInt(tranCount);
}
public void readFields(DataInput input) throws IOException {
inputQTime=input.readDouble();
elapsedTime=input.readDouble();
cpuTime=input.readDouble();
tranCount=input.readInt();
}
}
Your compareTo is flawed, it will totally fail the sort algorithm, because you seem to break transivity in your ordering.
I would recommend you to use a CompareToBuilder from Apache Commons or a ComparisonChain from Guava to make your comparisons much more readable (and correct!).

Resources