Serializing a long string in Hadoop - hadoop

I have a class which implements WritableComparable class in Hadoop. This class has two string variables, one short and one very long. I use writeChars to write these variables and readLine to read them but it seems like I get some sort of error. What is the best way to serialize such a long String in Hadoop?

I think you can use byteswritable to make it more efficient. Check the below custom key which has BytesWritable type as callId.
public class CustomMRKey implements WritableComparable<CustomMRKey> {
private BytesWritable callId;
private IntWritable mapperType;
/**
* #default constructor
*/
public CustomMRKey() {
set(new BytesWritable(), new IntWritable());
}
/**
* Constructor
*
* #param callId
* #param mapperType
*/
public CustomMRKey(BytesWritable callId, IntWritable mapperType) {
set(callId, mapperType);
}
/**
* sets the call id and mapper type
*
* #param callId
* #param mapperType
*/
public void set(BytesWritable callId, IntWritable mapperType) {
this.callId = callId;
this.mapperType = mapperType;
}
/**
* This method returns the callId
*
* #return callId
*/
public BytesWritable getCallId() {
return callId;
}
/**
* This method sets the callId given a callId
*
* #param callId
*/
public void setCallId(BytesWritable callId) {
this.callId = callId;
}
/**
* This method returns the mapper type
*
*
* #return
*/
public IntWritable getMapperType() {
return mapperType;
}
/**
* This method is set to store the mapper type
*
* #param mapperType
*/
public void setMapperType(IntWritable mapperType) {
this.mapperType = mapperType;
}
#Override
public void readFields(DataInput in) throws IOException {
callId.readFields(in);
mapperType.readFields(in);
}
#Override
public void write(DataOutput out) throws IOException {
callId.write(out);
mapperType.write(out);
}
#Override
public boolean equals(Object obj) {
if (obj instanceof CustomMRCdrKey) {
CustomMRCdrKey key = (CustomMRCdrKey) obj;
return callId.equals(key.callId)
&& mapperType.equals(key.mapperType);
}
return false;
}
#Override
public int compareTo(CustomMRCdrKey key) {
int cmp = callId.compareTo(key.getCallId());
if (cmp != 0) {
return cmp;
}
return mapperType.compareTo(key.getMapperType());
}
}
To use in say mapper code say you can generate the key of BytesWritable form using something as following :-
You can call as :
CustomMRKey customKey=new CustomMRKey(new BytesWritable(),new IntWritable());
customKey.setCallId(makeKey(value, this.resultKey));
customKey.setMapperType(this.mapTypeIndicator);
Then makeKey method is something like below :-
public BytesWritable makeKey(Text value, BytesWritable key) throws IOException {
try {
ByteArrayOutputStream byteKey = new ByteArrayOutputStream(Constants.MR_DEFAULT_KEY_SIZE);
for (String field : keyFields) {
byte[] bytes = value.getString(field).getBytes();
byteKey.write(bytes,0,bytes.length);
}
if(key==null){
return new BytesWritable(byteKey.toByteArray());
}else{
key.set(byteKey.toByteArray(), 0, byteKey.size());
return key;
}
} catch (Exception ex) {
throw new IOException("Could not generate key", ex);
}
}
Hope this may help.

Related

How to convert file to a string using SpringIntegration

I am trying to read a text file and convert it to a string using SpringIntegration.
Need help in transforming file to a string.
Git Link: https://github.com/ravikalla/spring-integration
Source Code -
#Bean
#InboundChannelAdapter(value = "payorFileSource", poller = #Poller(fixedDelay = "10000"))
public MessageSource<File> fileReadingMessageSource() {
FileReadingMessageSource sourceReader = new FileReadingMessageSource();
sourceReader.setDirectory(new File(INPUT_DIR));
sourceReader.setFilter(new SimplePatternFileListFilter(FILE_PATTERN));
return sourceReader;
}
#Bean
#Transformer(inputChannel="payorFileSource", outputChannel="payorFileContent")
public FileToStringTransformer transformFileToString() {
FileToStringTransformer objFileToStringTransformer = new FileToStringTransformer();
return objFileToStringTransformer;
}
Error -
SEVERE: org.springframework.integration.handler.ReplyRequiredException: No reply produced by handler 'fileCopyConfig.transformPayorStringToObject.transformer.handler', and its 'requiresReply' property is set to true., failedMessage=GenericMessage [payload=1|test1, headers={sequenceNumber=1, file_name=payor.txt, sequenceSize=4, correlationId=ff1fef7d-7011-ee99-8d71-96146ac9ea07, file_originalFile=source/payor.txt, id=fd4f950b-afcf-70e6-a053-7d59ff593add, file_relativePath=payor.txt, timestamp=1554875904858}]
at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:119)
You can convert the file into an InputStream and the use
IOUtils.toString(inputStream) to convert it into a String.
That error is coming from somewhere else; the FTST can't return null.
I haven't looked at all your code, but this looks suspicious:
#Bean
#Transformer(inputChannel="payorRawStringChannel", outputChannel="payorRawObjectChannel")
public GenericTransformer<String, Payor> transformPayorStringToObject() {
return new GenericTransformer<String, Payor>() {
#Override
public Payor transform(String strPayor) {
String[] arrPayorData = strPayor.split(",");
Payor objPayor = null;
if (null != arrPayorData && arrPayorData.length > 1)
objPayor = new Payor(Integer.parseInt(arrPayorData[0]), arrPayorData[1]);
return objPayor;
}
};
}
It can return null; transformers are not allowed to do that.
Turn on DEBUG logging and follow the message flow to see which component is at fault.
package org.springframework.integration.samples.tcpclientserver;
import java.io.UnsupportedEncodingException;
import org.springframework.core.convert.converter.Converter;
/**
* Simple byte array to String converter; allowing the character set
* to be specified.
*
* #author Gary Russell
* #since 2.1
*
*/
public class ByteArrayToStringConverter implements Converter<byte[], String> {
private String charSet = "UTF-8";
public String convert(byte[] bytes) {
try {
return new String(bytes, this.charSet);
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
return new String(bytes);
}
}
/**
* #return the charSet
*/
public String getCharSet() {
return charSet;
}
/**
* #param charSet the charSet to set
*/
public void setCharSet(String charSet) {
this.charSet = charSet;
}
}

Set Request Not Working SNMP4j

I have searched everywhere but I haven't found the solution to my problem. I've tried all solutions involving changing the community name, certain different set classes and I still can't get the set to work for SNMP4j. The code executes fine and can be accessed from iReasoning MIB browser but will not change the value of the OID after the set executes. The SNMP classes and tester classes are below.
SNMPManager code:
public class SNMPManager {
Snmp snmp = null;
String address = null;
/**
* Constructor
*
* #param add
*/
public SNMPManager(String add) {
address = add;
}
public static void main(String[] args) throws IOException {
/**
* Port 161 is used for Read and Other operations Port 162 is used for the trap
* generation
*/
SNMPManager client = new SNMPManager("udp:127.0.0.1/161");
client.start();
/**
* OID - .1.3.6.1.2.1.1.1.0 => SysDec OID - .1.3.6.1.2.1.1.5.0 => SysName => MIB
* explorer will be usefull here, as discussed in previous article
*/
String sysDescr = client.getAsString(new OID(".1.3.6.1.2.1.1.1.0"));
System.out.println(sysDescr);
}
/**
* Start the Snmp session. If you forget the listen() method you will not get
* any answers because the communication is asynchronous and the listen() method
* listens for answers.
*
* #throws IOException
*/
public void start() throws IOException {
TransportMapping<?> transport = new DefaultUdpTransportMapping();
snmp = new Snmp(transport);
// Do not forget this line!
transport.listen();
}
/**
* Method which takes a single OID and returns the response from the agent as a
* String.
*
* #param oid
* #return
* #throws IOException
*/
public String getAsString(OID oid) throws IOException {
ResponseEvent event = get(new OID[] { oid });
return event.getResponse().get(0).getVariable().toString();
}
/**
* This method is capable of handling multiple OIDs
*
* #param oids
* #return
* #throws IOException
*/
public ResponseEvent get(OID oids[]) throws IOException {
PDU pdu = new PDU();
for (OID oid : oids) {
pdu.add(new VariableBinding(oid));
}
pdu.setType(PDU.GET);
ResponseEvent event = snmp.send(pdu, getTarget(), null);
if (event != null) {
return event;
}
throw new RuntimeException("GET timed out");
}
public ResponseEvent set(OID oid, String val) throws IOException {
PDU pdu = new PDU();
VariableBinding varBind = new VariableBinding(oid, new OctetString(val));
pdu.add(varBind);
pdu.setType(PDU.SET);
//pdu.setRequestID(new Integer32(1));
Target target = getTarget();
ResponseEvent event = snmp.set(pdu, target);
if (event != null) {
System.out.println("\nResponse:\nGot Snmp Set Response from Agent");
System.out.println("Snmp Set Request = " + event.getRequest().getVariableBindings());
PDU responsePDU = event.getResponse();
System.out.println("\nresponsePDU = " + responsePDU);
if (responsePDU != null) {
int errorStatus = responsePDU.getErrorStatus();
int errorIndex = responsePDU.getErrorIndex();
String errorStatusText = responsePDU.getErrorStatusText();
System.out.println("\nresponsePDU = " + responsePDU);
if (errorStatus == PDU.noError) {
System.out.println("Snmp Set Response = " + responsePDU.getVariableBindings());
} else {
System.out.println("errorStatus = " + responsePDU);
System.out.println("Error: Request Failed");
System.out.println("Error Status = " + errorStatus);
System.out.println("Error Index = " + errorIndex);
System.out.println("Error Status Text = " + errorStatusText);
}
}
return event;
}
throw new RuntimeException("SET timed out");
}
/**
* This method returns a Target, which contains information about where the data
* should be fetched and how.
*
* #return
*/
private Target getTarget() {
Address targetAddress = GenericAddress.parse(address);
CommunityTarget target = new CommunityTarget();
target.setCommunity(new OctetString("public"));
target.setAddress(targetAddress);
target.setRetries(2);
target.setTimeout(1500);
target.setVersion(SnmpConstants.version2c);
return target;
}
}
SNMPAgent code:
public class SNMPAgent extends BaseAgent {
private String address;
/**
*
* #param address
* #throws IOException
*/
public SNMPAgent(String address) throws IOException {
/**
* Creates a base agent with boot-counter, config file, and a CommandProcessor
* for processing SNMP requests. Parameters: "bootCounterFile" - a file with
* serialized boot-counter information (read/write). If the file does not exist
* it is created on shutdown of the agent. "configFile" - a file with serialized
* configuration information (read/write). If the file does not exist it is
* created on shutdown of the agent. "commandProcessor" - the CommandProcessor
* instance that handles the SNMP requests.
*/
super(new File("conf.agent"), new File("bootCounter.agent"),
new CommandProcessor(new OctetString(MPv3.createLocalEngineID())));
this.address = address;
}
/**
* Adds community to security name mappings needed for SNMPv1 and SNMPv2c.
*/
#Override
protected void addCommunities(SnmpCommunityMIB communityMIB) {
Variable[] com2sec = new Variable[] { new OctetString("public"), new OctetString("cpublic"), // security name
getAgent().getContextEngineID(), // local engine ID
new OctetString("public"), // default context name
new OctetString(), // transport tag
new Integer32(StorageType.nonVolatile), // storage type
new Integer32(RowStatus.active) // row status
};
MOTableRow<?> row = communityMIB.getSnmpCommunityEntry()
.createRow(new OctetString("public2public").toSubIndex(true), com2sec);
communityMIB.getSnmpCommunityEntry().addRow((SnmpCommunityEntryRow) row);
}
/**
* Adds initial notification targets and filters.
*/
#Override
protected void addNotificationTargets(SnmpTargetMIB arg0, SnmpNotificationMIB arg1) {
// TODO Auto-generated method stub
}
/**
* Adds all the necessary initial users to the USM.
*/
#Override
protected void addUsmUser(USM arg0) {
// TODO Auto-generated method stub
}
/**
* Adds initial VACM configuration.
*/
#Override
protected void addViews(VacmMIB vacm) {
vacm.addGroup(SecurityModel.SECURITY_MODEL_SNMPv2c, new OctetString("cpublic"), new OctetString("v1v2group"),
StorageType.nonVolatile);
vacm.addAccess(new OctetString("v1v2group"), new OctetString("public"), SecurityModel.SECURITY_MODEL_ANY,
SecurityLevel.NOAUTH_NOPRIV, MutableVACM.VACM_MATCH_EXACT, new OctetString("fullReadView"),
new OctetString("fullWriteView"), new OctetString("fullNotifyView"), StorageType.nonVolatile);
vacm.addViewTreeFamily(new OctetString("fullReadView"), new OID("1.3"), new OctetString(),
VacmMIB.vacmViewIncluded, StorageType.nonVolatile);
}
/**
* Unregister the basic MIB modules from the agent's MOServer.
*/
#Override
protected void unregisterManagedObjects() {
// TODO Auto-generated method stub
}
/**
* Register additional managed objects at the agent's server.
*/
#Override
protected void registerManagedObjects() {
// TODO Auto-generated method stub
}
#SuppressWarnings("unchecked")
protected void initTransportMappings() throws IOException {
transportMappings = new TransportMapping[1];
Address addr = GenericAddress.parse(address);
TransportMapping<?> tm = TransportMappings.getInstance().createTransportMapping(addr);
transportMappings[0] = tm;
}
/**
* Start method invokes some initialization methods needed to start the agent
*
* #throws IOException
*/
public void start() throws IOException {
init();
// This method reads some old config from a file and causes
// unexpected behavior.
// loadConfig(ImportModes.REPLACE_CREATE);
addShutdownHook();
getServer().addContext(new OctetString("public"));
finishInit();
run();
sendColdStartNotification();
}
/**
* Clients can register the MO they need
*/
public void registerManagedObject(ManagedObject mo) {
try {
server.register(mo, null);
} catch (DuplicateRegistrationException ex) {
throw new RuntimeException(ex);
}
}
public void unregisterManagedObject(MOGroup moGroup) {
moGroup.unregisterMOs(server, getContext(moGroup));
}
}
MOCreator code:
public class MOCreator {
public static MOScalar<Variable> createReadOnly(OID oid,Object value ){
return new MOScalar<Variable>(oid,
MOAccessImpl.ACCESS_READ_WRITE,
getVariable(value));
}
private static Variable getVariable(Object value) {
if(value instanceof String) {
return new OctetString((String)value);
}
throw new IllegalArgumentException("Unmanaged Type: " + value.getClass());
}
}
Tester class:
public class TestSNMPAgent {
String siteOIDNumber = "1";
String dataOIDNumber = "1";
String sensorOIDNumber = "1";
OID newOIDdata = new OID("1.3.6.1.4.1.1234.5." + siteOIDNumber + ".2." + dataOIDNumber + "." + sensorOIDNumber + ".0");
OID newOIDdata2 = new OID("1.3.6.1.4.1.1234.5.1.2.1.2.0");
OID newOIDdata3 = new OID("1.3.6.1.4.1.1234.5.1.2.1.3.0");
public static void main(String[] args) throws IOException {
TestSNMPAgent client = new TestSNMPAgent("udp:127.0.0.1/161");
client.init();
}
SNMPAgent agent = null;
/**
* This is the client which we have created earlier
*/
SNMPManager client = null;
String address = null;
/**
* Constructor
*
* #param add
*/
public TestSNMPAgent(String add) {
address = add;
}
/**
* Initiates the testing of the SNMP Agent.
* #throws IOException
*/
private void init() throws IOException {
/*agent = new SNMPAgent("172.21.1.103/2001");*/
agent = new SNMPAgent("172.21.1.103/2010");
agent.start();
// Since BaseAgent registers some MIBs by default we need to unregister
// one before we register our own sysDescr. Normally you would
// override that method and register the MIBs that you need
agent.unregisterManagedObject(agent.getSnmpv2MIB());
//agent.registerManagedObject(MOCreator.createReadOnly(sysDescr,"This Description is set By KGrewe"));
agent.registerManagedObject(MOCreator.createReadOnly(newOIDdata, "50"));
agent.registerManagedObject(MOCreator.createReadOnly(newOIDdata2, "NaN"));
agent.registerManagedObject(MOCreator.createReadOnly(newOIDdata3, "3"));
SNMPManager client1 = new SNMPManager("udp:127.21.1.103/2010");
client1.start();
client1.set(newOIDdata, "30");
//System.out.println(newOIDdata);
// Get back Value which is set
//System.out.println(client1.getAsString(newOIDdata));
while(true) {
}
}
}
Thanks for the help in advance.

Spring Batch to read multiple files with same extension

I have custom reader to read data from CSV File.
package org.kp.oppr.remediation.batch.csv;
import java.util.Arrays;
import java.util.LinkedHashMap;
import java.util.Map;
import org.apache.commons.lang.StringUtils;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.remediation.batch.csv.FlatFileItemReaderNewLine;
import org.remediation.batch.model.RawItem;
import org.remediation.batch.model.RawItemLineMapper;
import org.springframework.batch.core.ExitStatus;
import org.springframework.batch.core.StepExecution;
import org.springframework.batch.core.StepExecutionListener;
import org.springframework.batch.core.annotation.BeforeStep;
import org.springframework.batch.item.file.LineCallbackHandler;
import org.springframework.batch.item.file.LineMapper;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.mapping.FieldSetMapper;
import org.springframework.batch.item.file.transform.DelimitedLineTokenizer;
import org.springframework.batch.item.file.transform.FieldSet;
import org.springframework.batch.item.file.transform.LineTokenizer;
import org.springframework.core.io.Resource;
import org.springframework.util.Assert;
import org.springframework.validation.BindException;
public class RawItemCsvReader extends MultiResourceItemReader<RawItem>
implements StepExecutionListener, LineCallbackHandler,
FieldSetMapper<RawItem> {
static final Logger LOGGER = LogManager.getLogger(RawItemCsvReader.class);
final private String COLUMN_NAMES_KEY = "COLUMNS_NAMES_KEY";
private StepExecution stepExecution;
private DefaultLineMapper<RawItem> lineMapper;
private String[] columnNames;
private Resource[] resources;
// = DelimitedLineTokenizer.DELIMITER_COMMA;
private char quoteCharacter = DelimitedLineTokenizer.DEFAULT_QUOTE_CHARACTER;
private String delimiter;
public RawItemCsvReader() {
setLinesToSkip(0);
setSkippedLinesCallback(this);
}
#Override
public void afterPropertiesSet() {
// not in constructor to ensure we invoke the override
final DefaultLineMapper<RawItem> lineMapper = new RawItemLineMapper();
setLineMapper(lineMapper);
}
/**
* Satisfies {#link LineCallbackHandler} contract and and Acts as the
* {#code skippedLinesCallback}.
*
* #param line
*/
#Override
public void handleLine(String line) {
getLineMapper().setLineTokenizer(getTokenizer());
getLineMapper().setFieldSetMapper(this);
}
private LineTokenizer getTokenizer() {
// this.columnNames = line.split(delimiter);
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setQuoteCharacter(quoteCharacter);
lineTokenizer.setDelimiter(delimiter);
lineTokenizer.setStrict(true);
lineTokenizer.setNames(columnNames);
addColumnNames();
return lineTokenizer;
}
private void addColumnNames() {
stepExecution.getExecutionContext().put(COLUMN_NAMES_KEY, columnNames);
}
#Override
public void setResources(Resource[] resources) {
this.resources = resources;
super.setResources(resources);
}
/**
* Provides acces to an otherwise hidden field in parent class. We need this
* because we have to reconfigure the {#link LineMapper} based on file
* contents.
*
* #param lineMapper
*/
#Override
public void setLineMapper(LineMapper<RawItem> lineMapper) {
if (!(lineMapper instanceof DefaultLineMapper)) {
throw new IllegalArgumentException(
"Must specify a DefaultLineMapper");
}
this.lineMapper = (DefaultLineMapper) lineMapper;
super.setLineMapper(lineMapper);
}
private DefaultLineMapper getLineMapper() {
return this.lineMapper;
}
/**
* Satisfies {#link FieldSetMapper} contract.
*
* #param fs
* #return
* #throws BindException
*/
#Override
public RawItem mapFieldSet(FieldSet fs) throws BindException {
if (fs == null) {
return null;
}
Map<String, String> record = new LinkedHashMap<String, String>();
for (String columnName : this.columnNames) {
record.put(columnName,
StringUtils.trimToNull(fs.readString(columnName)));
}
RawItem item = new RawItem();
item.setResource(resources);
item.setRecord(record);
return item;
}
#BeforeStep
public void saveStepExecution(StepExecution stepExecution) {
this.stepExecution = stepExecution;
}
#Override
public void beforeStep(StepExecution stepExecution) {
//LOGGER.info("Start Raw Read Step for " + itemResource.getFilename());
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
LOGGER.info("End Raw Read Step for lines read: " + stepExecution.getReadCount()
+ " lines skipped: " + stepExecution.getReadSkipCount());
/*
LOGGER.info("End Raw Read Step for " + itemResource.getFilename()
+ " lines read: " + stepExecution.getReadCount()
+ " lines skipped: " + stepExecution.getReadSkipCount());
*/
return ExitStatus.COMPLETED;
}
public void setDelimiter(String delimiter) {
this.delimiter = delimiter;
}
public void setQuoteCharacter(char quoteCharacter) {
this.quoteCharacter = quoteCharacter;
}
public String[] getColumnNames() {
return columnNames;
}
public void setColumnNames(String[] columnNames) {
this.columnNames = columnNames;
}
public String getDelimiter() {
return delimiter;
}
}
I want to use MultiResourceItemReader along with this class to read multiple files with the same extension. I am using the Spring MultiResourceItemReader to do the job. I need to know how to configure private ResourceAwareItemReaderItemStream delegate; instance for this class
package org.kp.oppr.remediation.batch.csv;
import java.util.Arrays;
import java.util.Comparator;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.springframework.batch.item.ExecutionContext;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemStream;
import org.springframework.batch.item.ItemStreamException;
import org.springframework.batch.item.ParseException;
import org.springframework.batch.item.UnexpectedInputException;
import org.springframework.batch.item.file.LineCallbackHandler;
import org.springframework.batch.item.file.LineMapper;
import org.springframework.batch.item.file.ResourceAwareItemReaderItemStream;
import org.springframework.batch.item.util.ExecutionContextUserSupport;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.io.Resource;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
public class MultiResourceItemReader <T> implements ItemReader<T>, ItemStream, InitializingBean,ResourceAwareItemReaderItemStream<T> {
static final Logger LOGGER = LogManager
.getLogger(MultipleFlatFileItemReaderNewLine.class);
private final ExecutionContextUserSupport executionContextUserSupport = new ExecutionContextUserSupport();
private ResourceAwareItemReaderItemStream<? extends T> delegate;
private Resource[] resources;
private MultiResourceIndex index = new MultiResourceIndex();
private boolean saveState = true;
// signals there are no resources to read -> just return null on first read
private boolean noInput;
private LineMapper<T> lineMapper;
private int linesToSkip = 0;
private LineCallbackHandler skippedLinesCallback;
private Comparator<Resource> comparator = new Comparator<Resource>() {
/**
* Compares resource filenames.
*/
public int compare(Resource r1, Resource r2) {
return r1.getFilename().compareTo(r2.getFilename());
}
};
public MultiResourceItemReader() {
executionContextUserSupport.setName(ClassUtils.getShortName(MultiResourceItemReader.class));
}
/**
* #param skippedLinesCallback
* will be called for each one of the initial skipped lines
* before any items are read.
*/
public void setSkippedLinesCallback(LineCallbackHandler skippedLinesCallback) {
this.skippedLinesCallback = skippedLinesCallback;
}
/**
* Public setter for the number of lines to skip at the start of a file. Can
* be used if the file contains a header without useful (column name)
* information, and without a comment delimiter at the beginning of the
* lines.
*
* #param linesToSkip
* the number of lines to skip
*/
public void setLinesToSkip(int linesToSkip) {
this.linesToSkip = linesToSkip;
}
/**
* Setter for line mapper. This property is required to be set.
*
* #param lineMapper
* maps line to item
*/
public void setLineMapper(LineMapper<T> lineMapper) {
this.lineMapper = lineMapper;
}
/**
* Reads the next item, jumping to next resource if necessary.
*/
public T read() throws Exception, UnexpectedInputException, ParseException {
if (noInput) {
return null;
}
T item;
item = readNextItem();
index.incrementItemCount();
return item;
}
/**
* Use the delegate to read the next item, jump to next resource if current
* one is exhausted. Items are appended to the buffer.
* #return next item from input
*/
private T readNextItem() throws Exception {
T item = delegate.read();
while (item == null) {
index.incrementResourceCount();
if (index.currentResource >= resources.length) {
return null;
}
delegate.close();
delegate.setResource(resources[index.currentResource]);
delegate.open(new ExecutionContext());
item = delegate.read();
}
return item;
}
/**
* Close the {#link #setDelegate(ResourceAwareItemReaderItemStream)} reader
* and reset instance variable values.
*/
public void close() throws ItemStreamException {
index = new MultiResourceIndex();
delegate.close();
noInput = false;
}
/**
* Figure out which resource to start with in case of restart, open the
* delegate and restore delegate's position in the resource.
*/
public void open(ExecutionContext executionContext) throws ItemStreamException {
Assert.notNull(resources, "Resources must be set");
noInput = false;
if (resources.length == 0) {
LOGGER.warn("No resources to read");
noInput = true;
return;
}
Arrays.sort(resources, comparator);
for(int i =0; i < resources.length; i++)
{
LOGGER.info("Resources after Sorting" + resources[i]);
}
index.open(executionContext);
delegate.setResource(resources[index.currentResource]);
delegate.open(new ExecutionContext());
try {
for (int i = 0; i < index.currentItem; i++) {
delegate.read();
}
}
catch (Exception e) {
throw new ItemStreamException("Could not restore position on restart", e);
}
}
/**
* Store the current resource index and position in the resource.
*/
public void update(ExecutionContext executionContext) throws ItemStreamException {
if (saveState) {
index.update(executionContext);
}
}
/**
* #param delegate reads items from single {#link Resource}.
*/
public void setDelegate(ResourceAwareItemReaderItemStream<? extends T> delegate) {
this.delegate = delegate;
}
/**
* Set the boolean indicating whether or not state should be saved in the
* provided {#link ExecutionContext} during the {#link ItemStream} call to
* update.
*
* #param saveState
*/
public void setSaveState(boolean saveState) {
this.saveState = saveState;
}
/**
* #param comparator used to order the injected resources, by default
* compares {#link Resource#getFilename()} values.
*/
public void setComparator(Comparator<Resource> comparator) {
this.comparator = comparator;
}
/**
* #param resources input resources
*/
public void setResources(Resource[] resources) {
this.resources = resources;
}
/**
* Facilitates keeping track of the position within multi-resource input.
*/
private class MultiResourceIndex {
private static final String RESOURCE_KEY = "resourceIndex";
private static final String ITEM_KEY = "itemIndex";
private int currentResource = 0;
private int markedResource = 0;
private int currentItem = 0;
private int markedItem = 0;
public void incrementItemCount() {
currentItem++;
}
public void incrementResourceCount() {
currentResource++;
currentItem = 0;
}
public void mark() {
markedResource = currentResource;
markedItem = currentItem;
}
public void reset() {
currentResource = markedResource;
currentItem = markedItem;
}
public void open(ExecutionContext ctx) {
if (ctx.containsKey(executionContextUserSupport.getKey(RESOURCE_KEY))) {
currentResource = ctx.getInt(executionContextUserSupport.getKey(RESOURCE_KEY));
}
if (ctx.containsKey(executionContextUserSupport.getKey(ITEM_KEY))) {
currentItem = ctx.getInt(executionContextUserSupport.getKey(ITEM_KEY));
}
}
public void update(ExecutionContext ctx) {
ctx.putInt(executionContextUserSupport.getKey(RESOURCE_KEY), index.currentResource);
ctx.putInt(executionContextUserSupport.getKey(ITEM_KEY), index.currentItem);
}
}
#Override
public void afterPropertiesSet() throws Exception {
// TODO Auto-generated method stub
}
#Override
public void setResource(Resource resource) {
// TODO Auto-generated method stub
}
}
Configuration Files for Spring is :
<batch:step id="readFromCSVFileAndUploadToDB" next="stepMovePdwFile">
<batch:tasklet transaction-manager="transactionManager">
<batch:chunk reader="multiResourceReader" writer="rawItemDatabaseWriter"
commit-interval="500" skip-policy="pdwUploadSkipPolicy" />
</batch:tasklet>
</batch:step>
<bean id="multiResourceReader"
class="org.springframework.batch.item.file.MultiResourceItemReader" scope="step">
<property name="resource" value="file:#{jobParameters[filePath]}/*.dat" />
<property name="delegate" ref="rawItemCsvReader"></property>
</bean>
<bean id="rawItemCsvReader" class="org.kp.oppr.remediation.batch.csv.RawItemCsvReader"
scope="step">
<property name="resources" value="file:#{jobParameters[filePath]}/*.dat" />
<property name="columnNames" value="${columnNames}" />
<property name="delimiter" value="${delimiter}" />
</bean>
Use a standard FlatFileItemReader (properly configured via XML) instead of your RawItemCsvReader as delegate.
This solution will answer your question because FlatFileItemReader implements AbstractItemStreamItemReader.
Remember: SB is heavly based on delegation; write a class like your reader is rarely requested.

JavaFX: ObsevableMap keySet as an ObservableSet

I want to transform an ObservableMap's keySet to a read only ObservableSet. I don't want to copy the value, any modification to the ObservableMap must affect the Observable keySet. If i bind another set to the observable key set content, its content is automatically updated.
This is what i would like to write.
ObservableMap<String, Object> map = FXCollections.observableHashMap();
ObservableSet<String> keySet = FXCollections.observableKeySet(map);
Set<String> boundSet = new HashSet<String>();
Bindings.bindContent(boundSet, keySet);
map.put("v", new Object());
assert boundSet.contains("v");
Is there this functionality in the JavaFX SDK ?
The feature you request does not need a special ObservableSet. It’s already part of the Map interface contract:
ObservableMap<String, Object> map = FXCollections.observableHashMap();
Set<String> keySet = map.keySet();
map.put("v", new Object());
assert keySet.contains("v");
A Map’s keyset always reflects the changes made to the backing map.
http://docs.oracle.com/javase/8/docs/api/java/util/Map.html#keySet--
Returns a Set view of the keys contained in this map. The set is backed by the map, so changes to the map are reflected in the set, and vice-versa.
As far as I know, there's no built-in way to do it.
Here's a utility method that I made to handle this:
/**
* Builds and returns an observable keyset of the specified map. Note that the resulting
* keyset is a new set that is guaranteed to have the same items as the map's true
* keyset, but does not make any guarantees about iteration order or implementation
* details of the keyset (for example, if the Map is a SortedMap, the keyset
* will not necessarily maintain keys in sorted order).
* #param <K> Map's key type
* #param <V> Map's value type
* #param map the ObservableMap to which the set should be bound
* #return a new observable set reflecting the map's keys
*/
public static <K, V> ObservableSet<K> getObservableKeySet(ObservableMap<K, V> map) {
ObservableSet<K> set = FXCollections.observableSet(new HashSet<>());
map.addListener(
(MapChangeListener<K, V>)(
change -> {
if(change.wasAdded() && !change.wasRemoved()) {
set.add(change.getKey());
}
if(change.wasRemoved() && !change.wasAdded()) {
set.remove(change.getKey());
}
//Note that if change was added and removed, that means that
//the key was unchanged and a value was just replaced. That
//shouldn't affect the keyset so we do nothing
})
);
return set;
}
Update:
I decided that I didn't like how the iteration order of the resulting set wouldn't match the original map's iteration order, so instead I made a new class that acts as an observable wrapper around a map's keyset. This is more like Map's built-in keySet() method where it returns a view of the actual set. This just adds in the listeners that make it observable:
/**
* Observable view of an ObservableMap's keyset
*/
public class ObservableKeySet<K, V> implements ObservableSet<K> {
/**
* The map's actual keyset object that gets wrapped
*/
private Set<K> wrappedSet;
/**
* Invalidation listeners to be notified when the set changes. Note that we end
* up calling these more than we should since invalidation listeners should only
* be called if the observable value is observed between changes and we're going
* to call them on every change. However, the ObservableSet returned from
* FxCollectionUtilities.observableSet actually also does that too, so I feel
* like we can get away with it.
*/
private Collection<InvalidationListener> invalidationListeners = new ArrayList<>();
/**
* Change listeners to be notified when the set changes
*/
private Collection<SetChangeListener<? super K>> changeListeners = new ArrayList<>();
/**
* Creates an Observable Set view of the specified map's keyset
* #param map ObservableMap
*/
public ObservableKeySet(ObservableMap<K, V> map) {
this.wrappedSet = map.keySet();
map.addListener((MapChangeListener<K,V>)this::onMapChange);
}
/**
* Code to be executed on any match change. It will determine if there is a resulting
* set change and trigger listeners as appropriate
* #param change
*/
private void onMapChange(MapChangeListener.Change<? extends K, ? extends V> change) {
SetChangeListener.Change<K> setChange = null;
//Note that if the map change says that there was an add and removal, then
//that means a value was getting replaced, which doesn't result in a keySet
//change
if(change.wasAdded() && ! change.wasRemoved()) {
setChange = new BasicSetChange(true, change.getKey());
}
else if(change.wasRemoved() && ! change.wasAdded()) {
setChange = new BasicSetChange(false, change.getKey());
}
if(setChange != null) {
invalidationListeners.forEach(listener -> listener.invalidated(this));
final SetChangeListener.Change<K> finalChange = setChange;
changeListeners.forEach(listener -> listener.onChanged(finalChange));
}
}
#Override
public void addListener(InvalidationListener listener) {
invalidationListeners.add(listener);
}
#Override
public void removeListener(InvalidationListener listener) {
invalidationListeners.remove(listener);
}
#Override
public void addListener(SetChangeListener<? super K> listener) {
changeListeners.add(listener);
}
#Override
public void removeListener(SetChangeListener<? super K> listener) {
changeListeners.remove(listener);
}
//Simple wrapper method that either just pass through to the wrapped set or
//throw an unsupported operation exception
#Override public int size() {return wrappedSet.size();}
#Override public boolean isEmpty() {return wrappedSet.isEmpty();}
#Override public boolean contains(Object o) {return wrappedSet.contains(o);}
#Override public Iterator<K> iterator() {return wrappedSet.iterator();}
#Override public Object[] toArray() {return wrappedSet.toArray();}
#Override public <T> T[] toArray(T[] a) {return wrappedSet.toArray(a);}
#Override public boolean containsAll(Collection<?> c) {return wrappedSet.containsAll(c);}
#Override public boolean add(K e) {throw new UnsupportedOperationException();}
#Override public boolean remove(Object o) {throw new UnsupportedOperationException();}
#Override public boolean addAll(Collection<? extends K> c) {throw new UnsupportedOperationException();}
#Override public boolean retainAll(Collection<?> c) {throw new UnsupportedOperationException();}
#Override public boolean removeAll(Collection<?> c) {throw new UnsupportedOperationException();}
#Override public void clear() {throw new UnsupportedOperationException();}
/**
* Simple implementation of {#link SetChangeListener.Change}
*/
private class BasicSetChange extends SetChangeListener.Change<K> {
/** If true, it is an add change, otherwise it is a remove change*/
private final boolean isAdd;
/** Value that was added or removed */
private final K value;
/**
* #param isAdd {#link #isAdd}
* #param value {#link #value}
*/
public BasicSetChange(boolean isAdd, K value) {
super(ObservableKeySet.this);
this.isAdd = isAdd;
this.value = value;
}
#Override
public boolean wasAdded() {
return isAdd;
}
#Override
public boolean wasRemoved() {
return !isAdd;
}
#Override
public K getElementAdded() {
return isAdd ? value : null;
}
#Override
public K getElementRemoved() {
return isAdd ? null : value;
}
}
}

Reading uncommitted generated IDs in Spring / JPA

I am trying to retrieve the generated ID of a newly created Entity within a transaction, but when I try to read the ID value it is null. I assume this is because the transaction has not yet been committed and the Entity's ID has yet to be created.
I am using Spring MVC and transactions (using #Transactional on my service), and using JPA for the data layer. I'm not an expert in transaction management, so I'm not even sure if this is possible. This is example code being executing in the presentation layer (Spring portlet MVC):
Long parentId = getParentId();
Folder parentFolder = linksService.getItem(parentId, Folder.class);
Folder newFolder;
newFolder = new Folder();
newFolder.setName("new folder");
newFolder.setParent(parentFolder);
parentFolder.addItem(newItem);
linksService.saveItem(parentFolder); // this calls entityManager.merge(parentFolder)
// this returns null
String itemId = newFolder.getItemId();
EDIT:
Here are the entities. I am using Oracle db.
#Entity
#Table(name = "LINK_ITEM")
#DiscriminatorColumn(name = "ITEM_TYPE")
public abstract class Item {
/**
* The Id of this item
*/
#Id
#TableGenerator(name = "table_gen", allocationSize = 1)
#GeneratedValue(strategy = GenerationType.TABLE,
generator = "table_gen")
#Column(name = "ITEM_ID")
private Long itemId;
/**
* The name of this item
*/
private String name;
/**
* the parent item of this item
*/
#ManyToOne
#JoinColumn(name="PARENT_ID")
private Item parent;
/**
* The user ID that owns this item
*/
private String owner;
/**
* #return Returns the itemId.
*/
public Long getItemId() {
return itemId;
}
/**
* #param itemId
* The itemId to set.
*/
public void setItemId(Long itemId) {
this.itemId = itemId;
}
/**
* #return Returns the name.
*/
public String getName() {
return name;
}
/**
* #param name
* The name to set.
*/
public void setName(String name) {
this.name = name;
}
/**
* #return Returns the owner.
*/
public String getOwner() {
return owner;
}
/**
* #param owner
* The owner to set.
*/
public void setOwner(String owner) {
this.owner = owner;
}
/**
* #return Returns the parent.
*/
public Item getParent() {
return parent;
}
/**
* #param parent
* The parent to set.
*/
public void setParent(Item parent) {
this.parent = parent;
}
/**
* Returns the depth of this object in the folder tree. 0 is the top folder,
* 1 is one level down, etc.
*
* #return Returns the depth.
*/
#Transient
public long getDepth() {
long i = 0;
for (Item item = this; item.getParent() != null; item = item
.getParent()) {
++i;
}
return i;
}
/**
* Changes the parent folder of this item and updates links / children
* appropriately.
*
* #param parentFolder
*/
public void updateParent(Folder parentFolder) {
removeFromParent();
parentFolder.addItem(this);
}
/**
* Removes this item from it's parent folder, if it has one.
*/
public void removeFromParent() {
if (getParent() != null) {
((Folder) getParent()).removeItem(this);
}
}
public void moveUp() {
if (getParent() == null) {
return;
}
Folder parent = (Folder) getParent();
List<Item> siblings = parent.getChildren();
int index = siblings.indexOf(this);
if (index > 0) {
Item previousItem = siblings.get(index - 1);
siblings.set(index, previousItem);
siblings.set(index - 1, this);
}
}
public void moveDown() {
if (getParent() == null) {
return;
}
Folder parent = (Folder) getParent();
List<Item> siblings = parent.getChildren();
int index = siblings.indexOf(this);
int numItems = siblings.size();
if ((numItems > 1) && (index < (numItems - 1))) {
Item nextItem = (Item) siblings.get(index + 1);
siblings.set(index, nextItem);
siblings.set(index + 1, this);
}
}
/**
* Returns the String representation of this Item.
*/
#Override
public String toString() {
return "itemId=" + this.getItemId() + "; name=" + this.getName()
+ "; owner=" + this.getOwner();
}
}
#Entity
#DiscriminatorValue("F")
public class Folder extends Item {
#OneToMany(cascade = CascadeType.ALL, orphanRemoval = true)
#JoinColumn(name="PARENT_ID", referencedColumnName="ITEM_ID")
private List<Item> children = new ArrayList<Item>();
#Transient
private String path;
#Transient
private boolean open;
#Transient
private Collection<Link> orderedLinks;
/**
* #return Returns the children.
*/
public List<Item> getChildren() {
return children;
}
/**
* #param children
* The children to set.
*/
public void setChildren(List<Item> children) {
this.children = children;
}
/**
* Changes the parent folder of this item and updates links / children
* appropriately.
*
* #param parentFolder
*/
public void updateParent(Folder parentFolder) {
super.updateParent(parentFolder);
// update the path since the parent folder has changed
updatePath();
}
/**
* #param newItem
*/
public void addItem(Item newItem) {
newItem.setParent(this);
getChildren().add(newItem);
}
/**
* #param item
*/
public void removeItem(Item item) {
getChildren().remove(item);
item.setParent(null);
}
/**
*
* #param items
*/
public void addItems(List<? extends Item> items) {
for (Item item : items)
addItem(item);
}
/**
*
* #param items
*/
public void removeItems(List<? extends Item> items) {
for (Item item : items)
removeItem(item);
}
/**
* Returns a list of Folder objects that are the subfolders of this folder.
* This folder is also included at the top of the list.
*
* #return
* #throws ServiceException
*/
#Transient
public List<Folder> getFolderList() {
List<Folder> folderList = new ArrayList<Folder>();
buildFolderList(folderList, null);
return folderList;
}
/**
* Returns a list of Folder objects that are the subfolders of this folder.
* This folder is also included at the top of the list. This method will
* exclude the "excludeFolder" and it's subfolders from the list.
*
* #param excludeFolder
* #return
*/
#Transient
public List<Folder> getFolderList(Folder excludeFolder) {
List<Folder> folderList = new ArrayList<Folder>();
buildFolderList(folderList, excludeFolder);
return folderList;
}
/**
* Returns a recursive list of the parent folder of this folder. Includes
* this folder in the list.
*
* #return
*/
#Transient
public List<Folder> getParentList() {
List<Folder> parentList = new ArrayList<Folder>();
Folder currentFolder = this;
parentList.add(currentFolder);
while (currentFolder.getParent() != null) {
currentFolder = (Folder) currentFolder.getParent();
parentList.add(currentFolder);
}
// reverse the ordering
Collections.reverse(parentList);
return parentList;
}
/**
* Private method called recursively to build a list of Folder's and
* subfolders of the parentFolder.
*
* #param folderList
* #param parentFolder
* #param excludeFolder
*/
private void buildFolderList(List<Folder> folderList, Folder excludeFolder) {
// Don't add the exclude folder to the list
if (excludeFolder != null && this.equals(excludeFolder)) {
return;
}
folderList.add(this);
if (!isFolderEmpty()) {
for (Item item : getChildren()) {
if (item instanceof Folder) {
((Folder) item).buildFolderList(folderList, excludeFolder);
}
}
}
}
/**
* #return Returns the folderEmpty.
*/
#Transient
public boolean isFolderEmpty() {
return children == null || children.isEmpty() || children.size() == 0;
}
/**
*
*/
private void updatePath() {
StringBuffer strBuffer = new StringBuffer("");
strBuffer.append(getName());
Item parent = getParent();
while (parent != null) {
strBuffer.insert(0, parent.getName() + " > ");
parent = parent.getParent();
}
this.path = strBuffer.toString();
}
/**
* #return Returns the path of this folder.
*/
public String getPath() {
if (this.path == null || this.path.length() == 0)
updatePath();
return this.path;
}
/**
* #param path
* The path to set.
*/
protected void setPath(String path) {
this.path = path;
}
public Item find(Long itemId) {
if (itemId.equals(getItemId()))
return this;
Item item = null;
List<Item> children = getChildren();
for (Item currentItem : children) {
if (currentItem.getItemId().equals(itemId)) {
item = currentItem;
break;
} else if (currentItem instanceof Folder) {
item = ((Folder) currentItem).find(itemId);
if (item != null)
break;
}
}
return item;
}
/**
* Returns the String representation of this Folder.
*/
#Override
public String toString() {
return super.toString() + "; path=" + this.getPath();
}
/**
*
* #return a list of Link objects that this Folder holds.
*/
#Transient
public List<Link> getLinks() {
List<Item> children = getChildren();
List<Link> links = new ArrayList<Link>(children.size()
- (children.size() / 2));
for (Item item : children) {
if (item instanceof Link) {
links.add((Link) item);
}
}
return links;
}
/**
* Returns the child Folders of this Folder and their child Folders, etc.
*
* #return
*/
#Transient
public List<Folder> getChildFolders() {
List<Folder> folderList = new ArrayList<Folder>();
buildFolderList(folderList, null);
folderList.remove(this);
return folderList;
}
public boolean isOpen() {
return open;
}
#Transient
public boolean isClosed() {
return !open;
}
public void setOpen(boolean open) {
this.open = open;
}
public Collection<Link> getOrderedLinks() {
return orderedLinks;
}
public void setOrderedLinks(Collection<Link> orderedLinks) {
this.orderedLinks = orderedLinks;
}
/*
* (non-Javadoc)
*
* #see java.lang.Object#equals(java.lang.Object)
*/
#Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || (obj.getClass() != this.getClass())) {
return false;
}
Folder folder = (Folder) obj;
return this.getItemId() == folder.getItemId();
}
/*
* (non-Javadoc)
*
* #see java.lang.Object#hashCode()
*/
#Override
public int hashCode() {
int var = (int) (this.getItemId().longValue() ^ (this.getItemId()
.longValue() >>> 32));
int hash = 7;
hash = 31 * hash + var;
return hash;
}
}
#Entity
#DiscriminatorValue("L")
public class Link extends Item {
private String url;
public Link() {
}
public Link(String url) {
this.url = url;
}
/**
* #return Returns the url.
*/
public String getUrl() {
return url;
}
/**
* #param url
* The url to set.
*/
public void setUrl(String url) {
if (url != null && url.indexOf(":/") == -1)
url = "http://" + url;
this.url = url;
}
}
The Controller is calling the DAO, which calls entityManager.merge() (and I tried including entityManger.flush()). I also use OpenEntityInManagerInterceptor.
Since you are using TABLE sequencing the id will be assign on your persist() call. Are you detaching or serializing the objects? You may need to return the id assigned in persist to your client.
Try
entityManager.flush();
after saving the new Entity. This will force the JPA provider to synchronize your persistence context with the underlying database and as part of this any ids will be generated.

Resources