Method Call With parameter - methods

this week in my college class we started our chapter on Method Calls and i'm having trouble with it. This is the activity we have in class:
Define a method printFeetInchShort, with int parameters numFeet and numInches, that prints using ' and " shorthand. Ex: myPrinter.printFeetInchShort(5, 8) prints:
5' 8"
Hint: Use " to print a double quote.
This is what I have so far, I don't know if i'm doing it right because i've been having trouble with method calling:
import java.util.Scanner;
public class HeightPrinter {
public static void printFeetInchShort(int numFeet, int numInches) {
System.out.println(numFeet + "" + numInches + "\"");
public static void main (String [] args) {
HeightPrinter myPrinter = new HeightPrinter();
// Will be run with (5, 8), then (4, 11)
myPrinter.printFeetInchShort(5, 8);
System.out.println("");
}
}

public static void printFeetInchShort(int numFeet, int numInches) {
System.out.print(numFeet + "' " + numInches + "\"");

You are missing the curly braces to close the printFeetInchShort method.
import java.util.Scanner;
public class HeightPrinter {
public static void printFeetInchShort(int numFeet, int numInches) {
System.out.println(numFeet + "' " + numInches + "\"");
} //<-- this is missing
public static void main (String [] args) {
Scanner scnr = new Scanner(System.in);
int userFeet;
int userInches;
userFeet = scnr.nextInt();
userInches = scnr.nextInt();
printFeetInchShort(userFeet, userInches); // Will be run with (5, 8), then (4, 11)
}

Related

Drools decision table - rules not matching

I have a hello-world type spring/drools setup. The issue is no rules fire when in theory they should.
Decision Table:
Console output - server startup:
package com.example.drools;
//generated from Decision Table
import com.example.drools.TestRules;
// rule values at B9, header at B4
rule "_9"
when
$test:TestRules(number1 == 10)
then
$test.add("10");
end
Drools Config:
#Configuration
public class DroolsConfiguration
{
private final static String VALIDATION_RULES = "validation-rules.xls";
#Bean
public KieContainer validationRulesKieContainer() {
KieServices kieServices = KieServices.Factory.get();
Resource rules = ResourceFactory.newClassPathResource(VALIDATION_RULES);
compileXlsToDrl(rules);
KieFileSystem kieFileSystem = kieServices.newKieFileSystem().write(rules);
KieBuilder kieBuilder = kieServices.newKieBuilder(kieFileSystem);
KieBuilder builder = kieBuilder.buildAll();
KieModule kieModule = kieBuilder.getKieModule();
return kieServices.newKieContainer(kieModule.getReleaseId());
}
private static void compileXlsToDrl(Resource resource) {
try {
InputStream is = resource.getInputStream();
SpreadsheetCompiler compiler = new SpreadsheetCompiler();
String drl = compiler.compile(is, InputType.XLS);
System.out.println(drl);
} catch (Exception e) {
e.printStackTrace();
}
}
}
Service:
#Service
public class ValidationRulesEngine
{
#Autowired
#Qualifier("validationRulesKieContainer")
private KieContainer validationKieContainer;
public void validate() {
KieSession kieSession = validationKieContainer.newKieSession();
kieSession.addEventListener(new DebugAgendaEventListener());
kieSession.addEventListener(new DebugRuleRuntimeEventListener());
TestRules tr = new TestRules(10, 20, 30);
kieSession.insert(tr);
int noOfRulesFired = kieSession.fireAllRules();
System.out.println("noOfRulesFired: " + noOfRulesFired);
System.out.println(tr);
System.out.println(tr.getRule());
}
}
TestRule - Fact:
public class TestRules
{
public int number1;
public int number2;
public int number3;
public List<String> rules = new ArrayList<String>();
public TestRules() {}
public TestRules(int number1, int number2, int number3)
{
super();
this.number1 = number1;
this.number2 = number2;
this.number3 = number3;
}
public void add(String rule) {
rules.add(rule);
}
public String getRule() {
return this.rules.size() > 0 ? this.rules.get(0) : "";
}
#Override
public String toString()
{
return "TestRules [number1=" + number1 + ", number2=" + number2 + ", number3=" + number3 + ", rules=" +
rules.stream().map(s -> s.toString()).collect(Collectors.joining(",")) + "]";
}
}
Console output - result:
2021-07-20 17:02:27.549 ERROR 20212 --- [nio-9016-exec-1] c.g.i.e.p.c.OfficeController : --> Rules Engine
==>[ObjectInsertedEventImpl: getFactHandle()=[fact 0:1:1539328290:1539328290:1:DEFAULT:NON_TRAIT:com.example.drools.TestRules:TestRules [number1=10, number2=20, number3=30, rules=]], getObject()=TestRules [number1=10, number2=20, number3=30, rules=], getKnowledgeRuntime()=KieSession[0], getPropagationContext()=PhreakPropagationContext [entryPoint=EntryPoint::DEFAULT, factHandle=[fact 0:1:1539328290:1539328290:1:DEFAULT:NON_TRAIT:com.example.drools.TestRules:TestRules [number1=10, number2=20, number3=30, rules=]], leftTuple=null, originOffset=-1, propagationNumber=2, rule=null, type=INSERTION]]
noOfRulesFired: 0
TestRules [number1=10, number2=20, number3=30, rules=]
2021-07-20 17:02:28.454 ERROR 20212 --- [nio-9016-exec-1] c.g.i.e.p.c.OfficeController : <-- Rules Engine
What am I missing?
This is no good:
$test:TestRules($test.number1 == 10, $test.number2 == 20)
You can't refer to $test before you declare it. The correct syntax is:
$test: TestRules( number1 == 10, number2 == 20 )
Fix your decision table from $test.number1 == $param to instead be number1 == $param. (And do the same for number2 adjacent.)
The rest looks fine, though I would suggest using a try-with-resources instead of a try-catch in your XLSX parsing method.

Chronicle Queue V3. Can Entries be lost on data block roll-over?

I have an application that writes entries to a Chronicle Queue (V3) that also retains excerpt entry index values in other (Chronicle)Maps by way of providing indexed access in the queue. Sometimes we fail to find a given entry that we've earlier saved and I believe it maybe related to data-block roll-over.
Below is a stand-alone test program that reproduces such use-cases at small-scale. It repeatedly writes an entry and immediately attempts to find the resulting index value up using a separate ExcerptTailer. All is well for a while until the first data-block is used up and a second data file is assigned, then the retrieval failures start. If the data block size is increased to avoid roll-overs, then no entries are lost. Also using a small index data-block size, causing multiple index files to be created, doesn't cause a problem.
The test program also tries using an ExcerptListener running in parallel to see if the entries apparently 'lost' by the writer, are ever received by the reader thread - they're not. Also tries to re-read the resulting queue from start until end, which reconfirms that they really are lost.
Stepping thru' the code, I see that when looking up a 'missing entry', within AbstractVanillarExcerpt#index, it appears to successfully locate the correct VanillaMappedBytes object from the dataCache, but determines that there is no entry and the data-offset as the len == 0. In addition to the entries not being found, at some point after the problems start occurring post-roll-over, an NPE is thrown from within the VanillaMappedFile#fileChannel method due to it having been passed a null File path. The code-path assumes that when resolving a entry looked up successfully in the index that a file will always have been found, but isn't in this case.
Is it possible to reliably use Chronicle Queue across data-block roll-overs, and if so, what am I doing that maybe causing the problem I'm experiencing?
import java.io.IOException;
import java.util.Collection;
import java.util.HashSet;
import java.util.Iterator;
import java.util.LinkedList;
import java.util.Set;
import org.junit.Before;
import org.junit.Test;
import net.openhft.affinity.AffinitySupport;
import net.openhft.chronicle.Chronicle;
import net.openhft.chronicle.ChronicleQueueBuilder;
import net.openhft.chronicle.ExcerptAppender;
import net.openhft.chronicle.ExcerptCommon;
import net.openhft.chronicle.ExcerptTailer;
import net.openhft.chronicle.VanillaChronicle;
public class ChronicleTests {
private static final int CQ_LEN = VanillaChronicle.Cycle.DAYS.length();
private static final long CQ_ENT = VanillaChronicle.Cycle.DAYS.entries();
private static final String ROOT_DIR = System.getProperty(ChronicleTests.class.getName() + ".ROOT_DIR",
"C:/Temp/chronicle/");
private static final String QDIR = System.getProperty(ChronicleTests.class.getName() + ".QDIR", "chronicleTests");
private static final int DATA_SIZE = Integer
.parseInt(System.getProperty(ChronicleTests.class.getName() + ".DATA_SIZE", "100000"));
// Chunk file size of CQ index
private static final int INDX_SIZE = Integer
.parseInt(System.getProperty(ChronicleTests.class.getName() + ".INDX_SIZE", "10000"));
private static final int Q_ENTRIES = Integer
.parseInt(System.getProperty(ChronicleTests.class.getName() + ".Q_ENTRIES", "5000"));
// Data type id
protected static final byte FSYNC_DATA = 1;
protected static final byte NORMAL_DATA = 0;
protected static final byte TH_START_DATA = -1;
protected static final byte TH_END_DATA = -2;
protected static final byte CQ_START_DATA = -3;
private static final long MAX_RUNTIME_MILLISECONDS = 30000;
private static String PAYLOAD_STRING = "1234567890ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz";
private static byte PAYLOAD_BYTES[] = PAYLOAD_STRING.getBytes();
private Chronicle _chronicle;
private String _cqPath = ROOT_DIR + QDIR;
#Before
public void init() {
buildCQ();
}
#Test
public void test() throws IOException, InterruptedException {
boolean passed = true;
Collection<Long> missingEntries = new LinkedList<Long>();
long sent = 0;
Thread listener = listen();
try {
listener.start();
// Write entries to CQ,
for (int i = 0; i < Q_ENTRIES; i++) {
long entry = writeQEntry(PAYLOAD_BYTES, (i % 100) == 0);
sent++;
// check each entry can be looked up
boolean found = checkEntry(i, entry);
if (!found)
missingEntries.add(entry);
passed &= found;
}
// Wait awhile for the listener
listener.join(MAX_RUNTIME_MILLISECONDS);
if (listener.isAlive())
listener.interrupt();
} finally {
if (listener.isAlive()) { // => exception raised so wait for listener
log("Give listener a chance....");
sleep(MAX_RUNTIME_MILLISECONDS);
listener.interrupt();
}
log("Sent: " + sent + " Received: " + _receivedEntries.size());
// Look for missing entries in receivedEntries
missingEntries.forEach(me -> checkMissingEntry(me));
log("All passed? " + passed);
// Try to find missing entries by searching from the start...
searchFromStartFor(missingEntries);
_chronicle.close();
_chronicle = null;
// Re-initialise CQ and look for missing entries again...
log("Re-initialise");
init();
searchFromStartFor(missingEntries);
}
}
private void buildCQ() {
try {
// build chronicle queue
_chronicle = ChronicleQueueBuilder.vanilla(_cqPath).cycleLength(CQ_LEN).entriesPerCycle(CQ_ENT)
.indexBlockSize(INDX_SIZE).dataBlockSize(DATA_SIZE).build();
} catch (IOException e) {
throw new InitializationException("Failed to initialize Active Trade Store.", e);
}
}
private long writeQEntry(byte dataArray[], boolean fsync) throws IOException {
ExcerptAppender appender = _chronicle.createAppender();
return writeData(appender, dataArray, fsync);
}
private boolean checkEntry(int seqNo, long entry) throws IOException {
ExcerptTailer tailer = _chronicle.createTailer();
if (!tailer.index(entry)) {
log("SeqNo: " + seqNo + " for entry + " + entry + " not found");
return false;
}
boolean isMarker = isMarker(tailer);
boolean isFsyncData = isFsyncData(tailer);
boolean isNormalData = isNormalData(tailer);
String type = isMarker ? "MARKER" : isFsyncData ? "FSYNC" : isNormalData ? "NORMALDATA" : "UNKNOWN";
log("Entry: " + entry + "(" + seqNo + ") is " + type);
return true;
}
private void log(String string) {
System.out.println(string);
}
private void searchFromStartFor(Collection<Long> missingEntries) throws IOException {
Set<Long> foundEntries = new HashSet<Long>(Q_ENTRIES);
ExcerptTailer tailer = _chronicle.createTailer();
tailer.toStart();
while (tailer.nextIndex())
foundEntries.add(tailer.index());
Iterator<Long> iter = missingEntries.iterator();
long foundCount = 0;
while (iter.hasNext()) {
long me = iter.next();
if (foundEntries.contains(me)) {
log("Found missing entry: " + me);
foundCount++;
}
}
log("searchFromStartFor Found: " + foundCount + " of: " + missingEntries.size() + " missing entries");
}
private void checkMissingEntry(long missingEntry) {
if (_receivedEntries.contains(missingEntry))
log("Received missing entry:" + missingEntry);
}
Set<Long> _receivedEntries = new HashSet<Long>(Q_ENTRIES);
private Thread listen() {
Thread returnVal = new Thread("Listener") {
public void run() {
try {
int receivedCount = 0;
ExcerptTailer tailer = _chronicle.createTailer();
tailer.toStart();
while (receivedCount < Q_ENTRIES) {
if (tailer.nextIndex()) {
_receivedEntries.add(tailer.index());
} else {
ChronicleTests.this.sleep(1);
}
}
log("listener complete");
} catch (IOException e) {
log("Interupted before receiving all entries");
}
}
};
return returnVal;
}
private void sleep(long interval) {
try {
Thread.sleep(interval);
} catch (InterruptedException e) {
// No action required
}
}
protected static final int THREAD_ID_LEN = Integer.SIZE / Byte.SIZE;
protected static final int DATA_TYPE_LEN = Byte.SIZE / Byte.SIZE;
protected static final int TIMESTAMP_LEN = Long.SIZE / Byte.SIZE;
protected static final int CRC_LEN = Long.SIZE / Byte.SIZE;
protected static long writeData(ExcerptAppender appender, byte dataArray[],
boolean fsync) {
appender.startExcerpt(DATA_TYPE_LEN + THREAD_ID_LEN + dataArray.length
+ CRC_LEN);
appender.nextSynchronous(fsync);
if (fsync) {
appender.writeByte(FSYNC_DATA);
} else {
appender.writeByte(NORMAL_DATA);
}
appender.writeInt(AffinitySupport.getThreadId());
appender.write(dataArray);
appender.writeLong(CRCCalculator.calcDataAreaCRC(appender));
appender.finish();
return appender.lastWrittenIndex();
}
protected static boolean isMarker(ExcerptCommon excerpt) {
if (isCqStartMarker(excerpt) || isStartMarker(excerpt) || isEndMarker(excerpt)) {
return true;
}
return false;
}
protected static boolean isCqStartMarker(ExcerptCommon excerpt) {
return isDataTypeMatched(excerpt, CQ_START_DATA);
}
protected static boolean isStartMarker(ExcerptCommon excerpt) {
return isDataTypeMatched(excerpt, TH_START_DATA);
}
protected static boolean isEndMarker(ExcerptCommon excerpt) {
return isDataTypeMatched(excerpt, TH_END_DATA);
}
protected static boolean isData(ExcerptTailer tailer, long index) {
if (!tailer.index(index)) {
return false;
}
return isData(tailer);
}
private static void movePosition(ExcerptCommon excerpt, long position) {
if (excerpt.position() != position)
excerpt.position(position);
}
private static void moveToFsyncFlagPos(ExcerptCommon excerpt) {
movePosition(excerpt, 0);
}
private static boolean isDataTypeMatched(ExcerptCommon excerpt, byte type) {
moveToFsyncFlagPos(excerpt);
byte b = excerpt.readByte();
if (b == type) {
return true;
}
return false;
}
protected static boolean isNormalData(ExcerptCommon excerpt) {
return isDataTypeMatched(excerpt, NORMAL_DATA);
}
protected static boolean isFsyncData(ExcerptCommon excerpt) {
return isDataTypeMatched(excerpt, FSYNC_DATA);
}
/**
* Check if this entry is Data
*
* #param excerpt
* #return true if the entry is data
*/
protected static boolean isData(ExcerptCommon excerpt) {
if (isNormalData(excerpt) || isFsyncData(excerpt)) {
return true;
}
return false;
}
}
The problem only occurs when initialising the data-block size with a value that is not a power of two. The built-in configurations on IndexedChronicleQueueBuilder (small(), medium(), large()) take care to initialise using powers of two which provided the clue as to the appropriate usage.
Notwithstanding the above response regarding support, which I totally appreciate, it would be useful if a knowledgeable Chronicle user could confirm that the integrity of Chronicle Queue depends on using a data-block size of a power of two.

Spring BeanUtils.copyProperties not working

I want to copy properties from one object to another, both are of the same class. However it's not copying the fields. Here is the demo code:
public static void main(String[] args) throws Exception {
A from = new A();
A to = new A();
from.i = 123;
from.l = 321L;
System.out.println(from.toString());
System.out.println(to.toString());
BeanUtils.copyProperties(from, to);
System.out.println(from.toString());
System.out.println(to.toString());
}
public static class A {
public String s;
public Integer i;
public Long l;
#Override
public String toString() {
return "A{" +
"s=" + s +
", i=" + i +
", l=" + l +
'}';
}
}
And the output is:
A{s=null, i=123, l=321}
A{s=null, i=null, l=null}
A{s=null, i=123, l=321}
A{s=null, i=null, l=null}
Looks like I have to have setter/getters for the class:
public static void main(String[] args) throws Exception {
A from = new A();
A to = new A();
from.i = 123;
from.l = 321L;
System.out.println(from.toString());
System.out.println(to.toString());
BeanUtils.copyProperties(from, to);
System.out.println(from.toString());
System.out.println(to.toString());
}
public static class A {
public String s;
public Integer i;
public Long l;
public String getS() {
return s;
}
public void setS(String s) {
this.s = s;
}
public Integer getI() {
return i;
}
public void setI(Integer i) {
this.i = i;
}
public Long getL() {
return l;
}
public void setL(Long l) {
this.l = l;
}
#Override
public String toString() {
return "A{" +
"s=" + s +
", i=" + i +
", l=" + l +
'}';
}
}
Now the output is:
A{s=null, i=123, l=321}
A{s=null, i=null, l=null}
A{s=null, i=123, l=321}
A{s=null, i=123, l=321}

Filehelpers Field max length

I'm working with Filehelpers and imported a csv File. Everything works fine, but now I want to validate the length of the imported Fields.
[DelimitedRecord(";")]
public class ImportFile
{
public string Name;
public string NameSurname;
}
Is there a possible way, that I can create an attribute "MaxLength" which split the Input String or Throw an Exception, if the InputString is bigger than my MaxLength Attribut?
The only thing I found was the FieldFlixedLength, but thats only the Split, the Inputfile in fields.
You can implement an AfterRead event as follows:
[DelimitedRecord(";")]
public class ImportRecord : INotifyRead<ImportRecord>
{
public string Name;
public string NameSurname;
public void BeforeRead(BeforeReadEventArgs<ImportRecord> e)
{
}
public void AfterRead(AfterReadEventArgs<ImportRecord> e)
{
if (e.Record.Name.Length > 20)
throw new Exception("Line " + e.LineNumber + ": First name is too long");
if (e.Record.NameSurname.Length > 20)
throw new Exception("Line " + e.LineNumber + ": Surname name is too long");
}
}
class Program
{
static void Main(string[] args)
{
var engine = new FileHelperEngine<ImportRecord>();
engine.ErrorMode = ErrorMode.SaveAndContinue;
string fileAsString = "Firstname;SurnameIsAVeryLongName" + Environment.NewLine
+ "FirstName;SurnameIsShort";
ImportRecord[] validRecords = engine.ReadString(fileAsString);
Console.ForegroundColor = ConsoleColor.Red;
foreach (ErrorInfo error in engine.ErrorManager.Errors)
{
Console.WriteLine(error.ExceptionInfo.Message);
}
Console.ForegroundColor = ConsoleColor.White;
foreach (ImportRecord validRecord in validRecords)
{
Console.WriteLine(String.Format("Successfully read record: {0} {1}", validRecord.Name, validRecord.NameSurname));
}
Console.WriteLine("Press any key...");
Console.ReadKey();
}
}

Converting MapWritable to a string in Hadoop

When I run my output with the toString() method I am getting:
#zombie org.apache.hadoop.io.MapWritable#b779f586
#zombies org.apache.hadoop.io.MapWritable#c8008ef9
#zona org.apache.hadoop.io.MapWritable#99e061a1
#zoology org.apache.hadoop.io.MapWritable#9d0060be
#zzp org.apache.hadoop.io.MapWritable#3e52c108
Here is my reducer code, how can I get the map values to print out instead?
package sample;
import java.io.IOException;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.MapWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.Writable;
import org.apache.hadoop.mapreduce.Reducer;
public class IntSumReducer
extends Reducer<Text,MapWritable,Text,MapWritable> {
private MapWritable result = new MapWritable();
String temp = "";
public void reduce(Text key, Iterable<MapWritable> values, Context context)throws IOException, InterruptedException {
result.clear();
for (MapWritable val : values) {
Iterable<Writable> keys = val.keySet();
for (Writable k : keys) {
IntWritable tally = (IntWritable) val.get(k);
if (result.containsKey(k)) {
IntWritable tallies = (IntWritable) result.get(k);
tallies.set(tallies.get() + tally.get());
temp = toString() + " : " + tallies.get();
result.put(new Text(temp), tallies);
} else {
temp = k.toString() + " : " + tally.get();
result.put(new Text(temp), tally);
}
}
}
context.write(key, result);
}
}
Thanks for the help
Adding a class like this should work:
class MyMapWritable extends MapWritable {
#Override
public String toString() {
StringBuilder result = new StringBuilder();
Set<Writable> keySet = this.keySet();
for (Object key : keySet) {
result.append("{" + key.toString() + " = " + this.get(key) + "}");
}
return result.toString();
}
}
Then call it like so:
MyMapWritable mw = new MyMapWritable();
mw.toString();
Your result is a MapWritable, and the toString() method is not overridden in MapWritable.
You can create new class that extends MapWritable and create your own toString() method in it.
Change your code after that to :
public class IntSumReducer extends Reducer<Text,MapWritable,Text,YourMapWritable> {
private YourMapWritable result = new YourMapWritable();
String temp = "";
...

Resources