Conflicting enum values java - spring-boot

I have 2 enums defined as below in 2 separate files:
MyErrorCodes.java
#Getter
public enum MyErrorCodes implements ErrorCode {
ERROR1(90, 1, 01),
ERROR2(90, 1, 02),
ERROR3(90, 1, 03),
ERROR4(90, 1, 04),
ERROR5(90, 1, 05);
....
}
ErrorCategory.java
public class ErrorCategory {
#AllArgsConstructor
public enum ErrorCodes {
EXECUTION_ERROR("execution.error", "Error in executing {0}.", INTERNAL_ERROR,
MyErrorCodes.ERROR1),
DESERIALIZATION_ERROR("...", "...",BAD_REQUEST_ERROR, MyErrorCodes.ERROR2),
DEFAULT_INTERNAL_ERROR("...", "...", INTERNAL_ERROR, MyErrorCodes.ERROR3),
INVALID_RESPONSE("...", "...", INTERNAL_ERROR, MyErrorCodes.ERROR4),
MAPPING_ERROR("...", "...", INTERNAL_ERROR, MyErrorCodes.ERROR5);
}
}
At Runtime I am getting error:
Conflicting enum values. Name 'ERROR4' uses ordinal value (4) that is also used for name 'MAPPING_ERROR'
ERROR4 is 4th element in enum#1 while MAPPING_ERROR is 5th element in the enum#2.
Then Why am I getting conflict error when they are completely different enums?
P.S. the interface implemented by enum#1 looks like this:
public interface ErrorCode {
int getErrorCode();
String getKey();
}
stacktrace:
at io.opentracing.contrib.concurrent.TracedRunnable.run(TracedRunnable.java:30)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.ignite.binary.BinaryObjectException: Conflicting enum values. Name 'ERROR4' uses ordinal value (4) that is also used for name 'MAPPING_ERROR'
at org.apache.ignite.internal.binary.BinaryUtils.mergeEnumValues(BinaryUtils.java:2538)
at org.apache.ignite.internal.binary.BinaryUtils.mergeMetadata(BinaryUtils.java:1028)
at org.apache.ignite.internal.processors.cache.binary.BinaryMetadataTransport.requestMetadataUpdate(BinaryMetadataTransport.java:211)
at org.apache.ignite.internal.processors.cache.binary.CacheObjectBinaryProcessorImpl.addMeta(CacheObjectBinaryProcessorImpl.java:606)
at org.apache.ignite.internal.processors.cache.binary.CacheObjectBinaryProcessorImpl$1.addMeta(CacheObjectBinaryProcessorImpl.java:288)
at org.apache.ignite.internal.binary.BinaryContext.registerUserClassDescriptor(BinaryContext.java:828)
at org.apache.ignite.internal.binary.BinaryContext.registerDescriptor(BinaryContext.java:784)
at org.apache.ignite.internal.binary.BinaryContext.registerClass(BinaryContext.java:581)
at org.apache.ignite.internal.binary.BinaryContext.registerClass(BinaryContext.java:556)
at org.apache.ignite.internal.binary.BinaryWriterExImpl.doWriteEnum(BinaryWriterExImpl.java:829)
at org.apache.ignite.internal.binary.BinaryWriterExImpl.writeEnumField(BinaryWriterExImpl.java:1323)
at org.apache.ignite.internal.binary.BinaryFieldAccessor$DefaultFinalClassAccessor.write0(BinaryFieldAccessor.java:670)
at org.apache.ignite.internal.binary.BinaryFieldAccessor.write(BinaryFieldAccessor.java:157)
... 115 common frames omitted
```

Okay so here is what we did and found out regarding the issue:
We were using ignite cluster for our deployment purpose, we updated the ignite version from 6 to 7 and that solved the issue.
Ignite caches the enum values, so When we introduce new enum values we should add to the end to increment the ordinal to avoid conflicts in future or clear the cache.

Related

Mybatis: IllegalArgumentException: Mapped Statements collection does not contain value for xxx

I have two entities Vendor and Goods with one-to-many relation, the relation looks like:
I am using mybatis with annotation, the mapper:
GoodsMapper
public interface GoodsMapper {
#Select("select * from goods where id=#{goodsId}")
#Results({
#Result(id = true, column = "id", property = "id"),
#Result(column = "name", property = "name"),
#Result(column = "vendor_id", property = "vendor",
one = #One(select = "com.xxx.server.mapper.VendorMapper.getVendor"))
})
Goods getGoods(#Param("goodsId") String goodsId);
}
VendorMapper
public interface VendorMapper {
#Select("select * from vendor where id=#{vendorId}")
Vendor getVendor(#Param("vendorId") String vendorId);
}
ignore the entity code & others...
when I invoked goodsMapper.getGoods(goodsId), I caught the following exception :
Caused by: org.apache.ibatis.exceptions.PersistenceException:
### Error querying database. Cause: java.lang.IllegalArgumentException: Mapped Statements collection does not contain value for com.xxx.server.mapper.VendorMapper.getVendor
### The error may exist in com/xxx/server/mapper/GoodsMapper.java (best guess)
### The error may involve com.xxx.server.mapper.GoodsMapper.getGoods
### The error occurred while handling results
### SQL: select * from goods where id=?
### Cause: java.lang.IllegalArgumentException: Mapped Statements collection does not contain value for com.xxx.server.mapper.VendorMapper.getVendor
at org.apache.ibatis.exceptions.ExceptionFactory.wrapException(ExceptionFactory.java:30)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:150)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:141)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectOne(DefaultSqlSession.java:77)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java:433)
... 117 more
Caused by: java.lang.IllegalArgumentException: Mapped Statements collection does not contain value for com.xxx.server.mapper.VendorMapper.getVendor
at org.apache.ibatis.session.Configuration$StrictMap.get(Configuration.java:933)
at org.apache.ibatis.session.Configuration.getMappedStatement(Configuration.java:726)
at org.apache.ibatis.session.Configuration.getMappedStatement(Configuration.java:719)
at org.apache.ibatis.executor.resultset.DefaultResultSetHandler.getNestedQueryMappingValue(DefaultResultSetHandler.java:740)
at org.apache.ibatis.executor.resultset.DefaultResultSetHandler.getPropertyMappingValue(DefaultResultSetHandler.java:465)
at org.apache.ibatis.executor.resultset.DefaultResultSetHandler.applyPropertyMappings(DefaultResultSetHandler.java:441)
I have checked the class path com.xxx.server.mapper.VendorMapper.getVendor for the select of #One, it is correct.
Appreciate any kind help~
In my case this was caused by referenced collection not being initialized by Spring yet.
Solution is to add #DependsOn annotation to the "parent" mapper.
#DependsOn("VendorMapper")
public interface GoodsMapper{
...
}
#Repository("VendorMapper")
public interface VendorMapper {
...
}

Syntax error on token "1", Identifier expected in Enum Java

I want to implement Enum for Number, I want to get its respective String values. I already followed link: http://www.makeinjava.com/convert-enum-integer-string-value-java/.
The error which I'm getting is
Syntax error on token "1", Identifier expected
Syntax error on token "2", Identifier expected
public enum CompanyCityType {
1("New York"),
2("Reston");
private Integer companyCityType;
CompanyCityType(Integer companyCityType) {
this.companyCityType = companyCityType;
}
public Integer getCompanyAddrType() {
return this.companyCityType;
}
}
You cannot begin any identifier name in Java with a number, it must follow the rules as specified for having a valid variable name in Java.
As per the Oracle variable tutorial:
Variable names are case-sensitive. A variable's name can be any legal
identifier — an unlimited-length sequence of Unicode letters and
digits, beginning with a letter, the dollar sign "$", or the
underscore character "_".
As the fields in an enum are actually public static final fields (singleton instances) or class variables they follow the same set of naming rules as a normal Java variable.
You need to refactor your code to:
public enum CompanyCityType {
NEW_YORK(1),
RESTON(2);
private int companyCityType;
CompanyCityType(int companyCityType) {
this.companyCityType = companyCityType;
}
public int getCompanyAddrType() {
return this.companyCityType;
}
}

Phantom's generated `store` method throws a ClassCastException at runtime

I have the following Phantom table definition:
package myPackage
import com.outworkers.phantom.CassandraTable
import com.outworkers.phantom.connectors.RootConnector
import com.outworkers.phantom.dsl._
import com.outworkers.phantom.keys.{PartitionKey, PrimaryKey}
import scala.concurrent.Future
case class KeysTwoThreeAndFour(myKeyTwo: Int, myKeyThree: String, myKeyFour: Int)
abstract class MyTable extends CassandraTable[MyTable, Int] with RootConnector {
object myKeyOne extends IntColumn(this) with PartitionKey
object myKeyTwo extends IntColumn(this) with PrimaryKey
object myKeyThree extends StringColumn(this) with PrimaryKey
object myKeyFour extends IntColumn(this) with PrimaryKey
object myValue extends IntColumn(this)
def insertValue(myKeyOne: Int, valuesMap: Map[KeysTwoThreeAndFour, Int]): Future[Unit] = {
val resultFutures = for ((key: KeysTwoThreeAndFour, myValue) <- valuesMap) yield {
store(myKeyOne, key.myKeyTwo, key.myKeyThree, key.myKeyFour, myValue).future()
}
Future.sequence(resultFutures).map { _ => () }
}
}
This compiles fine, but at runtime throws the following exception:
java.lang.ClassCastException: scala.Tuple5 cannot be cast to scala.runtime.Nothing$
at myPackage.MyTable$anon$macro$1$1.store(MyTable.scala:10)
at com.outworkers.phantom.CassandraTable.store(CassandraTable.scala:125) ~[com.outworkers.phantom-dsl_2.11-2.7.6.jar:2.7.6]
at myPackage.MyTable$$anonfun$2.apply(MyTable.scala:19)
at myPackage.MyTable$$anonfun$2.apply(MyTable.scala:18)
...
I am following the examples in the bottom of the Phantom table docs, what seems to be the problem? Is the issue perhaps that I have a simple Int as my "Record" type instead of an actual class?
I am using phantom-dsl 2.7.6, Play Framework 2.3.10 and Scala 2.11.11.
Note that the following code works fine:
insert
.value(_.myKeyOne, myKeyOne)
.value(_.myKeyTwo, key.myKeyTwo)
.value(_.myKeyThree, key.myKeyThree)
.value(_.myKeyFour, key.myKeyFour)
.value(_.myValue, myValue)
.future()
Thanks.
It seems that the issue was due to the "Record" type being a primitive, instead of a class; in other words, wrapping my Int in a case class having that Int as the only member solved the problem.
Additionally, there was a problem with the Select statement, which was solved too:
java.util.concurrent.ExecutionException: Boxed Error
at com.outworkers.phantom.CassandraTable.fromRow(CassandraTable.scala:85) ~[com.outworkers.phantom-dsl_2.11-2.7.6.jar:2.7.6]
at com.outworkers.phantom.SelectTable$$anonfun$select$1.apply(SelectTable.scala:24) ~[com.outworkers.phantom-dsl_2.11-2.7.6.jar:2.7.6]
at com.outworkers.phantom.SelectTable$$anonfun$select$1.apply(SelectTable.scala:24) ~[com.outworkers.phantom-dsl_2.11-2.7.6.jar:2.7.6]
at com.outworkers.phantom.builder.query.SelectQuery.fromRow(SelectQuery.scala:59) ~[com.outworkers.phantom-dsl_2.11-2.7.6.jar:2.7.6]
at com.outworkers.phantom.builder.query.RootExecutableQuery$class.singleResult(ExecutableQuery.scala:176) ~[com.outworkers.phantom-dsl_2.11-2.7.6.jar:2.7.6]
at com.outworkers.phantom.builder.query.SelectQuery.singleResult(SelectQuery.scala:33) ~[com.outworkers.phantom-dsl_2.11-2.7.6.jar:2.7.6]

Spring Repository method complains about field not found

I'm using Spring Data Neo4j in my project and I'm having issues with naming conventions for repositories.
This is a simple class containing only one field and the getters/setters
#RelationshipEntity
public class ScoredRelationship
{
protected Float score;
}
and the class below extends it with other kind of fields
#RelationshipEntity(
type = RecommenderRelTypes.GOV_CONSUMER_TO_GOV_CONSUMER_SIMILARITY)
public class GovConsumerToGovConsumerSimilarity extends ScoredRelationship
{ // Other fields}
To access the relationship, I'm using the usual repository class
public interface GovConsumerToGovConsumerSimilarityRepository extends
GraphRepository<GovConsumerToGovConsumerSimilarity>
{
public Set<GovConsumerToGovConsumerSimilarity> findByScoreGreaterThan(Float value);
public Set<GovConsumerToGovConsumerSimilarity>
findByScoreGreaterThanOrderByScoreDesc(Float value);
public Set<GovConsumerToGovConsumerSimilarity>
findTopXByScoreGreaterThanOrderByScoreDesc(int limit, Float score);
}
This code compiles well. However, whenever I'm trying to use one of the methods, Spring return a series of exception or doesn't act as intended.
F.e. #findByScoreGreaterThan(0.3f) always returns an empty set. However, invoking a findAll() and printing all the scores it actually has a lot of objects with a score greater than 0.3.
In the second and third case, it always throws an exception saying
Caused by: Unknown identifier `score`.
at org.neo4j.cypher.internal.symbols.SymbolTable.evaluateType(SymbolTable.scala:60)
at org.neo4j.cypher.internal.commands.expressions.Identifier.evaluateType(Identifier.scala:51)
at org.neo4j.cypher.internal.commands.expressions.Expression.assertTypes(Expression.scala:53)
at org.neo4j.cypher.internal.pipes.SortPipe$$anonfun$assertTypes$1.apply(SortPipe.scala:34)
at org.neo4j.cypher.internal.pipes.SortPipe$$anonfun$assertTypes$1.apply(SortPipe.scala:33)
at scala.collection.LinearSeqOptimized$class.foreach(LinearSeqOptimized.scala:59)
at scala.collection.immutable.List.foreach(List.scala:45)
at org.neo4j.cypher.internal.pipes.SortPipe.assertTypes(SortPipe.scala:33)
at org.neo4j.cypher.internal.pipes.PipeWithSource.<init>(PipeWithSource.scala:27)
at org.neo4j.cypher.internal.pipes.SortPipe.<init>(SortPipe.scala:29)
at org.neo4j.cypher.internal.executionplan.builders.SortBuilder.apply(SortBuilder.scala:33)
at org.neo4j.cypher.internal.executionplan.ExecutionPlanImpl.prepareExecutionPlan(ExecutionPlanImpl.scala:49)
at org.neo4j.cypher.internal.executionplan.ExecutionPlanImpl.<init>(ExecutionPlanImpl.scala:33)
at org.neo4j.cypher.ExecutionEngine$$anonfun$prepare$1.apply(ExecutionEngine.scala:67)
at org.neo4j.cypher.ExecutionEngine$$anonfun$prepare$1.apply(ExecutionEngine.scala:67)
at org.neo4j.cypher.internal.LRUCache.getOrElseUpdate(LRUCache.scala:37)
at org.neo4j.cypher.ExecutionEngine.prepare(ExecutionEngine.scala:67)
at org.neo4j.cypher.ExecutionEngine.execute(ExecutionEngine.scala:59)
at org.neo4j.cypher.ExecutionEngine.execute(ExecutionEngine.scala:63)
at org.neo4j.cypher.javacompat.ExecutionEngine.execute(ExecutionEngine.java:79)
at org.springframework.data.neo4j.support.query.CypherQueryEngine.parseAndExecuteQuery(CypherQueryEngine.java:61)
How could be possible? I mean, the class obviously has the score field. Also, executing the simple #findByScoreGreaterThan(float value) doesn't throw any exception, but at the same the latter method always returns an empty set.
EDIT:
These are the queries used by Spring. Actually, they seems right
Executing cypher query: START `govConsumerToGovConsumerSimilarity`=node:__types__(className="it.cerict.recommender.persistence.neo4j.GovConsumerToGovConsumerSimilarity") WHERE `govConsumerToGovConsumerSimilarity`.`score`! > {0} RETURN `govConsumerToGovConsumerSimilarity` params {0=0.3}
Executing cypher query: START `govConsumerToGovConsumerSimilarity`=node:__types__(className="it.cerict.recommender.persistence.neo4j.GovConsumerToGovConsumerSimilarity") WHERE `govConsumerToGovConsumerSimilarity`.`score`! > {0} RETURN `govConsumerToGovConsumerSimilarity` ORDER BY score DESC params {0=0.3}
EDIT2: I've also tried to change the score type from Float to float with no further improvements.
This seems to be a bug related to Spring Data Neo4j. Looking to the query executed, it is clear that it searches for nodes, while it should search for relationships.
I changed the #findByScoreGreaterThanOrderByScoreDesc() method using the #Query annotation that specifies the following Cypher query
START `govConsumerToGovConsumerSimilarity`=rel:__rel_types__(className="it.cerict.recommender.persistence.neo4j.GovConsumerToGovConsumerSimilarity") WHERE `govConsumerToGovConsumerSimilarity`.`score`! > 0.3 RETURN `govConsumerToGovConsumerSimilarity` ORDER BY `govConsumerToGovConsumerSimilarity`.`score` DESC;

JPA/Hibernate generating wrong SQL in Spring Roo finder method

I'm developing a Spring web application whose persistence layer consists in Spring Roo generated JPA entities, with Hibernate as persistence provider and MySql as underlying DB.
Among my entities I have a class Detection with a tstamp java.util.Date field generated in Roo as follows:
entity jpa --class ~.data.Detection
...
field date --fieldName tstamp --type java.util.Date
...
finder add findDetectionsByTstampBetween
(the finder method was of course chosen after executing finder list)
In my controller code, at a point I invoke:
List<Detection> detections = Detection.findDetectionsByTstampBetween(from, to).getResultList();
Where from and to are two valid java.util.Date(s). When testing sample data though (after ensuring that for a given choice of from, to the returned list shouldn't be empty), I got an empty list and investigated the reasons.
I found in tomcat logs that Hibernate was generating the following SQL:
Hibernate: select detection0_.id as id1_3_, ...etc..., detection0_.tstamp as tstamp4_3_ from detection detection0_ where detection0_.tstamp>=?
I would expect the where clause should contain a trailing "AND detection0_.tstamp<=?", checking the other date range limit. I took a look at the generated Detection.findDetectionsByTstampBetween(Date minTstamp, Date maxTstamp) method in Detection_Roo_Finder.aj and actually the "AND" is present in the invocation to createQuery.
public static TypedQuery<Detection> Detection.findDetectionsByTstampBetween(Date minTstamp, Date maxTstamp) {
if (minTstamp == null) throw new IllegalArgumentException("The minTstamp argument is required");
if (maxTstamp == null) throw new IllegalArgumentException("The maxTstamp argument is required");
EntityManager em = Detection.entityManager();
TypedQuery<Detection> q = em.createQuery("SELECT o FROM Detection AS o WHERE o.tstamp BETWEEN :minTstamp AND :maxTstamp", Detection.class);
q.setParameter("minTstamp", minTstamp);
q.setParameter("maxTstamp", maxTstamp);
return q;
}
Any idea what could cause the problem?
I've finally found the solution to the riddle and, as it turned out, the issue had nothing to do with JPA.
The problem was that the call to the persistence layer was inserted inside a Rest service controller with the following mapping:
#ResponseBody
#RequestMapping(value="/detections", method=RequestMethod.GET, params="from, to" )
public Object getDetectionsInRange(
#RequestParam(required=true) #DateTimeFormat(pattern="yyyy-MM-dd HH:mm") final Date from,
#RequestParam(required=true) #DateTimeFormat(pattern="yyyy-MM-dd HH:mm") final Date to
)
{
...
List<Detection> detections = Detection.findDetectionsByTstampBetween(from, to).getResultList();
...
}
The error was in the definition of the params= argument in #RequestMapping, the correct format being as follows:
#RequestMapping(value="/detections", method=RequestMethod.GET, params={"from", "to"} )
This error caused another version of the controller method for /detections. In this second version I called a different finder method, which appeared to generate the wrong SQL in Hibernate.
#ResponseBody
#RequestMapping(value="/detections", method=RequestMethod.GET )
public Object getDetections(
#RequestParam(required=false, defaultValue="0") int days,
#RequestParam(required=false, defaultValue="0") int hours,
#RequestParam(required=false, defaultValue="0") int minutes
)
{
...
List<Detection> detections = Detection.findDetectionsByTstampGreaterThanEquals( ... ).getResultList();
...
}

Resources