Jersey 2.6 with Jackson JSON deserialization - jersey

My goal is to make web calls and convert returned JSON to POJOs. I'm trying to use Jersey+Jackson for this but am getting exceptions when running.
My maven pom file includes the following dependencies -
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-client</artifactId>
<version>2.6</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.media</groupId>
<artifactId>jersey-media-json-jackson</artifactId>
<version>2.6</version>
</dependency>
The code I use to make fetch some data is as follows -
Client client = ClientBuilder.newBuilder()
.register(JacksonFeature.class)
.build();
ClientResponse response = client.target(url).request(MediaType.APPLICATION_JSON).get(ClientResponse.class);
But the following exception is throw -
javax.ws.rs.ProcessingException: Error reading entity from input stream.
at org.glassfish.jersey.message.internal.InboundMessageContext.readEntity(InboundMessageContext.java:868)
at org.glassfish.jersey.message.internal.InboundMessageContext.readEntity(InboundMessageContext.java:785)
at org.glassfish.jersey.client.ClientResponse.readEntity(ClientResponse.java:335)
...
...
Caused by: org.codehaus.jackson.map.JsonMappingException: Can not find a deserializer for non-concrete Map type [map type; class javax.ws.rs.core.MultivaluedMap, [simple type, class java.lang.String] -> [collection type; class java.util.List, contains [simple type, class java.lang.String]]]
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createAndCache2(StdDeserializerProvider.java:315)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createAndCacheValueDeserializer(StdDeserializerProvider.java:290)
at org.codehaus.jackson.map.deser.StdDeserializerProvider.findValueDeserializer(StdDeserializerProvider.java:159)
at org.codehaus.jackson.map.deser.std.StdDeserializer.findDeserializer(StdDeserializer.java:620)
at org.codehaus.jackson.map.deser.BeanDeserializer.resolve(BeanDeserializer.java:379)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._resolveDeserializer(StdDeserializerProvider.java:407)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createAndCache2(StdDeserializerProvider.java:352)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createAndCacheValueDeserializer(StdDeserializerProvider.java:290)
at org.codehaus.jackson.map.deser.StdDeserializerProvider.findValueDeserializer(StdDeserializerProvider.java:159)
at org.codehaus.jackson.map.deser.StdDeserializerProvider.findTypedValueDeserializer(StdDeserializerProvider.java:180)
at org.codehaus.jackson.map.ObjectMapper._findRootDeserializer(ObjectMapper.java:2829)
at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2699)
at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1315)
at org.codehaus.jackson.jaxrs.JacksonJsonProvider.readFrom(JacksonJsonProvider.java:419)
at org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$TerminalReaderInterceptor.invokeReadFrom(ReaderInterceptorExecutor.java:257)
at org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$TerminalReaderInterceptor.aroundReadFrom(ReaderInterceptorExecutor.java:229)
at org.glassfish.jersey.message.internal.ReaderInterceptorExecutor.proceed(ReaderInterceptorExecutor.java:149)
at org.glassfish.jersey.message.internal.MessageBodyFactory.readFrom(MessageBodyFactory.java:1124)
at org.glassfish.jersey.message.internal.InboundMessageContext.readEntity(InboundMessageContext.java:853)
... 90 more
Caused by: java.lang.IllegalArgumentException: Can not find a deserializer for non-concrete Map type [map type; class javax.ws.rs.core.MultivaluedMap, [simple type, class java.lang.String] -> [collection type; class java.util.List, contains [simple type, class java.lang.String]]]
at org.codehaus.jackson.map.deser.BasicDeserializerFactory.createMapDeserializer(BasicDeserializerFactory.java:424)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createDeserializer(StdDeserializerProvider.java:380)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createAndCache2(StdDeserializerProvider.java:310)
... 108 more
Am I missing some setup to get this to work right?
I have tried the url via curl and browser and it returns JSON as expected.

What you need is a Response and not a ClientResponse:
javax.ws.rs.core.Response jsonResponse = client.target(url).request(MediaType.APPLICATION_JSON).get();
Then you can see what comes in your response (debugging is your friend here). Is it by any chance a map of some type? If it is, you can read it by doing e.g.
Map<SomeClassOfYours> entitiesFromResponse = jsonResponse.readEntity(new GenericType<Map<SomeClassOfYours>>() {});
If you've put a normal entity in the response you can simply do something like:
SomeClassOfYours entityFromResponse = jsonResponse.readEntity(SomeClassOfYours.class);
Edit: For this to work you'd also need to define SomeClassOfYours and put the corresponding fields, constructor, getters and setters in there.
Edit2: When in doubt you can always read the jsonResponse as a String.class and put it in a String variable.

Response jsonResponse = getClient().target(URI).request().get();
T result = jsonResponse.readEntity(type);

Related

Quarkus Tika Native Image generation failed Error

I am trying to create native image from Quarkus Tika. I use below dependencies with this code snippet.
#Inject
TikaParser parser;
#POST
#Path("/text")
#Produces(MediaType.TEXT_PLAIN)
public String extractText(InputStream stream) {
Instant start = Instant.now();
String text = null;
try {
text = parser.getText(stream);
} catch (Exception e) {
log.info("error" + e);
}
<dependency>
<groupId>io.quarkiverse.tika</groupId>
<artifactId>quarkus-tika</artifactId>
<version>1.0.3</version>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-resteasy-reactive</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-awt</artifactId>
</dependency>
quarkus-quickstarts/tika-quickstart$ mvn clean install -Dnative
I used sample project from https://quarkiverse.github.io/quarkiverse-docs/quarkus-tika/dev/index.html
Even though i used "initialize-at-run-time" parameter for related classes, i got same error.
<quarkus.native.additional-build-args>--initialize-at-run-time=org.apache.sis.internal.system.DelayedExecutor\,org.apache.sis.internal.system.ReferenceQueueConsumer\,ucar.nc2.grib.grib2.Grib2JpegDecoder\,ucar.nc2.grib.grib2.Grib2DataReader2</quarkus.native.additional-build-args>
Here are my config files:
application.properties
quarkus.tika.tika-config-path=tika-config.xml
tika-config.xml
<?xml version="1.0" encoding="UTF-8"?> <properties> <parsers>
<parser class="org.apache.tika.parser.pdf.PDFParser">
<mime>application/pdf</mime>
</parser>
<parser class="org.apache.tika.parser.txt.TXTParser">
<mime>text/plain</mime>
</parser> </parsers> </properties>
Error:
Fatal error: org.graalvm.compiler.debug.GraalError: com.oracle.graal.pointsto.constraints.UnsupportedFeatureException: Detected a started Thread in the image heap. Threads running in the image generator are no longer running at image runtime. To see how this object got instantiated use --trace-object-instantiation=org.apache.sis.internal.system.DelayedExecutor. The object was probably created by a class initializer and is reachable from a static field. You can request class initialization at image runtime by using the option --initialize-at-run-time=<class-name>. Or you can write your own initialization methods and call them explicitly from your main entry point.
at com.oracle.graal.pointsto.util.AnalysisFuture.setException(AnalysisFuture.java:49)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:269)
at com.oracle.graal.pointsto.util.AnalysisFuture.ensureDone(AnalysisFuture.java:63)
at com.oracle.graal.pointsto.heap.ImageHeapScanner.lambda$postTask$9(ImageHeapScanner.java:611)
at com.oracle.graal.pointsto.util.CompletionExecutor.executeCommand(CompletionExecutor.java:193)
at com.oracle.graal.pointsto.util.CompletionExecutor.lambda$executeService$0(CompletionExecutor.java:177)
at java.base/java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1426)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: com.oracle.graal.pointsto.constraints.UnsupportedFeatureException: Detected a started Thread in the image heap. Threads running in the image generator are no longer running at image runtime. To see how this object got instantiated use --trace-object-instantiation=org.apache.sis.internal.system.DelayedExecutor. The object was probably created by a class initializer and is reachable from a static field. You can request class initialization at image runtime by using the option --initialize-at-run-time=<class-name>. Or you can write your own initialization methods and call them explicitly from your main entry point.
at com.oracle.svm.hosted.image.DisallowedImageHeapObjectFeature.error(DisallowedImageHeapObjectFeature.java:173)
at com.oracle.svm.core.image.DisallowedImageHeapObjects.check(DisallowedImageHeapObjects.java:74)
at com.oracle.svm.hosted.image.DisallowedImageHeapObjectFeature.replacer(DisallowedImageHeapObjectFeature.java:149)
at com.oracle.graal.pointsto.meta.AnalysisUniverse.replaceObject(AnalysisUniverse.java:582)
at com.oracle.svm.hosted.ameta.AnalysisConstantReflectionProvider.replaceObject(AnalysisConstantReflectionProvider.java:257)
at com.oracle.svm.hosted.ameta.AnalysisConstantReflectionProvider.interceptValue(AnalysisConstantReflectionProvider.java:228)
at com.oracle.svm.hosted.heap.SVMImageHeapScanner.transformFieldValue(SVMImageHeapScanner.java:126)
at com.oracle.graal.pointsto.heap.ImageHeapScanner.onFieldValueReachable(ImageHeapScanner.java:331)
at com.oracle.graal.pointsto.heap.ImageHeapScanner.lambda$createImageHeapObject$3(ImageHeapScanner.java:272)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
... 10 more

Spring Boot WebClient XML

My spring boot application wants to use Webclient to make an http request (XML request body) and receives XML response. Hence I created another spring boot application with jackson-dataformat-xml and created an endpoint to receive and return XML as below.
spring-boot-version=2.2.5
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-xml</artifactId>
</dependency>
#PostMapping(value = "/api",
consumes = MediaType.APPLICATION_XML_VALUE,
produces = MediaType.APPLICATION_XML_VALUE)
public ResponseEntity<MyXmlResponse> trip(#RequestBody MyXmlRequest request) throws Exception {
MyXmlResponse response = new MyXmlResponse();
response.setStatus("SUCCESS");
response.setTripID(request.getTripID());
return ResponseEntity.ok().body(response);
}
It works perfect and obviously no JaxB annotations are required as I use jackson-dataformat-xml. Also the request XML can be case-insensitive.
Now, in my first application I want to consume this API via webclient. I read that Spring webflux do not support Jackson-dataformat-xml yet. Hence I have to annotate my classes with Jaxb annotations.
spring-boot-version=2.2.5
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
webClient.post()
.uri(URI.create("url-to-api-endpoint"))
.body(Mono.just(myXmlRequest), MyXmlRequest.class)
.exchange()
.doOnSuccess(response -> {
HttpStatus statusCode = response.statusCode();
log.info("Status code of external system request {}", statusCode);
})
.doOnError(onError -> {
log.error("Error on connecting to external system {}", onError.getMessage());
})
.flatMap(response -> response.bodyToMono(MyXmlResponse.class))
.subscribe(this::handleResponse);
Above code throws an exception as follows
org.springframework.webreactive.function.UnsupportedMediaTypeException: Content type 'application/xml' not supported for bodyType=com.example.MyXmlRequest
at org.springframework.web.reactive.function.BodyInserters.unsupportedError(BodyInserters.java:391)
I fixed this problem by annotating with XmlRootElement as follows
#Getter #Setter #NoArgsConstructor #ToString
#XmlRootElement()
public class MyXmlRequest {
private String attribute1;
}
On the next attempt I got another error as follows
reactor.core.Exceptions$ErrorCallbackNotImplemented: org.springframework.web.reactive.function.UnsupportedMediaTypeException: Content type 'application/xml' not supported for bodyType=com.example.MyXmlResponse
Caused by: org.springframework.web.reactive.function.UnsupportedMediaTypeException: Content type 'application/xml' not supported for bodyType=com.example.MyXmlResponse
This could be solved by annotating MyXmlResponse with XmlRootElement as follows
#Getter #Setter #NoArgsConstructor #ToString
#XmlRootElement()
public class MyXmlResponse {
private String attr1;
private String attr2;
}
This time I get unmarshallexception as follows
reactor.core.Exceptions$ErrorCallbackNotImplemented: org.springframework.core.codec.DecodingException: Could not unmarshal XML to class com.example.MyXmlResponse; nested exception is javax.xml.bind.UnmarshalException
- with linked exception:
[com.sun.istack.internal.SAXParseException2; lineNumber: 1; columnNumber: 15; unexpected element (uri:"", local:"MyXmlResponse"). Expected elements are <{}myXmlResponse>]
Caused by: org.springframework.core.codec.DecodingException: Could not unmarshal XML to class com.example.MyXmlResponse; nested exception is javax.xml.bind.UnmarshalException
- with linked exception:
I fixed it with additional attributes passed to annotation as follows.
#XmlRootElement(name = "MyXmlResponse", namespace = "")
public class MyXmlResponse {
In future, my XML structures going to be tremendously complex. I want to know if I am doing it the right way.

Springboot throwing giving Jackson exceptions

During testing , I have faced the issue.I have published a rest API with a controller class with a model input .
While Calling the API , instead of a single string , an array [{"a":1,"b":2}] has been used. Which triggered the following error:
{
"timestamp": "2018-12-19T12:33:36.729+0000",
"status": 400,
"error": "Bad Request",
"message": "JSON parse error: Cannot deserialize instance of `java.lang.String` out of START_ARRAY token; nested exception is com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot deserialize instance of `java.lang.String` out of START_ARRAY token\n at [Source: (PushbackInputStream); line: 3, column: 14] (through reference chain: com.xy.df.model.inputReq[\"req\"])",
"path": "x/y/z"
}
We did not imported JACKSON dependency in application , explicitly in POM. I have noticed in the parent pom jackson version used is :2.9.5
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.9.5</version>
</dependency>
1.Is it vulnerable for RCE? How to resolve this in Spring-boot ?
2. How I can supress/override the exception message so that client never gets to know what libraries used underneath ?
JsonMappingException: out of START_ARRAY token exception is thrown by Jackson object mapper as it's expecting an Object {} whereas it found an Array [{}] in response.
This can be solved by replacing Object with Object[] in the argument for geForObject("url",Object[].class).
References:
Ref.1
Ref.2
Ref.3
I have resolved issue . Before going ahead , one needs to understand couple of very useful annotations-
#ExceptionHandler - This handler helps you define an error class for which you want to catch the exception
#controller advice - It caters a cross cutting approach . Any class mentioned as controller advice , it is available for all the controller under your microservice.
#ControllerAdvice
public class ExceptionController {
#Autowired
SomeGenericResponse someGenericResponse ; /* data model of common response */
#ExceptionHandler(value = <My case Jackson Class>.class)
public ResponseEntity<SomeGenericResponse> CustomException(HttpServletRequest req, HttpServletResponse res,Exception ex) {
someGenericResponse.setMessage("Your Message");
someGenericResponse.setStatus("false");
return new ResponseEntity<SomeGenericResponse> someGenericResponse ,HttpStatus.BAD_REQUEST);
}
}

Elasticsearch-Spark serialization not working with inner classes

Elasticsearch/Spark serialization does not appear to play well with nested types.
For example:
public class Foo implements Serializable {
private List<Bar> bars = new ArrayList<Bar>();
// getters and setters
public static class Bar implements Serializable {
}
}
List<Foo> foos = new ArrayList<Foo>();
foos.add( new Foo());
// Note: Foo object does not contain nested Bar instances
SparkConf sc = new SparkConf(); //
sc.setMaster("local");
sc.setAppName("spark.app.name");
sc.set("spark.serializer", KryoSerializer.class.getName());
JavaSparkContext jsc = new JavaSparkContext(sc);
JavaRDD javaRDD = jsc.parallelize(ImmutableList.copyOf(foos));
JavaEsSpark.saveToEs(javaRDD, INDEX_NAME+"/"+TYPE_NAME);
The above code above works, and documents of type Foo will be indexed within Elasticsearch.
The issue arises when the bars list in a Foo object is not empty, for instance:
Foo = new Foo();
Bar = new Foo.Bar();
foo.getBars().add(bar);
Then, when indexing to Elasticsearch, the following exception is thrown:
org.elasticsearch.hadoop.serialization.EsHadoopSerializationException:
Cannot handle type [Bar] within type [class Foo], instance [Bar ...]]
within instance [Foo#1cf628a]
using writer [org.elasticsearch.spark.serialization.ScalaValueWriter#4e635d]
at org.elasticsearch.hadoop.serialization.builder.ContentBuilder.value(ContentBuilder.java:63)
at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.doWriteObject(TemplatedBulk.java:71)
at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:58)
at org.elasticsearch.hadoop.rest.RestRepository.writeToIndex(RestRepository.java:148)
at org.elasticsearch.spark.rdd.EsRDDWriter.write(EsRDDWriter.scala:47)
at org.elasticsearch.spark.rdd.EsSpark$$anonfun$saveToEs$1.apply(EsSpark.scala:68)
at org.elasticsearch.spark.rdd.EsSpark$$anonfun$saveToEs$1.apply(EsSpark.scala:68)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:64)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
These are the relevant Maven dependencies
<dependency>
<groupId>com.sksamuel.elastic4s</groupId>
<artifactId>elastic4s_2.11</artifactId>
<version>1.5.5</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-hadoop-cascading</artifactId>
<version>2.1.0.Beta4</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.1.3</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-spark_2.10</artifactId>
<version>2.1.0.Beta4</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
</dependency>
What is the correct way to index when using nested types with ElasticSearch and Spark?
Thanks
A solution could be to build a json from the object you're trying to save, using for example Json4s.
In this case your "JavaEsSpark" RDD would be a RDD of strings.
Then you simply have to call
JavaEsSpark.saveJsonToEs...
instead of
JavaEsSpark.saveToEs...
This workaround helped me save countless hours trying to figure out a way to Serialize nested maps.
Looking at the ScalaValueWriter & JdkValueWriter code we can see that only certain types are directly supported. Most likely the inner class is not a JavaBean or other supported type.
One day ScalaValueWriter & JdkValueWriter will possibly support user defined types (like Bar in our example), other than just Java types like String, int, etc.
In the meantime, there is the following workaround. Instead of having Foo expose a List of Bar objects, internally transform the List to a Map<String, Object> and expose that.
Something like this:
private List<Map<String, Object>> bars= new ArrayList<Map<String, Object>>();
public List<Map<String, Object>> getBars() {
return bars;
}
public void setBars(List<Bar> bars) {
for (Bar bar: bars){
this.bars.add(bar.getAsMap());
}
}
i suggest working with com.google.gson.Gson;
String foosJson = new Gson().toJson(foos );
then ,
Map map = new HashMap<> ();
...
...
JavaRDD<Map<String,?>> javaRDD= sc.parallelize(ImmutableList.of(map));
JavaEsSpark.saveToEs ( javaRDD, INDEX_NAME+"/"+TYPE_NAME );

Spring AOP: exclude avoid final classes and enums from pointcut

I am using try to implement Logging using Spring AOP. I have defined the
#Pointcut("execution(* com.mycom..*(..))")
private void framework() {}
#Around("framework()")
public Object aroundAdviceFramework(ProceedingJoinPoint jp) throws Throwable {
if (logger.isDebugEnabled())
logger.debug("DEBUG:: {} {} Enter", jp.getTarget().getClass().getName(), jp.getSignature().getName());
Object returnVal = jp.proceed(jp.getArgs());
if (logger.isDebugEnabled())
logger.debug("DEBUG:: {} {} Out", jp.getTarget().getClass().getName(), jp.getSignature().getName());
logger.info("INFO:: " + jp.getTarget().getClass().getName() + " " + jp.getSignature().getName() + " Finished:");
return returnVal;
}
There are lot of classes under mycom package and its subpackages. Some of the classes are enum and final class.
Because of this I am getting
nested exception is org.springframework.aop.framework.AopConfigException:
Could not generate CGLIB subclass of class [class com.mycom.util.BancsServiceProvider]: Common causes of this problem include using a final class or a non-visible class; nested exception is java.lang.IllegalArgumentException: Cannot subclass final class class com.mycom.util.BancsServiceProvider
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:529)
Is there a way to exclude all the final classes and enum classes from logging using some kind of regular expression.
Note:I have enum classes all over in different packages. It would be difficult to exclude them using complete class names.
Update 2014-11-17 (trying kriegaex's solution):
I tried using
#Pointcut("!within(is(FinalType))")
but I am getting following error
Pointcut is not well-formed: expecting ')' at character position 10
!within(is(FinalType))
I have added this maven dependency in the pom file
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>1.8.4</version>
</dependency>
I have also added this maven dependency
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>1.8.4</version>
</dependency>
Now, everything is working like charm. Any ideas, whats happening here?
Currently you can exclude enums, aspects, interfaces, inner types, anonymous types via is() pointcut syntax which was introduced in AspectJ 1.6.9, see also my answer here.
What you cannot do at the moment is exclude final types via AspectJ syntax. But I think it would make sense, so I created a ticket for it.
How to exclude enums:
#Pointcut("execution(* com.mycom..*(..)) && !within(is(EnumType))")
Update: AspectJ 1.8.4 has been released, see also overview in the official download section. On Maven Central the download is not available yet, but it will be soon, I guess. When available, this link will be valid, currently it yields a 404 error.
So why is this release interesting? Because the ticket mentioned above has been resolved and there is a new pointcut primitive is(FinalType) available as of now, see 1.8.4 release notes.
So now the full solution you requested looks like this:
#Pointcut(
"execution(* com.mycom..*(..)) && " +
"!within(is(EnumType)) && " +
"!within(is(FinalType))"
)
I verified that it works like this.

Resources