Jenkins gives access restriction for javax/smartcardio - maven

For a Java / OSGi project I need to communicate with a smartcard plugged to my computer. I do this using the package javax.smartcardio.
When I first imported this package and wanted to use it, Eclipse anounced an error "Access restriction: The type 'CommandAPDU' is not API". As proposed on https://www.javacardos.com/javacardforum/viewtopic.php?t=918, I added an Accessibility Rule Pattern to the Build Path. After that, everything worked fine and I could use the package in my local environment.
But now I wanted to pass the project to our continous integration system, which is Jenkins with Maven. I also committed the .classpath file. And there I get the same error:
[ERROR] import javax.smartcardio.CommandAPDU;
[ERROR] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[ERROR] Access restriction: The type 'CommandAPDU' is not API (restriction on classpath entry '/disc2/lunifera/server/jenkins/workspace/tools/hudson.model.JDK/JDK-8/jre/lib/rt.jar')
For some reason, the access rule does not seem to work on Jenkins. Does anybody know how to solve this problem? Thanks alot.

Not exactly the kind of answer I hoped for, but in the end, I overcame the problem by using reflection. I wrote wrapper classes for the classes from javax.smartcardio that I needed. These wrapper classes hold instances of the original classes and operate on them purely by reflection. For example, the CardTerminal class wrapper may look like this:
public class CardTerminal {
private Object originalCardTerminal;
public CardTerminal(Object originalCardTerminal)
{
this.originalCardTerminal = originalCardTerminal;
}
public Card connect(String protocol)
{
try
{
Class originalCardTerminalClass = Class.forName("javax.smartcardio.CardTerminal");
Method connectMethod = originalCardTerminalClass.getMethod("connect", String.class);
Object originalCard = connectMethod.invoke(originalCardTerminal, protocol);
if (originalCard == null)
{
return null;
}
else
{
return new Card(originalCard); // "Card" is another wrapper type, of course
}
}
catch (ClassNotFoundException | NoSuchMethodException | SecurityException | IllegalAccessException | IllegalArgumentException | InvocationTargetException e)
{
return null;
}
}}
Of course, this is not a good solution because all the objects are just of type "object" and you lose type safety. But it's the best solution I found so far, and it's working, because you don't need to import anything from javax.smartcardio.

Related

Configuring a custom Gradle sourceSet using a closure

I'm trying to develop a Gradle plugin for a language I use (SystemVerilog). I'm still experimenting and figuring things out. Before I write the entire thing as a plugin, I thought it would be best to try out the different parts I need inside a build script, to get a feel of how things should work.
I'm trying to define a container of source sets, similar to how the Java plugin does it. I'd like to be able to use a closure when configuring a source set. Concretely, I'd like to be able to do the following:
sourceSets {
main {
sv {
include '*.sv'
}
}
}
I defined my own sourceSet class:
class SourceSet implements Named {
final String name
final ObjectFactory objectFactory
#Inject
SourceSet(String name, ObjectFactory objectFactory) {
this.name = name
this.objectFactory = objectFactory
}
SourceDirectorySet getSv() {
SourceDirectorySet sv = objectFactory.sourceDirectorySet('sv',
'SystemVerilog source')
sv.srcDir("src/${name}/sv")
return sv
}
SourceDirectorySet sv(#Nullable Closure configureClosure) {
configure(configureClosure, getSv());
return this;
}
}
I'm using org.gradle.api.file.SourceDirectorySet because that already implements PatternFilterable, so it should give me access to include, exclude, etc.
If I understand the concept correctly, the sv(#Nullable Closure configureClosure) method is the one that gives me the ability to write sv { ... } to configure via a closure.
To add the sourceSets property to the project, I did the following:
project.extensions.add("sourceSets",
project.objects.domainObjectContainer(SourceSet.class))
As per the Gradle docs, this should give me the possibility to configure sourceSets using a closure. This site, which details using custom types, states that by using NamedDomainObjectContainer, Gradle will provide a DSL that build scripts can use to define and configure elements. This would be the sourceSets { ... } part. This should also be the sourceSets { main { ... } } part.
If I create a sourceSet for main and use it in a task, then everything works fine:
project.sourceSets.create('main')
task compile(type: Task) {
println 'Compiling source files'
println project.sourceSets.main.sv.files
}
If I try to configure the main source set to only include files with the .sv extension, then I get an error:
sourceSets {
main {
sv {
include '*.sv'
}
}
}
I get the following error:
No signature of method: build_47mnuak4y5k86udjcp7v5dkwm.sourceSets() is applicable for argument types: (build_47mnuak4y5k86udjcp7v5dkwm$_run_closure1) values: [build_47mnuak4y5k86udjcp7v5dkwm$_run_closure1#effb286]
I don't know what I'm doing wrong. I'm sure it's just a simple thing that I'm forgetting. Does anyone have an idea of what that might be?
I figured out what was going wrong. It was a combination of poor copy/paste skills and the fact that Groovy is a dynamic language.
First, let's look at the definition of the sv(Closure) function again:
SourceDirectorySet sv(#Nullable Closure configureClosure) {
configure(configureClosure, getSv());
return this;
}
Once I moved this code to an own Groovy file and used the IDE to show me what is getting called, I noticed that it wasn't calling the function I expected. I was expecting a call to org.gradle.util.ConfigureUtil.configure. Since this is part of the public API, I expected it to be imported by default in the build script. As this page states, this is not the case.
To solve the issue, it's enough to add the following import:
import static org.gradle.util.ConfigureUtil.configure
This will get rid of the cryptic closure related error. It is replaced by the following error, though:
Cannot cast object 'SourceSet_Decorated#a6abab9' with class 'SourceSet_Decorated' to class 'org.gradle.api.file.SourceDirectorySet'
This is caused by the copy/paste error I mentioned. When I wrote the SourceSet class, I drew heavily from org.gradle.api.tasks.SourceSet (and org.gradle.api.internal.tasks.DefaultSourceSet). If we look at the java(Closure) method there, we'll see it has the following signature:
SourceSet java(#Nullable Closure configureClosure);
Notice that it returns SourceSet and not SourceDirectorySet like in my code. Using the proper return type fixes the issue:
SourceSet sv(#Nullable Closure configureClosure)
With this new return type, let's look again at the configuration code for the source set:
sourceSets {
main {
sv {
include '*.sv'
}
}
}
Initially, I thought it was supposed to work as follows: pass main { ... } as a Closure to sourceSets, pass sv { ... } as a Closure to main, and handle the include ... part inside sourceDirectorySet. I banged my head against the wall for a while, because I couldn't find any code in that class hierarchy that takes closures like this.
Now, I think the flow is slightly different: pass main { ... } as a Closure to sourceSets (as initially thought), but call the sv(Closure) function on main (of type sourceSet), passing it { include ... } as the argument.
Bonus: There was one more issue that wasn't related to the "compile" errors I was having.
Even after getting the code to run without errors, it still wasn't behaving as expected. I had some files with the *.svh extension that were still getting picked up. This is because, when calling getSv(), it was creating a new SourceDirectorySet each time. Any configuration that was done previously was getting thrown away each time that this function was called.
Making the sourceDirectorySet a class member and moving its creation to the constructor fixed the issue:
private SourceDirectorySet sv
SourceSet(String name, ObjectFactory objectFactory) {
// ...
sv = objectFactory.sourceDirectorySet('sv',
'SystemVerilog source')
sv.srcDir("src/${name}/sv")
}
SourceDirectorySet getSv() {
return sv
}

NoMethodError when calling JJWT from Kotlin

I'm working on a small Spring Boot application in Kotlin and now I want to secure it using JJWT. Roughly speaking, I am translating this tutorial to my use case: https://jakublesko.com/spring-security-with-jwt/
In the project I have this AuthenticationFilter:
class JwtAuthenticationFilter(val authManager: AuthenticationManager) : UsernamePasswordAuthenticationFilter() {
// other stuff omitted for brevity
override fun successfulAuthentication(request: HttpServletRequest?, response: HttpServletResponse?, chain: FilterChain?, authResult: Authentication?) {
val user = authResult?.principal as User
val roles = user.authorities
.stream()
.map(GrantedAuthority::getAuthority)
val signingKey = JWT_SECRET.toByteArray()
val token = Jwts.builder()
.signWith(SignatureAlgorithm.HS512, Keys.hmacShaKeyFor(signingKey))
.setHeaderParam("typ", TOKEN_TYPE) // validate that "typ" does not actually mean "type"
.setIssuer(TOKEN_ISSUER)
.setAudience(TOKEN_AUDIENCE)
.setSubject(user.username)
.setExpiration(Date(System.currentTimeMillis() + 864000000))
.claim("rol", roles)
.compact()
response?.addHeader(TOKEN_HEADER, TOKEN_PREFIX + token)
}
}
When I post to the authentication URL that is supposed to issue a token though, I receive:
java.lang.NoSuchMethodError: io.jsonwebtoken.SignatureAlgorithm.getMinKeyLength()I
at io.jsonwebtoken.security.Keys.hmacShaKeyFor(Keys.java:69) ~[jjwt-api-0.10.7.jar:na]
I can debug into the successfulAuthentication method and see that it is called with reasonable parameters. What catches my eye is the "I" after the parenthesis at the end of the getMinKeyLength()I. My googling skills apparently do not suffice to find a reason why it is there, but I strongly suspect it is related to reflection & calling Java libs from Kotlin code.
Is anyone around who can tell me how to fix this? I have run out of guesses.
I stumbled upon the same issue. The reason for this is most probably a dependency conflict. Check if you have any other dependency using jjwt. For us it was com.twilio.sdk.
You can do it using
mvn dependency:tree -Dverbose
After you've identified the conflicting dependency you can either match your jjwt version or exclude it from the dependency.
to run java code from Kotlin, please put the java sources under the folder
main/java
Also, your kotlin sources are located under main/kotlin

Lagom framework / streamed response / websocket / pathCall / Descriptor / Creator instead of Function

I have my service declared this way:
public interface BlogQueryService extends Service {
public ServiceCall<String, Source<String, ?>> tick(int interval);
public ServiceCall<String, Source<String, ?>> tock();
public ServiceCall<NotUsed, Source<PostSummary, ?>> newPosts();
public ServiceCall<String, Source<PostSummary, ?>> getPostSummaries();
#Override
default Descriptor descriptor() {
return named("blog-query").with(
//pathCall("/api/bloggie/tick/:interval", this::tick),
pathCall("/api/bloggie/tock", tock())
//pathCall("/api/bloggie/newPosts", this::newPosts),
//pathCall("/api/bloggie/postSummaries", this::getPostSummaries)
).withAutoAcl(true);
}
}
The tick works. The tock doesn't.
When I invoke it using websocket client (to ws://localhost:9000/api/bloggie/tock ) , I got "undefined" as response, indicating no mapping found for that URL.
After some experimentings, found out why: tick works because it has url param (the :interval). Tick doesn't work because it doesn't have url param. Seriously pathCall requires you to have param in your URL? So I checked the API of Service: http://www.lagomframework.com/documentation/1.0.x/api/java/com/lightbend/lagom/javadsl/api/Service.html
There are several overloaded declarations of pathCall. Apparently the tick uses this one:
static <Request,Response,A> Descriptor.Call<Request,Response> pathCall(String pathPattern, akka.japi.function.Function<A,ServiceCall<Request,Response>> methodRef)
So from the signature, yes it requires the method to take a parameter. So, if the method (such is tock) doesn't take a param, the binding will fail at runtime. So I guess I need to use this one instead:
static <Request,Response> Descriptor.Call<Request,Response> pathCall(String pathPattern, akka.japi.function.Creator<ServiceCall<Request,Response>> methodRef)
The problem is... I don't know how. I haven't seen any example of the use of akka.japi.function.Creator in pathCall.
I tried this:
default Descriptor descriptor() {
return named("blog-query").with(
pathCall("/api/bloggie/tick/:interval", this::tick),
pathCall("/api/bloggie/tock", new Creator<ServiceCall<String, Source<String, ?>>> () {
public ServiceCall<String, Source<String, ?>> create() {
return tock();
}
})
//pathCall("/api/bloggie/newPosts", this::newPosts),
//pathCall("/api/bloggie/postSummaries", this::getPostSummaries)
).withAutoAcl(true);
}
It compiles. But it throws an error at runtime:
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) Error in custom provider, java.lang.IllegalStateException: Unable to resolve method for service call with ID PathCallId{pathPattern='/api/bloggie/tock'}. Ensure that the you have passed a method reference (ie, this::someMethod). Passing anything else, for example lambdas, anonymous classes or actual implementation classes, is forbidden in declaring a service descriptor.
at com.lightbend.lagom.javadsl.server.ServiceGuiceSupport.bindServices(ServiceGuiceSupport.java:43) (via modules: com.google.inject.util.Modules$OverrideModule -> sample.bloggie.impl.BlogServiceModule)
while locating com.lightbend.lagom.internal.server.ResolvedServices
Thanks in advance!
I just did some experiments... All compiled, but none of them works....
namedCall("/api/bloggie/tock", this::tock)
Result: Compile success. Runtime: path unknown (no binding (?)).
Then I tried
pathCall("/api/bloggie/tock", () -> this.tock())
Result: exception.
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) Error in custom provider, scala.MatchError: Request (of class sun.reflect.generics.reflectiveObjects.TypeVariableImpl)
at com.lightbend.lagom.javadsl.server.ServiceGuiceSupport.bindServices(ServiceGuiceSupport.java:43) (via modules: com.google.inject.util.Modules$OverrideModule -> sample.bloggie.impl.BlogServiceModule)
while locating com.lightbend.lagom.internal.server.ResolvedServices
for parameter 1 at com.lightbend.lagom.internal.server.ServiceRegistrationModule$RegisterWithServiceRegistry.<init>(ServiceRegistrationModule.scala:55)
at com.lightbend.lagom.internal.server.ServiceRegistrationModule.bindings(ServiceRegistrationModule.scala:29):
Binding(class com.lightbend.lagom.internal.server.ServiceRegistrationModule$RegisterWithServiceRegistry to self eagerly) (via modules: com.google.inject.util.Modules$OverrideModule -> play.api.inject.guice.GuiceableModuleConversions$$anon$1)
while locating com.lightbend.lagom.internal.server.ServiceRegistrationModule$RegisterWithServiceRegistry
Then I tried:
public ServiceCall<NotUsed, Source<String, ?>> tock(Void x);
Result: exception
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) Error in custom provider, java.lang.IllegalArgumentException: Don't know how to serialize ID class java.lang.Void
at com.lightbend.lagom.javadsl.server.ServiceGuiceSupport.bindServices(ServiceGuiceSupport.java:43) (via modules: com.google.inject.util.Modules$OverrideModule -> sample.bloggie.impl.BlogServiceModule)
Update: "Solved" (partially). Figured out that this one works:
pathCall("/tock", this::tock)
I can open it using this URL: ws://localhost:9000/tock
So..., I can't have nicely structured URL for those functions that returns stream, when those functions need no param? At least for now (?).
UPDATE: seems like this problem is happening not only with pathCall. I encountered the same problem with rest call. This one doesn't work (no binding):
public ServiceCall<NotUsed, PSequence<PostSummary>> getPostSummaries();
...
restCall(Method.GET, "/api/bloggie/postSummaries", this::getPostSummaries)
This one works:
public ServiceCall<NotUsed, PSequence<PostSummary>> getPostSummaries();
...
restCall(Method.GET, "/postSummaries", this::getPostSummaries)
Thanks!
So firstly, namedCall should only be used if you don't care about the path. You are invoking the service call directly, which means you do care about the path, so you have to use pathCall or restCall.
This should work:
pathCall("/api/bloggie/tock", this::tock)
Also, I think you're not pasting the full errors. Make sure you check right to the bottom of the list of Guice errors, that should explain exactly what the problem is, in many of the cases above, the problem is that you're not passing a method reference, you're passing a lambda, and the error message should say that.

Resolve actual Reference path using Microsoft.Build.Evaluation

I'm doing some introspection and analysis of csproj files using the Microsoft.Build.Evaluation tools in a small C# console app. I want to locate the actual location of Reference items, using the same heuristics as MSBuild itself ie the locations described here. I'm heading towards auto conversion of build artifacts into packages, similar to what's outlined on the JetBrains blog here
The only examples I can find expect the HintPath to be correct, for example this project, and I know there are some HintPaths that are not currently correct, I don't want to trust them. This project very close what I'm trying to do, with the added complication that I want to use real resolution behaviour to find dependencies.
I have an instance of a Microsoft.Build.Evaluation.Project object for my csproj, and I can't see any methods available on it that could exersize the resolution for me. I think what I'm hoping for is a magic Resolve() method for a Reference or a ProjectItem, a bit like this method.
I can probably find an alternative by constraining my own search to a set of limited output paths used by this build system, but I'd like to hook into MSBuild if I can.
The reference resolution is one of the trickiest parts of MSBuild. The logic of how assemblies are located is implemented inside the a standard set of tasks:
ResolveAssemblyReference, ResolveNativeReference, etc. The logic is how this works is very complicated, you can see that just by looking at number of possible parameters to those tasks.
However you don't need to know the exact logic to find the location of referenced files. There are standard targets called "ResolveAssemblyReferences", "ResolveProjectReferences" and some others more specialized for native references, COM references. Those targets are executed as part of the normal build. If you just execute those targets separately, you can find out the return values, which is exactly what you need. The same mechanism is used by IDE to get location of refereces, for Intellisense, introspection, etc.
Here is how you can do it in code:
using Microsoft.Build.BuildEngine;
using Microsoft.Build.Execution;
using Microsoft.Build.Framework;
using System;
using System.Collections.Generic;
class Program
{
static int Main(string[] args)
{
if (args.Length < 1)
{
Console.WriteLine("Usage: GetReferences.exe <projectFileName>");
return -1;
}
string projectFileName = args[0];
ConsoleLogger logger = new ConsoleLogger(LoggerVerbosity.Normal);
BuildManager manager = BuildManager.DefaultBuildManager;
ProjectInstance projectInstance = new ProjectInstance(projectFileName);
var result = manager.Build(
new BuildParameters()
{
DetailedSummary = true,
Loggers = new List<ILogger>() { logger }
},
new BuildRequestData(projectInstance, new string[]
{
"ResolveProjectReferences",
"ResolveAssemblyReferences"
}));
PrintResultItems(result, "ResolveProjectReferences");
PrintResultItems(result, "ResolveAssemblyReferences");
return 0;
}
private static void PrintResultItems(BuildResult result, string targetName)
{
var buildResult = result.ResultsByTarget[targetName];
var buildResultItems = buildResult.Items;
if (buildResultItems.Length == 0)
{
Console.WriteLine("No refereces detected in target {0}.", targetName);
return;
}
foreach (var item in buildResultItems)
{
Console.WriteLine("{0} reference: {1}", targetName, item.ItemSpec);
}
}
}
Notice, the engine is called to invoke specific targets in the project. Your project usually does not build, but some targets might be invoked by pre-requisite targets.
Just compile it and will print a sub-set of all dependencies. There might be more dependencies if you use COM references or native dependencies for your project. It should be easy to modify the sample to get those as well.

How to discovering types exported by OSGi bundle without installing/activating?

Basically i want to discover if a jar implements any number of interfaces wihtout activating or starting the bundle. Is it possible to read the meta data from the meta-inf from an API just like the container does but without activating a bundle ?
I want to use OSGi to support plugins of which numerous interfaces will be published and i would like to know which interfaces are implemented by a bundle when the user uploads without activating the bundle etc.
I do not think it is possible to discover what services a bundle is going to provide, because this can happen from inside the Java code, without any meta-data about it. Of course, if you use Declarative Services, there is a meta-data file. Also, the bundle needs to import (or provide) the service interface, which may give you a hint (but not more).
You can inspect what Java packages a bundles imports and exports without activating it.
If you are willing to install (not resolve, not activate) it, you can query it. The Felix or Equinox shells can list those packages after all.
Here is the relevant source from Felix' shell. It uses the PackageAdmin service:
public void execute(String s, PrintStream out, PrintStream err)
{
// Get package admin service.
ServiceReference ref = m_context.getServiceReference(
org.osgi.service.packageadmin.PackageAdmin.class.getName());
PackageAdmin pa = (ref == null) ? null :
(PackageAdmin) m_context.getService(ref);
// ...
Bundle bundle = m_context.getBundle( bundleId );
ExportedPackage[] exports = pa.getExportedPackages(bundle);
// ...
}
you may try something like below. Find the ".class" files in the exported packages using bundle.findResource(...) method.
BundleContext context = bundle.getBundleContext();
ServiceReference ref = context.getServiceReference(PackageAdmin.class.getName());
PackageAdmin packageAdmin = (PackageAdmin)context.getService(ref);
List<Class> agentClasses = new ArrayList<Class>();
ExportedPackage[] exportedPackages = packageAdmin.getExportedPackages(bundle);
for(ExportedPackage ePackage : exportedPackages){
String packageName = ePackage.getName();
String packagePath = "/"+packageName.replace('.', '/');
//find all the class files in current exported package
Enumeration clazzes = bundle.findEntries(packagePath, "*.class", false);
while(clazzes.hasMoreElements()){
URL url = (URL)clazzes.nextElement();
String path = url.getPath();
int index = path.lastIndexOf("/");
int endIndex = path.length()-6;//Strip ".class" substring
String className = path.substring(index+1, endIndex);
String fullClassName=packageName+"."+className;
try {
Class clazz = bundle.loadClass(fullClassName);
//check whether the class is annotated with Agent tag.
if(clazz.isAnnotationPresent(Agent.class))
agentClasses.add(clazz);
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
}
}

Resources