I would like to keep business logic separate from web application (spring MVC + Hibernate) for better maintainability and I do not want to restart the server for the business logic change. If I modify DRL file for business change, would the drool engine pick the latest DRL?
I did a test application, but the updated DRL file is not loaded into 'KnowledgeBase'. Is there any way to load the updated DRL/rules in Drools engine without restart the server?
Account.java
public class Account {
private Integer balance;
// setter & getter
}
And the test program has :
KnowledgeBuilder kbuilder = KnowledgeBuilderFactory.newKnowledgeBuilder();
kbuilder.add(ResourceFactory.newClassPathResource("myrule.drl"), ResourceType.DRL);
KnowledgeBase kbase = kbuilder.newKnowledgeBase();
StatelessKnowledgeSession ksession = kbase.newStatelessKnowledgeSession();
Account account = new Account(1000);
account.withdraw(500);
ksession.execute(account);
And DRL is
rule "belowLimit"
when
$account : Account( balance < 700 )
then
System.out.println("Notify user");
end
Drools 5.x versions (as you are apparently using) featured Change Sets which define directories or files to be watched for changes so that the Knowledge Base would be rebuilt automatically and sessions started thereafter might reflect the new Knowledge Base.
Detaiks can be found in the documentation see section" Knowledge Base by Configuration Using Changesets" and its successor. It's really too much to repeat it all here.
6.x now makes this much simpler. It's based around maven versioning. A KieContainer (contains the runtime bits) can be started agains any version of a jar. You can then update it to any version of a jar too, using the method updateToVersion. Finally kie-ci provides embedded maven support for automatic updates, and resolutions from external maven repositories, using maven version update standards.
http://docs.jboss.org/drools/release/6.0.0.Final/drools-docs/html/KIEChapter.html#BuildDeployUtilizeAndRunSection
You have to use KieScanner to achieve this. Here is a quote from the documentation:
The KieScanner allows continuous monitoring of your Maven repository
to check whether a new release of a Kie project has been installed. A
new release is deployed in the KieContainer wrapping that project. The
use of the KieScanner requires kie-ci.jar to be on the classpath.
KieServices kieServices = KieServices.Factory.get();
ReleaseId releaseId = kieServices.newReleaseId( "org.acme", "myartifact", "LATEST" );
KieContainer kContainer = kieServices.newKieContainer( releaseId );
KieScanner kScanner = kieServices.newKieScanner( kContainer );
// Start the KieScanner polling the Maven repository every 10 seconds
kScanner.start( 10000L );
You also can use ks.scanNow().
Remember to build your project with incremental version checked, as drools will check whether any new version of that artifact is present or not. If yes, then it will only swap in runtime.
Related
I have a Maven multi-module Spring Boot project with modules App, Domain1 and Domain2. App depends on Domain1 and Domain2.
Flyway is auto configured in the App module, with migrations in db/migrations. It runs without problems.
In the domain modules, I have AutoConfiguration classes with #AutoConfigureBefore(JpaAutoConfiguration::class) to run flyway manually 'per domain'. Each domain has it's migration files in different locations (to prevent collissions):
Domain1:
#Autowired
fun migrateFlyway(dataSource: DataSource) {
val cfg = FluentConfiguration()
.locations("classpath:migrations/domain1")
.baselineOnMigrate(true)
.table("flyway_domain1_history")
.dataSource(dataSource)
val flyway = Flyway(cfg)
flyway.migrate()
}
Domain2:
#Autowired
fun migrateFlyway(dataSource: DataSource) {
val cfg = FluentConfiguration()
.locations("classpath:migrations/domain2")
.baselineOnMigrate(true)
.table("flyway_domain2_history")
.dataSource(dataSource)
val flyway = Flyway(cfg)
flyway.migrate()
}
The problem is: both flyway instances from the domain run, creating their respective history tables. However: neither of them actually executes the sql files in them, although they exist (checked from IntelliJ in the target folder, and after building the whole thing with Maven, the SQL files exist in the JAR's).
Am I missing something?
Never mind, after years of using Flyway I just didn't understand the baseline property. De default is 1, and my first script has version 1. So Flyway saw it, but skipped. Setting baseline to '000' solved the problem.
I have an application with an embedded jetty server which I'm starting up like this (placed in main() and launched with eclipse)
Server server = new Server(port);
WebAppContext context = new WebAppContext();
context.setResourceBase("web/");
context.setDescriptor("web/WEB-INF/web.xml");
context.setConfigurations(new Configuration[]{
new AnnotationConfiguration(), new WebXmlConfiguration(),
new WebInfConfiguration(), new TagLibConfiguration(),
new PlusConfiguration(), new MetaInfConfiguration(),
new FragmentConfiguration(), new EnvConfiguration()});
context.setContextPath("/");
context.setParentLoaderPriority(true);
context.setAttribute("org.eclipse.jetty.server.webapp.ContainerIncludeJarPattern", ".*/classes/.*");
URL classes = Main.class
.getProtectionDomain()
.getCodeSource()
.getLocation();
URL url = Main.class.getResource("/packageX/packageY/");
context.getMetaData().setWebInfClassesDirs(Arrays.asList(Resource.newResource(classes)));
//If i replace classes with url it can't find annotated filters or listeners.
//context.getMetaData().setWebInfClassesDirs(Arrays.asList(Resource.newResource(url)));
server.setHandler(context);
server.start();
server.join();
Above code is working fine but it scans for all packages in my fat jar which is time consuming. So I thought of narrowing down search space by giving package path for searching, but after that it can't find any annotated classes. What am I doing wrong here?
The two attributes ...
org.eclipse.jetty.server.webapp.ContainerIncludeJarPattern
for controlling what Container JARs are scanned.
org.eclipse.jetty.server.webapp.WebInfIncludeJarPattern
for controlling what WebApp JARs are scanned.
.. only work at the JAR level. Having a true uber-jar kinda makes both of those configurations have no meaning.
Have you considered just making your uber-jar a live-war instead? and separate the the container and webapp portions better?
See: https://github.com/jetty-project/embedded-jetty-live-war
You could also do a pre-compute of the bytecode / annotation scannable components at build time and load the WebAppContext via the special QuickStartWebApp component instead, skipping the annotation scanning step entirely when embedded.
See: https://www.eclipse.org/jetty/documentation/current/quickstart-webapp.html
Having troubles with clustered ehcache in osgi/aem. Only with first project build/installation it works fine, but with second build/installation it stops working, generate a lot of errors. It looks like terracotta connection, cachemanager or something third is not properly closed.
Even after deleting bundles it tries connect to terracotta.
ok log, errors in log
I'm installing ehcache and ehcache-clustered as standalone bundles in osgi. Also tried with embedding them into my bundle. Ehcache and ehcache-clustered are set as dependencies, also tried with org.apache.servicemix.bundles.javax-cache-api (embedding, not sure if it's needed)
First time all ehcache and ehcache-clustered services are active, 2nd time are satisfied.
Ehcache bundle, ehcache-clustered bundle, javax-cache-api bundle, my project bundle
pom.xml
Same code I have tired as standalone java app and it works perfectly fine (https://github.com/ehcache/ehcache3-samples/blob/master/clustered/src/main/java/org/ehcache/sample/ClusteredXML.java)
So not sure what I have missed (dependencies, import packages..)?
ehcache config, terracotta config
#Activate
private void activate() {
LOGGER.info("Creating clustered cache manager from XML");
URL myUrl = ClusteredService.class.getResource("/com/myco/services/ehcache-clustered-local2.xml");
Configuration xmlConfig = new XmlConfiguration(myUrl);
try (CacheManager cacheManager = CacheManagerBuilder.newCacheManager(xmlConfig) ) {
cacheManager.init();
org.ehcache.Cache<String, String> basicCache = cacheManager.getCache("basicCache4", String.class, String.class);
LOGGER.info("1. Putting to cache");
basicCache.put("1","abc");
LOGGER.info("1. Getting from cache");
String value = basicCache.get("1");
LOGGER.info("1. Retrieved '{}'", value);
LOGGER.info("cache manager status2, " + cacheManager.getStatus().toString());
}
}
You have to also create a #Deactivate method where you do a cacheManager.shutdown();
I guess if you call your code in a non OSGi project twice you would also experience the same error.
I'm trying to test an OSGi service annodated with Declaratice Services annotations (org.osgi.service.component.annotations). I have generated my project based on the AEM Multi-Project Example.
public class PostServiceTest {
#Rule
public AemContext context = new AemContext((AemContextCallback) context -> {
context.registerInjectActivateService(new PostService());
}, ResourceResolverType.RESOURCERESOLVER_MOCK);
#Test
public void shouldFetchRandomPosts() {
final PostService postsService = context.getService(PostService.class);
final List<Post> posts = postsService.randomPosts(100);
assertEquals(100, posts.size());
}
}
Whenever I run this test in IntelliJ, OSGi Mocks complain about hte absence of SCR metadata on the tested class.
org.apache.sling.testing.mock.osgi.NoScrMetadataException: No OSGi SCR metadata found for class com.example.PostServiceTest
at org.apache.sling.testing.mock.osgi.OsgiServiceUtil.injectServices(OsgiServiceUtil.java:381)
at org.apache.sling.testing.mock.osgi.MockOsgi.injectServices(MockOsgi.java:148)
at org.apache.sling.testing.mock.osgi.context.OsgiContextImpl.registerInjectActivateService(OsgiContextImpl.java:153)
at org.apache.sling.testing.mock.osgi.context.OsgiContextImpl.registerInjectActivateService(OsgiContextImpl.java:168)
at com.example.PostServiceTest.shouldReturnTheEnpointNamesWithValidConfigurationAsTheListOfAcceptableKeys(PostServiceTest.java:23)
Does this mean I can only test classes annotated with the older SCR annotations that come with Apache Felix? The documentation for OSGi Mocks suggests that Declarative Services annotations is supported in version 2.0.0 and higher. The version I'm using meets this criterion.
Interestingly enough, this only happened when I ran the test directly form the IDE. It turns out that IntelliJ was not generating the SCR metadata when compiling my classes for tests.
When I compile the class under test with Gradle, the 'com.cognifide.aem.bundle' plugin is used to generate the SCR descriptor and place it in the resulting Java Archive. That's why unit tests executed by Gradle work fine. Just clicking the Run button in IntelliJ caused this step to be missed.
In order to get this working, I ended up setting up IntelliJ to allow me to run unit tests via Gradle.
I went to Settings > Build, Execution, Deployment > Gradle > Runner and used the dropdown menu so that I could decide whether to use Gradle on a test-by-test basis.
I want to write a maven plugin which will explore the classpath of my application at build time, search for classes with a certain annotation, and generate some java code adding utilities for these classes, which should get compiled in the JAR of the application.
So I wrote a mojo, inheriting from AbstractMojo, and getting the project through:
#Parameter(defaultValue = "${project}", readonly = true, required = true)
private MavenProject project;
I have most of the code, and my mojo does get execute, but I'm having trouble inserting my mojo at right build phase.
If I plug it like that:
#Mojo(name = "generate", defaultPhase = LifecyclePhase.GENERATE_SOURCES,
requiresDependencyResolution = ResolutionScope.COMPILE)
then the java code which I generate is compiled in the JAR file.
Note that I use project.addCompileSourceRoot to register the output folder.
But that isn't enough for me because it's too early in the build: I cannot read the classpath and find the classes from my project. I think they're not compiled yet.
I search for classes like so:
final List<URL> urls = List.ofAll(project.getCompileClasspathElements())
.map(element -> Try.of(() -> new File(element).toURI().toURL()).get());
final URLClassLoader classLoader = new URLClassLoader(urls.toJavaList().toArray(new URL[0]), Thread.currentThread().getContextClassLoader());
final Set<Class<?>> entities = HashSet.ofAll(new Reflections(classLoader).getTypesAnnotatedWith(MyAnnotation.class));
(I'm using vavr but you get the gist in any case)
So, by plugging my code at the GENERATE_SOURCES phase, this code doesn't work and I don't find any classes.
However, if I plug my mojo at the PROCESS_CLASSES phase:
#Mojo(name = "generate", defaultPhase = LifecyclePhase.PROCESS_CLASSES,
requiresDependencyResolution = ResolutionScope.COMPILE)
Then my classes are found, I can access the rest of the code from the application, but the code that I generate is not taken into account in the build, despite using addCompileSourceRoot.
How do I get both features working at the same time: ability to explore code from the rest of the application and ability to generate code which will be compiled with the rest of the JAR?
I guess a possible answer would be "you can't", but as far as I can tell, querydsl and immmutables are doing it (I tried reading their source but couldn't find the relevant code).
So #khmarbaise was right, what I wanted was not a maven mojo, but rather a maven annotation processor.
I found that this walkthrough was very helpful in creating one, and also this stackoverflow answer came in handy.