So I was loosely following the Amdatu JPA video tutorial and I almost got it working...
At a glance everything seems to be fine, only DataSource service is not resolved and I don't know why. It seems to me that it is registered. So how would I go debugging this, there should be some way to debug this, right?
When starting I have this in msg log:
[CM Configuration Updater (Update: pid=org.amdatu.jpa.datasourcefactory.dd8bf61e-01b1-4732-9b0c-bba96e1f5aff)] DEBUG org.amdatu.jpa.datasourcefactory - ServiceEvent REGISTERED - [javax.sql.DataSource] - org.amdatu.jpa.datasourcefactory
Output of "dm":
[5] org.amdatu.jpa.datasourcefactory
org.osgi.service.cm.ManagedServiceFactory(service.pid=org.amdatu.jpa.datasourcefactory) registered
org.osgi.service.log.LogService service optional available
javax.sql.DataSource(validationQuery=SELECT 1,name=ManagedDS,driverName=postgresql,serviceName=ManagedDS) registered
org.osgi.service.log.LogService service optional available
org.osgi.service.jdbc.DataSourceFactory (osgi.jdbc.driver.class=org.postgresql.Driver) service required available
javax.transaction.TransactionManager service required available
So the output above should mean that DataSource is registered, right?
[31] org.amdatu.jpa.extender
org.amdatu.jpa.extender.PersistenceBundleManager() registered
org.osgi.service.log.LogService service optional available
javax.persistence.spi.PersistenceProvider service required available
active (Meta-Persistence=*) bundle optional available
java.lang.Object(bundle=32) registered
org.osgi.service.log.LogService service optional available
org.amdatu.jpa.extender.PersistenceBundleManager service required available
org.amdatu.jpa.extender.PersistenceUnitInfoImpl#7175ee92 unregistered
javax.persistence.spi.PersistenceProvider (javax.persistence.provider=org.eclipse.persistence.jpa.PersistenceProvider) service required available
javax.sql.DataSource (name=ManagedDS) service required unavailable
Everything further that depends on DataSource is obviously not resolved
javax.persistence.EntityManager service required unavailable
So what I don't get is why is DataSource not resolved there? I checked and it seems it is registered with property name=ManagedDS, but I am quite new to Felix DS so I am not really sure what is happening here.
I also tried adding this
#ServiceDependency(filter="(name=ManagedDS)")
private volatile DataSource ds;
to one of my services, but that too cannot be resolved. Thanks for any help regarding this, but what I would be most grateful of would be a way to debug and solve this myself.
So, Amdatu video tutorial suggested I should add
Import-Package: javax.sql;version=1.0.0
to my bundles. I tried removing that and it works (I did that when it stopped resolving that import after I set all versions to small ranges. Still don't know why it did that and wish that I tried that sooner)
So my guess as to why it works now - packages in my OSGi container were probably using two different versions/instances of javax.sql.DataSource. Probably one from postgres package and other someplace else (system?). Maybe one of the OSGi gurus can comment on this and clear it up?
Another sub-question is as that video suggested it is a good thing to add that import, what can I do to make it work or if it not important should I just not bother?
Related
I have an interface which I have implemented. I have annoted the impl with #Component and #Service of the package org.apache.felix.scr.annotations.
I wrote a simple constructor for my impl
public MyImpl(){
LOG.info("New instance created!!");
}
I also added loggers in #activate and #deactivate method.
I expected to see "New instance created!!" only once BUT I can see activate and deactivate method being called per request I make on a page(This service is invoked by A Sling Model which is used in that page)
What I saw was "New instance created!!" logged several times.
This means the OSGi container create multiple instances of my Service and called the activate and deactivate method every time.
This shows that this is not a Singleton.
The Object should be discarded only when I uninstall my bundle.
Please help me understand what is going on here.
I WANT TO IMPLEMENT A TRUE SINGLETON IN AEM
I have implemented this in AEM 6.5 instance which uses Apache Felix.
Edit:
Adding Service properties:
aemRootUrl http://localhost:8080
api.http.connections_manager.timeout 60000
api.http.cookie_max.age 18000
api.http.max_connections 200
api.http.max_connections_per_host 20
api.http.timeout.connection 300000
api.http.timeout.socket 300000
api.server.ssl.trust_all_certs true
api.server.url https://10asdasdsad
api.server.username admin
component.id 3925
component.name com.example.foundation.core.connection.impl.HybrisConnectionImpl
non_akamai.api.server.url hadasdadasd
service.bundleid 585
Service PID com.example.foundation.core.connection.impl.HybrisConnectionImpl
service.scope bundle
Using Bundles com.example.dumb-foundation.core (585)
Values altered to hide client specific information
EDIT::
I've removed the SCR annotations and replaced them with OSGI annotations here I've explictly specified
#Component(service =HybrisConnection.class, immediate=true,scope = ServiceScope.SINGLETON)
But still is shows as scope=bundle.
Should I enforce Singleton and OSGi annotations on it's dependencies as well for this to be a proper Singleton?
In declarative services (which is what you use behind the scenes) there are some cases when a component (and its service) is unpublished.
By default a simple component with immediate=true will come up when the bundle starts and go down when it stops.
If your component has any mandatory service dependencies (#Reference) then it will only be active while all dependencies are present. So if at least one dependent service goes away the component will be deactivated.
In addition the component might get restarted when config is not present at start but added later. If you want to avoid this make the config required.
Every thing #Christian Schneider said is true.
They AEM services are Singletons but are deactivated/unpublished at times. This might be for various reasons.
I faced a horrible issue because of ConfigurationAdmin service. Using this services caused our OSGi config files to be bound to the wrong bundle i.e. SlingModels. bundle within AEM.
the only way to access this is by getting the service using configAdmin.getConfig(PID).setBundleLocation(null);
BUT Doing this causes the service that is linked to this configuration to restart.
So every time I did config.setBundleLocation(null) the service restarted.
The best and most awesome way to resolve this is use OCD to define configuration for OSGi Services linked to OSGi config.xmls
AND NEVER EVER EVER use configuration Admin
If you want to access properties of another service Say ServiceA want to read ServiceB's title property set in com.example.serivce.impl.ServiceB.xml
Then in ServiceB in the #activate method read the props from OCD config and set it in instance level and have ServiceA inject ServiceB as it's dependency and use the property needed.
eg.
class ServiceA{
#Reference
private ServiceB serviceB;
public void someMethod(){
serviceB.getTitle(); // Successfully read property of another service i.e.
ServiceB without using ConfigurationAdmin.
}
}
I'm using springFramework and I try to setBeanDefinitions, the problem is that this methode need a Map beanDefinition as a param... could U tell me plz how I could instantiate this param?
NullPointerException at com.liferay.portal.spring.util.SpringFactoryImpl.setBeanDefinitions(SpringFactoryImpl.java:56)
additional information:
I try to deploy a liferay project without using liferay configuration files (only springFramework libraries), I created my own sessionFactory, my own dataSource ... etc!!
when I run the program, I'm able to create dataBase Schema basing on portlet-hbm.xml information... well now I try to instantiate beans for portal-spring.xml.. (which are xxxxpersistance.java)! those latters told me that they use 'com.liferay.portal.kernel.dao.orm.SessionFactory' as a required type and it can not convert property value of type 'org.hibernate.impl.SessionFactoryImpl'!! so I tried to use the liferay libraries only for those beans and I try to instanciate them manually... but I wasn't able to setBeanDefinitions cause I need a Map beanDefinition as a param... I don't know if there is a way to get them using sessionFactory or not!!
Thanks again
You only mention junit in the tags to your question. I'd recommend to write unit tests without relying on the whole Liferay infrastructure. That will tremendously lower your required setup efforts and simplify your life a lot.
I'm developing my first Tapestry application with a login system based on a Hibernate database.
On one page with a session object, I want to call my Authenticator service class, which also gets the session injected and does some stuff. My problem is, I can't get any services to run, it's been very frustrating, despite me following simple guides like this one: http://code.google.com/p/shams/wiki/Service
In my services package, I got the Authenticator.java and AuthenticatorImpl.java interface and implemented class. In the AppModule class, I call
binder.bind(Authenticator.class, AuthenticatorImpl.class);
And in my page 'ShowAllUsers' I inject my Authenticator service object:
...
public class ShowAllUsers{
#Inject
private Session session;
#Inject
private Authenticator authenticator;
...
}
But when I load the page on my server, I receive following error:
org.apache.tapestry5.ioc.internal.OperationException
Error obtaining injected value for field de.webtech2.pages.user.ShowAllUsers.authenticator: No service implements the interface de.webtech2.services.Authenticator.
trace:
- Creating instantiator for component class de.webtech2.pages.user.ShowAllUsers
- Running component class transformations on de.webtech2.pages.user.ShowAllUsers
- Injecting field de.webtech2.pages.user.ShowAllUsers.authenticator
But my AppModule does bind the class to the interface successfully. In the Maven build console I can read "Authenticator: DEFINED" and if I try to bind it in another module, it complains because it's bound in AppMopule already.
Why doesn't tapestry see the implementation? What am I doing wrong?
Glad you checked the startup log output, that's certainly the first "sanity check" towards addressing this problem.
I think uklance has the right idea: do a clean build, make sure you don't have multiple classes named Authenticator floating around ... perhaps from a 3rd party library. I'm always having problems where I accidentally import a non-Tapestry class that happens to be named "Resource" or something.
I haven't solved the issue itself but I found a workaround that fixes it. As you might find on the internet, Tapestry allows for auto-reloading classes. Pages and components do work fine, services have some limitations -- this is where there seem to arise issues. Tomcat doesn't link the interface to the implementation.
Fix: A simple restart of eclipse solves this. Meh.
(This also fixes the "method not found" error if you added a new method to an existing service)
Also, when I execute mvn clean, everything gets screwed many times over. Eclipse can no longer resolve the simplest class and package references. Classes in the same package can no longer be found, or references to the javax.internet package lead into eternal nothingness -- whereas everything was working just fine a moment ago.
Fix:
Right-click eclipse project -> Properties -> Maven
Tick the checkbox for "Resolve dependencies from Workspace projects" and hit Apply.
If it is already checked, uncheck -> apply, then recheck -> apply. Eclipse should go sane again -- until next time...
When starting WebSphere, I get this exception:
Could not instantiate bean class [org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter]:
Constructor threw exception; nested exception is java.lang.ClassCastException:
com.ibm.xtq.xslt.jaxp.compiler.TransformerFactoryImpl incompatible with
javax.xml.transform.TransformerFactory
Caused by: java.lang.ClassCastException: com.ibm.xtq.xslt.jaxp.compiler.TransformerFactoryImpl
incompatible with javax.xml.transform.TransformerFactory
at javax.xml.transform.TransformerFactory.newInstance(Unknown Source)
at org.springframework.http.converter.xml.AbstractXmlHttpMessageConverter.<init>(AbstractXmlHttpMessageConverter.java:47)
at org.springframework.http.converter.xml.SourceHttpMessageConverter.<init>(SourceHttpMessageConverter.java:45)
at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.<init>(AnnotationMethodHandlerAdapter.java:197)
This doesn't seem have any impact on any beans in my applicationContext.xml but it's still odd. For me, this looks as if IBM classes are leaking into my application.
How can I fix this? I already set the option "Access to internal server classes" to "Restrict".
It was indeed a class-loading issue, however this cannot be solved by changing class-loader settings.
The problem was that the xml-apis and javax.xml jars were being imported over some maven dependencies.
Since we already set the class loader policies for the application to PARENT_LAST, the javax.xml.transform.TransformerFactory was being loaded from the WebApp-Class loader from our jar files.
However its implementation 'com.ibm.xtq.xslt.jaxp.compiler.TransformerFactoryImpl' was coming from the server class loader, this one was linked to the javax.xml.transform.TransformerFactory provided by the JDK/JRE.
Since the classes were loaded from different sources a ClassCastException was thrown.
Removing all dependencies to xml-apis / xerces / javax.xml jars solved the problem.
Since these APIs are now part of the JDK they no longer need to be imported.
... and if you wonder why I know so much about this issue: I work together with Aaron. ;)
I can't speak for Restrict as I have no personal experience with it,But I think the problem is more to do with IBM Class Loader. The class you are referring to is part of IBM Java implementation of TransformerFactory, I think you can try one of the following to solve this issue on hand
Either change the server class loader policy to PARENT_LAST (This way class loader will find the class from application's local class path, before going to up the chain all the way to java run time)
The other option would be look at the jaxp.properties file, I think it is located in (was_root\java\jre\lib), I only read about this option never actually used it
Why do you say IBM classes are leaking into your application?
The TransformerFactory is asked to create a newInstance. It follows a sequence of steps to determine which TransformerFactory to use. If none of the config is specified, it simply chooses to use the default factory.
Here is the javadoc for TransformerFactory:
http://download.oracle.com/javase/1.5.0/docs/api/javax/xml/transform/TransformerFactory.html#newInstance()
What is the OS ? Is that AIX?
http://www.ibm.com/developerworks/java/jdk/aix/j664/sdkguide.aix64.html
Looking at this doc (link above) for AIX it tells me that this is the default Impl:
javax.xml.transform.TransformerFactory
Selects the XSLT processor. Possible values are:
com.ibm.xtq.xslt.jaxp.compiler.TransformerFactoryImpl
Use the XL TXE-J compiler. This value is the default.
Post back additional information so that we can try and troubleshoot this.
HTH
Manglu
I'm using the Tapestry5 tapx template library to send an html email, as per this example.
When I run the example I get the following error:
Caused by: java.lang.RuntimeException: No service implements the interface org.springframework.context.ApplicationContext.
at org.apache.tapestry5.ioc.internal.RegistryImpl.getService(RegistryImpl.java:560)
at org.apache.tapestry5.ioc.internal.ObjectLocatorImpl.getService(ObjectLocatorImpl.java:44)
All the tapestry-* jars, including tapestry-spring-5.1.05.jar are in my classpath.
Any clues as to what I'm missing?
Figured it out. SpringIOC loads all modules it find on the classpath. The SpringModule, in tapestry-spring.jar, attempts to initialise the ApplicactionContext service, which causes the problem.
Removing tapestry-spring.jar from the classpath fixes the problem.
Follow the directions on the web site carefully; my guess is that you are not using the special TapestrySpringFilter (instead of the normal TapestryFilter).
It's been a while since I looked at this code; I can't remember if the ApplicationContext is exposed as a service or injectable object. Seems like it should be.
Fair enough; not sure what you situation is, but you should look in more detail at what TapestrySpringFilter does in terms of set up and replicate it into your standalone app's startup. There's some special bootstrapping magic that you will want to leverage.