Weblogic,EJB, $Proxy99 class cast exception - ejb-3.0

Following are the modules in my project,
1. EJB module (version 3): We prepare ejb jar of this module and deploy on Weblogic11g server. It deals with database operation. It has #local, #Remote interface and #stateless classes implementing #local,#Remote interfaces.
2. Web Application : This web application takes inputs (user uploads file) from users, validates file and inserts data into database. It uses RMI.
Problem: On production (weblogic 11g server ) sometimes we observe exception saying $Proxy99 cannot be cast to "Remote interface name" (for different different classes) e.g com.xyz.fileProcessSetting.FileProcessSttgFacadeRemote.
But after some time when we again upload file, it gets uploaded successfully without any error.
Now, I do not understand how come these remote objects becomes temporarily unavailable? Never faced this issue on development/UAT environment. Also no idea how to reproduce and fix it.
Please help. Thanks in advance.
#Remote
public interface FileProcessSttgFacadeRemote {
//methods
}
#Local
public interface FileProcessSttgFacadeLocal {
//methods
}
#Stateless
public class FileProcessSttgFacade implements FileProcessSttgFacadeLocal, FileProcessSttgFacadeRemote {
//methods
}
in weblogic-ejb-jar.xml
<weblogic-enterprise-bean>
<ejb-name>FileProcessSttgFacade</ejb-name>
<stateless-session-descriptor>
<business-interface-jndi-name-map>
<business-remote>com.xyz.fileProcessSetting.FileProcessSttgFacadeRemote</business-remote>
<jndi-name>FileProcessSttgFacade</jndi-name>
</business-interface-jndi-name-map>
</stateless-session-descriptor>
</weblogic-enterprise-bean>
In web application also in ejb module whenever we want to call methods we use following lookup method to get remote object:
public class someclass extends EjbLocator {
public void someMethod(){
FileProcessSttgFacadeRemote fpfr = (FileProcessSttgFacadeRemote) getService("FileProcessSttgFacade");
//other code
}
}
Following is the class used for JNDI lookup:
public class EjbLocator {
public Object getService(final String jndiName) throws Exception {
try {
obj = getDefaultContext().lookup(jndiName);
} catch (final Exception exp) {
exp.printStackTrace();
}
return obj;
}
protected Context getDefaultContext() {
try {
final Hashtable<String, String> env = new Hashtable<String, String>();
env.put(Context.INITIAL_CONTEXT_FACTORY, "weblogic.jndi.WLInitialContextFactory");
env.put(Context.SECURITY_PRINCIPAL,"weblogic");
env.put(Context.SECURITY_CREDENTIALS, "password");
env.put(Context.PROVIDER_URL, "t3://<ip>:<port>");
defaultContext = new InitialContext(env);
return defaultContext;
} catch (final NamingException nExp) {
nExp.printStackTrace();
}
return null;
}
}

Related

java.lang.NullPointerException: When saving jpa data using Gemfire cachewriter

Jpa Repository save is working in all classes. But when trying to save in CacheWriter it is throwing NullPointerException(personRepository.save(entryEvent.getNewValue())). Any idea on this? Configured mysql database in application properties.
java.lang.NullPointerException
at com.javasampleapproach.gemfirerestapi.GemfireWriter.beforeCreate(GemfireWriter.java:28)
at com.gemstone.gemfire.internal.cache.LocalRegion.cacheWriteBeforePut(LocalRegion.java:3131)
at com.gemstone.gemfire.internal.cache.AbstractRegionMap.invokeCacheWriter(AbstractRegionMap.java:3145)
at com.gemstone.gemfire.internal.cache.AbstractRegionMap.basicPut(AbstractRegionMap.java:2909)
at com.gemstone.gemfire.internal.cache.LocalRegion.virtualPut(LocalRegion.java:5821)
at com.gemstone.gemfire.internal.cache.LocalRegionDataView.putEntry(LocalRegionDataView.java:118)
at com.gemstone.gemfire.internal.cache.LocalRegion.basicPut(LocalRegion.java:5211)
at com.gemstone.gemfire.internal.cache.LocalRegion.validatedPut(LocalRegion.java:1597)
at com.gemstone.gemfire.internal.cache.LocalRegion.put(LocalRegion.java:1580)
at com.gemstone.gemfire.internal.cache.AbstractRegion.put(AbstractRegion.java:327)
Controller:
#GetMapping(value = "/getPerson")
public Iterable<Person> getPerson(#RequestParam("id") long personId,#RequestParam("age") int age, #RequestParam("name") String name) {
try{
Person bob = new Person();
bob.setPersonId(personId);
bob.setAge(age);
bob.setName(name);
Region<Long,Person> region=gemfireCache.getRegion("person");
region.put(personId, bob);
}catch(Exception e){
e.printStackTrace();
}
return personRepository.findAll();
}
Cachewriter:
public class GemfireCacheWriter implements CacheWriter<Long, Person>{
#Autowired
PersonRepository personRepository;
#Override
public void beforeCreate(EntryEvent<Long, Person> entryEvent) throws CacheWriterException {
// TODO Auto-generated method stub
personRepository.save(entryEvent.getNewValue());
}
}
CacheWriter Config:
#Bean
LocalRegionFactoryBean<Long, Person> personRegion(final GemFireCache cache) {
LocalRegionFactoryBean<Long, Person> personRegion = new LocalRegionFactoryBean<>();
personRegion.setCache(cache);
personRegion.setName("person");
personRegion.setPersistent(false);
personRegion.setCacheWriter(new GemfireWriter());
personRegion.setCacheLoader(new GemfireLoader());
return personRegion;
}
Looking at the source code for LocalRegion, I don't think the entryEvent received by your CacheWriter could be null, so the actual null reference is probably personRepository. Have you correctly configured spring-data-gemfire to autowire the PersonRepository?, is the CacheWriter configured as a Spring bean (using the #Component as an example)?.
You can use the Write Through Example as a good starting point for implementing this use case.
Hope this helps. Cheers.

JAXBElement: providing codec (/converter?) for class java.lang.Class

I have been evaluating to adopt spring-data-mongodb for a project. In summary, my aim is:
Using existing XML schema files to generate Java classes.
This is achieved using JAXB xjc
The root class is TSDProductDataType and is further modeled as below:
The thing to note here is that ExtensionType contains protected List<Object> any; allowing it to store Objects of any class. In my case, it is amongst the classes named TSDModule_Name_HereModuleType and can be browsed here
Use spring-data-mongodb as persistence store
This is achieved using a simple ProductDataRepository
#RepositoryRestResource(collectionResourceRel = "product", path = "product")
public interface ProductDataRepository extends MongoRepository<TSDProductDataType, String> {
TSDProductDataType queryByGtin(#Param("gtin") String gtin);
}
The unmarshalled TSDProductDataType, however, contains JAXBElement which spring-data-mongodb doesn't seem to handle by itself and throws a CodecConfigurationException org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class java.lang.Class.
Here is the faulty statement:
TSDProductDataType tsdProductDataType = jaxbElement.getValue();
repository.save(tsdProductDataType);
I tried playing around with Converters for spring-data-mongodb as explained here, however, it seems I am missing something since the exception is about "Codecs" and not "Converters".
Any help is appreciated.
EDIT:
Adding converters for JAXBElement
Note: Works with version 1.5.6.RELEASE of org.springframework.boot::spring-boot-starter-parent. With version 2.0.0.M3, hell breaks loose
It seems that I missed something while trying to add converter earlier. So, I added it like below for testing:
#Component
#ReadingConverter
public class JAXBElementReadConverter implements Converter<DBObject, JAXBElement> {
//#Autowired
//MongoConverter converter;
#Override
public JAXBElement convert(DBObject dbObject) {
Class declaredType, scope;
QName name = qNameFromString((String)dbObject.get("name"));
Object rawValue = dbObject.get("value");
try {
declaredType = Class.forName((String)dbObject.get("declaredType"));
} catch (ClassNotFoundException e) {
if (rawValue.getClass().isArray()) declaredType = List.class;
else declaredType = LinkedHashMap.class;
}
try {
scope = Class.forName((String) dbObject.get("scope"));
} catch (ClassNotFoundException e) {
scope = JAXBElement.GlobalScope.class;
}
//Object value = rawValue instanceof DBObject ? converter.read(declaredType, (DBObject) rawValue) : rawValue;
Object value = "TODO";
return new JAXBElement(name, declaredType, scope, value);
}
QName qNameFromString(String s) {
String[] parts = s.split("[{}]");
if (parts.length > 2) return new QName(parts[1], parts[2], parts[0]);
if (parts.length == 1) return new QName(parts[0]);
return new QName("undef");
}
}
#Component
#WritingConverter
public class JAXBElementWriteConverter implements Converter<JAXBElement, DBObject> {
//#Autowired
//MongoConverter converter;
#Override
public DBObject convert(JAXBElement jaxbElement) {
DBObject dbObject = new BasicDBObject();
dbObject.put("name", qNameToString(jaxbElement.getName()));
dbObject.put("declaredType", jaxbElement.getDeclaredType().getName());
dbObject.put("scope", jaxbElement.getScope().getCanonicalName());
//dbObject.put("value", converter.convertToMongoType(jaxbElement.getValue()));
dbObject.put("value", "TODO");
dbObject.put("_class", JAXBElement.class.getName());
return dbObject;
}
public String qNameToString(QName name) {
if (name.getNamespaceURI() == XMLConstants.NULL_NS_URI) return name.getLocalPart();
return name.getPrefix() + '{' + name.getNamespaceURI() + '}' + name.getLocalPart();
}
}
#SpringBootApplication
public class TsdApplication {
public static void main(String[] args) {
SpringApplication.run(TsdApplication.class, args);
}
#Bean
public CustomConversions customConversions() {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(),
new JAXBElementWriteConverter()
));
}
}
So far so good. However, how do I instantiate MongoConverter converter;?
MongoConverter is an interface so I guess I need an instantiable class adhering to this interface. Any suggestions?
I understand the desire for convenience in being able to just map an existing domain object to the database layer with no boilerplate, but even if you weren't having the JAXB class structure issue, I would still be recommending away from using it verbatim. Unless this is a simple one-off project, you almost definitely will hit a point where your domain models will need to change but your persisted data need to remain in an existing state. If you are just straight persisting the data, you have no mechanism to convert between a newer domain schema and an older persisted data scheme. Versioning of the persisted data scheme would be wise too.
The link you posted for writing the customer converters is one way to achieve this and fits in nicely with the Spring ecosystem. That method should also solve the issue you are experiencing (about the underlying messy JAXB data structure not converting cleanly).
Are you unable to get that method working? Ensure you are loading them into the Spring context with #Component plus auto-class scanning or manually via some Configuration class.
EDIT to address your EDIT:
Add the following to each of your converters:
private final MongoConverter converter;
public JAXBElement____Converter(MongoConverter converter) {
this.converter = converter;
}
Try changing your bean definition to:
#Bean
public CustomConversions customConversions(#Lazy MongoConverter converter) {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(converter),
new JAXBElementWriteConverter(converter)
));
}

Custom JASPIC on WebSphere error message

Though similar, the specific problem I have is not addressed in Use JASPIC auth module on WebSphere 8.5
I am getting the following error message:
SECJ8027E: The path and name of file where JASPI persistent registrations are stored must be specified using property com.ibm.websphere.jaspi.configuration.
I can set the custom property in the administration to some existing folder but I wanted to make sure that is the right approach or if there is some step I was missing.
Note I am specifically using the "embedded in application" approach rather than a server installed JASPIC module so I have something like this
#WebListener
public class JaspicInitializer implements
ServletContextListener {
#Override
public void contextInitialized(final ServletContextEvent sce) {
final Map<String, String> options = new HashMap<>();
AuthConfigFactory.getFactory()
.registerConfigProvider(AuthModuleConfigProvider.class.getName(), options, "HttpServlet", null, null);
}
}
I had the error on both WebSphere 8.5.5.11 and 9.0.0.3
From #Uux comment, I changed the way I do the registration so it no longer give the error.
#WebListener
public class JaspicInitializer implements
ServletContextListener {
private String registrationID;
#Override
public void contextDestroyed(final ServletContextEvent sce) {
AuthConfigFactory.getFactory().removeRegistration(registrationID);
}
#Override
public void contextInitialized(final ServletContextEvent sce) {
final ServletContext context = sce.getServletContext();
registrationID = AuthConfigFactory.getFactory()
.registerConfigProvider(new AuthModuleConfigProvider(), "HttpServlet",
context.getVirtualServerName() + " " + context.getContextPath(), "JEE Sample");
}
}
Also WebSphere Global Security needs to be configured with
Enable application security
Enable Java Authentication SPI (JASPI)

CGLIB proxy not getting created for Transactional Proxies

Here is what I am doing:
#Component("jdbcBookDao")
public class JdbcBookDao extends JdbcDaoSupport implements BookDao{
#Autowired
public void injectDataSource(DataSource dataSource){
setDataSource(dataSource);
}
#Transactional
public int getStock(int isbn){
String sql = "SELECT bs.STOCK FROM BOOK b, BOOK_STOCK bs WHERE b.id=bs.book_id AND b.isbn=?";
return getJdbcTemplate().queryForInt(sql, isbn);
}
}
And in the application context, I have declared:
<tx:annotation-driven proxy-target-class="true"/>
With this config, I expected that when I fetch jdbcBookdao from context, it would be a CGLIB proxy(as I have set proxy-target-class to true). But when I debug, it comes as instance of JdkDynamicAopProxy. Can some one please explain why JDK proxy is getting created even when I requested for a CGLIB proxy?
Thanks.
Spring source code into object according to you if you use interface then the JDK proxy, if you use normal class then the cgLib.
e public AopProxy createAopProxy(AdvisedSupport config) throws AopConfigException {
if (config.isOptimize() || config.isProxyTargetClass() || hasNoUserSuppliedProxyInterfaces(config)) {
Class targetClass = config.getTargetClass();
if (targetClass == null) {
throw new AopConfigException("TargetSource cannot determine target class: " +
"Either an interface or a target is required for proxy creation.");
}
if (targetClass.isInterface()) {
return new JdkDynamicAopProxy(config);
}
if (!cglibAvailable) {
throw new AopConfigException(
"Cannot proxy target class because CGLIB2 is not available. " +
"Add CGLIB to the class path or specify proxy interfaces.");
}
return CglibProxyFactory.createCglibProxy(config);
}
else {
return new JdkDynamicAopProxy(config);
}
}nter code here

ClassCastException when using embedded glassfish for unit tests

I'm running some unit tests on some EJBS via maven and an embedded glassfish container. One of my tests works, but all subsequent attempts to test a different EJB result in the same error:
java.lang.ClassCastException: $Proxy81 cannot be cast to
Followed by whatever bean I'm attempting to test. I'm confident my setup is good since, as I say, one of my beans can be tested properly.
Examples of workiing code:
#Stateful
public class LayoutManagerBean implements LayoutManager {
private final Log LOG = LogFactory.getLog(LayoutManagerBean.class);
public List<Menu> getMenus(User currentUser) {
...
}
}
#Local
public interface LayoutManager {
public List<Menu> getMenus(User user);
}
And the test:
public class LayoutManagerTest {
private static EJBContainer ejbContainer;
private static Context ctx;
#BeforeClass
public static void setUp() {
ejbContainer = EJBContainer.createEJBContainer();
ctx = ejbContainer.getContext();
}
#AfterClass
public static void tearDown() {
ejbContainer.close();
}
#Test
public void getMenus() {
LayoutManager manager = null;
try {
manager = (LayoutManager) ctx.lookup("java:global/classes/LayoutManagerBean!uk.co.monkeypower.openchurch.core.layout.beans.LayoutManager");
} catch (NamingException e) {
System.out.println("Failed to lookup the gosh darned bean!");
}
assertNotNull(manager);
//Menu[] menus = manager.getMenus();
//assertTrue(menus.length > 1);
}
}
And an example of a failure:
#Singleton
public class OpenChurchPortalContext implements PortalContext {
private Set<PortletMode> portletModes = Collections.emptySet();
private Set<WindowState> windowStates = Collections.emptySet();
private Properties portalProperties = new Properties();
public OpenChurchPortalContext() {
portletModes.add(PortletMode.VIEW);
portletModes.add(PortletMode.HELP);
portletModes.add(PortletMode.EDIT);
portletModes.add(new PortletMode("ABOUT"));
windowStates.add(WindowState.MAXIMIZED);
windowStates.add(WindowState.MINIMIZED);
windowStates.add(WindowState.NORMAL);
}
...
}
And the test:
public class OpenChurchPortalContextTest {
private static EJBContainer ejbContainer;
private static Context ctx;
#BeforeClass
public static void setUp() {
ejbContainer = EJBContainer.createEJBContainer();
ctx = ejbContainer.getContext();
}
#AfterClass
public static void tearDown() {
ejbContainer.close();
}
#Test
public void test() {
OpenChurchPortalContext context = null;
try {
context = (OpenChurchPortalContext) ctx.lookup("java:global/classes/OpenChurchPortalContext");
} catch (NamingException e) {
System.out.println("Failed to find the bean in the emebedded jobber");
}
assertNotNull(context);
Set<PortletMode> modes = (Set<PortletMode>) context.getSupportedPortletModes();
assertTrue(modes.size() > 1);
Set<WindowState> states = (Set<WindowState>) context.getSupportedWindowStates();
assertTrue(states.size() > 1);
}
}
Any ideas as to why this may not be working?
You often get this problem if you are proxying a class, not an interface. Assuming that it's this line which is failing:
context = (OpenChurchPortalContext) ctx.lookup("java:global/classes/OpenChurchPortalContext");
OpenChurchPortalContext is a class, but it is being wrapped by a proxy class to implement the EJB specific functionality. This proxy class isn't a subclass of OpenChurchPortalContext, so you're getting a ClassCastException.
You aren't getting this with the first example, because the LayoutManager is an interface.
LayoutManager manager = null; // INTERFACE, so it works
try {
manager = (LayoutManager) ctx.lookup("java:global/classes/LayoutManagerBean!uk.co.monkeypower.openchurch.core.layout.beans.LayoutManager");
} catch (NamingException e) {
System.out.println("Failed to lookup the gosh darned bean!");
}
First, you can test to see if this is really your problem, change context to be a PortalContext not OpenChurchPortalContext:
PortalContext context = null;
try {
context = (PortalContext) ctx.lookup("java:global/classes/OpenChurchPortalContext");
} catch (NamingException e) {
System.out.println("Failed to find the bean in the emebedded jobber");
}
If your problem really is the Proxy, then the above code should work. If this is the case, you have two potential solutions:
When you do the ctx.lookup, always use an interface. This can be a bit of a pain, because you need to define an interface specifically for each EJB.
You may be able to configure your EJB container to proxy the classes instead of just the interfaces, similar to proxyTargetClass for Spring AOP. You'll need to check with the documentation for your container for that.
Your singleton EJB has a default local business interface by means of implementing PortalContext interface. The test client should know it only by its business interface, and the actual bean class (OpenChurchPortalContext) should not be referenced directly by the client. So the fix is to look it up by its business interface PortalContext.

Resources