what the difference between the two codes (Spring Boot)? - spring-boot

These two codes should do exactly the same thing, but the first one works and the second one doesnt work. Can anyone review the code and give the details about why the code failed during second approach.
The first code :
#Component
public class AdminSqlUtil implements SqlUtil {
#Autowired private ApplicationContext context;
DataSource dataSource =(DataSource) context.getBean("adminDataSource");
public void runSqlFile(String SQLFileName) {
Resource resource = context.getResource(SQLFileName);
EncodedResource encodedResource = new EncodedResource(resource, Charset.forName("UTF-8"));
try {
ScriptUtils.executeSqlScript(dataSource.getConnection(), encodedResource);
} catch (SQLException ex) {
throw new RuntimeException(ex);
}
}
The second code :
#Component
public class AdminSqlUtil implements SqlUtil {
#Autowired private ApplicationContext context;
public void runSqlFile(String SQLFileName) {
Resource resource = context.getResource(SQLFileName);
EncodedResource encodedResource = new EncodedResource(resource, Charset.forName("UTF-8"));
try {
ScriptUtils.executeSqlScript((DataSource)context.getBean("adminDataSource").getConnection(), encodedResource);
} catch (SQLException ex) {
throw new RuntimeException(ex);
}
}

The first one has a private scope and the framework can not access it. You could have add #inject before your private scope variable so the framework can initialize it. However the best practice is to define a public dependency setter for that to work.
The second one on the other hand initiates the value at the start, which is not a dependency injection by the way. I am not talking about good and bad practice. It is wrong. We don’t initialize a variable which is suppose to be initialized by the framework.
So lets go with the first one, Try to add a setter for it.
Take a look at this link.

Related

Why is exception in Spring Batch AsycItemProcessor caught by SkipListener's onSkipInWrite method?

I'm writing a Spring Boot application that starts up, gathers and converts millions of database entries into a new streamlined JSON format, and then sends them all to a GCP PubSub topic. I'm attempting to use Spring Batch for this, but I'm running into trouble implementing fault tolerance for my process. The database is rife with data quality issues, and sometimes my conversions to JSON will fail. When failures occur, I don't want the job to immediately quit, I want it to continue processing as many records as it can and, before completion, to report which exact records failed so that I, and or my team, can examine these problematic database entries.
To achieve this, I've attempted to use Spring Batch's SkipListener interface. But I'm also using an AsyncItemProcessor and an AsyncItemWriter in my process, and even though the exceptions are occurring during the processing, the SkipListener's onSkipInWrite() method is catching them - rather than the onSkipInProcess() method. And unfortunately, the onSkipInWrite() method doesn't have access to the original database entity, so I can't store its ID in my list of problematic DB entries.
Have I misconfigured something? Is there any other way to gain access to the objects from the reader that failed the processing step of an AsynItemProcessor?
Here's what I've tried...
I have a singleton Spring Component where I store how many DB entries I've successfully processed along with up to 20 problematic database entries.
#Component
#Getter //lombok
public class ProcessStatus {
private int processed;
private int failureCount;
private final List<UnexpectedFailure> unexpectedFailures = new ArrayList<>();
public void incrementProgress { processed++; }
public void logUnexpectedFailure(UnexpectedFailure failure) {
failureCount++;
unexpectedFailure.add(failure);
}
#Getter
#AllArgsConstructor
public static class UnexpectedFailure {
private Throwable error;
private DBProjection dbData;
}
}
I have a Spring batch Skip Listener that's supposed to catch failures and update my status component accordingly:
#AllArgsConstructor
public class ConversionSkipListener implements SkipListener<DBProjection, Future<JsonMessage>> {
private ProcessStatus processStatus;
#Override
public void onSkipInRead(Throwable error) {}
#Override
public void onSkipInProcess(DBProjection dbData, Throwable error) {
processStatus.logUnexpectedFailure(new ProcessStatus.UnexpectedFailure(error, dbData));
}
#Override
public void onSkipInWrite(Future<JsonMessage> messageFuture, Throwable error) {
//This is getting called instead!! Even though the exception happened during processing :(
//But I have no access to the original DBProjection data here, and messageFuture.get() gives me null.
}
}
And then I've configured my job like this:
#Configuration
public class ConversionBatchJobConfig {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private TaskExecutor processThreadPool;
#Bean
public SimpleCompletionPolicy processChunkSize(#Value("${commit.chunk.size:100}") Integer chunkSize) {
return new SimpleCompletionPolicy(chunkSize);
}
#Bean
#StepScope
public ItemStreamReader<DbProjection> dbReader(
MyDomainRepository myDomainRepository,
#Value("#{jobParameters[pageSize]}") Integer pageSize,
#Value("#{jobParameters[limit]}") Integer limit) {
RepositoryItemReader<DbProjection> myDomainRepositoryReader = new RepositoryItemReader<>();
myDomainRepositoryReader.setRepository(myDomainRepository);
myDomainRepositoryReader.setMethodName("findActiveDbDomains"); //A native query
myDomainRepositoryReader.setArguments(new ArrayList<Object>() {{
add("ACTIVE");
}});
myDomainRepositoryReader.setSort(new HashMap<String, Sort.Direction>() {{
put("update_date", Sort.Direction.ASC);
}});
myDomainRepositoryReader.setPageSize(pageSize);
myDomainRepositoryReader.setMaxItemCount(limit);
// myDomainRepositoryReader.setSaveState(false); <== haven't figured out what this does yet
return myDomainRepositoryReader;
}
#Bean
#StepScope
public ItemProcessor<DbProjection, JsonMessage> dataConverter(DataRetrievalSerivice dataRetrievalService) {
//Sometimes throws exceptions when DB data is exceptionally weird, bad, or missing
return new DbProjectionToJsonMessageConverter(dataRetrievalService);
}
#Bean
#StepScope
public AsyncItemProcessor<DbProjection, JsonMessage> asyncDataConverter(
ItemProcessor<DbProjection, JsonMessage> dataConverter) throws Exception {
AsyncItemProcessor<DbProjection, JsonMessage> asyncDataConverter = new AsyncItemProcessor<>();
asyncDataConverter.setDelegate(dataConverter);
asyncDataConverter.setTaskExecutor(processThreadPool);
asyncDataConverter.afterPropertiesSet();
return asyncDataConverter;
}
#Bean
#StepScope
public ItemWriter<JsonMessage> jsonPublisher(GcpPubsubPublisherService publisherService) {
return new JsonMessageWriter(publisherService);
}
#Bean
#StepScope
public AsyncItemWriter<JsonMessage> asyncJsonPublisher(ItemWriter<JsonMessage> jsonPublisher) throws Exception {
AsyncItemWriter<JsonMessage> asyncJsonPublisher = new AsyncItemWriter<>();
asyncJsonPublisher.setDelegate(jsonPublisher);
asyncJsonPublisher.afterPropertiesSet();
return asyncJsonPublisher;
}
#Bean
public Step conversionProcess(SimpleCompletionPolicy processChunkSize,
ItemStreamReader<DbProjection> dbReader,
AsyncItemProcessor<DbProjection, JsonMessage> asyncDataConverter,
AsyncItemWriter<JsonMessage> asyncJsonPublisher,
ProcessStatus processStatus,
#Value("${conversion.failure.limit:20}") int maximumFailures) {
return stepBuilderFactory.get("conversionProcess")
.<DbProjection, Future<JsonMessage>>chunk(processChunkSize)
.reader(dbReader)
.processor(asyncDataConverter)
.writer(asyncJsonPublisher)
.faultTolerant()
.skipPolicy(new MyCustomConversionSkipPolicy(maximumFailures))
// ^ for now this returns true for everything until 20 failures
.listener(new ConversionSkipListener(processStatus))
.build();
}
#Bean
public Job conversionJob(Step conversionProcess) {
return jobBuilderFactory.get("conversionJob")
.start(conversionProcess)
.build();
}
}
This is because the future wrapped by the AsyncItemProcessor is only unwrapped in the AsyncItemWriter, so any exception that might occur at that time is seen as a write exception instead of a processing exception. That's why onSkipInWrite is called instead of onSkipInProcess.
This is actually a known limitation of this pattern which is documented in the Javadoc of the AsyncItemProcessor, here is an excerpt:
Because the Future is typically unwrapped in the ItemWriter,
there are lifecycle and stats limitations (since the framework doesn't know
what the result of the processor is).
While not an exhaustive list, things like StepExecution.filterCount will not
reflect the number of filtered items and
itemProcessListener.onProcessError(Object, Exception) will not be called.
The Javadoc states that the list is not exhaustive, and the side-effect regarding the SkipListener that you are experiencing is one these limitations.

java.lang.NullPointerException: When saving jpa data using Gemfire cachewriter

Jpa Repository save is working in all classes. But when trying to save in CacheWriter it is throwing NullPointerException(personRepository.save(entryEvent.getNewValue())). Any idea on this? Configured mysql database in application properties.
java.lang.NullPointerException
at com.javasampleapproach.gemfirerestapi.GemfireWriter.beforeCreate(GemfireWriter.java:28)
at com.gemstone.gemfire.internal.cache.LocalRegion.cacheWriteBeforePut(LocalRegion.java:3131)
at com.gemstone.gemfire.internal.cache.AbstractRegionMap.invokeCacheWriter(AbstractRegionMap.java:3145)
at com.gemstone.gemfire.internal.cache.AbstractRegionMap.basicPut(AbstractRegionMap.java:2909)
at com.gemstone.gemfire.internal.cache.LocalRegion.virtualPut(LocalRegion.java:5821)
at com.gemstone.gemfire.internal.cache.LocalRegionDataView.putEntry(LocalRegionDataView.java:118)
at com.gemstone.gemfire.internal.cache.LocalRegion.basicPut(LocalRegion.java:5211)
at com.gemstone.gemfire.internal.cache.LocalRegion.validatedPut(LocalRegion.java:1597)
at com.gemstone.gemfire.internal.cache.LocalRegion.put(LocalRegion.java:1580)
at com.gemstone.gemfire.internal.cache.AbstractRegion.put(AbstractRegion.java:327)
Controller:
#GetMapping(value = "/getPerson")
public Iterable<Person> getPerson(#RequestParam("id") long personId,#RequestParam("age") int age, #RequestParam("name") String name) {
try{
Person bob = new Person();
bob.setPersonId(personId);
bob.setAge(age);
bob.setName(name);
Region<Long,Person> region=gemfireCache.getRegion("person");
region.put(personId, bob);
}catch(Exception e){
e.printStackTrace();
}
return personRepository.findAll();
}
Cachewriter:
public class GemfireCacheWriter implements CacheWriter<Long, Person>{
#Autowired
PersonRepository personRepository;
#Override
public void beforeCreate(EntryEvent<Long, Person> entryEvent) throws CacheWriterException {
// TODO Auto-generated method stub
personRepository.save(entryEvent.getNewValue());
}
}
CacheWriter Config:
#Bean
LocalRegionFactoryBean<Long, Person> personRegion(final GemFireCache cache) {
LocalRegionFactoryBean<Long, Person> personRegion = new LocalRegionFactoryBean<>();
personRegion.setCache(cache);
personRegion.setName("person");
personRegion.setPersistent(false);
personRegion.setCacheWriter(new GemfireWriter());
personRegion.setCacheLoader(new GemfireLoader());
return personRegion;
}
Looking at the source code for LocalRegion, I don't think the entryEvent received by your CacheWriter could be null, so the actual null reference is probably personRepository. Have you correctly configured spring-data-gemfire to autowire the PersonRepository?, is the CacheWriter configured as a Spring bean (using the #Component as an example)?.
You can use the Write Through Example as a good starting point for implementing this use case.
Hope this helps. Cheers.

JAXBElement: providing codec (/converter?) for class java.lang.Class

I have been evaluating to adopt spring-data-mongodb for a project. In summary, my aim is:
Using existing XML schema files to generate Java classes.
This is achieved using JAXB xjc
The root class is TSDProductDataType and is further modeled as below:
The thing to note here is that ExtensionType contains protected List<Object> any; allowing it to store Objects of any class. In my case, it is amongst the classes named TSDModule_Name_HereModuleType and can be browsed here
Use spring-data-mongodb as persistence store
This is achieved using a simple ProductDataRepository
#RepositoryRestResource(collectionResourceRel = "product", path = "product")
public interface ProductDataRepository extends MongoRepository<TSDProductDataType, String> {
TSDProductDataType queryByGtin(#Param("gtin") String gtin);
}
The unmarshalled TSDProductDataType, however, contains JAXBElement which spring-data-mongodb doesn't seem to handle by itself and throws a CodecConfigurationException org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class java.lang.Class.
Here is the faulty statement:
TSDProductDataType tsdProductDataType = jaxbElement.getValue();
repository.save(tsdProductDataType);
I tried playing around with Converters for spring-data-mongodb as explained here, however, it seems I am missing something since the exception is about "Codecs" and not "Converters".
Any help is appreciated.
EDIT:
Adding converters for JAXBElement
Note: Works with version 1.5.6.RELEASE of org.springframework.boot::spring-boot-starter-parent. With version 2.0.0.M3, hell breaks loose
It seems that I missed something while trying to add converter earlier. So, I added it like below for testing:
#Component
#ReadingConverter
public class JAXBElementReadConverter implements Converter<DBObject, JAXBElement> {
//#Autowired
//MongoConverter converter;
#Override
public JAXBElement convert(DBObject dbObject) {
Class declaredType, scope;
QName name = qNameFromString((String)dbObject.get("name"));
Object rawValue = dbObject.get("value");
try {
declaredType = Class.forName((String)dbObject.get("declaredType"));
} catch (ClassNotFoundException e) {
if (rawValue.getClass().isArray()) declaredType = List.class;
else declaredType = LinkedHashMap.class;
}
try {
scope = Class.forName((String) dbObject.get("scope"));
} catch (ClassNotFoundException e) {
scope = JAXBElement.GlobalScope.class;
}
//Object value = rawValue instanceof DBObject ? converter.read(declaredType, (DBObject) rawValue) : rawValue;
Object value = "TODO";
return new JAXBElement(name, declaredType, scope, value);
}
QName qNameFromString(String s) {
String[] parts = s.split("[{}]");
if (parts.length > 2) return new QName(parts[1], parts[2], parts[0]);
if (parts.length == 1) return new QName(parts[0]);
return new QName("undef");
}
}
#Component
#WritingConverter
public class JAXBElementWriteConverter implements Converter<JAXBElement, DBObject> {
//#Autowired
//MongoConverter converter;
#Override
public DBObject convert(JAXBElement jaxbElement) {
DBObject dbObject = new BasicDBObject();
dbObject.put("name", qNameToString(jaxbElement.getName()));
dbObject.put("declaredType", jaxbElement.getDeclaredType().getName());
dbObject.put("scope", jaxbElement.getScope().getCanonicalName());
//dbObject.put("value", converter.convertToMongoType(jaxbElement.getValue()));
dbObject.put("value", "TODO");
dbObject.put("_class", JAXBElement.class.getName());
return dbObject;
}
public String qNameToString(QName name) {
if (name.getNamespaceURI() == XMLConstants.NULL_NS_URI) return name.getLocalPart();
return name.getPrefix() + '{' + name.getNamespaceURI() + '}' + name.getLocalPart();
}
}
#SpringBootApplication
public class TsdApplication {
public static void main(String[] args) {
SpringApplication.run(TsdApplication.class, args);
}
#Bean
public CustomConversions customConversions() {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(),
new JAXBElementWriteConverter()
));
}
}
So far so good. However, how do I instantiate MongoConverter converter;?
MongoConverter is an interface so I guess I need an instantiable class adhering to this interface. Any suggestions?
I understand the desire for convenience in being able to just map an existing domain object to the database layer with no boilerplate, but even if you weren't having the JAXB class structure issue, I would still be recommending away from using it verbatim. Unless this is a simple one-off project, you almost definitely will hit a point where your domain models will need to change but your persisted data need to remain in an existing state. If you are just straight persisting the data, you have no mechanism to convert between a newer domain schema and an older persisted data scheme. Versioning of the persisted data scheme would be wise too.
The link you posted for writing the customer converters is one way to achieve this and fits in nicely with the Spring ecosystem. That method should also solve the issue you are experiencing (about the underlying messy JAXB data structure not converting cleanly).
Are you unable to get that method working? Ensure you are loading them into the Spring context with #Component plus auto-class scanning or manually via some Configuration class.
EDIT to address your EDIT:
Add the following to each of your converters:
private final MongoConverter converter;
public JAXBElement____Converter(MongoConverter converter) {
this.converter = converter;
}
Try changing your bean definition to:
#Bean
public CustomConversions customConversions(#Lazy MongoConverter converter) {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(converter),
new JAXBElementWriteConverter(converter)
));
}

Can I programmatically add a qualifier to a bean?

I am registering transaction managers in my code, I would normally use annotation based configuration but as I don't know until runtime how many data sources (and hence transaction managers) there will be, I have to programmatically register these, as follows:
private final void registerTransactionManagerBean(final DataSource dataSource, ConfigurableApplicationContext context) {
String transactionManagerName = this.getName() + "-transactionManager";
context.getBeanFactory().registerSingleton(transactionManagerName, new DataSourceTransactionManager(dataSource));
LOG.info("Registering transaction manager under name : " + transactionManagerName);
}
Assuming this.getName() returned 'mydb', I originally expected to be able to qualify a transaction manager like this:
#Transactional("mydb-transactionManager")
What I've realised however is the value of that annotation refers to the qualifier and not the name. I did a quick test by declaring a bean as below and it works:
#Bean
#Qualifier("mydb-transactionManager")
public PlatformTransactionManager test() {
return new DataSourceTransactionManager(new EmbeddedDatabaseBuilder().build());
}
My question is, is there a way I can programmatically add a qualifier when registering a bean?
UPDATE
I've worked this out, I'm falling foul of this problem (in BeanFactoryAnnotationUtils:isQualifierMatch):
catch (NoSuchBeanDefinitionException ex) {
// ignore - can't compare qualifiers for a manually registered singleton object
}
I am manually registering my transaction manager bean so I presume this is why I'm stuck. I'm not really sure what options that gives me apart from to not programmatically register transaction managers as a runtime thing sadly.
I've worked this out, I'm falling foul of this problem:
catch (NoSuchBeanDefinitionException ex) {
// ignore - can't compare qualifiers for a manually registered singleton object
}
I am manually registering my transaction manager bean so I presume this is why I'm stuck. I'm not really sure what options that gives me apart from to not programatically register transaction managers as a runtime thing sadly.
Raised as a JIRA issue - https://jira.spring.io/browse/SPR-11915
public class RuntimeRegistrationWithQualifierTest {
private AnnotationConfigApplicationContext context;
#Test
public void beanWithQualifier() {
final GenericBeanDefinition helloBeanDefinition = new GenericBeanDefinition();
helloBeanDefinition.addQualifier(new AutowireCandidateQualifier(Hello.class));
final GenericBeanDefinition worldBeanDefinition = new GenericBeanDefinition();
worldBeanDefinition.addQualifier(new AutowireCandidateQualifier(World.class));
final DefaultListableBeanFactory factory = context.getDefaultListableBeanFactory();
factory.registerBeanDefinition("helloBean", helloBeanDefinition);
factory.registerSingleton("helloBean", "hello");
factory.registerBeanDefinition("worldBean", worldBeanDefinition);
factory.registerSingleton("worldBean", "world");
context.register(Foo.class);
context.refresh();
final Foo foo = context.getBean(Foo.class);
assertThat(foo.hello).isEqualTo("hello");
assertThat(foo.world).isEqualTo("world");
}
#Before
public void newContext() {
context = new AnnotationConfigApplicationContext();
}
#Qualifier
#Retention(RUNTIME)
#Target({FIELD, PARAMETER})
#interface Hello {}
#Qualifier
#Retention(RUNTIME)
#Target({FIELD, PARAMETER})
#interface World {}
static class Foo {
final String hello;
final String world;
Foo(#Hello final String hello, #World final String world) {
this.hello = hello;
this.world = world;
}
}
}

ClassCastException when using embedded glassfish for unit tests

I'm running some unit tests on some EJBS via maven and an embedded glassfish container. One of my tests works, but all subsequent attempts to test a different EJB result in the same error:
java.lang.ClassCastException: $Proxy81 cannot be cast to
Followed by whatever bean I'm attempting to test. I'm confident my setup is good since, as I say, one of my beans can be tested properly.
Examples of workiing code:
#Stateful
public class LayoutManagerBean implements LayoutManager {
private final Log LOG = LogFactory.getLog(LayoutManagerBean.class);
public List<Menu> getMenus(User currentUser) {
...
}
}
#Local
public interface LayoutManager {
public List<Menu> getMenus(User user);
}
And the test:
public class LayoutManagerTest {
private static EJBContainer ejbContainer;
private static Context ctx;
#BeforeClass
public static void setUp() {
ejbContainer = EJBContainer.createEJBContainer();
ctx = ejbContainer.getContext();
}
#AfterClass
public static void tearDown() {
ejbContainer.close();
}
#Test
public void getMenus() {
LayoutManager manager = null;
try {
manager = (LayoutManager) ctx.lookup("java:global/classes/LayoutManagerBean!uk.co.monkeypower.openchurch.core.layout.beans.LayoutManager");
} catch (NamingException e) {
System.out.println("Failed to lookup the gosh darned bean!");
}
assertNotNull(manager);
//Menu[] menus = manager.getMenus();
//assertTrue(menus.length > 1);
}
}
And an example of a failure:
#Singleton
public class OpenChurchPortalContext implements PortalContext {
private Set<PortletMode> portletModes = Collections.emptySet();
private Set<WindowState> windowStates = Collections.emptySet();
private Properties portalProperties = new Properties();
public OpenChurchPortalContext() {
portletModes.add(PortletMode.VIEW);
portletModes.add(PortletMode.HELP);
portletModes.add(PortletMode.EDIT);
portletModes.add(new PortletMode("ABOUT"));
windowStates.add(WindowState.MAXIMIZED);
windowStates.add(WindowState.MINIMIZED);
windowStates.add(WindowState.NORMAL);
}
...
}
And the test:
public class OpenChurchPortalContextTest {
private static EJBContainer ejbContainer;
private static Context ctx;
#BeforeClass
public static void setUp() {
ejbContainer = EJBContainer.createEJBContainer();
ctx = ejbContainer.getContext();
}
#AfterClass
public static void tearDown() {
ejbContainer.close();
}
#Test
public void test() {
OpenChurchPortalContext context = null;
try {
context = (OpenChurchPortalContext) ctx.lookup("java:global/classes/OpenChurchPortalContext");
} catch (NamingException e) {
System.out.println("Failed to find the bean in the emebedded jobber");
}
assertNotNull(context);
Set<PortletMode> modes = (Set<PortletMode>) context.getSupportedPortletModes();
assertTrue(modes.size() > 1);
Set<WindowState> states = (Set<WindowState>) context.getSupportedWindowStates();
assertTrue(states.size() > 1);
}
}
Any ideas as to why this may not be working?
You often get this problem if you are proxying a class, not an interface. Assuming that it's this line which is failing:
context = (OpenChurchPortalContext) ctx.lookup("java:global/classes/OpenChurchPortalContext");
OpenChurchPortalContext is a class, but it is being wrapped by a proxy class to implement the EJB specific functionality. This proxy class isn't a subclass of OpenChurchPortalContext, so you're getting a ClassCastException.
You aren't getting this with the first example, because the LayoutManager is an interface.
LayoutManager manager = null; // INTERFACE, so it works
try {
manager = (LayoutManager) ctx.lookup("java:global/classes/LayoutManagerBean!uk.co.monkeypower.openchurch.core.layout.beans.LayoutManager");
} catch (NamingException e) {
System.out.println("Failed to lookup the gosh darned bean!");
}
First, you can test to see if this is really your problem, change context to be a PortalContext not OpenChurchPortalContext:
PortalContext context = null;
try {
context = (PortalContext) ctx.lookup("java:global/classes/OpenChurchPortalContext");
} catch (NamingException e) {
System.out.println("Failed to find the bean in the emebedded jobber");
}
If your problem really is the Proxy, then the above code should work. If this is the case, you have two potential solutions:
When you do the ctx.lookup, always use an interface. This can be a bit of a pain, because you need to define an interface specifically for each EJB.
You may be able to configure your EJB container to proxy the classes instead of just the interfaces, similar to proxyTargetClass for Spring AOP. You'll need to check with the documentation for your container for that.
Your singleton EJB has a default local business interface by means of implementing PortalContext interface. The test client should know it only by its business interface, and the actual bean class (OpenChurchPortalContext) should not be referenced directly by the client. So the fix is to look it up by its business interface PortalContext.

Resources