I have a Naming Builder class which registers the objects in the JNDI directory from a map. it is recommended by Spring to replace its own deprecated JNDI Mock implementation. Deprecated as of Spring Framework 5.2 in favor of complete solutions from third parties such as Simple-JNDI
import org.springframework.mock.jndi.SimpleNamingContextBuilder;
public class SMNContextBuilder implements InitializingBean{
private Map ncMap;
public void afterPropertiesSet() throws Exception {
if (ncMap == null) {
throw new IllegalStateException("ncMap is null!");
}
SimpleNamingContextBuilder.emptyActivatedContextBuilder();
bindObjects();
}
protected void bindObjects() {
for (Iterator iter = ncMap.entrySet().iterator(); iter
.hasNext();) {
Map.Entry entryTmp = (Map.Entry) iter.next();
SimpleNamingContextBuilder.getCurrentContextBuilder().bind(
"" + entryTmp.getKey(), entryTmp.getValue());
}
}
public void setNamingContextMap(Map ncMapPar) {
ncMap = ncMapPar;
}
}
Then, I have a Test Config where this is used.
#Configuration
public class MQTestConfig {
#Bean
public SMNContextBuilder jnidInitializingBean() throws JMSException {
SMNContextBuilder builder = new SMNContextBuilder ();
Map<String, Object> map = new HashMap<>();
map.put("java:comp/env/jms/My_ConFac", myConnectionFactory());
map.put("jms/My_Queue", myQueue());
builder.setNamingContextMap(map);
return builder;
}
What would be the Alternative of using SimpleNamingContextBuilder?
I have tried the following:
import javax.naming.InitialContext;
public class SMNContextBuilder implements InitializingBean{
private Map ncMap;
InitialContext ctx;
public SimpleMapNamingContextBuilder() {
try {
this.ctx = new InitialContext();
} catch (NamingException e) {
e.printStackTrace();
}
}
public void afterPropertiesSet() throws Exception {
if (ncMap == null) {
throw new IllegalStateException("ncMap is null!");
}
SMNContextBuilder.emptyActivatedContextBuilder();
bindObjects();
}
protected void bindObjects() {
for (Iterator iter = ncMap.entrySet().iterator(); iter
.hasNext();) {
Map.Entry entryTmp = (Map.Entry) iter.next();
try {
ctx.bind(
"" + entryTmp.getKey(), entryTmp.getValue());
} catch (NamingException e) {
e.printStackTrace();
}
}
}
public void setNamingContextMap(Map ncMapPar) {
ncMap = ncMapPar;
}
}
As you point out in the question, Spring's recommendation is to use Simple-JNDI.
Combing through the documentation, it doesn't look like there is a way to easily replace a class from this library into your current code, but it appears you can still accomplish what you want by loading your beans (in this case myConnectionFactory() and myQueue() into your InitialContext using Properties. See this section of the documentation.
It does seem like the more common avenue is to configure your JNDI resources using .xml/.properties/.ini files. There is a lot of documentation on how to do this in the Simple-JNDI github page (linked above)
Related
How to loada custom ApplicationContextInitializer to in spring boot AWS Lambda?
I have an aws lambda application using spring boot, I would like to write an ApplicationContextInitializer for decrypting database passwords. I have the following code that works while running it as a spring boot application locally, but when I deploy it to the AWS console as a lambda it doesn't work.
Here is my code
1. applications.properties
spring.datasource.url=url
spring.datasource.username=testuser
CIPHER.spring.datasource.password=encryptedpassword
The following code is the ApplicationContextInitializer, assuming password is Base64 encoded for testing only (In the actual case it will be encrypted by AWM KMS). The idea here is if the key is starting with 'CIPHER.' (as in CIPHER.spring.datasource.password)I assume it's value needs to be decrypted and another key value pair with actual, key (here spring.datasource.password) and its decrypted value will be added at context initialization.
will be like spring.datasource.password=decrypted password
#Component
public class DecryptedPropertyContextInitializer
implements ApplicationContextInitializer<ConfigurableApplicationContext> {
private static final String CIPHER = "CIPHER.";
#Override
public void initialize(ConfigurableApplicationContext applicationContext) {
ConfigurableEnvironment environment = applicationContext.getEnvironment();
for (PropertySource<?> propertySource : environment.getPropertySources()) {
Map<String, Object> propertyOverrides = new LinkedHashMap<>();
decodePasswords(propertySource, propertyOverrides);
if (!propertyOverrides.isEmpty()) {
PropertySource<?> decodedProperties = new MapPropertySource("decoded "+ propertySource.getName(), propertyOverrides);
environment.getPropertySources().addBefore(propertySource.getName(), decodedProperties);
}
}
}
private void decodePasswords(PropertySource<?> source, Map<String, Object> propertyOverrides) {
if (source instanceof EnumerablePropertySource) {
EnumerablePropertySource<?> enumerablePropertySource = (EnumerablePropertySource<?>) source;
for (String key : enumerablePropertySource.getPropertyNames()) {
Object rawValue = source.getProperty(key);
if (rawValue instanceof String && key.startsWith(CIPHER)) {
String cipherRemovedKey = key.substring(CIPHER.length());
String decodedValue = decode((String) rawValue);
propertyOverrides.put(cipherRemovedKey, decodedValue);
}
}
}
}
public String decode(String encodedString) {
byte[] valueDecoded = org.apache.commons.codec.binary.Base64.decodeBase64(encodedString);
return new String(valueDecoded);
}
Here is the Spring boot initializer
#SpringBootApplication
#ComponentScan(basePackages = "com.amazonaws.serverless.sample.springboot.controller")
public class Application extends SpringBootServletInitializer {
#Bean
public HandlerMapping handlerMapping() {
return new RequestMappingHandlerMapping();
}
#Bean
public HandlerAdapter handlerAdapter() {
return new RequestMappingHandlerAdapter();
}
#Bean
public HandlerExceptionResolver handlerExceptionResolver() {
return new HandlerExceptionResolver() {
#Override
public ModelAndView resolveException(HttpServletRequest request, HttpServletResponse response, Object handler, Exception ex) {
return null;
}
};
}
//loading the initializer here
public static void main(String[] args) {
SpringApplication application=new SpringApplication(Application.class);
application.addInitializers(new DecryptedPropertyContextInitializer());
application.run(args);
}
This is working when run as a spring boot appliaction, But when it deployed as a lambda into AWS the main() method in my SpringBootServletInitializer will never be called by lambda. Here is my Lambda handler.
public class StreamLambdaHandler implements RequestStreamHandler {
private static Logger LOGGER = LoggerFactory.getLogger(StreamLambdaHandler.class);
private static SpringBootLambdaContainerHandler<AwsProxyRequest, AwsProxyResponse> handler;
static {
try {
handler = SpringBootLambdaContainerHandler.getAwsProxyHandler(Application.class);
handler.onStartup(servletContext -> {
FilterRegistration.Dynamic registration = servletContext.addFilter("CognitoIdentityFilter", CognitoIdentityFilter.class);
registration.addMappingForUrlPatterns(EnumSet.of(DispatcherType.REQUEST), true, "/*");
});
} catch (ContainerInitializationException e) {
e.printStackTrace();
throw new RuntimeException("Could not initialize Spring Boot application", e);
}
}
#Override
public void handleRequest(InputStream inputStream, OutputStream outputStream, Context context)
throws IOException {
handler.proxyStream(inputStream, outputStream, context);
outputStream.close();
}
}
What change is to be made in the code to load the ApplicationContextInitializer by Lambda? Any help will be highly appreciated.
I was able to nail it in the following way.
First changed the property value with place holder with a prefix, where the prefix denotes the values need to be decrypted, ex.
spring.datasource.password=${MY_PREFIX_placeHolder}
aws lambda environment variable name should match to the placeholder
('MY_PREFIX_placeHolder') and it value is encrypted using AWS KMS (This sample is base64 decoding).
create an ApplicationContextInitializer which will decrypt the property value
public class DecryptedPropertyContextInitializer
implements ApplicationContextInitializer<ConfigurableApplicationContext> {
private static final String CIPHER = "MY_PREFIX_";
#Override
public void initialize(ConfigurableApplicationContext applicationContext) {
ConfigurableEnvironment environment = applicationContext.getEnvironment();
for (PropertySource<?> propertySource : environment.getPropertySources()) {
Map<String, Object> propertyOverrides = new LinkedHashMap<>();
decodePasswords(propertySource, propertyOverrides);
if (!propertyOverrides.isEmpty()) {
PropertySource<?> decodedProperties = new MapPropertySource("decoded "+ propertySource.getName(), propertyOverrides);
environment.getPropertySources().addBefore(propertySource.getName(), decodedProperties);
}
}
}
private void decodePasswords(PropertySource<?> source, Map<String, Object> propertyOverrides) {
if (source instanceof EnumerablePropertySource) {
EnumerablePropertySource<?> enumerablePropertySource = (EnumerablePropertySource<?>) source;
for (String key : enumerablePropertySource.getPropertyNames()) {
Object rawValue = source.getProperty(key);
if (rawValue instanceof String && key.startsWith(CIPHER)) {
String decodedValue = decode((String) rawValue);
propertyOverrides.put(key, decodedValue);
}
}
}
}
public String decode(String encodedString) {
byte[] valueDecoded = org.apache.commons.codec.binary.Base64.decodeBase64(encodedString);
return new String(valueDecoded);
}
}
The above code will decrypt all the values with prefix MY_PREFIX_ and add them at the top of the property source.
As the spring boot is deployed into aws lambda, lambda will not invoke the main() function, so if the ApplicationContextInitializer is initialized in main() it is not going to work. In order to make it work need to override createSpringApplicationBuilder() method of SpringBootServletInitializer, so SpringBootServletInitializer will be like
#SpringBootApplication
#ComponentScan(basePackages = "com.amazonaws.serverless.sample.springboot.controller")
public class Application extends SpringBootServletInitializer {
#Bean
public HandlerMapping handlerMapping() {
return new RequestMappingHandlerMapping();
}
#Bean
public HandlerAdapter handlerAdapter() {
return new RequestMappingHandlerAdapter();
}
#Bean
public HandlerExceptionResolver handlerExceptionResolver() {
return new HandlerExceptionResolver() {
#Override
public ModelAndView resolveException(HttpServletRequest request, HttpServletResponse response, Object handler, Exception ex) {
return null;
}
};
}
#Override
protected SpringApplicationBuilder createSpringApplicationBuilder() {
SpringApplicationBuilder builder = new SpringApplicationBuilder();
builder.initializers(new DecryptedPropertyContextInitializer());
return builder;
}
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
No need to make any changes for the lambdahandler.
Currently, I have AuthFilter and here I received an UserState. I need to pass it to the next Filter. But how to do it right? Or exists other practices to resolve it?
public class AuthFilter extends ZuulFilter {
#Autowired
private AuthService authService;
#Autowired
private ApplicationContext appContext;
#Override
public String filterType() {
return PRE_TYPE;
}
#Override
public int filterOrder() {
return PRE_DECORATION_FILTER_ORDER - 2;
}
#Override
public boolean shouldFilter() {
RequestContext context = RequestContext.getCurrentContext();
String requestURI = context.getRequest().getRequestURI();
for (String authPath : authPaths) {
if (requestURI.contains(authPath)) return true;
}
return false;
}
#Override
public Object run() throws ZuulException {
try {
UserState userState = authService.getUserData();
DefaultListableBeanFactory context = new DefaultListableBeanFactory();
GenericBeanDefinition beanDefinition = new GenericBeanDefinition();
beanDefinition.setBeanClass(UserState.class);
beanDefinition.setPropertyValues(new MutablePropertyValues() {
{
add("user", userState);
}
});
context.registerBeanDefinition("userState", beanDefinition);
} catch (UndeclaredThrowableException e) {
if (e.getUndeclaredThrowable().getClass() == UnauthorizedException.class) {
throw new UnauthorizedException(e.getMessage());
}
if (e.getUndeclaredThrowable().getClass() == ForbiddenException.class) {
throw new ForbiddenException(e.getMessage(), "The user is not allowed to make this request");
}
}
return null;
}
}
I pretty sure filters are chained together and the request/response are passed through them. You can add the data to the request, and have the next filter look for it.
In my application, I need to use pure JDBC together with Guice. However, Guice doesn't provide any built-in support to manage transactions. guice-persist only provides support based on JPA, which I cannot use.
so I tried to implement a simple solution to manage transactions with Guice and JDBC. here is the first version:
use TransactionHolder to store the transaction per thread.
public class JdbcTransactionHolder {
private static ThreadLocal<JdbcTransaction> currentTransaction = new ThreadLocal<JdbcTransaction>();
public static void setCurrentTransaction(JdbcTransaction transaction) {
currentTransaction.set(transaction);
}
public static JdbcTransaction getCurrentTransaction() {
return currentTransaction.get();
}
public static void removeCurrentTransaction() {
currentTransaction.remove();
}
}
implements a transaction manager for JDBC, for now only begin(), getTransaction(), commit() and rollback() method:
public class JdbcTransactionManager implements TransactionManager {
#Inject
private DataSource dataSource;
#Override
public void begin() throws NotSupportedException, SystemException {
logger.debug("Start the transaction");
try {
JdbcTransaction tran = JdbcTransactionHolder.getCurrentTransaction();
Connection conn = null;
if(tran == null) {
conn = dataSource.getConnection();
}
else {
conn = tran.getConnection();
}
// We have to put the connection in the holder so that we can get later
// from the holder and use it in the same thread
logger.debug("Save the transaction for thread: {}.", Thread.currentThread());
JdbcTransactionHolder.setCurrentTransaction(new JdbcTransaction(conn));
} catch (Exception e) {
throw new RuntimeException(e);
}
}
#Override
public void commit() throws RollbackException, HeuristicMixedException,
HeuristicRollbackException, SecurityException,
IllegalStateException, SystemException {
logger.debug("Commit the transaction");
try {
logger.debug("Get the connection for thread: {}.", Thread.currentThread());
Transaction transaction = JdbcTransactionHolder.getCurrentTransaction();
transaction.commit();
}
catch(Exception e) {
throw new RuntimeException(e);
}
finally {
JdbcTransactionHolder.removeCurrentTransaction();
}
}
#Override
public Transaction getTransaction() throws SystemException {
logger.debug("Get transaction.");
final JdbcTransaction tran = JdbcTransactionHolder.getCurrentTransaction();
if(tran == null) {
throw new DBException("No transaction is availble. TransactionManager.begin() is probably not yet called.");
}
return tran;
}
#Override
public void rollback() throws IllegalStateException, SecurityException,
SystemException {
logger.debug("Rollback the transaction");
try {
logger.debug("Get the transaction for thread: {}.", Thread.currentThread());
Transaction conn = JdbcTransactionHolder.getCurrentTransaction();
conn.commit();
}
catch(Exception e) {
throw new RuntimeException(e);
}
finally {
JdbcTransactionHolder.removeCurrentTransaction();
}
}
}
implement a wrapper for DataSource which can get the current connection from the transaction holder if a transaction has been started:
public class JdbcDataSource implements DataSource {
private final static org.slf4j.Logger logger = LoggerFactory.getLogger(JdbcDataSource.class);
private DataSource dataSource;
public JdbcDataSource(DataSource dataSource) {
this.dataSource = dataSource;
}
#Override
public PrintWriter getLogWriter() throws SQLException {
return dataSource.getLogWriter();
}
#Override
public int getLoginTimeout() throws SQLException {
return dataSource.getLoginTimeout();
}
#Override
public Logger getParentLogger() throws SQLFeatureNotSupportedException {
return dataSource.getParentLogger();
}
#Override
public void setLogWriter(PrintWriter out) throws SQLException {
this.dataSource.setLogWriter(out);
}
#Override
public void setLoginTimeout(int seconds) throws SQLException {
this.dataSource.setLoginTimeout(seconds);
}
#Override
public boolean isWrapperFor(Class<?> arg0) throws SQLException {
return this.isWrapperFor(arg0);
}
#Override
public <T> T unwrap(Class<T> iface) throws SQLException {
return this.unwrap(iface);
}
#Override
public Connection getConnection() throws SQLException {
JdbcTransaction transaction = JdbcTransactionHolder.getCurrentTransaction();
if(transaction != null) {
// we get the connection from the transaction
logger.debug("Transaction exists for the thread: {}.", Thread.currentThread());
return transaction.getConnection();
}
Connection conn = this.dataSource.getConnection();
conn.setAutoCommit(false);
return conn;
}
#Override
public Connection getConnection(String username, String password)
throws SQLException {
JdbcTransaction transaction = JdbcTransactionHolder.getCurrentTransaction();
if(transaction != null) {
// we get the connection from the transaction
logger.debug("Transaction exists for the thread: {}.", Thread.currentThread());
return transaction.getConnection();
}
return this.dataSource.getConnection(username, password);
}
}
then create a DataSourceProvider so that we can inject DataSource to any POJO using guice:
public class DataSourceProvider implements Provider {
private static final Logger logger = LoggerFactory.getLogger(DataSourceProvider.class);
private DataSource dataSource;
public DataSourceProvider() {
JdbcConfig config = getConfig();
ComboPooledDataSource pooledDataSource = new ComboPooledDataSource();
try {
pooledDataSource.setDriverClass(config.getDriver());
} catch (Exception e) {
throw new RuntimeException(e);
}
pooledDataSource.setJdbcUrl(config.getUrl());
pooledDataSource.setUser(config.getUsername());
pooledDataSource.setPassword(config.getPassword() );
pooledDataSource.setMinPoolSize(config.getMinPoolSize());
pooledDataSource.setAcquireIncrement(5);
pooledDataSource.setMaxPoolSize(config.getMaxPoolSize());
pooledDataSource.setMaxStatements(config.getMaxStatementSize());
pooledDataSource.setAutoCommitOnClose(false);
this.dataSource = new JdbcDataSource(pooledDataSource);
}
private JdbcConfig getConfig() {
JdbcConfig config = new JdbcConfig();
Properties prop = new Properties();
try {
//load a properties file from class path, inside static method
prop.load(JdbcConfig.class.getResourceAsStream("/database.properties"));
//get the property value and print it out
config.setDriver(prop.getProperty("driver"));
config.setUrl(prop.getProperty("url"));
config.setUsername(prop.getProperty("username"));
config.setPassword(prop.getProperty("password"));
String maxPoolSize = prop.getProperty("maxPoolSize");
if(maxPoolSize != null) {
config.setMaxPoolSize(Integer.parseInt(maxPoolSize));
}
String maxStatementSize = prop.getProperty("maxStatementSize");
if(maxStatementSize != null) {
config.setMaxStatementSize(Integer.parseInt(maxStatementSize));
}
String minPoolSize = prop.getProperty("minPoolSize");
if(minPoolSize != null) {
config.setMinPoolSize(Integer.parseInt(minPoolSize));
}
}
catch (Exception ex) {
logger.error("Failed to load the config file!", ex);
throw new DBException("Cannot read the config file: database.properties. Please make sure the file is present in classpath.", ex);
}
return config;
}
#Override
public DataSource get() {
return dataSource;
}
and then implement TransactionalMethodInterceptor to manage the transaction for the method with Transactional annotation:
public class TransactionalMethodInterceptor implements MethodInterceptor {
private final static Logger logger = LoggerFactory.getLogger(TransactionalMethodInterceptor.class);
#Inject
private JdbcTransactionManager transactionManager;
#Override
public Object invoke(MethodInvocation method) throws Throwable {
try {
// Start the transaction
transactionManager.begin();
logger.debug("Start to invoke the method: " + method);
Object result = method.proceed();
logger.debug("Finish invoking the method: " + method);
transactionManager.commit();
return result;
} catch (Exception e) {
logger.error("Failed to commit transaction!", e);
try {
transactionManager.rollback();
}
catch(Exception ex) {
logger.warn("Cannot roll back transaction!", ex);
}
throw e;
}
}
}
Finally, the code to put all together so that Guice can inject the instances:
bind(DataSource.class).toProvider(DataSourceProvider.class).in(Scopes.SINGLETON);
bind(TransactionManager.class).to(JdbcTransactionManager.class);
TransactionalMethodInterceptor transactionalMethodInterceptor = new TransactionalMethodInterceptor();
requestInjection(transactionalMethodInterceptor);
bindInterceptor(Matchers.any(), Matchers.annotatedWith(Transactional.class), transactionalMethodInterceptor);
bind(TestDao.class).to(JdbcTestDao.class);
bind(TestService.class).to(TestServiceImpl.class);
I use c3p0 for the datasource pool. so, it works just fine in my test.
I find another related question: Guice, JDBC and managing database connections
but so far I haven't find any similar approach, except something in SpringFramework. but even the implementation in Spring seems quite complex.
I would like to ask if anyone has any suggestion for this solution.
thanks.
I trying to configure spring cache, but the method is executed still. I have the below code, and the civilStatus cache is not working. The method getCivilStatus() is executed always. Does Anybody know the reason?
#Configuration
#EnableCaching
public class ApplicationConfig {
#Autowired
private SocioDemographicInfoService socioDemographicInfo;
#Bean
public CacheManager cacheManager() {
SimpleCacheManager cacheManager = new SimpleCacheManager();
cacheManager.setCaches(Arrays.asList(
new ConcurrentMapCache("civilStatus");
return cacheManager;
}
}
#Service
public class SocioDemographicInfoService {
#Cacheable(value="civilStatus")
public Map<String, String> getCivilStatus(){
log.info("Retrieving civilStatus");
Map<String, String> civilStatus = new HashMap<String, String>();
BufferedReader br = null;
String line = "";
String cvsSplitBy = ",";
try {
ClassLoader classLoader = getClass().getClassLoader();
File file = new File(classLoader.getResource("CatalogoEstadoCivil.csv").getFile());
br = new BufferedReader(new FileReader(file));
while ((line = br.readLine()) != null) {
String[] cod = line.split(cvsSplitBy);
civilStatus.put(cod[0].trim(), cod[1]);
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (br != null) {
try {
br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
return civilStatus;
}
}
}
I believe you are using spring boot and setting up a server using a class something like this (given below). Add EnableCaching annotation on the same class and define CacheManager as given below, instead of a separate configuration class. That will make sure caching is enabled before your class get initialized.
#Configuration
#EnableAutoConfiguration
#ComponentScan
#EnableCaching
#PropertySource(ignoreResourceNotFound = true, value = {"classpath:application.properties"})
#ImportResource(value = { "classpath*:spring/*.xml" })
public class MyBootServer{
public static void main(String args[]){
ApplicationContext ctx = SpringApplication.run(MyBootServer.class, args);
}
#Bean(name="cacheManager")
public CacheManager getCacheManager() {
...// Your code
}
}
Nothing wrong in your over all code. I tested your configuration in my spring boot sample code and it works
You don't need the AOP and caching complexity your usecase is a lot simpler. Just create a method that loads the file at startup and let your getCivilStatus return that map. A lot simpler.
#Service
public class SocioDemographicInfoService implements ResourceLoaderAware {
private final Map<String, String> civilStatus = new HashMap<String, String>();
private ResourceLoader loader;
#PostConstruct
public void init() {
log.info("Retrieving civilStatus");
Map<String, String> civilStatus = new HashMap<String, String>();
BufferedReader br = null;
String line = "";
String cvsSplitBy = ",";
Resource input = loader.getResource("classpath:CatalogoEstadoCivil.csv"));
if (input.isReadable() ) {
File file = input.getFile();
br = new BufferedReader(new FileReader(file));
try {
while ((line = br.readLine()) != null) {
String[] cod = line.split(cvsSplitBy);
civilStatus.put(cod[0].trim(), cod[1]);
}
} catch (IOException e) {
logger.error("Error reading file", e_;
} finally {
if (br != null) {
try { br.close() } catch( IOException e) {}
}
}
}
}
public Map<String, String> getCivilStatus() {
return this.civilStatus;
}
public void setResourceLoader(ResourceLoader loader) {
this.loader=loader;
}
}
Something like this should work. It loads your after the bean is constructed (this code can probably be optimized by using something like commons-io). Note I used Springs ResourceLoader to load the file.
How do I write a unit test to verify async behavior using Spring 4 and annotations?
Since i'm used to Spring's (old) xml style), it took me some time to figure this out. So I thought I answer my own question to help others.
First the service that exposes an async download method:
#Service
public class DownloadService {
// note: placing this async method in its own dedicated bean was necessary
// to circumvent inner bean calls
#Async
public Future<String> startDownloading(final URL url) throws IOException {
return new AsyncResult<String>(getContentAsString(url));
}
private String getContentAsString(URL url) throws IOException {
try {
Thread.sleep(1000); // To demonstrate the effect of async
InputStream input = url.openStream();
return IOUtils.toString(input, StandardCharsets.UTF_8);
} catch (InterruptedException e) {
throw new IllegalStateException(e);
}
}
}
Next the test:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration
public class DownloadServiceTest {
#Configuration
#EnableAsync
static class Config {
#Bean
public DownloadService downloadService() {
return new DownloadService();
}
}
#Autowired
private DownloadService service;
#Test
public void testIndex() throws Exception {
final URL url = new URL("http://spring.io/blog/2013/01/16/next-stop-spring-framework-4-0");
Future<String> content = service.startDownloading(url);
assertThat(false, equalTo(content.isDone()));
final String str = content.get();
assertThat(true, equalTo(content.isDone()));
assertThat(str, JUnitMatchers.containsString("<html"));
}
}
If you are using the same example in Java 8 you could also use the CompletableFuture class as follows:
#Service
public class DownloadService {
#Async
public CompletableFuture<String> startDownloading(final URL url) throws IOException {
CompletableFuture<Boolean> future = new CompletableFuture<>();
Executors.newCachedThreadPool().submit(() -> {
getContentAsString(url);
future.complete(true);
return null;
});
return future;
}
private String getContentAsString(URL url) throws IOException {
try {
Thread.sleep(1000); // To demonstrate the effect of async
InputStream input = url.openStream();
return IOUtils.toString(input, StandardCharsets.UTF_8);
} catch (InterruptedException e) {
throw new IllegalStateException(e);
}
}
}
Now the test:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration
public class DownloadServiceTest {
#Configuration
#EnableAsync
static class Config {
#Bean
public DownloadService downloadService() {
return new DownloadService();
}
}
#Autowired
private DownloadService service;
#Test
public void testIndex() throws Exception {
final URL url = new URL("http://spring.io/blog/2013/01/16/next-stop-spring-framework-4-0");
CompletableFuture<Boolean> content = service.startDownloading(url);
content.thenRun(() -> {
assertThat(true, equalTo(content.isDone()));
assertThat(str, JUnitMatchers.containsString("<html"));
});
// wait for completion
content.get(10, TimeUnit.SECONDS);
}
}
Please that when the time-out is not specified, and anything goes wrong the test will go on "forever" until the CI or you shut it down.