Does creating genericBeanDefition of AmazonS3ClientBuilder class equivalent to creating AmazonS3 class? - spring

If created separately for each case: we could do something like this:
public AmazonS3 CASE_1_AmazonS3(STSAssumeRoleSessionCredentialsProvider assumedCredentials, Regions region) {
return AmazonS3ClientBuilder.standard()
.withCredentials(assumedCredentials)
.withRegion(Regions.US_EAST_1)
.build();
}
But how to make it generic?

Related

how to Spring integration SftpOutboundGateway setFilter Custom Argument?

How can I pass a parameter as a filter condition when getting the file list of the SFTP server from MessagingGateway?
My SftpMessageGateway code
#MessagingGateway
public interface SftpMessageGateway {
#Gateway(requestChannel = "getSftpChannel")
List<SftpFileInfo> getIconListByProductUiId(#Payloads("productUiId") String productUiId);
Integration Config
#Bean
public SessionFactory<ChannelSftp.LsEntry> sftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(host);
factory.setPort(port);
factory.setUser(id);
factory.setPassword(password);
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<>(factory);
}
#Bean
#ServiceActivator(inputChannel = "getSftpChannel")
public MessageHandler getMessageHandler() {
SftpOutboundGateway outboundGateway = new SftpOutboundGateway(sftpSessionFactory(), "ls", "'" + uploadPath + "'");
outboundGateway.setOption(AbstractRemoteFileOutboundGateway.Option.NAME_ONLY);
outboundGateway.setFilter(new SftpSimplePatternFileListFilter("*alpha*"));
outboundGateway.setFilter(new SftpSimplePatternFileListFilter("I want get custom argument)); <----
return outboundGateway;
}
You can set only one filter into a gateway, however there is a CompositeFileListFilter where you can combine a set of filters, include any custom impl of the FileListFilter.
See more info in docs: https://docs.spring.io/spring-integration/docs/current/reference/html/file.html#remote-persistent-flf
You can refer following code snippet for implementing FileListFilter. My use case was to fetch most latest file uploaded in SFTP directory.
#Component
public class LastModifiedFileFilter implements FileListFilter<LsEntry> {
#Override
public List<LsEntry> filterFiles(LsEntry[] files) {
List<LsEntry> result = new ArrayList<LsEntry>();
Vector<LsEntry> list = new Vector<LsEntry>();
Collections.addAll(list, files);
ChannelSftp.LsEntry lastModifiedEntry = Collections.max(list,
(Comparator.comparingInt(entry -> entry.getAttrs().getMTime())));
result.add(lastModifiedEntry);
return result;
}
}
Once you have your own custom filter in place then you need to 'Chain' it with your other filters in SftpOutboundGateway object. For your reference, I did it this way
ChainFileListFilter<LsEntry> filterList = new ChainFileListFilter<LsEntry>();
filterList.addFilter(new SftpSimplePatternFileListFilter("*alpha*"));
filterList.addFilter(new LastModifiedFileFilter());
setFilter(filterList);
For me, it will now fetch latest file having "alpha" string present in its name. Hope this helps.

Connect to Amazon S3 without Access-Key and Secret-Key from Spring Boot

I cannot connect to Amazon S3 through the IAM role. I have been told that I cannot use the secret-key or the access-key, but I can't find any way to do it without this.
Actually what I have is this:
public class S3Config {
#Autowired
Environment env;
#Value("${amazon.aws.accesskey}")
private String awsId;
#Value("${amazon.aws.secretkey}")
private String awsKey;
#Value("${amazon.aws.role}")
private String roleArn;
#Value("${amazon.aws.region}")
private String region;
#Bean
#Scope("prototype")
#Primary
public AmazonS3Client s3client() {
BasicAWSCredentials awsCreds = new BasicAWSCredentials("", "");
AWSSecurityTokenService stsClient = AWSSecurityTokenServiceClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(awsCreds))
.withRegion(Regions.fromName(region))
.build();
AssumeRoleRequest assumeRequest = new AssumeRoleRequest().withRoleArn(roleArn).withDurationSeconds(3600)
.withRoleSessionName("Test");
AssumeRoleResult roleResponse = stsClient.assumeRole(assumeRequest);
Credentials sessionCredentials = roleResponse.getCredentials();
BasicSessionCredentials awsCredentials = new BasicSessionCredentials(
sessionCredentials.getAccessKeyId(),
sessionCredentials.getSecretAccessKey(),
sessionCredentials.getSessionToken());
AmazonS3Client s3ClientRole = (AmazonS3Client) AmazonS3ClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(awsCredentials))
.withRegion(Regions.fromName(region))
.build();
AmazonS3Client s3Client = s3ClientRole;
return s3Client;
}
}
If on the line where I create the awsCreds object, I fill it correctly with the values ​​I get from the application.properties, if it does everything correctly. But if I leave it empty I never get to execute the line stsClient.assumeRole(assumeRequest); and it gives me connection time out error.
I think you should follow this doc: https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/java-dg-roles.html#default-provider-chain
It should follow you step bu step on resolving you problem.
Basically an IAM Role allows a specific "resource", such as an EC2 instance, to connect to the service and thus you configure your client to get temporary credentials just to "authenticate" and "authorize" it.

Spring Integration with SFTP

I'm building a little microservice to access files from a SFTP file server. I decided to use Spring Integration SFTP to get the job done. I'm new to Spring Integration and confused with how it all works.
My goal is to get a list of files a directory on a SFTP server and present them to the user interface. From there a user will select a file for download and I'll use the filename to stream the file from the SFTP server to the user interface.
I'm using the following code which does work.
Entire class to handle SFTP with SSH
#Slf4j
#Configuration
public class SftpConfig {
#Value("${sftp.host}")
private String sftpHost;
#Value("${sftp.port:22}")
private int sftpPort;
#Value("${sftp.user}")
private String sftpUser;
#Value("${sftp.privateKey:#{null}}")
private Resource sftpPrivateKey;
#Value("${sftp.privateKeyPassphrase:}")
private String sftpPrivateKeyPassphrase;
#Value("${sftp.password:#{null}}")
private String sftpPasword;
#Value("${sftp.remote.directory:/}")
private String sftpRemoteDirectory;
#Bean
public SessionFactory<LsEntry> sftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(sftpHost);
factory.setPort(sftpPort);
factory.setUser(sftpUser);
if (sftpPrivateKey != null) {
factory.setPrivateKey(sftpPrivateKey);
factory.setPrivateKeyPassphrase(sftpPrivateKeyPassphrase);
} else {
factory.setPassword(sftpPasword);
}
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<>(factory);
}
#ServiceActivator(inputChannel = "ftpLS")
#Bean
public SftpOutboundGateway getFtpLS() {
SftpOutboundGateway gateway = new SftpOutboundGateway(sftpSessionFactory(), "ls", "'" + sftpRemoteDirectory + "' + payload");
gateway.setOption(AbstractRemoteFileOutboundGateway.Option.NAME_ONLY);
return gateway;
}
#ServiceActivator(inputChannel = "ftpGet")
#Bean
public SftpOutboundGateway getFtpGet() {
SftpOutboundGateway gateway = new SftpOutboundGateway(sftpSessionFactory(), "get", "'" + sftpRemoteDirectory + "' + payload");
gateway.setOption(AbstractRemoteFileOutboundGateway.Option.STREAM);
return gateway;
}
#MessagingGateway(defaultRequestChannel = "ftpLS")
public interface FtpLS {
List list(String directory);
}
#MessagingGateway(defaultRequestChannel = "ftpGet")
public interface FtpGet {
InputStream get(String fileName);
}
}
Run
#Bean
public ApplicationRunner runner(SftpConfig.FtpLS ftpLS, SftpConfig.FtpGet ftpGet) {
return args -> {
List<String> list = ftpLS.list("139");
System.out.println("Result:" + list);
InputStream is = ftpGet.get("139/" + list.get(0));
String theString = IOUtils.toString(is,"UTF-8");
System.out.println("Result:" + theString);
};
}
My number one question is this the correct approach?
Secondly, do I need two interfaces in order to use the two different SftpOutboundGateway's?
Lastly, is there a better way to pass in a dynamic directory name when doing a FtsGet? Right now I'm passing I'm concatenating 139 with the base directory in a string and passing it in through the interface.
is this the correct approach?
Yes, the approach is correct. Although I would suggest do not use isSharedSession for the gateway since it might be used from different threads by different users.
do I need two interfaces?
No, you really can have one #MessagingGateway, but with several methods marked with their own #Gateway annotations.
is there a better way to pass in a dynamic directory?
No, your approach is correct. There is no something like working directory to switch automatically, like we can do in FTP.

JAXBElement: providing codec (/converter?) for class java.lang.Class

I have been evaluating to adopt spring-data-mongodb for a project. In summary, my aim is:
Using existing XML schema files to generate Java classes.
This is achieved using JAXB xjc
The root class is TSDProductDataType and is further modeled as below:
The thing to note here is that ExtensionType contains protected List<Object> any; allowing it to store Objects of any class. In my case, it is amongst the classes named TSDModule_Name_HereModuleType and can be browsed here
Use spring-data-mongodb as persistence store
This is achieved using a simple ProductDataRepository
#RepositoryRestResource(collectionResourceRel = "product", path = "product")
public interface ProductDataRepository extends MongoRepository<TSDProductDataType, String> {
TSDProductDataType queryByGtin(#Param("gtin") String gtin);
}
The unmarshalled TSDProductDataType, however, contains JAXBElement which spring-data-mongodb doesn't seem to handle by itself and throws a CodecConfigurationException org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class java.lang.Class.
Here is the faulty statement:
TSDProductDataType tsdProductDataType = jaxbElement.getValue();
repository.save(tsdProductDataType);
I tried playing around with Converters for spring-data-mongodb as explained here, however, it seems I am missing something since the exception is about "Codecs" and not "Converters".
Any help is appreciated.
EDIT:
Adding converters for JAXBElement
Note: Works with version 1.5.6.RELEASE of org.springframework.boot::spring-boot-starter-parent. With version 2.0.0.M3, hell breaks loose
It seems that I missed something while trying to add converter earlier. So, I added it like below for testing:
#Component
#ReadingConverter
public class JAXBElementReadConverter implements Converter<DBObject, JAXBElement> {
//#Autowired
//MongoConverter converter;
#Override
public JAXBElement convert(DBObject dbObject) {
Class declaredType, scope;
QName name = qNameFromString((String)dbObject.get("name"));
Object rawValue = dbObject.get("value");
try {
declaredType = Class.forName((String)dbObject.get("declaredType"));
} catch (ClassNotFoundException e) {
if (rawValue.getClass().isArray()) declaredType = List.class;
else declaredType = LinkedHashMap.class;
}
try {
scope = Class.forName((String) dbObject.get("scope"));
} catch (ClassNotFoundException e) {
scope = JAXBElement.GlobalScope.class;
}
//Object value = rawValue instanceof DBObject ? converter.read(declaredType, (DBObject) rawValue) : rawValue;
Object value = "TODO";
return new JAXBElement(name, declaredType, scope, value);
}
QName qNameFromString(String s) {
String[] parts = s.split("[{}]");
if (parts.length > 2) return new QName(parts[1], parts[2], parts[0]);
if (parts.length == 1) return new QName(parts[0]);
return new QName("undef");
}
}
#Component
#WritingConverter
public class JAXBElementWriteConverter implements Converter<JAXBElement, DBObject> {
//#Autowired
//MongoConverter converter;
#Override
public DBObject convert(JAXBElement jaxbElement) {
DBObject dbObject = new BasicDBObject();
dbObject.put("name", qNameToString(jaxbElement.getName()));
dbObject.put("declaredType", jaxbElement.getDeclaredType().getName());
dbObject.put("scope", jaxbElement.getScope().getCanonicalName());
//dbObject.put("value", converter.convertToMongoType(jaxbElement.getValue()));
dbObject.put("value", "TODO");
dbObject.put("_class", JAXBElement.class.getName());
return dbObject;
}
public String qNameToString(QName name) {
if (name.getNamespaceURI() == XMLConstants.NULL_NS_URI) return name.getLocalPart();
return name.getPrefix() + '{' + name.getNamespaceURI() + '}' + name.getLocalPart();
}
}
#SpringBootApplication
public class TsdApplication {
public static void main(String[] args) {
SpringApplication.run(TsdApplication.class, args);
}
#Bean
public CustomConversions customConversions() {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(),
new JAXBElementWriteConverter()
));
}
}
So far so good. However, how do I instantiate MongoConverter converter;?
MongoConverter is an interface so I guess I need an instantiable class adhering to this interface. Any suggestions?
I understand the desire for convenience in being able to just map an existing domain object to the database layer with no boilerplate, but even if you weren't having the JAXB class structure issue, I would still be recommending away from using it verbatim. Unless this is a simple one-off project, you almost definitely will hit a point where your domain models will need to change but your persisted data need to remain in an existing state. If you are just straight persisting the data, you have no mechanism to convert between a newer domain schema and an older persisted data scheme. Versioning of the persisted data scheme would be wise too.
The link you posted for writing the customer converters is one way to achieve this and fits in nicely with the Spring ecosystem. That method should also solve the issue you are experiencing (about the underlying messy JAXB data structure not converting cleanly).
Are you unable to get that method working? Ensure you are loading them into the Spring context with #Component plus auto-class scanning or manually via some Configuration class.
EDIT to address your EDIT:
Add the following to each of your converters:
private final MongoConverter converter;
public JAXBElement____Converter(MongoConverter converter) {
this.converter = converter;
}
Try changing your bean definition to:
#Bean
public CustomConversions customConversions(#Lazy MongoConverter converter) {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(converter),
new JAXBElementWriteConverter(converter)
));
}

How to consume a spring data rest service with java?

I have the following spring boot + data Rest repository:
#RepositoryRestResource(collectionResourceRel = "dto", path = "produtos")
public interface ProdutoRepository extends CrudRepository<Produto, Integer> {
#Query("SELECT p FROM Produto p where descricao LIKE CONCAT(UPPER(:like),'%')")
List<Produto> findByLike(#Param("like") String like);
}
I also have a java client that access this method (this is my example of doing it):
String url = "http://localhost:8080/produtos/search/findByLike?like={like}";
RestTemplate t = new RestTemplate();
ProdutoDto resp = t.getForObject(url, ProdutoDto.class, txtLoc.getText());
ProdutoDto (this one is not totally necessary):
public class ProdutoDto extends HalDto<Produto> {}
HalDto:
public class HalDto<T extends ResourceSupport> extends ResourceSupport {
#JsonProperty("_embedded")
private EmbeddedDto<T> embedded;
public EmbeddedDto<T> getEmbedded() {
return embedded;
}
public void setEmbedded(EmbeddedDto<T> embedded) {
this.embedded = embedded;
}
}
EmbeddedDto:
public class EmbeddedDto<T> {
#JsonProperty("dto")
private List<T> dtoList;
public List<T> getDtoList()
{
return dtoList;
}
public void setDto(List<T> dtoList) {
this.dtoList = dtoList;
}
}
Those classes are necessary (i think) because Spring Data returns data in the HAL (https://en.wikipedia.org/wiki/Hypertext_Application_Language) format.
Note: Produto must extend ResourceSupport.
Caveats: All collectionResourceRel must be named "dto" and it only works for collections (may be adjusted).
Is this the proper way to do this?
I have googled around and found plenty of examples of doing the server side, but almost nothing on building clients.
Thanks.
This is a solution that I have found which seems to work well.
First, setup your RestTemplate so that it expects JSON/HAL and knows what to do with it:
#Bean
public RestTemplate restTemplate() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(new Jackson2HalModule());
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
MappingJackson2HttpMessageConverter messageConverter =
new MappingJackson2HttpMessageConverter();
messageConverter.setObjectMapper(objectMapper);
messageConverter.setSupportedMediaTypes(Arrays.asList(MediaTypes.HAL_JSON, MediaType.APPLICATION_JSON_UTF8));
return new RestTemplate(Arrays.asList(messageConverter));
}
Then you can use the exchange method of the RestTemplate to specify that you want your result to be ResponseEntity<PagedResources<Producto>>
ResponseEntity<PagedResources<Producto>> resultResponse = restTemplate.exchange(uri, HttpMethod.GET, HttpEntity.EMPTY, new ParameterizedTypeReference<PagedResources<Producto>>(){});
if(resultResponse.getStatusCode() == HttpStatus.OK){
Collection<Producto> results = resultResponse.getBody().getContent();
log.info("{} results obtained", results.size());
}
You can instantiate restTemplate by either calling the restTemplate() method defined above or you can inject (autowire) it.

Resources