I have a #RestController method like below:
public AppUserDocumentUploadResponse uploadFile(#RequestParam("file") MultipartFile file,
#RequestParam HashMap<Object, Object> docPropertiesMap)
In the SwaggerUI screen it shows like this:
I need to show like below instead of "additionalProp1": {}:
{
"key" : "value"
}
I tried this SO thread and similar but looks like they are meant for swagger2.0/springfox and not OpenAPI (which I am using):
<dependency>
<groupId>org.springdoc</groupId>
<artifactId>springdoc-openapi-ui</artifactId>
<version>1.6.9</version>
</dependency>
Any pointers to official reference OR solution will be greatly appreciated.
Related
I am using spring boot for creating rest services. I need to validate the parameter passed. I have a service like below,
#GetMapping(value="/employee/{Id}")
public EmployeeDTO getEmployeeDetails(#PathVariable String Id) {
...
}
I need to throw error if Id is not passed in url. Like "Missing Id in request". I was able to achieve using below,
#GetMapping(value={"/employee", "/employee/{Id}"})
public EmployeeDTO getEmployeeDetails(#PathVariable String Id) {
...
}
And handled MissingPathVariableException in ExceptionHandler annotated with #ControllerAdvise.
But I wanted to know is this the right way to check ?
You can use #ControllerAdvise to handle exceptions that are generated while executing your actual code.
For Path variable validation, you can make use of spring-boot-starter-validation.
Add this maven dependency:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-validation</artifactId>
</dependency>
Then your controller will look like:
#GetMapping(value={"/employee", "/employee/{Id}"})
public EmployeeDTO getEmployeeDetails(
#NotBlank(message = "Missing Id in request")
#PathVariable String Id) {
...
}
I recommend you to read this: Validating Form Input
Im trying to expose simple rest controller to take multipart file as input and upload to S3 and download API to get file key as input and download the file from S3 and send to FE.
Here this Api should support all standard file formats.
Is there a generic implementation for this as this looks pretty standard feature . I could not find any implementation
Why don't you try Spring Content? It does exactly what you need.
Assuming maven, Spring Boot and Spring Data (let me know if you are using something else):-
pom.xml
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<!-- HSQL -->
<dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.hateoas</groupId>
<artifactId>spring-hateoas</artifactId>
</dependency>
<dependency>
<groupId>com.github.paulcwarren</groupId>
<artifactId>content-s3-spring-boot-starter</artifactId>
<version>${spring-content-version}</version>
</dependency>
<dependency>
<groupId>com.github.paulcwarren</groupId>
<artifactId>content-rest-spring-boot-starter</artifactId>
<version>${spring-content-version}</version>
</dependency>
...
<dependencies>
Update your entity with the managed Spring Content annotations.
Document.java
#Entity
public class Document {
...existing fields...
#ContentId
private String contentId;
#ContentLength
private Long contentLen;
#MimeType
private String mimeType;
...getters and setters...
}
Create a connection to your S3 store. The S3 Store has been implemented to use a SimpleStorageResourceLoader so this bean will ultimately be used by your store.
StoreConfig.java
#Configuration
#EnableS3Stores
public class S3Config {
#Autowired
private Environment env;
public Region region() {
return Region.getRegion(Regions.fromName(System.getenv("AWS_REGION")));
}
#Bean
public BasicAWSCredentials basicAWSCredentials() {
return new BasicAWSCredentials(env.getProperty("AWS_ACCESS_KEY_ID"), env.getProperty("AWS_SECRET_KEY"));
}
#Bean
public AmazonS3 client(AWSCredentials awsCredentials) {
AmazonS3Client amazonS3Client = new AmazonS3Client(awsCredentials);
amazonS3Client.setRegion(region());
return amazonS3Client;
}
#Bean
public SimpleStorageResourceLoader simpleStorageResourceLoader(AmazonS3 client) {
return new SimpleStorageResourceLoader(client);
}
}
Define a Store typed to Document - as that is what you are associating content with.
DocumentContentStore.java
#StoreRestResource
public interface DocumentStore extends ContentStore<Document, String> {
}
When you run this you also need to set the bucket for your store. This can be done by specifying spring.content.s3.bucket in application.properties/yaml or by setting the AWS_BUCKET environment variable.
This is enough to create a REST-based content service for storing content in S3 and associating that content with your Document entity. Spring Content will see the Store interface and the S3 dependencies. Assume you want to store content in S3 and inject an implementation of your interface for you. Meaning you dont have to implement it yourself. You will be able to store content by POSTing a multipart-form-data request to:
POST /documents/{documentId}/content
and fetching it again with:
GET /documents/{documentId}/content
(the service supports full CRUD BTW and video streaming in case that might be important).
You'll see that Spring Content associates content with your entity by managing the content related annotations for you.
This can be used with or without Spring Data. The dependencies and Store are a little different depending. I assume you have entities that you want to associate data with as you added a spring-data tag but let me know if not and I can adapt the answer.
There is a video of this here - the demo starts about half way through. It uses the Filesystem module not S3 but they are interchangeable. Just need to pick the right dependencies for the type of store you are using. S3 in your case.
HTH
I have a controller in spring boot. I want to get the formId from Form Data (see the image above). #RequestHeader(value="formId") doesn't work. How to get the value?
formId is not from the header but the from form data which is the request body.
You can get it like in this example:
#GetMapping("foo)
public String foo(#RequestBody MultiValueMap<String, String> formData) {
String formId = formData.get("formId");
// your code
}
First you need below dependency,
<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>javax.ws.rs-api</artifactId>
<version>2.1</version>
</dependency>
Then you can get Form data value using below example code,
#PostMapping("/foo")
#ResponseBody
public ResponseEntity<?> getData(#FormParam("formId") String formId) {
System.out.println(formId);
}
In here formParam variable name and parameter name should be equal.
I'm currently looking into Spring Cloud Function and its possibilities to deploy one function on different cloud environments (AWS Lambda and Azure Functions).
My function looks like this (of course very simplified):
#Component
public class EchoFunction implements Function<String, String> {
#Override
public String apply(String m) {
String message = "Received message: " + m;
return message;
}
}
When deploying that on AWS Lambda, it works perfectly (the full project can be found here).
However, if I run the same function as local Azure Functions deployment using the Azure Functions Core Tools, I get the following exception when calling the function:
24.01.19 21:58:50] Caused by: java.lang.ClassCastException: reactor.core.publisher.FluxJust cannot be cast to java.lang.String
[24.01.19 21:58:50] at de.margul.awstutorials.springcloudfunction.function.EchoFunction.apply(EchoFunction.java:9)
[24.01.19 21:58:50] at org.springframework.cloud.function.adapter.azure.AzureSpringBootRequestHandler.handleRequest(AzureSpringBootRequestHandler.java:56)
[24.01.19 21:58:50] at de.margul.awstutorials.springcloudfunction.azure.handler.FunctionHandler.execute(FunctionHandler.java:19)
[24.01.19 21:58:50] ... 16 more
For some reason, the function seems to expect a Flux instead of a String.
I think this might be related to what [the documentation] (https://cloud.spring.io/spring-cloud-static/spring-cloud-function/2.0.0.RELEASE/single/spring-cloud-function.html#_function_catalog_and_flexible_function_signatures) says about this:
One of the main features of Spring Cloud Function is to adapt and support a range of type signatures for user-defined functions, while providing a consistent execution model. That’s why all user defined functions are transformed into a canonical representation by FunctionCatalog, using primitives defined by the Project Reactor (i.e., Flux and Mono). Users can supply a bean of type Function, for instance, and the FunctionCatalog will wrap it into a Function,Flux>.
So the problem might be related to this:
If I change the function in the following way, it works:
#Component
public class EchoFunction implements Function<String, Flux<String>> {
#Override
public Flux<String> apply(String m) {
String message = "Received message: "+m;
return Flux.just(message);
}
}
My function handler looks like this:
public class FunctionHandler extends AzureSpringBootRequestHandler<String, String> {
#FunctionName("createEntityFunction")
public String execute(#HttpTrigger(name = "req", methods = {
HttpMethod.POST }, authLevel = AuthorizationLevel.ANONYMOUS) HttpRequestMessage<String> entity,
ExecutionContext context) {
return handleRequest(entity.getBody(), context);
}
#Bean
public EchoFunction createEntityFunction() {
return new EchoFunction();
}
}
For the AWS deployment, I had the following dependencies:
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-function-adapter-aws</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>1.2.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-events</artifactId>
<version>2.2.5</version>
</dependency>
</dependencies>
For the Azure deployment, I have only one dependency:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-function-adapter-azure</artifactId>
<version>2.0.0</version>
</dependency>
I've already looked into the source code of both adapters:
On AWS, the SpringBootRequestHandler invokes the target function (in line 48).
On Azure, the AzureSpringBootRequestHandler invokes the target function (in line 56).
For me, it looks like in both cases, a Flux is handed over.
However, for the AWS adapter, the Object is unwrapped somewhere in between obviously.
But this is not the case with the Azure adapter.
Any ideas why?
#margul Sorry for the late reply/ Without the newly created spring-cloud-function tag your question was kind of lost.
I just looked at it and also the issue you opened in GH and it appears to be a bug on our side.
Basically it seems like if we can't find function in catalog we fallback on bean factory. The problem with this approach is that bean factory has raw function bean (function not fluxified yet), hence the ClassCast exception.
Anyway, i'll address the rest in GH.
Just to close this off, please see this issue
please help me to solve the following issue:
I have a class, where several fields are marked as #NotNull:
public class SearchCommentRequest {
#NotNull
private Date fromDate;
#NotNull
private Date toDate;
//...
}
Object if this class is passed to controller as #RequestBody annotated also with #Valid:
#PostMapping(value = "/comment/search", consumes="application/json", produces = "text/csv")
public ResponseEntity<byte[]> searchComments(#RequestBody #Valid SearchCommentRequest searchRequest) {
List<SearchCommentResult> comments = commentService.searchComments(searchRequest);
So, I expect that if either fromDate or toDate is null - exception will be thrown.
Writing my integration tests, I decided to check this validation case as well:
#Test
public void searchCommentsValidateRequest() throws Exception {
ObjectMapper mapper = new ObjectMapper();
SearchCommentRequest request = new SearchCommentRequest();
// toDate = null;
request.setFromDate(new Date());
String requestBody = mapper.writer().writeValueAsString(request);
mockMvc.perform(post(COMMENT_SEARCH_ENDPOINT)
.contentType("application/json")
.content(requestBody))
.andDo(MockMvcResultHandlers.print())
.andExpect(status().is(400));
}
But it looks like mockMvc is ignoring validation. Searching for the same issues, I found several sources where solution was adding the following dependencies:
<dependency>
<groupId>javax.el</groupId>
<artifactId>javax.el-api</artifactId>
<version>2.2.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId>
<version>3.0.0</version>
</dependency>
But it didn't help.
I'm using Spring 4.3.3.RELEASE and manually added to pom.xml the following dependency:
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>2.0.1.Final</version>
</dependency>
Actually, #Notnull is often use at entity level.
It will validate and throw exception when you persist entity automatically.
If you want to validate at controller and use #Valid.
You should declare more about BindingResult result
and check errors
if(result.hasErrors()){
//do something or throw exceptions
}