I successfully setup up AWS Elastic search and I can query elastic search on AWS using curl from terminal:
curl -XPUT -u 'admin:xxxx'
'https://search-xxxxxxx-xxxxxxu.eu-west-1.es.amazonaws.com/xxxxx/_doc/1'
-d '{"name": "product", "description": "crema..."}' -H 'Content-Type: application/json'
In the project I have the following dependency;
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
and running an elastic search instance on my mac I can connect to elastic search using the following configuration:
#Configuration
public class ElasticClient extends ElasticsearchConfiguration {
#Override
public ClientConfiguration clientConfiguration() {
return ClientConfiguration.builder()
.connectedTo("localhost:9200")
.build();
}
}
In order to connect to elastic search on AWS I tried several confgurations including the deprecated RestHighLevelClient without success.
It is possible to connect to AWS using Spring boot and basic http authentication like I've done using curl command from terminal? I would like to implement public ClientConfiguration clientConfiguration() with the right configuration in order to connect to my elastic search hosted on AWS.
Thanks
Related
I basically followed the steps described here: https://docs.spring.io/spring-data/neo4j/docs/current/reference/html/#configure-spring-boot-project
My application.properties contains the following:
spring.neo4j.uri=neo4j://localhost:7687
spring.neo4j.authentication.username=neo4j
spring.neo4j.authentication.password=verySecret357
I have a Neo4jConfiguration Bean which only specifies the TransactionManager, rest is (supposedly) taken care of by spring-boot-starter-data-neo4j:
#Configuration
public class Neo4jConfiguration {
#Bean
public ReactiveNeo4jTransactionManager reactiveTransactionManager(Driver driver,
ReactiveDatabaseSelectionProvider databaseNameProvider) {
return new ReactiveNeo4jTransactionManager(driver, databaseNameProvider);
}
}
Neo4j (5.3.0) runs in a Docker container I started with
docker run -d --name neo4j -p 7474:7474 -p 7687:7687 -e 'NEO4J_AUTH=neo4j/verySecret357' neo4j:4.4.11-community
I can access it through HTTP on my localhost:7474 and can authenticate using the credentials above.
Now, when I run my springboot app and try to create Nodes in Neo4j, I keep getting the same exception:
org.neo4j.driver.exceptions.AuthenticationException: The client is unauthorized due to authentication failure.
Running in debug, it however seems the client authentication scheme is correctly set:
Any thoughts on what I might be doing wrong ?
Edit: one thing though, I would assume that the "authToken" would contain a base64-encoded String (username:password) as the scheme is basic auth. It looks like it's not the case (using neo4j-java-driver:5.2.0).
Edit: seems to be related to the Docker image. A standalone neo4j instance works fine.
I'm new in Spring and I have a problem when I run Spring WEB in VM.
Test on local computer works:
I run app mvn spring-boot:run -Dspring-boot.run.profiles=test and check request:
curl -X POST ``http://localhost:8080/api/v1/dictionary/yandex-alice-skill`` -H "Content-Type: application/json" -d "{}"
and get {"response":{"text":"Hi! I can help you to learn words.","end_session":false},"version":"1.0"}
When I run app in VM and try to get json in my VM everything works fine too:
BUT when I try to get json in my local computer I get 404
Page have 404, but why I can't get json? How to properly connect to my virtual machine?
Controller:
#RestController
#Slf4j
#RequestMapping("/api/v1/dictionary/yandex-alice-skill")
#RequiredArgsConstructor
public class DictionaryController {
#Autowired
private DictionaryService dictionaryService;
#PostMapping
public #ResponseBody talkYandexAlice(
#RequestBody YandexAliceRequest request) {
return dictionaryService.talkYandexAlice(request);
}
}
I want to get json, but I get HTML from server.
I have simple rest controller
public void getMyIp(HttpServletRequest request)
{
final var ip = request.getRemoteAddr();
....
}
And I emulate request through proxy server
curl --location --request GET 'localhost:8080/api/myIp' \
--header 'X-Forwarded-For: 10.10.10.10' \
--header 'X-Real-Ip: 10.10.10.10'
I changed strategy in application.yml
server:
forward-headers-strategy: FRAMEWORK
The application runs from IDE with build-in tomcat server.
Why I gets my real ip address?
UPD: I changed strategy to native, and it works now!
It need to use strategy NATIVE
server:
forward-headers-strategy: NATIVE
Facing issue while using DynamoDb with Spring Boot for storing data.
It gives me the following error.
com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException: Cannot do operations on a non-existent table (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ResourceNotFoundException; Request ID: 7ffd4509-e444-4569-8c81-d4e7a1c218ef)
I have started a local instance of DynamoDb using the following command on a windows machine
java -Djava.library.path=./DynamoDBLocal_lib -jar DynamoDBLocal.jar -port 8001 -sharedDb
Created DynamoDBMapper for interacting with DB
#Bean
public DynamoDBMapper mapper() {
return new DynamoDBMapper(amazonDynamoDBConfig());
}
public AmazonDynamoDB amazonDynamoDBConfig() {
return AmazonDynamoDBClientBuilder.standard()
.withEndpointConfiguration(new
AwsClientBuilder.EndpointConfiguration(awsDynamoDBEndPoint, awsRegion))
.withCredentials(new AWSStaticCredentialsProvider(new
BasicAWSCredentials(awsAccessKey, awsSecretKey)))
.build();
}
And called mapper using #Autowiring
#Autowired
private DynamoDBMapper mapper;
When I try to add data using
mapper.save(person);
it gives an error saying Cannot do operations on a non-existent table
Please give me some idea where I am missing the trick here.
Thanks in advance.
The root cause might be because of aws-cli and the application using different aws profiles (credential and region). It will create different db files and use different db files when aws-cli and application connect to dynamodb local.
Please use the below approach to debug.
You must use sharedDB to start your docker instance.
docker run -p 8000:8000 -v $(pwd)/local/dynamodb:/data/ amazon/dynamodb-local -jar DynamoDBLocal.jar -sharedDb -dbPath /data
Please check the aws profile u have created (aws_access_key_id & aws_secret_access_key) . Use these same values in your application to connect to the docker dynamoDB instance .
In your Person.java(Model class) check the table name. Table names are case sensitive for DynamoDB
#DynamoDBTable(tableName = "Person")
Or
#DynamoDBTable(tableName = "person")
Is it possible to refresh the configurations calling a java method instead to use the REST api:
curl localhost:8080/actuator/refresh -d {} -H "Content-Type: application/json"
You can use the ResartEndpoint class from spring-cloud-context:
#Autowired
private RestartEndpoint restartEndpoint;
...
Thread restartThread = new Thread(() -> restartEndpoint.restart());
restartThread.setDaemon(false);
restartThread.start();
This is how #alexbt suggests to do it. But note that the spring cloud documentation also says you can refresh individual beans provided they are RefreshScope.