Apache Superset oauth2 with custom Spring-Security OAuth2 server - spring

I am using Apache Superset and trying to configure its OAuth2 capability to connect to my (custom) Spring-Security OAuth2 server. Unfortunately, it ain't working right now. The stack track begins with this.
15:09:16.584 [qtp1885996206-21] ERROR org.springframework.boot.web.support.ErrorPageFilter - Forwarding to error page from request [/oauth/authorize] due to exception [Could not resolve view with name 'forward:/oauth/confirm_access' in servlet with name 'dispatcherServlet'] javax.servlet.ServletException: Could not resolve view with name 'forward:/oauth/confirm_access' in servlet with name 'dispatcherServlet' at org.springframework.web.servlet.DispatcherServlet.render(DispatcherServlet.java:1262) ~[spring-webmvc-4.3.8.RELEASE.jar:4.3.8.RELEASE] at
Here is the relevant portion of my config.py from Superset.
AUTH_TYPE = AUTH_OAUTH
OAUTH_PROVIDERS = [
{
"name" : "MY-OAUTH",
"icon" : APP_ICON,
"token_key" : "password",
"remote_app" : {
"consumer_key" : "my_dashboard",
"consumer_secret" : "my_secret",
"base_url" : "http://localhost:8088/myoauth",
"request_token_params" : {
"scope": "my_dashboard read write",
"grant_type" : "password"
},
"request_token_url" : None,
"access_token_url" : "http://localhost:8088/myoauth/oauth/token",
"access_token_params" : {
"scope": "my_dashboard read write",
"grant_type" : "password",
"response_type" : "authorization_code"
},
"access_token_method" : "POST",
"authorize_url" : "http://localhost:8088/myoauth/oauth/authorize"
}
}
]
A nice gentleman suggested that I have somehow disabled the servlet handler for /oauth/confirm_access, but I am not sure how to check on that or fix such a problem.
Do you know what is going on here, what I can do to fix this or where I can start looking?
Thanks,
Matt

Related

hadoop.security.authentication: simple

i have the following configuration in core-site.xml
Hadoop Version 3.3.x
hadoop.http.authentication.simple.anonymous.allowed: "false"
hadoop.http.authentication.type: simple
hadoop.http.filter.initializers: org.apache.hadoop.security.AuthenticationFilterInitializer
getting the following error when starting bootstrap namenode.
"caught exception initializing http://apache-hadoop-journalnode-2.apache-hadoop-journalnode.nom-backend.svc.cluster.local:8480/getJournal?jid=apache-hadoop-namenode&segmentTxId=1&storageInfo=-66%3A1139640761%3A1664180726776%3ACID-9f32eb78-6efb-46ed-afd4-715e6c598e01&inProgressOk=true","exceptionclass":"org.apache.hadoop.hdfs.server.common.HttpGetFailedException","stack":["org.apache.hadoop.hdfs.server.common.HttpGetFailedException: Fetch of http://apache-hadoop-journalnode-2.apache-hadoop-journalnode.nom-backend.svc.cluster.local:8480/getJournal?jid=apache-hadoop-namenode&segmentTxId=1&storageInfo=-66%3A1139640761%3A1664180726776%3ACID-9f32eb78-6efb-46ed-afd4-715e6c598e01&inProgressOk=true failed with status code 401","Response message:","Authentication required","\tat org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream$URLLog$1.run(EditLogFileInputStream.java:489)","\tat org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream$URLLog$1.run(EditLogFileInputStream.java:474)","\tat java.security.AccessController.doPrivileged(Native Method)","\tat javax.security.auth.Subject.doAs(Subject.java:422)","\tat org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)","\tat org.apache.hadoop.security.SecurityUtil.doAsUser(SecurityUtil.java:536)","\tat org.apache.hadoop.security.SecurityUtil.doAsCurrentUser(SecurityUtil.java:530)","\tat org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream$URLLog.getInputStream(EditLogFileInputStream.java:473)","\tat org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream.init(EditLogFileInputStream.java:157)","\tat org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream.nextOpImpl(EditLogFileInputStream.java:218)","\tat org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream.nextOp(EditLogFileInputStream.java:276)","\tat org.apache.hadoop.hdfs.server.namenode.EditLogInputStream.readOp(EditLogInputStream.java:85)","\tat org.apache.hadoop.hdfs.server.namenode.EditLogInputStream.skipUntil(EditLogInputStream.java:151)"
works fine when hadoop.http.authentication.simple.anonymous.allowed: "true" but not when set to false
curl -s http://localhost:9870/jmx?qry=Hadoop:service=NameNode,name=NameNodeStatus
{
"beans" : [ {
"name" : "Hadoop:service=NameNode,name=NameNodeStatus",
"modelerType" : "org.apache.hadoop.hdfs.server.namenode.NameNode",
"NNRole" : "NameNode",
"HostAndPort" : "apache-hadoop-namenode-0.apache-hadoop-namenode.nom-backend.svc.cluster.local:8020",
"SecurityEnabled" : false,
"LastHATransitionTime" : 1664300715338,
"BytesWithFutureGenerationStamps" : 0,
"SlowPeersReport" : null,
"SlowDisksReport" : null,
"State" : "active"
} ]
}
but fails to show details when i use curl -s http://localhost:9870/jmx?user.name=hdfs&qry=Hadoop:service=NameNode,name=NameNodeStatus
similar authentication is seen in bootstrapstandby node when doing imageTransfer?getImage endpoint
message":"Failed to start namenode.","exceptionclass":"java.io.IOException","stack":["java.io.IOException: java.lang.RuntimeException: org.apache.hadoop.hdfs.server.common.HttpGetFailedException: Image transfer servlet at http://apache-hadoop-namenode-0.apache-hadoop-namenode.nom-backend.svc.cluster.local:9870/imagetransfer?getimage=1&txid=0&storageInfo=-66:1919632792:1664433837683:CID-462fba6c-76bc-4c4f-958e-c4906374d825&bootstrapstandby=true failed with status code 401","Response message:","Authentication required","\tat org.apache.hadoop.hdfs.server.namenode.ha.BootstrapStandby.run(BootstrapStandby.java:549)"

Why is spring returning a HAL/HATEOAS response when calling a non-existent url?

I am getting a response from localhost:8085/ponds when I have not set a controller for it... there is no mapping anywhere.
The even more weird thing is that when I go to localhost:8085/ponds I can see in the application that it's executing an SQL command.
I have searched online for where this _embedded is coming from and I have found something regarding a HAL and HATEOS, however, I have not implicitly implemented these anywhere.
I have ran a maven clean install and deploy in the terminal and I see this:
INFO 16328 --- [ main] o.s.d.r.w.RepositoryRestHandlerMapping : Mapped "{[/{repository}/{id}],methods=[GET],produces=[application/hal+json || application/json]}" onto public org.springframework.http.ResponseEntity> org.springframework.data.rest.webmvc.RepositoryEntityController.getItemResource(org.springframework.data.rest.webmvc.RootResourceInformation,java.io.Serializable,org.springframework.data.rest.webmvc.PersistentEntityResourceAssembler,org.springframework.http.HttpHeaders) throws org.springframework.web.HttpRequestMethodNotSupportedException
org.springframework.http.ResponseEntity<
org.springframework.hateoas.Resource>
This is the response in the browser:
{
"_embedded" : {
"ponds" : [ ]
},
"_links" : {
"self" : {
"href" : "http://localhost:8085/ponds"
},
"profile" : {
"href" : "http://localhost:8085/profile/ponds"
}
}
}
The last time I used SpringBoot was a good few months ago, if I tried to access a link/mapping/HTML page that I did not write a controller for, I would get a "page not found" page, not a response.
I am using SpringBoot 2.0.5.RELEASE
I do not understand why it's giving a response and not an error...
So I have commented:
<!--<dependency>-->
<!--<groupId>org.springframework.boot</groupId>-->
<!--<artifactId>spring-boot-starter-data-rest</artifactId>-->
<!--</dependency>-->
And the response went away. I still don't understand why it's using the ResponseEntity, or why the ResponseEntity is actually returning that JSON...
spring-boot-starter-data-rest
There is no need for that dependency. I have removed it.
https://spring.io/guides/gs/accessing-data-rest/
This guide walks you through the process of creating an application that accesses relational JPA data through a hypermedia-based RESTful front end.

How to set a base url in a springboot graphql app

How can we set a base url in springboot graphql-server app .
By default the graphiql api-console opens on http://localhost:8080/graphiql
Trying to access http://localhost:8080 through postman with a post -query as below :
{
bookings {
name
}
}
gives an error saying :
{
"timestamp": 1549913598497,
"status": 404,
"error": "Not Found",
"message": "No message available",
"path": "/"
}
Q1 what should be the path to the server i should be using to invoke it.
Q2 is there a way to provide a custom base path something loke http://localhost:8080/service/api/query
Usually the server path for graphql endpoints is at http://localhost:8080/graphql. If not just inspect the network tab on your browser when you run a query on your GraphiQL interface, it will run the query on the api endpoint.
In order to change the base path you would need to change application.properties into something like:
graphql.servlet.mapping: /service/api/query
graphiql.mapping: /graphiql
graphiql.endpoint: /service/api/query
If you are using Spring Boot Property file you can change the base url like below:
spring.graphql.path=/service/api/query
Example:
When I changed the base url like below.:
spring.graphql.path=/api/projects/graphql
It reflected like below in console:
2022-11-05 08:58:14.964 INFO 17336 --- [ main] s.b.a.g.s.GraphQlWebMvcAutoConfiguration : GraphQL endpoint HTTP POST /api/projects/graphql
More can be found at below official document:
https://docs.spring.io/spring-boot/docs/current/reference/html/application-properties.html#appendix.application-properties.web

Why are my Spring Data repositories not found if moved into another package of a Spring Boot application?

I created a new project in Eclipse from these helpful spring examples (Import Getting Started Content). It is called "gs-accessing-data-rest-complete"
Reference and full Code can be found: spring-guides/gs-accessing-data-rest
When leaving the example unchanged, except using WAR instead of JAR packaging, everything works well. When calling $ curl http://localhost:8080/, I'll get an exposure of usable resources.
$ curl http://localhost:8080/
{
"_links" : {
"people" : {
"href" : "http://localhost:8080/name{?page,size,sort}",
"templated" : true
},
"profile" : {
"href" : "http://localhost:8080/alps"
}
}
But when moving the PersonRepository into another package, e.g. myRepos via Eclipse's Refactor-->Move command, a resource is not accessible anymore.
The response from curl is then:
$ curl http://localhost:8080/
{
"_links" : {
"profile" : {
"href" : "http://localhost:8080/alps"
}
}
}
As far as I understood, Spring scans for Repositories automatically. Because the main class uses #SpringBootApplication annotation, everything should be found by spring itself.
What am I missing? Do I have to add some special XML configuration file or add another Configuration Class somewhere? Or do I have to update application.properties in order to sth.?
Perhaps somebody has some useful experiences, she or he might share with me. Thank you.
Try specifying the base package to use when scanning for repositories by using this annotation on your config class: #EnableJpaRepositories(basePackages = "your.base.repository.package")

Call Spring batch jobs on remote server

I use Spring Batch Admin to manage and monitor jobs and executions. How can I call a job and launch it from a standalone java application with given HTTP Connection to the server containing Spring Batch Admin WebAPP.
Thank you for any help
You can use Spring Batch Admin JSON API to do so - it is possible to list jobs as well as to run them. Additionally, you can expose JMX beans to monitor and manage batch jobs remotely.
Below is an example of json POST request to the job service launching job named 'job1':
$ curl -d jobParameters=fail=false http://localhost:8080/spring-batch-admin-sample/batch/jobs/job1.json
{"jobExecution" : {
"resource" : "http://localhost:8080/spring-batch-admin-sample/batch/jobs/executions/2.json",
"id" : "2",
"status" : "STARTING",
"startTime" : "",
"duration" : "",
"exitCode" : "UNKNOWN",
"exitDescription" : "",
"jobInstance" : { "resource" : "http://localhost:8080/spring-batch-admin-sample/batch/jobs/job1/1.json" },
"stepExecutions" : {
}
}
}
you can use simply HttpURLConnection with the JOB url and its parameter.
the URL construct would be like
"http://:8080/spring-batch-admin-sample/batch/jobs/yourJob?jobParameters=" + URLEncoder.encode("param1=value,param2=value2", UTF-8)
let me know, for any clarifications.....

Resources