Grails + hibernate + controller sessionFactory null object? - spring

I am new to using hibernate and grails. I have multiple java objects that I need to persist. To learn how it works I'm using a simple example of an employee class. Its in my src/java with the corresponding xml mappings with it. I think I need to make a session instance from my session factory, and I do not know what I'm doing wrong. I followed a tutorial for setting up hibternate hibernate tut and tried to translate it for grails. Any thoughts?
package com.turingpages.Matrix.view
import org.springframework.dao.DataIntegrityViolationException
import org.hibernate.HibernateException;
import org.hibernate.Session;
import org.hibernate.Transaction;
import org.hibernate.SessionFactory;
import org.hibernate.cfg.Configuration;
import org.codehaus.groovy.grails.commons.ApplicationHolder as AH
class MatrixController {
def ctx = AH.application.mainContext
def sessionFactory = ctx.sessionFactory
static allowedMethods = [save: "POST", update: "POST", delete: "POST"]
...
def create() {
def session = sessionFactory.currentSession
Transaction tx = null;
Integer employeeID = null;
try{
tx = session.beginTransaction();
Employee employee = new Employee("fname", "lname", 100);
employeeID = (Integer) session.save(employee);
tx.commit();
} catch (HibernateException e) {
if (tx!=null) tx.rollback();
e.printStackTrace();
} finally {
session.close();
}
return employeeID;
}
My stack trace:
ERROR errors.GrailsExceptionResolver - NullPointerException occurred when processing request: [POST] /turingpages/matrix/create
Cannot get property 'currentSession' on null object. Stacktrace follows:
Message: Cannot get property 'currentSession' on null object
Line | Method
->> 33 | create in com.turingpages.Matrix.view.MatrixController$$ENvP7skK
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
| 195 | doFilter in grails.plugin.cache.web.filter.PageFragmentCachingFilter
| 63 | doFilter in grails.plugin.cache.web.filter.AbstractFilter
| 1110 | runWorker in java.util.concurrent.ThreadPoolExecutor
| 603 | run . . . in java.util.concurrent.ThreadPoolExecutor$Worker
^ 722 | run in java.lang.Thread

There's no reason you should be writing code like that with Grails. If you really need to manage the transaction in a controller, you should do it this way.
def create() {
Integer employeeID = null;
Employee.withTransaction { status ->
Employee employee = new Employee(firstName: "fname", lastName: "lname", noIdea: 100);
employee.save()
if (employee.hasErrors()) {
status.setRollbackOnly()
}
}
return employee.id;
}
That said, when dealing with a single Domain like this, you don't really need to worry about it at all:
def create() {
Employee employee = new Employee(firstName: "fname", lastName: "lname", noIdea: 100);
employee.save(flush: true)
[employee: employee] // generally you want to pass the object back to a view this way
// deal with errors in the domain on the view
}
And even better would be to use a Service class. But that can be your homework assignment.

You need to move the code
def ctx = AH.application.mainContext
def sessionFactory = ctx.sessionFactory
into the method create(), or replace it with
def sessionFactory
However, Grails provides a withTransaction method to serve your purpose in a simpler way:
def create() {
Employee.withTransaction{ status ->
Employee employee = new Employee("fname", "lname", 100).save()
.....
if (employee.id) {
return employee.id
}
else {
status.setRollbackOnly()
}
}
}

Related

consume/return a JSON response when executing flow using Spring boot API

I'm a beginner in corda and I'm trying to execute flows using Spring boot API. When I used:
#PostMapping(value = [ "create-iou" ], produces = [ TEXT_PLAIN_VALUE ] , headers = [ "Content-Type=application/x-www-form-urlencoded" ])
my flow is getting executed (by testing it using insomnia). But When I changed it to
#PostMapping(value = [ "create-iou" ], produces = [ APPLICATION_JSON_VALUE ], headers = [ "Content-Type=application/json" ])
It gives me a 406 not acceptable error: No body returned for response.
Here's the API I've created/copied:
#PostMapping(value = [ "create-iou" ], produces = [ TEXT_PLAIN_VALUE ] , headers = [ "Content-Type=application/x-www-form-urlencoded" ])
fun createIOU(request: HttpServletRequest): ResponseEntity<String> {
val iouValue = request.getParameter("iouValue").toInt()
val partyName = request.getParameter("partyName")
?: return ResponseEntity.badRequest().body("Query parameter 'partyName' must not be null.\n")
if (iouValue <= 0 ) {
return ResponseEntity.badRequest().body("Query parameter 'iouValue' must be non-negative.\n")
}
val partyX500Name = CordaX500Name.parse(partyName)
val otherParty = proxy.wellKnownPartyFromX500Name(partyX500Name) ?: return ResponseEntity.badRequest().body("Party named $partyName cannot be found.\n")
return try {
val signedTx = proxy.startTrackedFlow(::Initiator, iouValue, otherParty).returnValue.getOrThrow()
ResponseEntity.status(HttpStatus.CREATED).body("Transaction id ${signedTx.id} committed to ledger.\n")
} catch (ex: Throwable) {
logger.error(ex.message, ex)
ResponseEntity.badRequest().body(ex.message!!)
}
}
I would like to return something like this:
{
iouValue: 99,
lender: PartyA,
borrower: PartyB
}
When executing the flow using http endpoint.
You need to use the RPC connection libraries provided by Corda:
import net.corda.client.rpc.CordaRPCClient
import net.corda.client.rpc.CordaRPCConnection
Take a look to this example to see how to use them.
You are not showing how your proxy is instantiate, but you need to instantiate a proxy to connect via RPC to the node, like so:
val rpcAddress = NetworkHostAndPort(host, rpcPort)
val rpcClient = CordaRPCClient(rpcAddress)
val rpcConnection = rpcClient.start(username, password)
proxy = rpcConnection.proxy
and once you have the proxy, you can create SpringBoot APIs to call that proxy that makes the RPC calls:
#RestController
#RequestMapping("/")
class StandardController(rpc: NodeRPCConnection) {
private val proxy = rpc.proxy
#GetMapping(value = ["/addresses"], produces = arrayOf("text/plain"))
private fun addresses() = proxy.nodeInfo().addresses.toString()
#GetMapping(value = ["/identities"], produces = arrayOf("text/plain"))
private fun identities() = proxy.nodeInfo().legalIdentities.toString()

Flink is not adding any data to Elasticsearch but no errors

Folks, I'm new to all this data streaming process but I was able to build and submit a Flink job that will read some CSV data from Kafka and aggregate it then put it in Elasticsearch.
I was able to do the first two parts, and print out my aggregation to STDOUT. But when I added the code to put it to Elasticsearch, it seems nothing is happening there (no data being added). I looked at the Flink job manager log and it looks fine (no errors) and says:
2020-03-03 16:18:03,877 INFO
org.apache.flink.streaming.connectors.elasticsearch7.Elasticsearch7ApiCallBridge
- Created Elasticsearch RestHighLevelClient connected to [http://elasticsearch-elasticsearch-coordinating-only.default.svc.cluster.local:9200]
Here is my code at this point:
/*
* This Scala source file was generated by the Gradle 'init' task.
*/
package flinkNamePull
import java.time.LocalDateTime
import java.util.Properties
import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.connectors.kafka.{FlinkKafkaConsumer010, FlinkKafkaProducer010}
import org.apache.flink.api.common.functions.RichMapFunction
import org.apache.flink.configuration.Configuration
import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment
import org.apache.flink.table.api.{DataTypes, Table}
import org.apache.flink.table.api.scala.StreamTableEnvironment
import org.apache.flink.table.descriptors.{Elasticsearch, Json, Schema}
object Demo {
/**
* MapFunction to generate Transfers POJOs from parsed CSV data.
*/
class TransfersMapper extends RichMapFunction[String, Transfers] {
private var formatter = null
#throws[Exception]
override def open(parameters: Configuration): Unit = {
super.open(parameters)
//formatter = DateTimeFormat.forPattern("yyyy-MM-dd HH:mm:ss")
}
#throws[Exception]
override def map(csvLine: String): Transfers = {
//var splitCsv = csvLine.stripLineEnd.split("\n")(1).split(",")
var splitCsv = csvLine.stripLineEnd.split(",")
val arrLength = splitCsv.length
val i = 0
if (arrLength != 13) {
for (i <- arrLength + 1 to 13) {
if (i == 13) {
splitCsv = splitCsv :+ "0.0"
} else {
splitCsv = splitCsv :+ ""
}
}
}
var trans = new Transfers()
trans.rowId = splitCsv(0)
trans.subjectId = splitCsv(1)
trans.hadmId = splitCsv(2)
trans.icuStayId = splitCsv(3)
trans.dbSource = splitCsv(4)
trans.eventType = splitCsv(5)
trans.prev_careUnit = splitCsv(6)
trans.curr_careUnit = splitCsv(7)
trans.prev_wardId = splitCsv(8)
trans.curr_wardId = splitCsv(9)
trans.inTime = splitCsv(10)
trans.outTime = splitCsv(11)
trans.los = splitCsv(12).toDouble
return trans
}
}
def main(args: Array[String]) {
// Create streaming execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment
env.setParallelism(1)
// Set properties per KafkaConsumer API
val properties = new Properties()
properties.setProperty("bootstrap.servers", "kafka.kafka:9092")
properties.setProperty("group.id", "test")
// Add Kafka source to environment
val myKConsumer = new FlinkKafkaConsumer010[String]("raw.data3", new SimpleStringSchema(), properties)
// Read from beginning of topic
myKConsumer.setStartFromEarliest()
val streamSource = env
.addSource(myKConsumer)
// Transform CSV (with a header row per Kafka event into a Transfers object
val streamTransfers = streamSource.map(new TransfersMapper())
// create a TableEnvironment
val tEnv = StreamTableEnvironment.create(env)
println("***** NEW EXECUTION STARTED AT " + LocalDateTime.now() + " *****")
// register a Table
val tblTransfers: Table = tEnv.fromDataStream(streamTransfers)
tEnv.createTemporaryView("transfers", tblTransfers)
tEnv.connect(
new Elasticsearch()
.version("7")
.host("elasticsearch-elasticsearch-coordinating-only.default.svc.cluster.local", 9200, "http") // required: one or more Elasticsearch hosts to connect to
.index("transfers-sum")
.documentType("_doc")
.keyNullLiteral("n/a")
)
.withFormat(new Json().jsonSchema("{type: 'object', properties: {curr_careUnit: {type: 'string'}, sum: {type: 'number'}}}"))
.withSchema(new Schema()
.field("curr_careUnit", DataTypes.STRING())
.field("sum", DataTypes.DOUBLE())
)
.inUpsertMode()
.createTemporaryTable("transfersSum")
val result = tEnv.sqlQuery(
"""
|SELECT curr_careUnit, sum(los)
|FROM transfers
|GROUP BY curr_careUnit
|""".stripMargin)
result.insertInto("transfersSum")
// Elasticsearch elasticsearch-elasticsearch-coordinating-only.default.svc.cluster.local:9200
env.execute("Flink Streaming Demo Dump to Elasticsearch")
}
}
I'm not sure how I can debug this beast... Wondering if somebody can help me figure out why the Flink job is not adding data to Elasticsearch :(
From my Flink cluster, I'm able to query Elasticsearch just fine (manually) and add records to my index:
curl -XPOST "http://elasticsearch-elasticsearch-coordinating-only.default.svc.cluster.local:9200/transfers-sum/_doc" -H 'Content-Type: application/json' -d'{"curr_careUnit":"TEST123","sum":"123"}'
A kind soul in the Flink mailist pointed out the fact that it could be Elasticsearch buffering my records... Well, it was. ;)
I have added the following options to the Elasticsearch connector:
.bulkFlushMaxActions(2)
.bulkFlushInterval(1000L)
Flink Elasticsearch Connector 7 using Scala
Please find a working and detailed answer which I have provided here.

Weird validation error handling in Micronaut

I have a controller action to serve my react front-end. It requires the validation messages in the special format:
#Transactional
#Post( uri = '{/id}', consumes = MediaType.APPLICATION_JSON, produces = MediaType.APPLICATION_JSON )
HttpResponse save( #PathVariable #Nullable Long id, #Body Map body ){
def o = bindFromIdAndBody id, body
if( o.save( flush:true ) ){
log.info "version >> $o.version"
HttpResponse.ok o
}else{
log.info '-------------------------'
List errors = o.errors.fieldErrors.collect{ FieldError fe ->
fe.codes.findResult{ String c ->
messageSource.getMessage c, fe.arguments, null, Locale.default
} ?: fe.codes.last()
}
log.info "save failed for $o: $errors"
HttpResponse.badRequest( errors:errors )
}
}
When I call the action, I'm getting 400 Bad Request in my client, but instead of { errors:[ {..}, {..}, {..} ] style JSON, I see rather:
{
"message":"Validation Error(s) occurred during save() : Field error in object ... default message [Property [{0}] of class [{1}] cannot be blank]\r\n",
"path":"fullName",
"_links":{"self":{"href":"/person/42","templated":false}}
}
Also the else{} block is never reached, I don't get any further logs.
Any hints?
It appears, that in GORM configuration for Micronaut done via
compile 'io.micronaut.configuration:micronaut-hibernate-gorm'
the failOnError is set to true by default. That led to ValidationException being thrown on save() instead of populating o.errors.
To fix the issue I added the line
grails.gorm.failOnError: false
to my application.yml and now it's working like charm.

Kamon MDC propagation doesn't work

I am trying to do MDC propagation with Kamon like shown in this documentation But it does not seem to work like they say
Play framework - 2.5
kamon-core - 0.6.2
kamon-play-25 - 0.6.2
My logback pattern:
<pattern>%d{HH:mm:ss.SSS} [%thread] [%level] [%traceToken]- %logger{36}\(%L\) %X{X-ApplicationId} - %message%n%xException</pattern>
I have created a filter:
class AccessLoggingFilter #Inject() (implicit val mat: Materializer, ec: ExecutionContext) extends Filter with LazyLogging {
val ApplicationIdKey = AvailableToMdc("X-ApplicationId")
def apply(next: (RequestHeader) => Future[Result])(request: RequestHeader): Future[Result] = {
TraceLocal.storeForMdc("X-ApplicationId", request.id.toString)
logger.error("first Location")
withMdc {
logger.error("Second location")
next(request)
}}}
And added it like so:
class MyFilters #Inject() (accessLoggingFilter: AccessLoggingFilter) extends DefaultHttpFilters(accessLoggingFilter)
Now when i do an http call to the server i get the following output:
c.v.i.utils.AccessLoggingFilter(24) - first Location
c.v.i.utils.AccessLoggingFilter(26) 1 - Second location
And all log prints afterwards do not show the '1' X-ApplicationId
Cant figure out what i am doing wrong.
Here a complete(almost) example:
build.sbt:
name := "kamon-play-example"
version := "1.0"
scalaVersion := "2.11.7"
val kamonVersion = "0.6.2"
val resolutionRepos = Seq("Kamon Repository Snapshots" at "http://snapshots.kamon.io")
val dependencies = Seq(
"io.kamon" %% "kamon-play-25" % kamonVersion,
"io.kamon" %% "kamon-log-reporter" % kamonVersion
)
lazy val root = (project in file(".")).enablePlugins(PlayScala)
.settings(resolvers ++= resolutionRepos)
.settings(libraryDependencies ++= dependencies)
basic filter:
class TraceLocalFilter #Inject() (implicit val mat: Materializer, ec: ExecutionContext) extends Filter {
val logger = Logger(this.getClass)
val TraceLocalStorageKey = "MyTraceLocalStorageKey"
val userAgentHeader = "User-Agent"
//this value will be available in the MDC at the moment to call to Logger.*()s
val UserAgentHeaderAvailableToMDC = AvailableToMdc(userAgentHeader)
override def apply(next: (RequestHeader) ⇒ Future[Result])(header: RequestHeader): Future[Result] = {
def onResult(result:Result) = {
val traceLocalContainer = TraceLocal.retrieve(TraceLocalKey).getOrElse(TraceLocalContainer("unknown","unknown"))
result.withHeaders(TraceLocalStorageKey -> traceLocalContainer.traceToken)
}
//update the TraceLocalStorage
TraceLocal.store(TraceLocalKey)(TraceLocalContainer(header.headers.get(TraceLocalStorageKey).getOrElse("unknown"), "unknown"))
TraceLocal.store(UserAgentHeaderAvailableToMDC)(header.headers.get(userAgentHeader).getOrElse("unknown"))
//call the action
next(header).map(onResult)
}
}
we need add the filter:
class Filters #Inject() (traceLocalFilter: TraceLocalFilter) extends HttpFilters {
val filters = Seq(traceLocalFilter)
}
a really simple controller and action:
class KamonPlayExample #Inject() (kamon: Kamon) extends Controller {
def sayHello = Action.async {
Future {
logger.info("Say hello to Kamon")
Ok("Say hello to Kamon")
}
}
}
in logback.xml add the following pattern:
<pattern>%date{HH:mm:ss.SSS} %-5level [%traceToken][%X{User-Agent}] [%thread] %logger{55} - %msg%n</pattern>
add the sbt-aspectj-runner plugin in order to run the application with Aspectjweaver in DEV mode:
addSbtPlugin("io.kamon" % "aspectj-play-runner" % "0.1.3")
run the application with aspectj-runner:run and make some curls:
curl -i -H 'X-Trace-Token:kamon-test' -H 'User-Agent:Super-User-Agent' -X GET "http://localhost:9000/helloKamon"
curl -i -H 'X-Trace-Token:kamon-test'-X GET "http://localhost:9000/helloKamon"
in the console:
15:09:16.027 INFO [kamon-test][Super-User-Agent] [application-akka.actor.default-dispatcher-8] controllers.KamonPlayExample - Say hello to Kamon
15:09:24.034 INFO [kamon-test][curl/7.47.1] [application-akka.actor.default-dispatcher-8] controllers.KamonPlayExample - Say hello to Kamon
hope you help.

ValidationException on Update: Validation error whilst flushing entity on AbstractPersistenceEventListener

In my environment, i have grails.gorm.failOnError = true on Config.groovy.
package org.example
class Book {
String title
String author
String email
static constraints = {
title nullable: false, blank: false
email nullable: false, blank: false, unique: true //apparently this is the problem..
}
}
And, on controller, i have:
package org.example
class BookController {
def update() {
def bookInstance = Book.get(params.id)
if (!bookInstance) {
flash.message = message(code: 'default.not.found.message', args: [message(code: 'book.label', default: 'Book'), params.id])
redirect(action: "list")
return
}
if (params.version) {
def version = params.version.toLong()
if (bookInstance.version > version) {
bookInstance.errors.rejectValue("version", "default.optimistic.locking.failure",
[message(code: 'book.label', default: 'Book')] as Object[],
"Another user has updated this Book while you were editing")
render(view: "edit", model: [bookInstance: bookInstance])
return
}
}
bookInstance.properties = params
bookInstance.validate()
if(bookInstance.hasErrors()) {
render(view: "edit", model: [bookInstance: bookInstance])
} else {
bookInstance.save(flush: true)
flash.message = message(code: 'default.updated.message', args: [message(code: 'book.label', default: 'Book'), bookInstance.id])
redirect(action: "show", id: bookInstance.id)
}
}
}
To save, it's ok. But, when updating without set the title field, i get:
Message: Validation error whilst flushing entity [org.example.Book]:
- Field error in object 'org.example.Book' on field 'title': rejected value []; codes [org.example.Book.title.blank.error.org.example.Book.title,org.example.Book.title.blank.error.title,org.example.Book.title.blank.error.java.lang.String,org.example.Book.title.blank.error,book.title.blank.error.org.example.Book.title,book.title.blank.error.title,book.title.blank.error.java.lang.String,book.title.blank.error,org.example.Book.title.blank.org.example.Book.title,org.example.Book.title.blank.title,org.example.Book.title.blank.java.lang.String,org.example.Book.title.blank,book.title.blank.org.example.Book.title,book.title.blank.title,book.title.blank.java.lang.String,book.title.blank,blank.org.example.Book.title,blank.title,blank.java.lang.String,blank]; arguments [title,class org.example.Book]; default message [Property [{0}] of class [{1}] cannot be blank]
Line | Method
->> 46 | onApplicationEvent in org.grails.datastore.mapping.engine.event.AbstractPersistenceEventListener
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
| 895 | runTask in java.util.concurrent.ThreadPoolExecutor$Worker
| 918 | run . . . . . . . in ''
^ 680 | run in java.lang.Thread
At q I understand it, the problem occurs when the flush hibernate session, hibernate tries to save the object again then the exception is thrown...
When trying to save the object again, is called the book.validate () again, which makes a new query in the database to ensure the uniqueness of the email field. Right now, the Validation Exception is thrown.
But, when i removed the unique validation of email property, the update is performed normally..
My question is: This behavior is correct? Hibernate calls book.save automatically?
This is the sample project, and the steps to simulate the error are:
source: https://github.com/roalcantara/grails_app_validation_exception
grails run-app
navigate to http:// localhost: 8080/ book/book/create
create an new instance filling all fields..
then edit this instance, in: http:// localhost: 8080/ book/book/edit/1
finally, drop the 'Title' field and click on Update, then the exception is thrown..
In my environment, this behavior has occurred on grails version 2.0.3 and 2.2.1
Thanks for any help! And sorry by my poor (and shame) english.. rs..
You are essentially validating twice, first with:
bookInstance.validate()
and second with:
bookInstance.save(flush: true)
When you call bookInstance.save(flush: true) a boolean is returned. Grails takes advantage of this by default when a controller is generated, but it appears you have changed the controller Grails generated by default for some reason.
Just replace this:
bookInstance.validate()
if(bookInstance.hasErrors()) {
render(view: "edit", model: [bookInstance: bookInstance])
} else {
bookInstance.save(flush: true)
flash.message = message(code: 'default.updated.message', args: [message(code: 'book.label', default: 'Book'), bookInstance.id])
redirect(action: "show", id: bookInstance.id)
}
With this:
if( !bookInstance.save( flush: true ) ) {
render(view: "edit", model: [bookInstance: bookInstance])
return
}

Resources