I am trying to generate Slick 3.0 code for Oracle. The DB user points to two schemas that have tables with the same name so the generated code has duplicate classes. I would like to filter out the tables from the schema that end in "STAGE"
Here is the code:
object CodeGen2 extends App {
//https://stackoverflow.com/questions/28285129/slick-code-generation-for-only-a-single-schema
val slickDriver = "com.typesafe.slick.driver.oracle.OracleDriver"
val jdbcDriver = "oracle.jdbc.OracleDriver"
val url = "jdbc:oracle:thin:#dbhost:1521:dbsid"
val user = "dbuser"
val password = "dbpassword"
val destDir = "src/main/scala"
val destPackage = "com.mycompany.mypackage"
import scala.concurrent.{ExecutionContext, Await, Future}
import scala.concurrent.duration.Duration
import slick.codegen.SourceCodeGenerator
import scala.concurrent.ExecutionContext.Implicits.global
import slick.jdbc.JdbcModelBuilder
import slick.jdbc.meta.MTable
import com.typesafe.slick.driver.oracle.OracleDriver
import slick.jdbc.JdbcBackend.DatabaseFactoryDef
println("Starting codegen...")
val db = OracleDriver.simple.Database.forURL(url, user=user, password=password, driver=jdbcDriver)
val filteredTables = OracleDriver.defaultTables.filter(
(t: MTable) => !t.name.schema.get.endsWith("STAGE")
)
val modelAction = OracleDriver.createModel(filteredTables, true)
println("Generating model...")
val model = Await.result(db.run(modelAction), Duration.Inf)
val codegen = new SourceCodeGenerator(model) {
// for illustration
val noStage = model.tables.filter { table => !table.name.schema.get.endsWith("STAGE") }
noStage.foreach { table => println(table.name.schema.get) }
}
println("Generating files...")
codegen.writeToFile(
slickDriver, destDir, destPackage, "Tables", "Tables.scala"
)
// slick.codegen.SourceCodeGenerator.main(
// Array(slickDriver, jdbcDriver, url, destDir, destPackage, user, password)
// )
println("Finished codegen.")
}
I try and filter the defaultTables but the signature is Seq[MTable] => Boolean so I have no idea how to deal with that. Is filtering the tables passed to driver.createModel the correct approach? I looked and tried other code with override(n) methods but nothing seemed workable.
name := "slick-test"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
"com.typesafe.slick" %% "slick" % "3.0.0",
"org.slf4j" % "slf4j-nop" % "1.6.4",
"com.typesafe.slick" %% "slick-extensions" % "3.0.0",
"com.typesafe.slick" %% "slick-codegen" % "3.0.0"
)
resolvers += "Typesafe Releases" at "http://repo.typesafe.com/typesafe/maven-releases/"
Thanks.
Try it this way :
val filteredTables = OracleDriver.defaultTables.map(seq => seq.filter(t => !t.name.schema.get.endsWith("STAGE")))
val modelAction = OracleDriver.createModel(Option(filteredTables), true)
Update: After I posted above, a workaround was found to the code generation problem for Oracle. It should be fixed in Slick 3.2. Replace the code above with this SourceCodeGenerator. See https://github.com/slick/slick/issues/1000
val codegen = new SourceCodeGenerator(model) {
// slated for fix in 3.2
override def Table = new Table(_) { table =>
// override generator responsible for columns
override def Column = new Column(_) {
override def defaultCode = v => {
def raw(v: Any) = rawType match {
case "String" => "\"" + v + "\""
case "Long" => v + "L"
case "Float" => v + "F"
case "Char" => "'" + v + "'"
case "scala.math.BigDecimal" => v.toString.trim + "d"
case "Byte" | "Short" | "Int" | "Double" | "Boolean" => v.toString
}
v match {
case Some(x) => s"Some(${raw(x)})"
case None => "None"
case x => raw(x)
}
}
}
}
}
Related
I am trying to solve the reconstruct itinerary problem (https://leetcode.com/problems/reconstruct-itinerary/) in Scala using functional approach. Java solution works but Scala doesn't. One reason I found out was the hashmap is being updated and every iteration has the latest hashmap (even when popping from recursion) which is weird.
Here is the solution in Java:
import java.util.ArrayList;
import java.util.HashMap;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
import java.util.PriorityQueue;
public class Solution1 {
private void dfg(Map<String, PriorityQueue<String>> adj, LinkedList<String> result, String vertex){
PriorityQueue<String> pq = adj.get(vertex);
while (pq!=null && !pq.isEmpty()){
System.out.println("Before :"+adj.get(vertex));
String v = pq.poll();
System.out.println("After :"+ adj.get(vertex));
dfg(adj,result,v);
}
result.addFirst(vertex);
}
public List<String> findItinerary(List<List<String>> tickets){
Map<String,PriorityQueue<String>> adj = new HashMap<>();
for(List<String> ticket: tickets){
adj.putIfAbsent(ticket.get(0),new PriorityQueue<>());
adj.get(ticket.get(0)).add(ticket.get(1));
}
LinkedList<String> result = new LinkedList<>();
dfg(adj,result,"JFK");
//not reverse yet
return result;
}
public static void main(String[] args){
List<List<String>> tickets = new ArrayList<>();
List t1= new ArrayList();
t1.add("JFK");
t1.add("SFO");
tickets.add(t1);
List t2= new ArrayList();
t2.add("JFK");
t2.add("ATL");
tickets.add(t2);
List t3= new ArrayList();
t3.add("SFO");
t3.add("ATL");
tickets.add(t3);
List t4= new ArrayList();
t4.add("ATL");
t4.add("JFK");
tickets.add(t4);
List t5= new ArrayList();
t5.add("ATL");
t5.add("SFO");
tickets.add(t5);
System.out.println();
Solution1 s1 = new Solution1();
List<String> finalRes = s1.findItinerary(tickets);
for(String model : finalRes) {
System.out.print(model + " ");
}
}
}
Here is my solution in Scala which is not working:
package graph
class Itinerary {
}
case class Step(g: Map[String,List[String]],sort: List[String]=List())
object Solution {
def main(arr: Array[String]) = {
val tickets = List(List("JFK","SFO"),List("JFK","ATL"),List("SFO","ATL"),List("ATL","JFK"),List("ATL","SFO"))
println(findItinerary(tickets))
}
def findItinerary(tickets: List[List[String]]): List[String] = {
val g = tickets.foldLeft(Map[String,List[String]]())((m,t)=>{
val key=t(0)
val value= t(1)
m + (key->(m.getOrElse(key,Nil) :+ value).sorted)
})
println(g)
// g.keys.foldLeft(Step())((s,n)=> dfs(n,g,s)).sort.toList
dfs("JFK",Step(g)).sort.toList
}
def dfs(vertex: String,step: Step): Step = {
println("Input vertex " + vertex)
println("Input map "+ step.g)
val updatedStep= step.g.getOrElse(vertex,Nil).foldLeft(step) ((s,n)=>{
//println("Processing "+n+" of vertex "+vertex)
//delete link
val newG = step.g + (vertex->step.g.getOrElse(vertex,Nil).filter(v=>v!=n))
// println(newG)
dfs(n,step.copy(g=newG))
})
println("adding vertex to result "+vertex)
updatedStep.copy(sort = updatedStep.sort:+vertex)
}
}
Scala is sometimes approached as a "better" Java, but that's really very limiting. If you can get into the FP mindset, and study the Standard Library, you'll find that it's a whole new world.
def findItinerary(tickets: List[List[String]]): List[String] = {
def loop(from : String
,jump : Map[String,List[String]]
,acc : List[String]) : List[String] = jump.get(from) match {
case None => if (jump.isEmpty) from::acc else Nil
case Some(next::Nil) => loop(next, jump - from, from::acc)
case Some(nLst) =>
nLst.view.map{ next =>
loop(next, jump+(from->(nLst diff next::Nil)), from::acc)
}.find(_.lengthIs > 0).getOrElse(Nil)
}
loop("JFK"
,tickets.groupMap(_(0))(_(1)).map(kv => kv._1 -> kv._2.sorted)
,Nil).reverse
}
I am going to be honest that I didn't look through your code to see where the problem was. But, I got caught by the problem and decided to give it a go; here is the code:
(hope my code helps you)
type Airport = String // Refined 3 upper case letters.
final case class AirlineTiket(from: Airport, to: Airport)
object ReconstructItinerary {
// I am using cats NonEmptyList to improve type safety, but you can easily remove it from the code.
private final case class State(
currentAirport: Airport,
availableDestinations: Map[Airport, NonEmptyList[Airport]],
solution: List[Airport]
)
def apply(tickets: List[AirlineTiket])(start: Airport): Option[List[Airport]] = {
#annotation.tailrec
def loop(currentState: State, checkpoints: List[State]): Option[List[Airport]] = {
if (currentState.availableDestinations.isEmpty) {
// We used all the tickets, so we can return this solution.
Some((currentState.currentAirport :: currentState.solution).reverse)
} else {
val State(currentAirport, availableDestinations, solution) = currentState
availableDestinations.get(currentAirport) match {
case None =>
// We got into nowhere, lets see if we can return to a previous state...
checkpoints match {
case checkpoint :: remaining =>
// If we can return from there
loop(currentState = checkpoint, checkpoints = remaining)
case Nil =>
// If we can't, then we can say that there is no solution.
None
}
case Some(NonEmptyList(destination, Nil)) =>
// If from the current airport we can only travel to one destination, we will just follow that.
loop(
currentState = State(
currentAirport = destination,
availableDestinations - currentAirport,
currentAirport :: solution
),
checkpoints
)
case Some(NonEmptyList(destination, destinations # head :: tail)) =>
// If we can travel to more than one destination, we are going to try all in order.
val newCheckpoints = destinations.map { altDestination =>
val newDestinations = NonEmptyList(head = destination, tail = destinations.filterNot(_ == altDestination))
State(
currentAirport = altDestination,
availableDestinations.updated(key = currentAirport, value = newDestinations),
currentAirport :: solution
)
}
loop(
currentState = State(
currentAirport = destination,
availableDestinations.updated(key = currentAirport, value = NonEmptyList(head, tail)),
currentAirport :: solution
),
newCheckpoints ::: checkpoints
)
}
}
}
val availableDestinations = tickets.groupByNel(_.from).view.mapValues(_.map(_.to).sorted).toMap
loop(
currentState = State(
currentAirport = start,
availableDestinations,
solution = List.empty
),
checkpoints = List.empty
)
}
}
You can see the code running here.
i use impala(JDBC) twice to get kafka offset and save data in foreachRDD.
but impala and kudu always shutdown. now i want to set connect pool, but little for scala.
it's my pseudo-code:
#node-1
val newOffsets = getNewOffset() // JDBC read kafka offset in kudu
val messages = KafkaUtils.createDirectStream(*,newOffsets,)
messages.foreachRDD(rdd => {
val spark = SparkSession.builder.config(rdd.sparkContext.getConf).getOrCreate()
#node-2
Class.forName(jdbcDriver)
val con = DriverManager.getConnection("impala url")
val stmt = con.createStatement()
stmt.executeUpdate(sql)
#node-3
val offsetRanges = rdd.asInstanceOf[HasOffsetRanges].offsetRanges
offsetRanges.foreach { r => {
val rt_upsert = s"UPSert into ${execTable} values('${r.topic}',${r.partition},${r.untilOffset})"
stmt.executeUpdate(rt_upsert)
stmt.close()
conn.close()
}}
}
how to code by c3p0 or other ? I'll appreciate your help.
Below is the code for reading data from kafka and inserting the data to kudu.
import kafka.message.MessageAndMetadata
import kafka.serializer.StringDecoder
import org.apache.kudu.client.KuduClient
import org.apache.kudu.client.KuduSession
import org.apache.kudu.client.KuduTable
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.streaming.Milliseconds,
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.kafka.KafkaUtils
import scala.collection.immutable.List
import scala.collection.mutable
import scala.util.control.NonFatal
import org.apache.spark.sql.{Row, SQLContext}
import org.apache.spark.sql.types._
import org.apache.kudu.Schema
import org.apache.kudu.Type._
import org.apache.kudu.spark.kudu.KuduContext
import scala.collection.mutable.ArrayBuffer
object KafkaKuduConnect extends Serializable {
def main(args: Array[String]): Unit = {
try {
val TopicName="TestTopic"
val kafkaConsumerProps = Map[String, String]("bootstrap.servers" -> "localhost:9092")
val KuduMaster=""
val KuduTable=""
val sparkConf = new SparkConf().setAppName("KafkaKuduConnect")
val sc = new SparkContext(sparkConf)
val sqlContext = new SQLContext(sc)
import sqlContext.implicits._
val ssc = new StreamingContext(sc, Milliseconds(1000))
val kuduContext = new KuduContext(KuduMaster, sc)
val kuduclient: KuduClient = new KuduClient.KuduClientBuilder(KuduMaster).build()
//Opening table
val kudutable: KuduTable = kuduclient.openTable(KuduTable)
// getting table schema
val tableschema: Schema = kudutable.getSchema
// creating the schema for the data frame using the table schema
val FinalTableSchema =generateStructure(tableschema)
//To create the schema for creating the data frame from the rdd
def generateStructure(tableSchema:Schema):StructType=
{
var structFieldList:List[StructField]=List()
for(index <-0 until tableSchema.getColumnCount)
{
val col=tableSchema.getColumnByIndex(index)
val coltype=col.getType.toString
println(coltype)
col.getType match {
case INT32 =>
structFieldList=structFieldList:+StructField(col.getName,IntegerType)
case STRING =>
structFieldList=structFieldList:+StructField(col.getName,StringType)
case _ =>
println("No Class Type Found")
}
}
return StructType(structFieldList)
}
// To create the Row object with values type casted according to the schema
def getRow(schema:StructType,Data:List[String]):Row={
var RowData=ArrayBuffer[Any]()
schema.zipWithIndex.foreach(
each=>{
var Index=each._2
each._1.dataType match {
case IntegerType=>
if(Data(Index)=="" | Data(Index)==null)
RowData+=Data(Index).toInt
case StringType=>
RowData+=Data(Index)
case _=>
RowData+=Data(Index)
}
}
)
return Row.fromSeq(RowData.toList)
}
val messages = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaConsumerProps, Set(TopicName))
messages.foreachRDD(
//we are looping through eachrdd
eachrdd=>{
//we are creating the Rdd[Row] to create dataframe with our schema
val StructuredRdd= eachrdd.map(eachmessage=>{
val record=eachmessage._2
getRow(FinalTableSchema, record.split(",").toList)
})
//DataFrame with required structure according to the table.
val DF = sqlContext.createDataFrame(StructuredRdd, FinalTableSchema)
kuduContext.upsertRows(DF,KuduTable)
}
)
}
catch {
case NonFatal(e) => {
print("Error in main : " + e)
}
}
}
}
I am writing a macro that needs to create a class that rewrites a trait, having the same methods/args of the trait but different return type.
So say we got:
trait MyTrait[T]
{
def x(t1: T)(t2: T): T
}
#AnnProxy
class MyClass[T] extends MyTrait[T]
MyClass will be rewritten to:
class MyClass[T] {
def x(t1: T)(t2: T): R[T]
}
(so x will now return R[T] instead of T)
I wrote the macro and debugging it, it produces this code:
Expr[Any](class MyClass[T] extends scala.AnyRef {
def <init>() = {
super.<init>();
()
};
def x(t1: T)(t2: T): macrotests.R[T] = $qmark$qmark$qmark
})
#AnnProxy
As you see the signature seems ok. But when trying to use the macro, I get a compilation error:
val my = new MyClass[Int]
my.x(5)(6)
Error:(14, 7) type mismatch;
found : Int(5)
required: T
x.x(5)(6)
^
So it seems the method's generic T is not the same as the class [T]. Any ideas how to fix?
This is my macro so far. I am not any good with macros (coin'd this up with a lot of help from stackoverflow), but this is the current state:
#compileTimeOnly("enable macro paradise to expand macro annotations")
class AnnProxy extends StaticAnnotation
{
def macroTransform(annottees: Any*): Any = macro IdentityMacro.impl
}
trait R[T]
object IdentityMacro
{
private val SDKClasses = Set("java.lang.Object", "scala.Any")
def impl(c: whitebox.Context)(annottees: c.Expr[Any]*): c.Expr[Any] = {
import c.universe._
def showInfo(s: String) = c.info(c.enclosingPosition, s.split("\n").mkString("\n |---macro info---\n |", "\n |", ""), true)
val classDef = annottees.map(_.tree).head.asInstanceOf[ClassDef]
val clazz = c.typecheck(classDef).symbol.asClass
val tparams = clazz.typeParams
val baseClasses = clazz.baseClasses.tail.filter(clz => !SDKClasses(clz.fullName))
val methods = baseClasses.flatMap {
base =>
base.info.decls.filter(d => d.isMethod && d.isPublic).map { decl =>
val termName = decl.name.toTermName
val method = decl.asMethod
val params = method.paramLists.map(_.map {
s =>
val vd = internal.valDef(s)
val f = tparams.find(_.name == vd.tpt.symbol.name)
val sym = if (f.nonEmpty) f.get else vd.tpt.symbol
q"val ${vd.name} : $sym "
})
val paramVars = method.paramLists.flatMap(_.map(_.name))
q""" def $termName (...$params)(timeout:scala.concurrent.duration.FiniteDuration) : macrotests.R[${method.returnType}] = {
???
}"""
}
}
val cde = c.Expr[Any] {
q"""
class ${classDef.name} [..${classDef.tparams}] {
..$methods
}
"""
}
showInfo(show(cde))
cde
}
}
EDIT: I managed to work around by building the class as a string and then using c.parse to compile it. Feels like a hack but it works. There must be a better way by manipulating the tree.
package macrotests
import scala.annotation.{StaticAnnotation, compileTimeOnly}
import scala.language.experimental.macros
import scala.reflect.macros.whitebox
#compileTimeOnly("enable macro paradise to expand macro annotations")
class AnnProxy extends StaticAnnotation
{
def macroTransform(annottees: Any*): Any = macro AnnProxyMacro.impl
}
trait R[T]
trait Remote[T]
object AnnProxyMacro
{
private val SDKClasses = Set("java.lang.Object", "scala.Any")
def impl(c: whitebox.Context)(annottees: c.Expr[Any]*): c.Expr[Any] = {
import c.universe._
val classDef = annottees.map(_.tree).head.asInstanceOf[ClassDef]
val clazz = c.typecheck(classDef).symbol.asClass
val baseClasses = clazz.baseClasses.tail.filter(clz => !SDKClasses(clz.fullName))
val methods = baseClasses.flatMap {
base =>
base.info.decls.filter(d => d.isMethod && d.isPublic).map { decl =>
val termName = decl.name.toTermName
val method = decl.asMethod
val params = method.paramLists.map(_.map {
s =>
val vd = internal.valDef(s)
val tq = vd.tpt
s"${vd.name} : $tq"
})
val paramVars = method.paramLists.flatMap(_.map(_.name))
val paramVarsArray = paramVars.mkString("Array(", ",", ")")
val paramsStr = params.map(_.mkString("(", ",", ")")).mkString(" ")
val retTpe = method.returnType.typeArgs.mkString("-unexpected-")
s""" def $termName $paramsStr (timeout:scala.concurrent.duration.FiniteDuration) : macrotests.Remote[$retTpe] = {
println($paramVarsArray.toList)
new macrotests.Remote[$retTpe] {}
}"""
}
}
val tparams = clazz.typeParams.map(_.name)
val tparamsStr = if (tparams.isEmpty) "" else tparams.mkString("[", ",", "]")
val code =
s"""
|class ${classDef.name}$tparamsStr (x:Int) {
|${methods.mkString("\n")}
|}
""".stripMargin
// print(code)
val cde = c.Expr[Any](c.parse(code))
cde
}
}
the code is very long , you can look at the github: https://github.com/1178615156/scala-macro-example/blob/master/stackoverflow/src/main/scala/so/AnnotationWithTrait.scala
import scala.annotation.StaticAnnotation
import scala.language.experimental.macros
import scala.reflect.macros.blackbox.Context
/**
* Created by yu jie shui on 2015/12/2.
*/
class AnnotationWithTrait extends StaticAnnotation {
def macroTransform(annottees: Any*): Any = macro AnnotationWithTraitImpl.apply
}
class AnnotationWithTraitImpl(val c: Context) {
import c.universe._
val SDKClasses = Set("java.lang.Object", "scala.Any")
def showInfo(s: String) = c.info(c.enclosingPosition, s.split("\n").mkString("\n |---macro info---\n |", "\n |", ""), true)
def apply(annottees: c.Expr[Any]*) = {
val classDef = annottees.map(_.tree).head.asInstanceOf[ClassDef]
val superClassSymbol= c.typecheck(classDef).symbol.asClass.baseClasses.tail
.filterNot(e => SDKClasses.contains(e.fullName)).reverse
val superClassTree= classDef match {
case q"$mod class $name[..$t](..$params) extends ..$superClass { ..$body }" =>
(superClass: List[Tree]).filterNot(e =>
typeOf[Object].members.exists(_.name == e.children.head.toString())
)
}
showInfo(show(superClassSymbol))
showInfo(show(superClassTree))
val impl = q"private[this] object ${TermName("impl")} extends ..${superClassTree}"
//
//get super class all can call method
val methods = superClassSymbol.map(_.info.members
.filterNot(_.isConstructor)
.filterNot(e => typeOf[Object].members.exists(_.name == e.name)).map(_.asMethod)).toList
case class ReplaceTypeParams(from: String, to: String)
type ClassReplace = List[ReplaceTypeParams]
//trait a[A]
//class b[B] extends a[B]
//need replace type params A to B
val classReplaceList: List[ClassReplace] = superClassTree zip superClassSymbol map {
case (superClassTree, superClassSymbol) =>
superClassSymbol.asClass.typeParams.map(_.name) zip superClassTree.children.tail map
(e => ReplaceTypeParams(e._1.toString, e._2.toString()))
}
val out = classReplaceList zip methods map {
case (classReplace, func) =>
func map { e => {
val funcName = e.name
val funcTypeParams = e.typeParams.map(_.name.toString).map(name => {
TypeDef(Modifiers(Flag.PARAM), TypeName(name), List(), TypeBoundsTree(EmptyTree, EmptyTree))
})
val funcParams = e.paramLists.map(_.map(e => q"${e.name.toTermName}:${
TypeName(
classReplace.find(_.from == e.info.toString).map(_.to).getOrElse(e.info.toString)
)} "))
val funcResultType = TypeName(
classReplace.find(_.from == e.returnType.toString).map(_.to).getOrElse(e.info.toString)
)
q"""
def ${funcName}[..${funcTypeParams}](...$funcParams):${funcResultType}=
impl.${funcName}[..${funcTypeParams}](...$funcParams)
"""
}
}
}
showInfo(show(out))
q"""
class ${classDef.name}[..${classDef.tparams}]{
$impl
..${out.flatten}
}
"""
}
}
test
trait MyTrait[MT1] {
def x(t1: MT1)(t2: MT1): MT1 = t1
}
trait MyTrait2[MT2] {
def t(t2: MT2): MT2 = t2
}
#AnnotationWithTrait
class MyClass[MCT1, MCT2] extends MyTrait[MCT1] with MyTrait2[MCT2]
object AnnotationWithTraitUsing extends App {
assert(new MyClass[Int, String].x(1)(2) == 1)
assert(new MyClass[Int, String].t("aaa") == "aaa")
}
This can't be too uncommon. I want to get a "fullPathAndFileName" value, that's not in my form, into a validator.
In one of my controllers I have a file renaming action:
def renameAction(fullPathAndFileName: String) = Action { implicit request =>
val newRenameForm = renameForm.bindFromRequest()
newRenameForm.fold(
hasErrors = { form =>
Redirect(routes.TestApp.renderFormAction(fullPathAndFileName)).flashing("error" -> "New filename must be unused and cannot be empty.")
},
success = { newFileName =>
...
Here's my validator:
private val renameForm: Form[String] = Form("newFileName" -> nonEmptyText.verifying({txt => dupeNotFound(txt)}))
private def dupeNotFound(newFileName: String) = { !Asset.findAsset(replaceFileNameOfAsset(fullPathAndFileName, newFileName)) }
So this code won't compile, because fullPathAndFileName is not defined. How can I let the validator use that value?
Posting as answer instead of just commenting...anyway, this should work if I understand things correctly...
val newRenameForm = renameForm(fullPathAndFileName).bindFromRequest()
And the validator...
private val renameForm: (String) => Form[String] = (fullPathAndFileName: String) => Form("newFileName" -> nonEmptyText.verifying({txt => dupeNotFound(fullPathAndFileName,txt)}))
private def dupeNotFound(fullPathAndFileName: String, newFileName: String) = { !Asset.findAsset(replaceFileNameOfAsset(fullPathAndFileName, newFileName)) }
I think I'm saving the image to Postgres correctly, but get unexpected results trying to load the image. I don't really know if the error is in save or load.
Here is my Anorm code for saving the image:
def storeBadgeImage(badgeHandle: String, imgFile: File) = {
val cmd = """
|update badge
|set img={imgBytes}
|where handle = {badgeHandle}
"""
var fis = new FileInputStream(imgFile)
var imgBytes: Array[Byte] = Resource.fromInputStream(fis).byteArray
// at this point I see the image in my browser if I return the imgBytes in the HTTP response, so I'm good so far.
DB.withConnection { implicit c =>
{
try {
SQL(cmd stripMargin).on("badgeHandle" -> badgeHandle, "imgBytes" -> imgBytes).executeUpdate() match {
case 0 => "update failed for badge " + badgeHandle + ", image " + imgFile.getCanonicalPath
case _ => "Update Successful"
}
} catch {
case e: SQLException => e.toString()
}
}
}
}
...I get "update succesful", so I presume the save is working (I could be wrong). Here is my code for loading the image:
def fetchBadgeImage(badgeHandle: String) = {
val cmd = """
|select img from badge
|where handle = {badgeHandle}
"""
DB.withConnection { implicit c =>
SQL(cmd stripMargin).on("badgeHandle" -> badgeHandle)().map {
case Row(image: Array[Byte]) => {
"image = " + image
}
case Row(Some(unknown: Any)) => {
println(unknown + " unknown type is " + unknown.getClass.getName) //[B#11be1c6 unknown type is [B
"unknown"
}
}
}
}
...rather than going into the case "Row(image: Array[Byte])" as hoped, it goes into the "Row(Some(unknown: Any))" case. My println outputs "[B#11be1c6 unknown type is [B"
I don't know what type [B is or where I may have gone wrong...
It's an array of byte in Java(byte[]). > "I don't know what type [B".
And You can write match { case Row(Some(image: Array[Byte])) => } too in this case and that might be better.
Or you might be able to do that as follows.
val results: Stream[Array[Byte]] = SQL(cmd stripMargin)
.on("badgeHandle" -> "name")().map { row => row[Array[Byte]]("img") }
...Oops, got the following compile error.
<console>:43: error: could not find implicit value for parameter c: anorm.Column[Array[Byte]]
val res: Stream[Array[Byte]] = SQL(cmd stripMargin).on("badgeHandle" -> "name")().map { row => row[Array[Byte]]("img") }
Unfortunately, scala.Array is not supported by default. If you imitate the way of other types, It works.
implicit def rowToByteArray: Column[Array[Byte]] = {
Column.nonNull[Array[Byte]] { (value, meta) =>
val MetaDataItem(qualified, nullable, clazz) = meta
value match {
case bytes: Array[Byte] => Right(bytes)
case _ => Left(TypeDoesNotMatch("..."))
}
}
}
val results: Stream[Array[Byte]] = SQL(cmd stripMargin)
.on("badgeHandle" -> "name")().map { row => row[Array[Byte]]("img") }
https://github.com/playframework/Play20/blob/master/framework/src/anorm/src/main/scala/anorm/Anorm.scala