Background: I have several months experience using Gremlin and Faunus, incl. the ScriptMap step.
Problem: User defined Gremlin steps work fine when loaded in the shell as part of a script. However, the same steps apparently have no effect when defined in a Faunus ScriptMap script.
/***********Faunus Driver*************/
//usage gremlin -e <hhis file> NOTE: to run in gremlin remove .submit() at end of pipe
import Java.io.Console;
//get args
console = System.console()
mapperpath=console. readLine ('> <map script path>: ')
refns=console.readLine('> <reference namespace>: ')
refinterestkey-console.readLine('> <interest field>: ')
//currently not in use
refinterestval=console.readLine('> <interest value>: ')
mainpropkey=console.readLine('> ^main field>: ')
delim=console.readLine('> <main delimiter>: ')
args=[]
args [0]=refns
args [1]=refinterestkey
args[2]=refinterestval
args [3]=mainpropkey
args [4]=delim
args=(String[]) args.toArray()
f=FaunusFactory.open('propertyfile')
f.V().filter('{it.get Property("_namespace") =="streamernamespace" && it.getProperty("_entity")==" selector"}').script(mapperpath, args).submit()
f.shutdown()
/***********Script Mapper*************/
Gremlin.defineStep ("findMatch", [Vertex, Pipe],
{streamer, interestindicator, fieldofinterest, fun ->
_().has (interestindicator , true).has(fieldofinterest,
fun(streamer)
}
)
Gremlin.defineStep("connectMatch", [Vertex, Pipe], {streamer ->
// copy and link streaming vertices to matching vertices in main graph
_().transform({if(main!= null) {
mylog.info("reference vertex " + main.id
+" & streaming vertex"+streamer.id+" match on main " +main.getProperty(fieldofinterest));
clone=g.addVertex(null);
ElementHelper.copyProperties(streamer, clone);
clone.setProperty("_namespace", main.getProperty("__namespace"));
mylog.info("create clone "+clone.id+" in "+clone.getProperty("_namespace"));
g.addEdge(main, clone, streamer.getProperty("source");
mylog.info("created edge "+ e);
g.commit()
}})
})
def g
def refns
def refinterestkey
def refinterestval
def mainpropkey
def delim
def normValue
def setup(args) {
refns=args[0]
refinterestkey=args[1]
refinterestval=args[2]
mainpropkey=args[3]
delim=args[4]
normValue = {obj-> seltype=obj.getProperty("type");
seltypenorm=seltype.trim().toUpperCase();
desc=obj.getProperty("description");
if(desc.contains(delim}) (
selnum=desc.split(delim) [1].trim ()
} else selnum=desc.trim();
selnorm=seltypenorm.concat(delim).concat(selnum);
mylog.info ("streamer selector (" + seltype", "+desc+") normalized as "+selnorm);
return selnorm
}
mylog=java.util.logging.Logger.getLogger("script_map")
mylog.info ("configuring connection to reference graph
conf=new BaseConfiguration()
conf.setProperty("storage.backend", "cassandra"}
conf.setProperty!"storage.keyspace", "titan"}
conf.setProperty("storage.index.index-name", "titan")
conf.setProperty("storage.hostname", "localhost")
g=TitanFactory.open(conf)
isstepsloaded = Gremlin.getStepnames().contains("findMatch"} &&
Gremlin.getStepNames().contain("connectMatch"}
mylog.info("custom steps available?: "+isstepsloaded)
}
def map{v, args) {
try{
incoming=g.v(v.id)
mylog.info{"current streamer id: "+incoming.id)
if(incoming.getProperty("_entity")=="selector") {
mylog.info("process incoming vertex "+incoming.id)
g.V{"_namespace", refns).findMatch(incoming,refinterestkey, mainpropkey,normValue).connectMatch(incoming).iterate ()
}
}catch(Exception e) {
mylog.info("map method exception raised");
mylog.severe(e.getMessage()
}
g.commit()
}
def cleanup(args) { g.shutdown()}
I just tested Faunus with user defined steps over The Graph of the Gods and it seems to work just fine. Here's what I did:
father.groovy
Gremlin.defineStep('father', [Vertex, Pipe], {_().out('father')})
def g
def setup(args) {
conf = new org.apache.commons.configuration.BaseConfiguration()
conf.setProperty('storage.backend', 'cassandrathrift')
conf.setProperty('storage.hostname', '192.168.2.110')
g = com.thinkaurelius.titan.core.TitanFactory.open(conf)
}
def map(v, args) {
u = g.v(v.id)
pipe = u.father().name
if (pipe.hasNext()) u.fathersName = pipe.next()
u.name + "'s father's name is " + u.fathersName
}
def cleanup(args) {
g.shutdown()
}
In Faunus' Gremlin REPL:
gremlin> g.V.has('type','demigod','god').script('father.groovy')
...
==>jupiter's father's name is saturn
==>hercules's father's name is jupiter
==>neptune's father's name is null
==>pluto's father's name is null
If this doesn't help to solve your problem, please provide more details, so we can reproduce the errors you see.
Cheers,
Daniel
The root problem was I set an obsolete value for the "storage.index.index-name" property (see titan graph config under setup(). Disregard discussion re getOrCreate methods/blueprints: apparently a broad range of mutations on existing graphs can be achieved at scale using custom Gremlin steps defined inside a script referenced in the Faunus script step, with faunus format NoOpOutputFormat. Lesson learned: Instead of configuring titan graphs in-line in the script, distribute a (centrally maintained) graph properties file for reference in configuring the titan graph CDH5 has simplified distributed cache management
Related
I have been reading as many posts as possible about this topic but none of them suggest working solutions for me, so, throwing it again to the community:
In a Jenkinsfile pipeline I have
steps {
(...)
sh script: '''
$pkgname #existing var
export report_filename=$pkgname'_report.txt'
(stuff is being written to the $report_filename file...)
'''
}
post {
always {
script {
//want to read the file with name carried by $report_filename
def report = readFile(file: env.report_filename, encoding: 'utf-8').trim()
buildDescription(report)
}
}
}
I don't manage to pass the value of the report_filename bash var on to the post > always > script section. Tried ${env.report_filename} (with/without single/double quotes), with/without env. and some other crazy things.
What am I doing wrong here?
Thanks.
May by it little bit not right.
create variable def var
use options returnStdout: true. And parse output. var = sh ( script " echo #existing var", returnStdout: true).split("\n")
use var[0] in stage readFile(file: var[0]...)
If u can use env, add:
environment {
VAR = sh (script " echo #existing var", returnStdout: true).split("\n") [0]
}
script {
//want to read the file with name carried by $report_filename
def report = readFile(file: env.VAR , encoding: 'utf-8').trim()
buildDescription(report)
}
I don't see why you don't simply declare the variables in Groovy right at the start.
I'm not too familiar with the language, and don't currently have a way to test this; but something like this:
def pkgname = "gunk"
def report_filename = "${pkgname}_report.txt"
steps {
(...)
sh script: """
# use triple double quotes so that Groovy variables are interpolated
# $pkgname #syntax error, take it out
(stuff is being written to the $report_filename file...)
"""
}
post {
always {
script {
//want to read the file with name carried by $report_filename
def report = readFile(file: env.report_filename, encoding: 'utf-8').trim()
buildDescription(report)
}
}
}
I've written some code with Akka Streams and Alpakka that reads from Amazon SQS and indexes the events in Elasticsearch. Everything works smoothly and the performance is awesome, but I have a problem with index names. I have this code:
class ElasticSearchIndexFlow(restClient: RestClient) {
private val elasticSettings = ElasticsearchSinkSettings(bufferSize = 10)
def flow: Flow[IncomingMessage[DomainEvent, NotUsed], Seq[IncomingMessageResult[DomainEvent, NotUsed]], NotUsed] =
ElasticsearchFlow.create[DomainEvent](index, "domain-event", elasticSettings)(
restClient,
DomainEventMarshaller.domainEventWrites
)
private def index = {
val now = DateTime.now()
s"de-${now.getYear}.${now.getMonthOfYear}.${now.getDayOfMonth}"
}
}
The problem is that after some days running the flow, the index name is not changing. I imagine that Akka Streams creates under the hood a fused actor, and that the function index for getting the index name is only evaluated at the beginning of execution.
Any idea of what can I do to index events in ES with an index name according to the current date?
The solution to the problem is setting the index name in previous step with IncomingMessage.withIndexName
So:
def flow: Flow[(DomainEvent, Message), IncomingMessage[DomainEvent, Message], NotUsed] =
Flow[(DomainEvent, Message)].map {
case (domainEvent, message) =>
IncomingMessage(Some(domainEvent.eventId), domainEvent, message)
.withIndexName(indexName(domainEvent.ocurredOn))
}
And:
def flow: Flow[IncomingMessage[DomainEvent, NotUsed], Seq[IncomingMessageResult[DomainEvent, NotUsed]], NotUsed] =
ElasticsearchFlow.create[DomainEvent]("this-index-name-is-not-used", "domain-event", elasticSettings)(
restClient,
DomainEventMarshaller.domainEventWrites
)
I am newbie for Gatling and trying to read some fields from CSV and use them in my gatling scenario but facing
No attribute name 'CSVFieldName' is defined
issue ;
some details:
Gatling Version : bundle-2.2.3
CSV Name : memId.csv
CSV contents :
memid
CKABC123
Scala File contents :
//Class Declaration
{
//some http configuration
val memId_feeder = csv("memId.csv").circular
val scn = scenario("Scn name").during( 10 seconds ) {
feed(memId_feeder)
exec(http("Req_01_Auth")
.post("/auth")
.check(status.is(200))
.headers(header_1)
.formParam("memberId","${memid}"))
}
setup(scn.inject(atOnceUsers(1)).protocols(httpConf))
}
Any help or clue to resolve this issue is really appreciable .
P.S. : There is no whitespaces in the input csv file .
Oh, I can feel your pain…
It's a while since I played with Gatling. As far I remember you have to provide a "chain" of actions in the scenario definition employing currying.
This all means: placing a dot before exec should make it.
val scn = scenario("Scn name").during( 10 seconds ) {
feed(memId_feeder)
.exec(http("Req_01_Auth")
.post("/auth")
.check(status.is(200))
.headers(header_1)
.formParam("memberId","${memid}"))
}
In my case applying dot implies error.
import com.shutterfly.loadtest.commerce.webcartorch.simulations.AbstractScenarioSimulation
import com.shutterfly.loadtest.siteServices.services.MyProjectsService
import com.shutterfly.loadtest.siteServices.util.{Configuration, HttpConfigs}
import io.gatling.core.Predef._
import com.shutterfly.loadtest.siteServices.services.MyProjectsService._
import io.gatling.http.config.HttpProtocolBuilder
class MetaDataApiSimulation extends Simulation {
def scenarioName = "MetaData Flow. Get All Projects"
def userCount = Configuration.getNumUsers(20)
def rampUpTime = Configuration.getRampUpTime(60)
def httpConf: HttpProtocolBuilder = HttpConfigs.newConfig(Configuration.siteServicesServer.hostname)
def getMetadata = exec(MyProjectsService.getAllProjectsForUser("${userId}"))
def dataFileName = "MetadataSimulationData.csv"
def Photobook_AddToCartDataFile="Photobook_AddToCartData.csv"
def Calendar_AddToCartDataFile="Calendar_AddToCartData.csv"
def dataFileName4="AddToCartData.csv"
def assertions = List(
details("First Assertion").responseTime.percentile3.lessThan(1000)
)
val scn = scenario(scenarioName).during(5) {
.exec().feed(csv(dataFileName).circular).exec(getMetadata)
}
setUp(scn.inject(rampUsers(userCount) over rampUpTime))
.protocols(httpConf)
.assertions(assertions)
}
Is there a way to authorize destinations with Apache Apollo MQ?
What I would like is to make it so that
1) users may write only to a shared topic but restrict read to an server/admin. This topic is to send messages to a server.
2) Users may read from their own private topic but no one but the server/admin may write to it.
For Example:
Topic User rights Server/Admin rights
/public Write only Read only
/user/foo ONLY the user foo may read Write only
/user/bar ONLY the user bar may read Write only
/user/<username> ONLY the <username> may read Write only
Now for the interesting part. This must work with dynamic topics. The user's name is NOT known ahead of time.
I had this working with Apache ActiveMQ using a custom BrokerFilter but am not sure how to do with with Apollo.
Thanks for any help.
After a lot of head scratching I figured it out.
In apollo.xml:
<broker xmlns="http://activemq.apache.org/schema/activemq/apollo" security_factory="com.me.MyAuthorizationPlugin">
In com.me.MyAuthorizationPlugin:
package com.me
import org.fusesource.hawtdispatch.DispatchQueue.QueueType
import org.apache.activemq.apollo.broker.security._
import org.apache.activemq.apollo.broker.{ Queue, Broker, VirtualHost }
import java.lang.Boolean
class MyAuthorizationPlugin extends SecurityFactory {
def install(broker: Broker) {
DefaultSecurityFactory.install(broker)
}
def install(virtual_host: VirtualHost) {
DefaultSecurityFactory.install(virtual_host)
val default_authorizer = virtual_host.authorizer
virtual_host.authorizer = new Authorizer() {
def can(ctx: SecurityContext, action: String, resource: SecuredResource): Boolean = {
println("Resource: " + resource.id + " User: " + ctx.user)
resource.resource_kind match {
case SecuredResource.TopicKind =>
val id = resource.id
println("Topic Resource: " + id + " User: " + ctx.user)
var result : Boolean = id.startsWith("user." + ctx.user) || id.startsWith("MDN." + ctx.user + ".")
println("Result: " + result)
return result
case _ =>
return default_authorizer.can(ctx, action, resource)
}
}
}
}
}
The following URLs seemed VERY useful and indeed nearly a perfect match:
https://github.com/apache/activemq-apollo/blob/trunk/apollo-stomp/src/test/resources/apollo-stomp-custom-security.xml#L18
https://github.com/apache/activemq-apollo/blob/trunk/apollo-stomp/src/test/scala/org/apache/activemq/apollo/stomp/test/UserOwnershipSecurityFactory.scala#L29
Now I only need to clean up my nasty scala and put it in Git.
I am thinking of doing two tests:
Speed of EXACTLY what I need
A Regex pattern matcher with username / clientID replacements and +/*/?/etc This pattern will be pulled from the config file.
If they are nearly identical I may see about adding it to Apollo by contacting commiters.
I found here a very good example of what I want:
Basically to be able to execute a String as a groovy script with an expression, but if the condition is false, I want to show detailed information about why it was evaluated as false.
EDIT
I want an utility method that work like this:
def expression = "model.book.title == \"The Shining\""
def output = magicMethod(expression)
// output.result: the exact result of executing expression
// output.detail: could be a string telling me why this expression returns true or false, similar to de image
I think it may be a combination of Eval.me + assert and to catch the exception in order to get details
Yeah, it works with assert, thanks for the idea #Justin Piper
here is the snippet:
def model = [model:[book:[title:"The Shinning"]]]
def magicMethod= { String exp ->
def out = [:]
out.result = Eval.x(model,"x.with{${exp}}")
try{
if(out.result){
Eval.x(model,"x.with{!assert ${exp}}")
}else{
Eval.x(model,"x.with{assert ${exp}}")
}
}catch(Throwable e){
out.detail = e.getMessage()
}
return out
}
def expression = "model.book.title == \"The Shining\""
def output = magicMethod(expression)
println "result: ${output.result}"
println "detail: ${output.detail}"