I am creating Gatling script for create new user.
I am passing user defined values (fName = "newuser" , emailDomain = "#perftestorg.com") into the next request.
But, in response it didnot return the defined values and timestamp is also not printed. Please suggest what am I doing wrong here?
import scala.concurrent.duration._
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import io.gatling.jdbc.Predef._
import java.util.Calendar
import java.util.Date
import java.text.SimpleDateFormat
class CreateNewUser extends Simulation {
val fName = "newuser"
val emailDomain = "#perftestorg.com"
val sdfDate = new SimpleDateFormat("ddMMyy.HH.mm.ss.SSS")
val now = new Date()
val timeStamp = sdfDate.format(now)
val httpProtocol = http
.baseURL("https://abcxyz.com")
.acceptHeader("application/json, text/javascript, */*; q=0.01")
.acceptEncodingHeader("gzip, deflate")
.acceptLanguageHeader("en-US")
.userAgentHeader("Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko")
.disableCaching
.doNotTrackHeader("1")
val scn = scenario("CreateNewUser")
.exec(http("LogIn")
.post("https://" + uri1 + "/login-page")
.headers(headers_0)
.formParam("_58_redirect", "")
.formParam("_58_rememberMe", "false")
.formParam("_58_login", "newuser002#perftestorg.com")
.formParam("_58_password", "scOrpiO")
.exec(http("checkUserAlreadyExist")
.post("/api/checkUserAlreadyExist")
.headers(headers_2)
.formParam("emailAdd", "$fName+timeStamp+$emailDomain"))
.exec(http("CreateNewUser")
.post("/api/UserSubscription")
.headers(headers_2)
.formParam("UserSubscription", """{"userEmail":"$fName+timeStamp+$emailDomain", "fName":"$fName","lName":""""+timeStamp+"""","languageId":"en_GB","subscription_id":"1","contractNumber":"1234567890","emailNotification":true,"orgName":"Performance Testing Organisation"}""")
.formParam("sendLoginCredential", "true"))
setUp(scn.inject(atOnceUsers(1))).protocols(httpProtocol)
}
Your help would be much appreciated.
thanks,
Praveen
When you use something as $fName you need to put the variable in session.
Use
exec(_.set("fName","username")).exec(http...
Related
Folks, I'm new to all this data streaming process but I was able to build and submit a Flink job that will read some CSV data from Kafka and aggregate it then put it in Elasticsearch.
I was able to do the first two parts, and print out my aggregation to STDOUT. But when I added the code to put it to Elasticsearch, it seems nothing is happening there (no data being added). I looked at the Flink job manager log and it looks fine (no errors) and says:
2020-03-03 16:18:03,877 INFO
org.apache.flink.streaming.connectors.elasticsearch7.Elasticsearch7ApiCallBridge
- Created Elasticsearch RestHighLevelClient connected to [http://elasticsearch-elasticsearch-coordinating-only.default.svc.cluster.local:9200]
Here is my code at this point:
/*
* This Scala source file was generated by the Gradle 'init' task.
*/
package flinkNamePull
import java.time.LocalDateTime
import java.util.Properties
import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.connectors.kafka.{FlinkKafkaConsumer010, FlinkKafkaProducer010}
import org.apache.flink.api.common.functions.RichMapFunction
import org.apache.flink.configuration.Configuration
import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment
import org.apache.flink.table.api.{DataTypes, Table}
import org.apache.flink.table.api.scala.StreamTableEnvironment
import org.apache.flink.table.descriptors.{Elasticsearch, Json, Schema}
object Demo {
/**
* MapFunction to generate Transfers POJOs from parsed CSV data.
*/
class TransfersMapper extends RichMapFunction[String, Transfers] {
private var formatter = null
#throws[Exception]
override def open(parameters: Configuration): Unit = {
super.open(parameters)
//formatter = DateTimeFormat.forPattern("yyyy-MM-dd HH:mm:ss")
}
#throws[Exception]
override def map(csvLine: String): Transfers = {
//var splitCsv = csvLine.stripLineEnd.split("\n")(1).split(",")
var splitCsv = csvLine.stripLineEnd.split(",")
val arrLength = splitCsv.length
val i = 0
if (arrLength != 13) {
for (i <- arrLength + 1 to 13) {
if (i == 13) {
splitCsv = splitCsv :+ "0.0"
} else {
splitCsv = splitCsv :+ ""
}
}
}
var trans = new Transfers()
trans.rowId = splitCsv(0)
trans.subjectId = splitCsv(1)
trans.hadmId = splitCsv(2)
trans.icuStayId = splitCsv(3)
trans.dbSource = splitCsv(4)
trans.eventType = splitCsv(5)
trans.prev_careUnit = splitCsv(6)
trans.curr_careUnit = splitCsv(7)
trans.prev_wardId = splitCsv(8)
trans.curr_wardId = splitCsv(9)
trans.inTime = splitCsv(10)
trans.outTime = splitCsv(11)
trans.los = splitCsv(12).toDouble
return trans
}
}
def main(args: Array[String]) {
// Create streaming execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment
env.setParallelism(1)
// Set properties per KafkaConsumer API
val properties = new Properties()
properties.setProperty("bootstrap.servers", "kafka.kafka:9092")
properties.setProperty("group.id", "test")
// Add Kafka source to environment
val myKConsumer = new FlinkKafkaConsumer010[String]("raw.data3", new SimpleStringSchema(), properties)
// Read from beginning of topic
myKConsumer.setStartFromEarliest()
val streamSource = env
.addSource(myKConsumer)
// Transform CSV (with a header row per Kafka event into a Transfers object
val streamTransfers = streamSource.map(new TransfersMapper())
// create a TableEnvironment
val tEnv = StreamTableEnvironment.create(env)
println("***** NEW EXECUTION STARTED AT " + LocalDateTime.now() + " *****")
// register a Table
val tblTransfers: Table = tEnv.fromDataStream(streamTransfers)
tEnv.createTemporaryView("transfers", tblTransfers)
tEnv.connect(
new Elasticsearch()
.version("7")
.host("elasticsearch-elasticsearch-coordinating-only.default.svc.cluster.local", 9200, "http") // required: one or more Elasticsearch hosts to connect to
.index("transfers-sum")
.documentType("_doc")
.keyNullLiteral("n/a")
)
.withFormat(new Json().jsonSchema("{type: 'object', properties: {curr_careUnit: {type: 'string'}, sum: {type: 'number'}}}"))
.withSchema(new Schema()
.field("curr_careUnit", DataTypes.STRING())
.field("sum", DataTypes.DOUBLE())
)
.inUpsertMode()
.createTemporaryTable("transfersSum")
val result = tEnv.sqlQuery(
"""
|SELECT curr_careUnit, sum(los)
|FROM transfers
|GROUP BY curr_careUnit
|""".stripMargin)
result.insertInto("transfersSum")
// Elasticsearch elasticsearch-elasticsearch-coordinating-only.default.svc.cluster.local:9200
env.execute("Flink Streaming Demo Dump to Elasticsearch")
}
}
I'm not sure how I can debug this beast... Wondering if somebody can help me figure out why the Flink job is not adding data to Elasticsearch :(
From my Flink cluster, I'm able to query Elasticsearch just fine (manually) and add records to my index:
curl -XPOST "http://elasticsearch-elasticsearch-coordinating-only.default.svc.cluster.local:9200/transfers-sum/_doc" -H 'Content-Type: application/json' -d'{"curr_careUnit":"TEST123","sum":"123"}'
A kind soul in the Flink mailist pointed out the fact that it could be Elasticsearch buffering my records... Well, it was. ;)
I have added the following options to the Elasticsearch connector:
.bulkFlushMaxActions(2)
.bulkFlushInterval(1000L)
Flink Elasticsearch Connector 7 using Scala
Please find a working and detailed answer which I have provided here.
We are manually setting a custom Message-ID header while sending emails using Java MimeMessage. The Message-ID format is following the RFC822 standard. However, on sending the mail via the Gmail API, the Message-ID header is getting overwritten with a new one from gmail.
Instead if we we Java Mail and send the email via SMTP, the custom Message-ID is retained by Gmail.
Is there a way to have a custom Message-ID while sending email via the Gmail API ?
I have checked the following question, but I am not sure if its still the case. (RFC822 Message-Id in new Gmail API)
[UPDATE]
EmailMimeMessage.scala
package utils.email
import javax.mail._
import javax.mail.internet._
import play.api.Logger
class EmailMimeMessage(session: Session, messageId: String) extends MimeMessage(session) {
#throws(classOf[MessagingException])
override def updateMessageID(): Unit = {
Logger.info(s"[EmailMimeMessage] before sending add message id: $messageId")
setHeader("Message-ID", messageId)
}
}
GmailApiService.scala
package utils.email
import java.io.ByteArrayOutputStream
import java.util.Properties
import javax.mail.Session
import javax.mail.internet.{InternetAddress, MimeMessage}
import com.google.api.client.auth.oauth2.{BearerToken, Credential}
import com.google.api.client.googleapis.javanet.GoogleNetHttpTransport
import com.google.api.client.http.HttpTransport
import com.google.api.client.json.JsonFactory
import com.google.api.client.json.jackson2.JacksonFactory
import com.google.api.client.repackaged.org.apache.commons.codec.binary.Base64
import com.google.api.services.gmail.Gmail
import scala.util.Try
case class EmailToBeSent(
to_email: String,
from_email: String,
from_name: String,
reply_to_email: String,
subject: String,
textBody: String,
htmlBody: String,
message_id: String
)
object GmailApiService {
private val APPLICATION_NAME: String = "Gmail API Java Quickstart"
private val JSON_FACTORY: JsonFactory = JacksonFactory.getDefaultInstance
private val HTTP_TRANSPORT: HttpTransport = GoogleNetHttpTransport.newTrustedTransport()
def createEmail(emailToBeSent: EmailToBeSent): Try[MimeMessage] = Try {
val props = new Properties()
val session = Session.getDefaultInstance(props, null)
val email = new EmailMimeMessage(session, emailToBeSent.message_id)
email.setFrom(new InternetAddress(emailToBeSent.from_email))
email.addRecipient(javax.mail.Message.RecipientType.TO, new InternetAddress(emailToBeSent.to_email))
email.setSubject(emailToBeSent.subject)
email.setText(emailToBeSent.textBody)
email
}
def createMessageWithEmail(email: MimeMessage) = Try {
val baos = new ByteArrayOutputStream
email.writeTo(baos)
val encodedEmail = Base64.encodeBase64URLSafeString(baos.toByteArray)
val message = new com.google.api.services.gmail.model.Message()
message.setRaw(encodedEmail)
message
}
def sendGmailService(emailToBeSent: EmailToBeSent, accessToken: String) = Try {
val credential = new Credential(BearerToken.authorizationHeaderAccessMethod)
.setAccessToken(accessToken)
val service = new Gmail.Builder(HTTP_TRANSPORT, JSON_FACTORY, credential).setApplicationName(APPLICATION_NAME).build
val user = "me"
val message = createEmail(emailToBeSent) flatMap { email => createMessageWithEmail(email) }
val sentMessage = service.users().messages().send(user, message.get).execute()
sentMessage
}
}
On calling GmailApiService.sendGmailService as follows (with Message-ID: "<1495728783999.123.456.local#examplegmail.com>"), in the sent email the Message-ID is overwritten by GMail with something like "YYfdasCAN=-fdas432HFD43FD_THD#mail.gmail.com":
val emailToBeSent = EmailToBeSent(
to_email = "mary_to#gmail.com",
from_email = "john_from#examplegmail.com",
from_name = "John Doe",
reply_to_email = "john_from#gmail.com",
subject = "How are you ?",
textBody = "Hey, how are you ?",
htmlBody = "<strong>Hey, how are you ?</strong>",
message_id ="<1495728783999.123.456.local#examplegmail.com>",
in_reply_to_id = None,
sender_email_settings_id = 0
)
val sentMsg = GmailApiService.sendGmailService(emailToBeSent, GOOGLE_OAUTH_ACCESS_TOKEN).get
My answer from 2014 is still correct: RFC822 Message-Id in new Gmail API
Gmail always sets the RFC822 Message-Id header on outgoing emails.
I am new to spark and scala. I am trying to run an example given in google. I am encounting following exception when running this program.
Exception is:
17/05/25 11:13:42 ERROR ReceiverTracker: Deregistered receiver for stream 0: Restarting receiver with delay 2000ms: Error starting Twitter stream - java.lang.IllegalStateException: Authentication credentials are missing.
Code that I am executing is as follows:
PrintTweets.scala
package example
import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.spark.streaming._
import org.apache.spark.streaming.twitter._
import org.apache.spark.streaming.StreamingContext._
import org.apache.log4j.Level
import Utilities._
object PrintTweets {
def main(args: Array[String]) {
// Configure Twitter credentials using twitter.txt
setupTwitter()
val appName = "TwitterData"
val conf = new SparkConf()
conf.setAppName(appName).setMaster("local[3]")
val ssc = new StreamingContext(conf, Seconds(5))
//val ssc = new StreamingContext("local[*]", "PrintTweets", Seconds(10))
setupLogging()
// Create a DStream from Twitter using our streaming context
val tweets = TwitterUtils.createStream(ssc, None)
// Now extract the text of each status update into RDD's using map()
val statuses = tweets.map(status => status.getText())
statuses.print()
ssc.start()
ssc.awaitTermination()
}
}
Utilities.scala
package example
import org.apache.log4j.Level
import java.util.regex.Pattern
import java.util.regex.Matcher
object Utilities {
/** Makes sure only ERROR messages get logged to avoid log spam. */
def setupLogging() = {
import org.apache.log4j.{Level, Logger}
val rootLogger = Logger.getRootLogger()
rootLogger.setLevel(Level.ERROR)
}
/** Configures Twitter service credentials using twiter.txt in the main workspace directory */
def setupTwitter() = {
import scala.io.Source
for (line <- Source.fromFile("../twitter.txt").getLines) {
val fields = line.split(" ")
if (fields.length == 2) {
System.setProperty("twitter4j.oauth." + fields(0), fields(1))
}
}
}
/** Retrieves a regex Pattern for parsing Apache access logs. */
def apacheLogPattern():Pattern = {
val ddd = "\\d{1,3}"
val ip = s"($ddd\\.$ddd\\.$ddd\\.$ddd)?"
val client = "(\\S+)"
val user = "(\\S+)"
val dateTime = "(\\[.+?\\])"
val request = "\"(.*?)\""
val status = "(\\d{3})"
val bytes = "(\\S+)"
val referer = "\"(.*?)\""
val agent = "\"(.*?)\""
val regex = s"$ip $client $user $dateTime $request $status $bytes $referer $agent"
Pattern.compile(regex)
}
}
When I check using print statments I find the exception is happening at line
val tweets = TwitterUtils.createStream(ssc, None)
I am giving credentials in twitter.txt file which is read properly by program. When I don't place twitter.txt in appropriate directory it shows explicit error, It shows explicit error unauthorized access when I give blank keys for customer key and secret etc in twitter.txt
If you need more details about error related information or versions of software let me know.
Thanks,
Madhu.
I could reproduce the issue with your code. I believe its your problem.
You might have not configured twitter.txt properly. Your twitter.txt file should be like this ->
consumerKey your_consumerKey
consumerSecret your_consumerSecret
accessToken your_accessToken
accessTokenSecret your_accessTokenSecret
I hope it helps.
After changing twitter.txt file syntax to following , single space between key and value it worked
consumerKey your_consumerKey
consumerSecret your_consumerSecret
accessToken your_accessToken
accessTokenSecret your_accessTokenSecret
I have an integration test in my Grails 3.2.2 application that is supposed to check that CORS support is operational. When I start the application and use something like Paw or Postman to do a request, the breakpoint I have set in CorsFilter shows that my headers are set properly. But when I do the same request from an integration test using RestBuilder with the following code:
void "Test request http OPTIONS"() {
given: "JSON content request"
when: "OPTIONS are requested"
def rest = new RestBuilder()
def optionsUrl = url(path)
def resp = rest.options(optionsUrl) {
header 'Origin', 'http://localhost:4200'
header 'Access-Control-Request-Method', 'GET'
}
then: "they are returned"
resp.status == HttpStatus.SC_OK
!resp.json
}
The breakpoint in CorsFilter shows that both headers are null:
And the weird thing is that when I put a breakpoint in RestTemplate, right before the request is executed, the headers are there:
I don't get how those headers can disappear. Any idea?
I was working on this problem problem recently, and while I don't know where RestBuilder is suppressing the Origin header, I did come up with a workaround for testing that grails' CORS support is operating as configured: using HTTPBuilder instead of RestBuilder to invoke the service.
After adding org.codehaus.groovy.modules.http-builder:http-builder:0.7.1 as a testCompile dependency in build.gradle, and with grails.cors.allowedOrigins set to http://localhost, the following tests both worked as desired:
import geb.spock.GebSpec
import grails.test.mixin.integration.Integration
import groovyx.net.http.HTTPBuilder
import groovyx.net.http.HttpResponseException
import groovyx.net.http.Method
#Integration
class ExampleSpec extends GebSpec {
def 'verify that explicit, allowed origin works'() {
when:
def http = new HTTPBuilder("http://localhost:${serverPort}/todo/1")
def result = http.request(Method.GET, "application/json") { req ->
headers.'Origin' = "http://localhost"
}
then:
result.id == 1
result.name == "task 1.1"
}
def 'verify that explicit, disallowed origin is disallowed'() {
when:
def http = new HTTPBuilder("http://localhost:${serverPort}/todo/1")
http.request(Method.GET, "application/json") { req ->
headers.'Origin' = "http://foobar.com"
}
then:
HttpResponseException e = thrown()
e.statusCode == 403
}
}
Had same problem. After some research I found out: http://hc.apache.org/, it supports sending 'Origin' and options requests.
import grails.test.mixin.integration.Integration
import grails.transaction.Rollback
import groovy.util.logging.Slf4j
import org.apache.http.client.HttpClient
import org.apache.http.client.methods.HttpOptions
import org.apache.http.impl.client.MinimalHttpClient
import org.apache.http.impl.conn.BasicHttpClientConnectionManager
import spock.lang.Specification
#Integration
#Rollback
#Slf4j
class CorsIntegrationSpec extends Specification {
def 'call with origin'() {
when:
def response = call(["Origin":"test","Content-Type":"application/json"])
then:
response != null
response.getStatusLine().getStatusCode() == 200
response.containsHeader("Access-Control-Allow-Origin")
response.containsHeader("Access-Control-Allow-Credentials")
response.containsHeader("Access-Control-Allow-Headers")
response.containsHeader("Access-Control-Allow-Methods")
response.containsHeader("Access-Control-Max-Age")
}
private call (Map<String, String> headers) {
HttpOptions httpOptions = new HttpOptions("http://localhost:${serverPort}/authz/token")
headers.each { k,v ->
httpOptions.setHeader(k,v)
}
BasicHttpClientConnectionManager manager = new BasicHttpClientConnectionManager()
HttpClient client = new MinimalHttpClient(manager)
return client.execute(httpOptions)
}
}
Trying to create connection but displayed error . Below are the Beanshell sampler code .
import org.jivesoftware.smack.ConnectionConfiguration;
import org.jivesoftware.smack.ConnectionListener;
import org.jivesoftware.smack.tcp.XMPPTCPConnection;
import org.jivesoftware.smack.tcp.XMPPTCPConnectionConfiguration;
import org.jivesoftware.smack.SASLAuthentication;
import org.jivesoftware.smack.SmackException;
import org.jivesoftware.smack.XMPPException;
import org.jivesoftware.smack.XMPPException.XMPPErrorException;
String jabberId = "admin";
String jabberPass = "12345";
String SERVER_ADDRESS = "xxx.xxx.xxx.xxx";
int PORT = 5222; // or any other port
String ServiceName = "Smack";
SASLAuthentication.blacklistSASLMechanism("DIGEST-MD5");
SASLAuthentication.unBlacklistSASLMechanism("PLAIN");
XMPPTCPConnectionConfiguration config = XMPPTCPConnectionConfiguration.builder()
.setCompressionEnabled(false)
.setHost(SERVER_ADDRESS)
.setServiceName(ServiceName)
.setPort(DEFAULT_PORT)
.setSecurityMode(ConnectionConfiguration.SecurityMode.disabled)
.setSendPresence(true)
.setDebuggerEnabled(true)
.build();
XMPPTCPConnection con = new XMPPTCPConnection(config);
int REPLY_TIMEOUT = 50000; // 50 seconds, but can be shorter
con.setPacketReplyTimeout(REPLY_TIMEOUT);
//con = getConnection();
con.connect();
//con.login(jabberId,jabberPass);
Below are the error..
Response code: 500 Response message:
org.apache.jorphan.util.JMeterException: Error invoking bsh method: eval
In file: inline evaluation of: ``public XMPPConnection doConnect(String userName,String password) { XMPPConne . . . '' Encountered "( xxx.xxx .xxx" at line 7, column 60.
please tell me what's wrong with it . or give me correct code to connect jmeter to xmpp server .