Unable to perform click on Katalon Studio - xpath

Problem
I am new to Katalon and Automation testing, I have tried everything but unable to perform click function for a simple button
My code doesn't fail either but the click function is not performed on page
Webpage
https://www.ratesupermarket.ca/term_life_insurance
HTML
<button class="cta-primary" id="submit" type="submit"> Get Quotes <i class="icon-entity" aria-hidden="true">๎ชน</i>
</button>
xpath
//button[#id='submit']
Script-Code
WebUI.openBrowser('')
'go to URL'
WebUI.navigateToUrl('https://www.ratesupermarket.ca/term_life_insurance')
'Enter Post code - Go to next Page'
WebUI.setText(findTestObject('Object Repository/Rates Page 1/Page_Life Insurance Comparison/input_postal_code'), 'M12W22')
WebUI.click(findTestObject('Object Repository/Rates Page 1/Page_Life Insurance Comparison/button_Get Quotes'))
Console Log
09-19-2018 12:58:17 PM - [START] - Start Test Case : Test Cases/Perform click and Get Quote
09-19-2018 12:58:17 PM - [INFO] - Evaluating variables for test case
09-19-2018 12:58:18 PM - [START] - Start action : openBrowser
09-19-2018 12:58:18 PM - [INFO] - Opening browser
09-19-2018 12:58:18 PM - [INFO] - Starting 'IE' driver
09-19-2018 12:58:18 PM - [INFO] - Action delay is set to 0 seconds
Started InternetExplorerDriver server (32-bit)
3.6.0.0
Listening on port 2893
Log level is set to TRACE
Log file is set to C:\Users\cnawork\AppData\Local\Temp\Katalon\Test Cases\Perform click and Get Quote\20180919_125813\IEDriverServer.log
Only local connections are allowed
Sep 19, 2018 12:58:21 PM org.openqa.selenium.remote.ProtocolHandshake createSession
INFO: Detected dialect: W3C
09-19-2018 12:58:21 PM - [RUN_DATA] - Logging run data 'sessionId' with value '50bd252d-7549-4b1a-9a8c-d9fa80318749'
09-19-2018 12:58:21 PM - [RUN_DATA] - Logging run data 'browser' with value 'IE 11'
09-19-2018 12:58:21 PM - [RUN_DATA] - Logging run data 'platform' with value 'Windows 8.1'
09-19-2018 12:58:21 PM - [RUN_DATA] - Logging run data 'seleniumVersion' with value '3.7.1'
09-19-2018 12:58:21 PM - [RUN_DATA] - Logging run data 'proxyInformation' with value 'ProxyInformation{proxyOption=NO_PROXY, proxyServerType=HTTP, password=, proxyServerAddress=, proxyServerPort=0}'
09-19-2018 12:58:21 PM - [PASSED] - Browser is opened with url: ''
09-19-2018 12:58:21 PM - [END] - End action : openBrowser
09-19-2018 12:58:21 PM - [START] - Start action : navigateToUrl
09-19-2018 12:58:21 PM - [INFO] - Checking url
09-19-2018 12:58:21 PM - [INFO] - Navigating to 'https://www.ratesupermarket.ca/term_life_insurance'
09-19-2018 12:58:27 PM - [PASSED] - Navigate to 'https://www.ratesupermarket.ca/term_life_insurance' successfully
09-19-2018 12:58:27 PM - [END] - End action : navigateToUrl
09-19-2018 12:58:27 PM - [START] - Start action : setText
09-19-2018 12:58:27 PM - [INFO] - Finding Test Object with id 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/input_postal_code'
09-19-2018 12:58:27 PM - [INFO] - Checking object
09-19-2018 12:58:27 PM - [INFO] - Checking text
09-19-2018 12:58:27 PM - [INFO] - Checking timeout
09-19-2018 12:58:27 PM - [INFO] - Finding web element with id: 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/input_postal_code' located by 'By.xpath: //input[#id='postal_code']' in '30' second(s)
09-19-2018 12:58:27 PM - [INFO] - Found 1 web elements with id: 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/input_postal_code' located by 'By.xpath: //input[#id='postal_code']' in '30' second(s)
09-19-2018 12:58:27 PM - [INFO] - Clearing text of object 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/input_postal_code'
09-19-2018 12:58:27 PM - [INFO] - Checking timeout
09-19-2018 12:58:27 PM - [INFO] - Finding web element with id: 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/input_postal_code' located by 'By.xpath: //input[#id='postal_code']' in '30' second(s)
09-19-2018 12:58:27 PM - [INFO] - Found 1 web elements with id: 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/input_postal_code' located by 'By.xpath: //input[#id='postal_code']' in '30' second(s)
09-19-2018 12:58:27 PM - [INFO] - Setting text of object 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/input_postal_code' to value 'M12W22'
09-19-2018 12:58:27 PM - [PASSED] - Text 'M12W22' is set on object 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/input_postal_code'
09-19-2018 12:58:27 PM - [END] - End action : setText
09-19-2018 12:58:27 PM - [START] - Start action : click
09-19-2018 12:58:27 PM - [INFO] - Finding Test Object with id 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/button_Get Quotes'
09-19-2018 12:58:27 PM - [INFO] - Checking object
09-19-2018 12:58:27 PM - [INFO] - Checking timeout
09-19-2018 12:58:27 PM - [INFO] - Finding web element with id: 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/button_Get Quotes' located by 'By.xpath: //button[#id='submit']' in '30' second(s)
09-19-2018 12:58:28 PM - [INFO] - Found 1 web elements with id: 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/button_Get Quotes' located by 'By.xpath: //button[#id='submit']' in '30' second(s)
09-19-2018 12:58:28 PM - [INFO] - Clicking on object: 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/button_Get Quotes'
09-19-2018 12:58:29 PM - [PASSED] - Object: 'Object Repository/Rates Page 1/Page_Life Insurance Comparison/button_Get Quotes' is clicked on
09-19-2018 12:58:29 PM - [END] - End action : click
09-19-2018 12:58:29 PM - [PASSED] - Test Cases/Perform click and Get Quote
09-19-2018 12:58:29 PM - [END] - End Test Case : Test Cases/Perform click and Get Quote

In katalon there is a WebUI Element called "wait for element visible". Use this element hopefully this will work for you.

Try using this keyword:
#Keyword
def clickUsingJS(TestObject to, int timeout)
{
WebDriver driver = DriverFactory.getWebDriver()
WebElement element = WebUICommonHelper.findWebElement(to, timeout)
JavaScriptExecutor executor = ((driver)as JavaScriptExecutor)
executor.executeScript('arguments[0].click()',element)
}

Related

Schema validation errors while parsing config.xml - Invalid xsi:type qname: error: failed to load java type corresponding to t=ashland-authenticatorT

Below is the error, i have while starting the weblogic Admin server after creating the domain from the domain template builder.jar file,
<Jul 20, 2021 9:44:51 AM EDT> <Schema validation errors while parsing /oracle/user_projects/domains/awt-dv/config/config.xml<43:7> - Invalid xsi:type qname: 'ext:ashland-authenticatorType' in element realm#http://xmlns.oracle.com/weblogic/domain.>
<Jul 20, 2021 9:44:51 AM EDT> <Schema validation errors while parsing /oracle/user_projects/domains/awt-dv/config/config.xml - /oracle/user_projects/domains/awt-dv/:43:7: error: failed to load java type corresponding to t=ashland-authenticatorType#http://www.ashland.com/weblogic12c/security/extension.>
<Jul 20, 2021 9:44:51 AM EDT> <Schema validation errors while parsing /oracle/user_projects/domains/awt-dv/config/config.xml<53:7> - Invalid xsi:type qname: 'ext:ashland-authorizerType' in element realm#http://xmlns.oracle.com/weblogic/domain.>
<Jul 20, 2021 9:44:51 AM EDT> <Schema validation errors while parsing /oracle/user_projects/domains/awt-dv/config/config.xml - /oracle/user_projects/domains/awt-dv/:53:7: error: failed to load java type corresponding to t=ashland-authorizerType#http://www.ashland.com/weblogic12c/security/extension.>

Accept retrieving less fields than requested in MARS Web API?

I'm trying to download a 25 day ahead forecast from the ECMWF MARS Web API for all of 2018. These forecasts (WAEF Control Forecast) are only published on mondays and thursdays, and here I'm running into problems fetching the data using the MARS Web API.
I tried requesting the intuitive 2018-01-01/to/2018-12-31, but since there are 5 days a week where there aren't any fields to retrieve, the request fails.
My MARS request file is as follows:
retrieve,
class=od,
date=2018-01-01/to/2018-12-31,
expver=1,
param=229.140/245.140,
step=600/624/648/672,
stream=waef,
time=00:00:00,
type=cf,
target="output.grib"
Which results in the following response:
...
mars - INFO - 20190215.100826 - Welcome to MARS
mars - INFO - 20190215.100826 - MARS Client build stamp: 20190130224336
mars - INFO - 20190215.100826 - MARS Client version: 6.23.3
mars - INFO - 20190215.100826 - MIR version: 1.1.2
mars - INFO - 20190215.100826 - Using ecCodes version 2.10.1
mars - INFO - 20190215.100826 - Using odb_api version: 0.15.9 (file format version: 0.5)
mars - INFO - 20190215.100826 - Maximum retrieval size is 30.00 G
retrieve,target="output.grib",stream=waef,param=229.140/245.140,padding=0,step=600/624/648/672,expver=1,time=00:00:00,date=2018-01-01/to/2018-12-31,type=cf,class=odmars - WARN - 20190215.100826 - For wave data, LEVTYPE forced to Surface
mars - INFO - 20190215.100826 - Automatic split by date is on
mars - INFO - 20190215.100826 - Request has been split into 12 monthly retrievals
mars - INFO - 20190215.100826 - Processing request 1
RETRIEVE,
CLASS = OD,
TYPE = CF,
STREAM = WAEF,
EXPVER = 0001,
REPRES = SH,
LEVTYPE = SFC,
PARAM = 229.140/245.140,
TIME = 0000,
STEP = 600/624/648/672,
DOMAIN = G,
TARGET = "output.grib",
PADDING = 0,
DATE = 20180101/20180102/20180103/20180104/20180105/20180106/20180107/20180108/20180109/20180110/20180111/20180112/20180113/20180114/20180115/20180116/20180117/20180118/20180119/20180120/20180121/20180122/20180123/20180124/20180125/20180126/20180127/20180128/20180129/20180130/20180131
mars - INFO - 20190215.100826 - Web API request id: xxx
mars - INFO - 20190215.100826 - Requesting 248 fields
mars - INFO - 20190215.100826 - Calling mars on 'marsod', callback on 36551
mars - INFO - 20190215.100827 - Server task is 228 [marsod]
mars - INFO - 20190215.100827 - Request cost: 72 fields, 17.2754 Mbytes on 1 tape, nodes: hpss [marsod]
2019-02-15 11:08:59 Request is active
mars - INFO - 20190215.102300 - Transfering 18114554 bytes
mars - WARN - 20190215.102301 - Visiting database marsod : expected 248, got 72
mars - ERROR - 20190215.102301 - Expected 248, got 72.
mars - ERROR - 20190215.102301 - Request failed
...
Is there any way to allow receiving less fields than requested or any other elegant solution to this problem other than only requesting the correct dates for mondays and thursdays?
I managed to find the answer in the MARS documentation after all. Using expect = any in the control section solved the issue. More information can be found here: https://confluence.ecmwf.int/pages/viewpage.action?pageId=43521134
retrieve,
class=od,
date=2018-01-01/to/2018-12-31,
expver=1,
param=229.140/245.140,
step=600/624/648/672,
stream=waef,
time=00:00:00,
type=cf,
expect=any,
target="output.grib"

StreamSets upgrade and LDAP authentication

Just upgraded StreamSets from 2.1.0.2 to 2.4.0.0 using Cloudera Manager (5.8.2). I can't login anymore into StreamSets - I get "login failed". The new version seem to be using a different LDAP lookup method.
My logs BEFORE Update looks as below:
Mar 15, 10:42:07.799 AM INFO com.streamsets.datacollector.http.LdapLoginModule
Searching for users with filter: '(&(objectClass={0})({1}={2}))' from base dn: DC=myComp,DC=Statistics,DC=ComQ,DC=uk
Mar 15, 10:42:07.826 AM INFO com.streamsets.datacollector.http.LdapLoginModule
Found user?: true
Mar 15, 10:42:07.826 AM INFO com.streamsets.datacollector.http.LdapLoginModule
Attempting authentication: CN=UserDV,OU=London,OU=ComQ,DC=ComQ,DC=Statistics,DC=comQ,DC=uk
My logs AFTER Update looks as below:
Mar 15, 11:10:21.406 AM INFO com.streamsets.datacollector.http.LdapLoginModule
Accessing LDAP Server: ldaps://comQ.statisticsxxx.com:3269 startTLS: false
Mar 15, 11:10:22.086 AM INFO org.ldaptive.auth.SearchDnResolver
search for user=[org.ldaptive.auth.User#1573608120::identifier= userdv, context=null] failed using filter=[org.ldaptive.SearchFilter#1129802876::filter=(&(objectClass=user)(uid={user})), parameters={context=null, user=userdv}]
Mar 15, 11:10:22.087 AM INFO com.streamsets.datacollector.http.LdapLoginModule
Found user?: false
Mar 15, 11:10:22.087 AM ERROR com.streamsets.datacollector.http.LdapLoginModule
Result code: null - DN cannot be null
You should change ldap.userFilter in Cloudera Manager from uid={user} to name={user}

LogStash failed action with response of 500, dropping action

I am trying to configure LogStash to watch a file and send events to elasticsearch server.
When I start logstash to output to stdout, it runs fine:
stdout {
codec => rubydebug
}
But when I add elasticsearch output:
elasticsearch {
cluster => 'myclustername'
host => 'myip'
node_name => 'Aragorn'
}
Logstash starts up
Mar 16, 2015 3:44:24 PM org.elasticsearch.node.internal.InternalNode <init>
INFO: [Aragorn] version[1.4.0], pid[7136], build[bc94bd8/2014-11-05T14:26:12Z]
Mar 16, 2015 3:44:24 PM org.elasticsearch.node.internal.InternalNode <init>
INFO: [Aragorn] initializing ...
Mar 16, 2015 3:44:24 PM org.elasticsearch.plugins.PluginsService <init>
INFO: [Aragorn] loaded [], sites []
Mar 16, 2015 3:44:25 PM org.elasticsearch.node.internal.InternalNode <init>
INFO: [Aragorn] initialized
Mar 16, 2015 3:44:25 PM org.elasticsearch.node.internal.InternalNode start
INFO: [Aragorn] starting ...
Mar 16, 2015 3:44:25 PM org.elasticsearch.transport.TransportService doStart
INFO: [Aragorn] bound_address {inet[/0:0:0:0:0:0:0:0:9300]}, publish_address {inet[/10.98.134.83:9300]}
Mar 16, 2015 3:44:25 PM org.elasticsearch.discovery.DiscoveryService doStart
INFO: [Aragorn] myclustername/RjasP2X0ShKXEl0f2WRxBA
Mar 16, 2015 3:44:30 PM org.elasticsearch.cluster.service.InternalClusterService$UpdateTask run
INFO: [Aragorn] detected_master [Aragorn][0YytUoWlQ2qgw2_0i5V4mQ][SOMEMACHINE][inet[/myip:9300]], added {[Aragorn][0YytUoWlQ2qgw2_0i5V4mQ][
SOMEMACHINE][inet[/myip:9300]],}, reason: zen-disco-receive(from master [[Aragorn][0YytUoWlQ2qgw2_0i5V4mQ][SOMEMACHINE][inet[/myip:9300]]])
Mar 16, 2015 3:44:30 PM org.elasticsearch.node.internal.InternalNode start
INFO: [Aragorn] started
But when messages start coming in, nothing is in fact sent to elasticsearch and these start to appear in logstash output:
WARNING: [Aragorn] Message not fully read (response) for [28] handler org.elasticsearch.action.support.master.TransportMasterNodeOperationAction$6#17b
531e, error [true], resetting
Mar 16, 2015 3:44:54 PM org.elasticsearch.transport.netty.MessageChannelHandler messageReceived
WARNING: [Aragorn] Message not fully read (response) for [29] handler org.elasticsearch.action.support.master.TransportMasterNodeOperationAction$6#130
82f0, error [true], resetting
and
{:timestamp=>"2015-03-16T15:44:54.377000+0100", :message=>"failed action with response of 500, dropping action...
(the above message is much longer but does not seem to contain any useful diagnostics)
What might be wrong?

hadoop too many logs on screen

I start learning hadoop using hive recently. As a beginner I am not so familiar with all the logs showing on the screen. So it's better to see a clean version of all important logs. I learn hive based on Rutberglen's "Programming Hive" book.
Just started, and I got numerous of logs after the first command. While on the book, it's just "OK, Time taken: 3.543 seconds".
Anyone has solution to reduce these logs?
PS:below are the logs I got from command "create table x (a int);"
WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.
Sep 28, 2014 12:10:28 AM org.apache.hadoop.hive.conf.HiveConf <clinit>
WARNING: hive-site.xml not found on CLASSPATH
Logging initialized using configuration in jar:file:/Users/admin/Documents/Study/software /Programming/Hive/hive-0.9.0-bin/lib/hive-common-0.9.0.jar!/hive-log4j.properties
Sep 28, 2014 12:10:28 AM SessionState printInfo
INFO: Logging initialized using configuration in jar:file:/Users/admin/Documents/Study/software/Programming/Hive/hive-0.9.0-bin/lib/hive-common-0.9.0.jar!/hive-log4j.properties
Hive history file=/tmp/admin/hive_job_log_admin_201409280010_720612579.txt
Sep 28, 2014 12:10:28 AM hive.ql.exec.HiveHistory printInfo
INFO: Hive history file=/tmp/admin/hive_job_log_admin_201409280010_720612579.txt
hive> CREATE TABLE x (a INT);
Sep 28, 2014 12:10:31 AM org.apache.hadoop.hive.ql.Driver PerfLogBegin
INFO: <PERFLOG method=Driver.run>
Sep 28, 2014 12:10:31 AM org.apache.hadoop.hive.ql.Driver PerfLogBegin
INFO: <PERFLOG method=compile>
Sep 28, 2014 12:10:31 AM hive.ql.parse.ParseDriver parse
INFO: Parsing command: CREATE TABLE x (a INT)
Sep 28, 2014 12:10:31 AM hive.ql.parse.ParseDriver parse
INFO: Parse Completed
Sep 28, 2014 12:10:31 AM org.apache.hadoop.hive.ql.parse.SemanticAnalyzer analyzeInternal
INFO: Starting Semantic Analysis
Sep 28, 2014 12:10:31 AM org.apache.hadoop.hive.ql.parse.SemanticAnalyzer analyzeCreateTable
INFO: Creating table x position=13
Sep 28, 2014 12:10:31 AM org.apache.hadoop.hive.ql.Driver compile
INFO: Semantic Analysis Completed
Sep 28, 2014 12:10:31 AM org.apache.hadoop.hive.ql.Driver getSchema
INFO: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
Sep 28, 2014 12:10:31 AM org.apache.hadoop.hive.ql.Driver PerfLogEnd
INFO: </PERFLOG method=compile start=1411877431127 end=1411877431388 duration=261>
Sep 28, 2014 12:10:31 AM org.apache.hadoop.hive.ql.Driver PerfLogBegin
INFO: <PERFLOG method=Driver.execute>
Sep 28, 2014 12:10:31 AM org.apache.hadoop.hive.ql.Driver execute
INFO: Starting command: CREATE TABLE x (a INT)
Sep 28, 2014 12:10:31 AM hive.ql.exec.DDLTask createTable
INFO: Default to LazySimpleSerDe for table x
Sep 28, 2014 12:10:31 AM hive.log getDDLFromFieldSchema
INFO: DDL: struct x { i32 a}
Sep 28, 2014 12:10:31 AM org.apache.hadoop.hive.metastore.HiveMetaStore newRawStore
INFO: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
Sep 28, 2014 12:10:31 AM org.apache.hadoop.hive.metastore.ObjectStore initialize
INFO: ObjectStore, initialize called
Sep 28, 2014 12:10:32 AM org.apache.hadoop.hive.metastore.ObjectStore getPMF
INFO: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
Sep 28, 2014 12:10:32 AM org.apache.hadoop.hive.metastore.ObjectStore setConf
INFO: Initialized ObjectStore
Sep 28, 2014 12:10:33 AM org.apache.hadoop.hive.metastore.HiveMetaStore logInfo
INFO: 0: create_table: db=default tbl=x
Sep 28, 2014 12:10:34 AM org.apache.hadoop.hive.ql.Driver PerfLogEnd
INFO: </PERFLOG method=Driver.execute start=1411877431389 end=1411877434527 duration=3138>
OK
Sep 28, 2014 12:10:34 AM org.apache.hadoop.hive.ql.Driver printInfo
INFO: OK
Sep 28, 2014 12:10:34 AM org.apache.hadoop.hive.ql.Driver PerfLogBegin
INFO: <PERFLOG method=releaseLocks>
Sep 28, 2014 12:10:34 AM org.apache.hadoop.hive.ql.Driver PerfLogEnd
INFO: </PERFLOG method=releaseLocks start=1411877434529 end=1411877434529 duration=0>
Sep 28, 2014 12:10:34 AM org.apache.hadoop.hive.ql.Driver PerfLogEnd
INFO: </PERFLOG method=Driver.run start=1411877431126 end=1411877434530 duration=3404>
Time taken: 3.407 seconds
Sep 28, 2014 12:10:34 AM CliDriver printInfo
INFO: Time taken: 3.407 seconds
Try starting hive shell as follows :
hive --hiveconf hive.root.logger=WARN,console
If you wanted to make this change persistent, modify the logger property file HIVE_CONF_DIR/hive-log4j.properties file. If you don't have this file in your HIVE_CONF_DIR, create this file by copying the contents of hive-log4j.default in the HIVE_CONF_DIR.

Resources