Quarkus MongoDb Java Client cannot log Find queries - quarkus

I am using quarkus 2.7.5.final with mongodb java sync client (mongodb-driver-sync-4.3.4) I wish I could log all the queries sent to the database.
I am using the following property:
quarkus.log.category."org.mongodb.driver.protocol.command".level=DEBUG
The update requests get logged and it's crystal clear and beautiful. My find queries are not logged at all. Right now I am logging them with a wrapper and it is not nice as the log given by the mongo java client.
What am I doing wrong?
Is there a way to log find query as well? I am using a Mongo Client that connects to a CosmosDb Database (imposed by company policy). Could it be a Cosmos problem?
It appears liveness queries get logged:
2022-08-29 11:54:33,690+0200 DEBUG traceId=7cfc802e51ca4b1b, parentId=0, spanId=7cfc802e51ca4b1b, sampled=true [or.mo.dr.pr.command] (cluster-rtt-ClusterId{value='630c8ad5149959716236660
b', description='null'}-tpaycosmosmngrpp01azne.documents.azure.com:10255) Sending command '{"ismaster": 1}' with request id 112 to database admin on connection [connectionId{localValue:2
, serverValue:1023289710}] to server tpaycosmosmngrpp01azne.documents.azure.com:10255

Related

Spring Boot service cannot return payload in AWS - unknown 500 status

I have multiple spring boot services, running in Docker on AWS ECS. Pretty typical: Spring Cloud Server (Eureka), Config Server, Postgres in RDS. However, I have one GET call that fails every time. I have logging that shows me the data pulled from the database, logged, and returned. No error logs in the CloudWatch, but the curl results in
{"timestamp":"2022-12-15T14:10:22.333+00:00","status":500,"error":"Internal Server Error","path”:”/person/aab3f9c6-bcf0-48f9-93f1-9c0b67bb5a39"}
And here is the value retrieved from the db:
Person person = Person(personId=3, internalId=64ffc6b8-ccd9-4c6c-89d3-411729aa8155, recordIndicator=null, personType=RESIDENT, firstName=Annabeth Jones, lastName=Smith, lastFourSSN=1234, gender=Female, race=null, ethnicity=White, blood=A+, imagePath=null, activationDate=2020-06-07, deactivationDate=null, dateOfBirth=1966-08-26, address=Address(addressId=10, addressLine1=1234 Maple Lane, addressLine2=, city=Colorado Springs, state=CO, zip=77777-1234, latitude=null, longitude=null), subscriptionDate=null, subscriptionId=null, officeInternalId=null)
which is then returned this way:
ResponseEntity.ok().contentType(MediaType.APPLICATION_JSON).body(person);
There are no errors in the logs, and all other similar calls work correctly. I am not sure where to look to try and find the problem.

Kafka Connect JDBC Source running even when query fails

I'm running a JDBC source connector and try to monitor its status somehow via the exposed JMX metrics and a prometheus exporter. However the status of the connector and all its tasks are still in the running state when the query fails or db can't be reached.
In earlier versions it seems that no value for source-record-poll-total in the source-task-metrics was exported when the query failed, in the versions I use (connect-runtime-6.2.0-ccs, confluentinc-kafka-connect-jdbc-10.2.0, jmx_prometheus_javaagent-0.14.0) even when failing the metric is exported with value 0.0.
Any ideas how I could detect such a failing query or db-connection?
This is resolved in version 10.2.4 of the jdbc connector. Tasks now fail when a SQLNonTransientException occurs and this can be detected using the exported metrics. See https://github.com/confluentinc/kafka-connect-jdbc/pull/1096

LDAP Authentication/Bind tracking via Windows Events and/or Splunk

Id like to have a Splunk query to show LDAP Authentication/Binds to a group of AD servers. However, if this can be found via Windows events I can then write the query in Splunk. I'm a bit new with LDAP and Splunk...
Current search (50 or so results in 15 mins):
index="winevent" host="AD Servers" serviceBindingInformation | stats count by Account_Name
This seems to show only "Message=A directory service object was modified." which is not what we are looking for.
Another search (over 6000 results in 15 mins):
index="winevent" host="AD Servers" LDAP
While I get far more results, I dont seem to have any that are showing Authentication or LDAP Binds. The event code for all of the results is:
5136: A directory service object was modified
Is there a different way to search for LDAP Authentication than how I am going about it or is there a change that should be made on AD or Splunk to allow visibility to view LDAP Authentication?
Thanks, C

Mule Connect to remote flat files

I am new to Mule and I have been struggling with a simple issue for a while now. I am trying to connect to flat files (.MDB, .DBF) located on a remote desktop through my Mule application using the generic database connector of Mule. I have tried different things here:
I am using StelsDBF and StelsMDB drivers for the JDBC connectivity. I tried connecting directly using jdbc URL - jdbc:jstels:mdb:host/path
I have also tried to access through FTP by using FileZilla server on remote desktop and using jdbc URL in my app - jdbc:jstels:dbf:ftp://user:password#host:21/path
None of these seem to be working as I am always getting Connection exceptions. If anyone has tried this before, what is the best way to go about it? Connecting a remote flat file with Mule? Your response on this will be greatly appreciated!
If you want to load the contents of the file inside a Mule flow you should use the file or FTP connector, i don't know for sure about your JDBC option.
With the File connector you can access local files (files on the server where mule is running), you could try to mount the folders as a share.
Or run an FTP server like you already tried, that should work.
There is probably an error in your syntax / connection.
Please paste the complete XML of your Mule flow so we can see what you are trying to do.
Your usecase is still not really clear to me, are you really planning to use http to trigger the DB everytime? Anyway did you try putting the file on a local path and use that path in your database url. Here is someone that says he had it working, he created a separate bean.
http://forums.mulesoft.com/questions/6422/setting_property_dynamically_on_jdbcdatasource.html
I think a local path is maybe possible and it's better to test that first.
Also take note of how to refer to a file path, look at the examples for the file connector: https://docs.mulesoft.com/mule-user-guide/v/3.7/file-transport-reference#namespace-and-syntax
If you manage to get it working and you can use the path directly in the JDBC url, you should have a look at the poll scope.
https://docs.mulesoft.com/mule-user-guide/v/3.7/poll-reference
You can use your DB connector as an inbound endpoint when wrapped in a poll scope.
I experienced the same issue when connect to Microsoft Access Database (*.mdb, *.accdb) using Mule Database Connector. After further investigation, it's solved by installing Microsoft Access Database Engine
Another issue, I couldn't pass parameter to construct a query as same as I do for other databases. e.g.: SELECT * FROM emplcopy WHERE id = #[payload.id]
To solve this issue:
I changed the Query type from Parameterized into Dynamic.
I generated the query inside Set Payload transformer (generate the query in form of String, e.g.: SELECT * FROM emplcopy WHERE id = '1').
Finally, put it into the Dynamic query area: #[payload]

Configuration Issue for IBM Filenet 5.2

I installed IBM Filenet Content Engine 5.2,on my machine.I am getting problem while configuring GCD datasources for new profile.
Let me first explain the setps I did,then I would mention the problem that I am getting.
First,I created GCD database in DB2,then I created datasources required for configuration of profile in WAS Admin Console.I created J2C Authentication Alias,for user which has access to GCD database and configured it with datasources.I am getting test database connection as successful but when I run task of configuring GCD datasources,it fails with the following error:-
Starting to run Configure GCD JDBC Data Sources
Configure GCD JDBC Data Sources ******
Finished running Configure GCD JDBC Data Sources
An error occurred while running Configure GCD JDBC Data Sources
Running the task failed with the following message: The data source configuration failed:
WASX7209I: Connected to process "server1" on node Poonam-PcNode01 using SOAP connector; The type of process is: UnManagedProcess
testing Database connection
DSRA8040I: Failed to connect to the DataSource. Encountered java.sql.SQLException: [jcc][t4][2013][11249][3.62.56] Connection authorization failure occurred. Reason: User ID or Password invalid. ERRORCODE=-4214, SQLSTATE=28000 DSRA0010E: SQL State = 28000, Error Code = -4,214.
It looks like simple error of user id and password not valid.I am using same alias for other datasources as well and they are working fine.so not sure,why I am getting error.I have also tried changing scope of datasources,but no success.Can somebody please help?
running "FileNet Configuration Manager" task of configuring GCD datasources will create all the needs things in WAS (including Alias), do not created it before manually.
I suspect it had an issue with exciting JDBC data sources/different names Alias
Seems from your message that you are running it from Filene configuration manager. Could you please double check from your database whether user id is authorised to execute query in GCD database. It is definitely do it with permission issue.

Resources