Unable to connect cosmos table api from databricks throws errror - azure-databricks

Loaded proper library at cluster level.
com.microsoft.azure:azure-cosmosdb-spark_2.4.0_2.11:3.7.0
Gave proper connection strings from cosmos table api
cosmosConfig = {
"Endpoint" : "https://cosmos-account-name.table.cosmos.azure.com:443/",
"Masterkey" : "PrimaryKey",
"Database" : "TablesDB",
"Collection" : "Deals_Metadata"
}
Started reading this using spark api.
cosmosdbConnection = spark.read.format("com.microsoft.azure.cosmosdb.spark").options(**cosmosConfig).load()
when i execute this throws below error.
java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

I tried to reproduce same in my environment I got same error.
To resolve this error, try to check com.azure.cosmos.spark jar properly installed or not and also follow below code.
Endpoint = "https://xxxx.documents.azure.com:443/"
MasterKey = "cosmos_db_key"
DatabaseName = "<dbname>"
ContainerName = "container"
spark.conf.set("spark.sql.catalog.cosmosCatalog", "com.azure.cosmos.spark.CosmosCatalog")
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountEndpoint", Endpoint)
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountKey", MasterKey)
spark.sql("CREATE DATABASE IF NOT EXISTS cosmosCatalog.{};".format(DatabaseName))
spark.sql("CREATE TABLE IF NOT EXISTS cosmosCatalog.{}.{} using cosmos.oltp TBLPROPERTIES(partitionKeyPath = '/id', manualThroughput = '1100')".format(DatabaseName, ContainerName))
Reading the data into spark Dataframe
Cfg1 = {
"spark.cosmos.accountEndpoint": Endpoint,
"spark.cosmos.accountKey": MasterKey,
"spark.cosmos.database": DatabaseName,
"spark.cosmos.container": ContainerName,
"spark.cosmos.read.inferSchema.enabled" : "false"
}
df = spark.read.format("cosmos.oltp").options(**Cfg1).load()
print(df.count())
Reference :
Manage data with Azure Cosmos DB Spark 3 OLTP Connector for SQL API | Microsoft

Related

Error executing query in Java code to connect to Presto

We are trying to connect to Presto using Java code and execute some queries. Catalog we are using is MySQL.
Presto is installed on the Linux server. Presto CLI is working fine on Linux. Started Presto in Linux.
MySQL is also installed on the Linux machine. We are able to access MySQL in windows using DbVisualizer.
I created a MySQL connector catalog for Presto. I'm successful in querying data of MySQL using Presto CLI as presto --server localhost:8080 --catalog mysql --schema tutorials.
Executing the Java code on the Windows machine, I'm able to access MySQL and execute queries, but we are unable to query data. When we try to run a query from Presto, it is giving us Error Executing Query. In the below example, I have used a jar from Trinosql
package testdbPresto;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.Properties;
public class PrestoJdbc {
public static void main(String args[]) throws SQLException, ClassNotFoundException {
try{
//connect mysql server tutorials database here
Class.forName("com.facebook.presto.jdbc.PrestoDriver");
String url = "jdbc:trino://35.173.241.37:8080/mysql/tutorials";
Properties properties = new Properties();
properties.setProperty("user", "root");
properties.setProperty("password", "Redcar88!");
properties.setProperty("SSL", "true");
Connection connection = DriverManager.getConnection(url, properties);
Statement statement = null;
statement = connection.createStatement();
//select mysql table author table two columns
String sql;
sql = "select auth_id, auth_name from mysql.tutorials.author";
ResultSet resultSet = statement.executeQuery(sql);
//Extract data from result set
while (resultSet.next()) {
//Retrieve by column name
String name = resultSet.getString("auth_name");
//Display values
System.out.println("name : " + name);
}
//Clean-up environment
resultSet.close();
statement.close();
connection.close();
}catch(Exception e){ e.printStackTrace();}
}
}
Output:
java.sql.SQLException: Error executing query
at io.trino.jdbc.TrinoStatement.internalExecute(TrinoStatement.java:274)
at io.trino.jdbc.TrinoStatement.execute(TrinoStatement.java:227)
at io.trino.jdbc.TrinoStatement.executeQuery(TrinoStatement.java:76)
at testdbPresto.PrestoJdbc.main(PrestoJdbc.java:29)
Caused by: java.io.UncheckedIOException: javax.net.ssl.SSLException: Unsupported or unrecognized SSL message
at io.trino.jdbc.$internal.client.JsonResponse.execute(JsonResponse.java:154)
at io.trino.jdbc.$internal.client.StatementClientV1.<init>(StatementClientV1.java:110)
at io.trino.jdbc.$internal.client.StatementClientFactory.newStatementClient(StatementClientFactory.java:24)
at io.trino.jdbc.QueryExecutor.startQuery(QueryExecutor.java:46)
at io.trino.jdbc.TrinoConnection.startQuery(TrinoConnection.java:728)
at io.trino.jdbc.TrinoStatement.internalExecute(TrinoStatement.java:239)
... 3 more
Caused by: javax.net.ssl.SSLException: Unsupported or unrecognized SSL message
at sun.security.ssl.SSLSocketInputRecord.handleUnknownRecord(SSLSocketInputRecord.java:448)
at sun.security.ssl.SSLSocketInputRecord.decode(SSLSocketInputRecord.java:174)
at sun.security.ssl.SSLTransport.decode(SSLTransport.java:110)
at sun.security.ssl.SSLSocketImpl.decode(SSLSocketImpl.java:1279)
at sun.security.ssl.SSLSocketImpl.readHandshakeRecord(SSLSocketImpl.java:1188)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:401)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:373)
at io.trino.jdbc.$internal.okhttp3.internal.connection.RealConnection.connectTls(RealConnection.java:299)
at io.trino.jdbc.$internal.okhttp3.internal.connection.RealConnection.establishProtocol(RealConnection.java:268)
at io.trino.jdbc.$internal.okhttp3.internal.connection.RealConnection.connect(RealConnection.java:160)
at io.trino.jdbc.$internal.okhttp3.internal.connection.StreamAllocation.findConnection(StreamAllocation.java:256)
at io.trino.jdbc.$internal.okhttp3.internal.connection.StreamAllocation.findHealthyConnection(StreamAllocation.java:134)
at io.trino.jdbc.$internal.okhttp3.internal.connection.StreamAllocation.newStream(StreamAllocation.java:113)
at io.trino.jdbc.$internal.okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.java:42)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
at io.trino.jdbc.$internal.okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.java:93)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
at io.trino.jdbc.$internal.okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.java:93)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.java:125)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
at io.trino.jdbc.$internal.client.OkHttpUtil.lambda$basicAuth$1(OkHttpUtil.java:85)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
at io.trino.jdbc.$internal.client.OkHttpUtil.lambda$userAgent$0(OkHttpUtil.java:71)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at io.trino.jdbc.$internal.okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
at io.trino.jdbc.$internal.okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:200)
at io.trino.jdbc.$internal.okhttp3.RealCall.execute(RealCall.java:77)
at io.trino.jdbc.$internal.client.JsonResponse.execute(JsonResponse.java:131)
... 8 more
It is quite old question but it might be still relevant.
You are trying connect to trino with presto jdbc driver. PrestoSQL was rebranded as Trino . So in order to access trino via jdb, you should use trino jdbc driver.
Add trino dependency in your classpath.
If you use maven, add this dependency in the pom.
<dependency>
<groupId>io.trino</groupId>
<artifactId>trino-jdbc</artifactId>
<version>${trino-jdbc.version}</version>
</dependency>
Then use the following driver
Class.forName("io.trino.jdbc.TrinoDriver");
Here is a code that works with Trino.
fun main() {
val trinoUrl = "jdbc:trino://myDomain:443"
val properties = Properties()
properties.setProperty("user", "noUserS")
// properties.setProperty("password", "noPass")
properties.setProperty("SSL", "true")
DriverManager.getConnection(trinoUrl, properties).use { trinoConn ->
trinoConn.createStatement().use { statement ->
statement.connection.catalog = "catalog1"
statement.connection.schema = "default"
println("Executing query...")
statement.executeQuery("""
select
restaurantId,
type,
time
from table1
where time > CURRENT_TIMESTAMP - INTERVAL '1' hour
""".trimIndent()
).use { resultSet ->
val list = mutableListOf<Map<String, String>>()
while (resultSet.next()) {
val data = mapOf(
"restaurantId" to resultSet.getString("restaurantId"),
"type" to resultSet.getString("type"),
"time" to resultSet.getString("time")
)
list.add(data)
}
println("Records returned: ${list.size}")
println(list)
}
}
}
exitProcess(0)
}
It is Kotlin, but it's easy to understand.
The .use {..} it's try-with-resources in Java.
Hope this helps.

Prevent KeyVault from updating secrets using Terraform

I'm building a terraform template to create Azure resources including Keyvault Secrets. The customer Subscription policy doesn't allow anyone to update/delete/view keyvault secrets.
If I run terraform apply for the first time, it will work perfectly. However, running the same template again will give you the following error: Error:
Error updating Key Vault "####" (Resource Group "####"): keyvault.VaultsClient#Update: Failure responding to request: StatusCode=403 --
Original Error: autorest/azure: Service returned an error. Status=403 Code="RequestDisallowedByPolicy" Message="Resource '###' was disallowed by policy. Policy identifiers: '[{\"policyAssignment\":{\"name\":\"###nis-deny-keyvault-acl\", ...
on ..\..\modules\azure\keyvault\main.tf line 15, in resource "azurerm_key_vault" "keyvault":
15: resource "azurerm_key_vault" "keyvault" {
How can I get my CI/CD working while that means terraform apply will be continuously running?
Is there a way to pass this policy in terraform?
Is there a way to prevent terraform from updating KV once it created (other than locking the resource)?
Here is the Keyvault module:
variable "keyvault_id" {
type = string
}
variable "secrets" {
type = map(string)
}
locals {
secret_names = keys(var.secrets)
}
resource "azurerm_key_vault_secret" "secret" {
count = length(var.secrets)
name = local.secret_names[count.index]
value = var.secrets[local.secret_names[count.index]]
key_vault_id = var.keyvault_id
}
data "azurerm_key_vault_secret" "secrets" {
count = length(var.secrets)
depends_on = [azurerm_key_vault_secret.secret]
name = local.secret_names[count.index]
key_vault_id = var.keyvault_id
}
output "keyvault_secret_attributes" {
value = [for i in range(length(azurerm_key_vault_secret.secret.*.id)) : data.azurerm_key_vault_secret.secrets[i]]
}
And here is the module from my template:
locals {
secrets_map = {
appinsights-key = module.app_insights.app_insights_instrumentation_key
storage-account-key = module.storage_account.primary_access_key
}
output_secret_map = {
for secret in module.keyvault_secrets.keyvault_secret_attributes :
secret.name => secret.id
}
}
module "keyvault" {
source = "../../modules/azure/keyvault"
keyvault_name = local.kv_name
resource_group_name = azurerm_resource_group.app_rg.name
}
module "keyvault_secrets" {
source = "../../modules/azure/keyvault-secret"
keyvault_id = module.keyvault.keyvault_id
secrets = local.secrets_map
}
module "app_service_keyvault_access_policy" {
source = "../../modules/azure/keyvault-policy"
vault_id = module.keyvault.keyvault_id
tenant_id = module.app_service.app_service_identity_tenant_id
object_ids = module.app_service.app_service_identity_object_ids
key_permissions = ["get", "list"]
secret_permissions = ["get", "list"]
certificate_permissions = ["get", "list"]
}
Using Terraform for provisioning and managing a keyvault with that kind of limitations sounds like a bad idea. Terraforms main idea is to monitor the state of your resources - if it is not allowed to read the resource it becomes pretty useless. Your problem is not even that Terraform is trying to update something, it fails because it wants to check the current state of your resource and fails.
If your goal is just to create secrets in a keyvault, I would just us the az keyvault commands like this:
az login
az keyvault secret set --name mySecret --vault-name myKeyvault --value mySecretValue
An optimal solution would of course be that your service principal that you use for executing Terrafom commands has the sufficient rights to perform the actions it was created for.
I know this is a late answer, but for future visitors:
The pipeline running the Terraform Plan and Apply will need to have proper access to the key vault.
So, if you are running your CI/CD from Azure Pipelines, you would typically have a service connection that your pipeline uses for authentication.
The service connection you use for Terraform is most likely based on a service principal that has contributor rights (at least at resource group level) for it to provisioning anything at all.
If that is the case, then you must add a policy giving that same service principal (Use the Service Principals Enterprise Object Id) to have at least list, get and set permissions for secrets.

Google Cloud - Connect Sql Server to Google Apps Script

I have problems to connect sql server on google cloud to google apps script, have tried many options do url connection like: Jdbc.getCloudSqlConnection("jdbc:google:mysql://apis-para-pap:southamerica-east1:revistamarcasserver","sqlserver", "*****"); but is not connecting, Exception: Failed to establish a database connection. Check connection string.
Do you can help me to solve this problem to connect Sql Server to Google Apps Script?
Information about google cloud Sql Server:
DB Type: SQL Server 2017 Standard
Location: southamerica-east1-b
Instance name: apis-para-pap:southamerica-east1:revistamarcasserver
Public address: 34.95.157.142
White list: (72.14.192.0/18) (64.233.160.0/19) (209.85.128.0/17) (66.102.0.0/20) (74.125.0.0/16) (173.194.0.0/16) (66.249.80.0/20) (64.18.0.0/20) (216.239.32.0/19) (207.126.144.0/20)
(observation: Using sql server management studio, i have tested and connected successfully, with this informations).
Thank you so much
I created a Cloud SQL instance, authorize the following IP ranges and was able to connect to it using the Public Ip Address. I could not connect using Instance connection name.
var db = 'mydatabase';
var instanceUrl = "jdbc:mysql://Public_IP_address_SQL";
var dbUrl = instanceUrl + '/' + db;
/**
* Create a new database within a Cloud SQL instance.
*/
function createDatabase() {
var conn = Jdbc.getConnection(instanceUrl,{user: 'root', password: '****'} );
conn.createStatement().execute('CREATE DATABASE ' + db);
}
/**
* Create a new user for your database with full privileges.
*/
function createUser() {
var conn = Jdbc.getConnection(dbUrl,{user: 'root', password: '****'});
var stmt = conn.prepareStatement('CREATE USER ? IDENTIFIED BY ?');
stmt.setString(1, user);
stmt.setString(2, userPwd);
stmt.execute();
conn.createStatement().execute('GRANT ALL ON `%`.* TO ' + user);
}
/**
* Create a new table in the database.
*/
function createTable() {
var conn = Jdbc.getConnection(dbUrl, {user: 'new_user', password: '****'});
conn.createStatement().execute('CREATE TABLE entries '
+ '(guestName VARCHAR(255), content VARCHAR(255), '
+ 'entryID INT NOT NULL AUTO_INCREMENT, PRIMARY KEY(entryID));');
}
createDatabase()
createUser()
createTable()
}

path.home is not configured in elasticsearch

Exception in thread "main" java.lang.IllegalStateException: path.home is not configured
at org.elasticsearch.env.Environment.(Environment.java:101)
at org.elasticsearch.node.internal.InternalSettingsPreparer.prepareEnvironment(InternalSettingsPreparer.java:81)
at org.elasticsearch.node.Node.(Node.java:128)
at org.elasticsearch.node.NodeBuilder.build(NodeBuilder.java:145)
at org.elasticsearch.node.NodeBuilder.node(NodeBuilder.java:152)
at JavaAPIMain.main(JavaAPIMain.java:43)
//adding document to elasticsearch using java
Node node = nodeBuilder().clusterName("myapplication").node();
Client client = node.client();
client.prepareIndex("kodcucom", "article", "1")
.setSource(putJsonDocument("ElasticSearch: Java",
"ElasticSeach provides Java API, thus it executes all operations " +
"asynchronously by using client object..",
new Date(),
new String[]{"elasticsearch"},
"Hüseyin Akdoğan")).execute().actionGet();
How about trying this one:
NodeBuilder.nodeBuilder()
.settings(Settings.builder()
.put("path.home", "/path/to/elasticsearch/home/dir")
.node();
Credits: https://github.com/elastic/elasticsearch/issues/15325
Always ask Google about your error message first. There are more than 5k results for your problem.
if you are using intellij or eclipse,
edit configuration and add the below line in your VMoptions
-Des.path.home={dropwizard installation directory}
for example in my mac
-Des.path.home=/Users/supreeth.vp/elasticsearch-2.3.4/bin

Building a JMX client in a servlet installed on the Deployment Manager

I'm building a monitoring application as a servlet running on my websphere 7 ND deployment manager. The tool uses JMX to query the deployment manager for various data. Global Security is enabled on the dmgr.
I'm having problems getting this to work however. My first attempt was to use the websphere client code:
String sslProps = "file:" + base +"/properties/ssl.client.props";
System.setProperty("com.ibm.SSL.ConfigURL", sslProps);
String soapProps = "file:" + base +"/properties/soap.client.props";
System.setProperty("com.ibm.SOAP.ConfigURL", pp);
Properties connectProps = new Properties();
connectProps.setProperty(AdminClient.CONNECTOR_TYPE, AdminClient.CONNECTOR_TYPE_SOAP);
connectProps.setProperty(AdminClient.CONNECTOR_HOST, dmgrHost);
connectProps.setProperty(AdminClient.CONNECTOR_PORT, soapPort);
connectProps.setProperty(AdminClient.CONNECTOR_SECURITY_ENABLED, "true");
AdminClient adminClient = AdminClientFactory.createAdminClient(connectProps) ;
This results in the following exception:
Caused by: com.ibm.websphere.management.exception.ConnectorNotAvailableException: ADMC0016E: The system cannot create a SOAP connector to connect to host ssunlab10.apaceng.net at port 13903.
at com.ibm.ws.management.connector.soap.SOAPConnectorClient.getUrl(SOAPConnectorClient.java:1306)
at com.ibm.ws.management.connector.soap.SOAPConnectorClient.access$300(SOAPConnectorClient.java:128)
at com.ibm.ws.management.connector.soap.SOAPConnectorClient$4.run(SOAPConnectorClient.java:370)
at com.ibm.ws.security.util.AccessController.doPrivileged(AccessController.java:118)
at com.ibm.ws.management.connector.soap.SOAPConnectorClient.reconnect(SOAPConnectorClient.java:363)
... 22 more
Caused by: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333)
at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
at java.net.Socket.connect(Socket.java:519)
at java.net.Socket.connect(Socket.java:469)
at java.net.Socket.<init>(Socket.java:366)
at java.net.Socket.<init>(Socket.java:209)
at com.ibm.ws.management.connector.soap.SOAPConnectorClient.getUrl(SOAPConnectorClient.java:1286)
... 26 more
So, I then tried to do it via RMI, but adding in the sas.client.properties to the environment, and setting the connectort type in the code to CONNECTOR_TYPE_RMI. Now though I got a NameNotFoundException out of CORBA:
Caused by: javax.naming.NameNotFoundException: Context: , name: JMXConnector: First component in name JMXConnector not found. [Root exception is org.omg.CosNaming.NamingContextPackage.NotFound: IDL:omg.org/CosNaming/NamingContext/NotFound:1.0]
To see if it was an IBM issue, I tried using the standard JMX connector as well with the same result (substitute AdminClient for JMXConnector in the above error)
JMXServiceURL url = new JMXServiceURL("service:jmx:rmi:///jndi/JMXConnector");
Hashtable h = new Hashtable();
String providerUrl = "corbaloc:iiop:" + dmgrHost + ":" + rmiPort + "/WsnAdminNameService";
h.put(Context.PROVIDER_URL, providerUrl);
// Specify the user ID and password for the server if security is enabled on server.
String[] credentials = new String[] { "***", "***" };
h.put("jmx.remote.credentials", credentials);
// Establish the JMX connection.
JMXConnector jmxc = JMXConnectorFactory.connect(url, h);
// Get the MBean server connection instance.
mbsc = jmxc.getMBeanServerConnection();
At this point, in desperation I wrote a wsadmin sccript to run both the RMI and SOAP methods. To my amazement, this works fine. So my question is, why does the code not work in a servlet installed on the dmgr ?
regards,
Trevor
For the SOAP error, the ConnectException looks like the wrong SOAP host/port was used for the dmgr. I would double-check the server logs for the SOAP port. For the RMI error (NameNotFoundException), it looks like you're trying to use JMXConnectorFactory, which isn't supported by WAS.
If your application is installed on the dmgr, it's probably easiest to just use AdminServiceFactory.getAdminService to get an in-process reference to the AdminService rather than trying to open a new connection to the same process:
http://publib.boulder.ibm.com/infocenter/wasinfo/fep/topic/com.ibm.websphere.javadoc.doc/web/apidocs/com/ibm/websphere/management/AdminServiceFactory.html

Resources