I created a MongoDB account on MongoLab cloud server and I have a DB created and its empty (no collection, just 1 user) as of now.
this is what the command given at Mongolab to connect to the DB
mongo dbh13.mongolab.com:27137/myDB -u <username> -p <password>
is the user name here mongolab account credentials or the user i created in the myDB there.? I tried both, it's not authorizing.
but if try to connect to directly with out authorization (from Windows), it worked
with this command
mongo dbh13.mongolab.com:27137/myDB
but after which if i try to do something like show dbs/show collections it fails with the following message
> show dbs
assert failed : listDatabases failed:{
"assertion" : "unauthorized db:admin lock type:-1 client:38.117.159.162"
,
"assertionCode" : 10057,
"errmsg" : "db assertion failure",
"ok" : 0
}
Fri Aug 12 16:30:50 uncaught exception: assert failed : listDatabases failed:{
"assertion" : "unauthorized db:admin lock type:-1 client:38.117.159.162"
,
"assertionCode" : 10057,
"errmsg" : "db assertion failure",
"ok" : 0
}
Any ideas?
I got the solution for authorization from windows shell
> mongo "dbh13.mongolab.com:27137/myDB"
MongoDB shell version: 1.6.5
connecting to: dbh13.mongolab.com:27137/myDB
> db.auth("<username>","<password>")
http://support.mongolab.com/entries/20177338-i-m-using-the-windows-mongo-shell-and-can-t-connect-help
Related
I'm debugging some tests for the .NET SurrealDB library. I can open connections to the database just fine but when I send a create RPC message to the db (docker container) it returns an error that reads "There was a problem with the database: The table does not exist"
TRACE tungstenite::protocol Received message {"id":"02B70C1AFE5D","async":true,"method":"create","params":["users",{"username":"john","password":"test123"}]}
...
16 13:46:45] DEBUG surrealdb::dbs Executing: CREATE $what CONTENT $data RETURN AFTER
surreal_1 | [2022-09-16 13:46:45] TRACE surrealdb::dbs Iterating: CREATE $what CONTENT $data RETURN AFTER
code: -32000, message: "There was a problem with the database: The table does not exist"
Any idea why that would happen? The table, of course, doesn't exist since I'm trying to create it. Would there be another reason in the Surreal code that such an error would be returned?
The error message was a red herring. The actual issue was that the client had an error that didn't allow it to sign in correctly so it wasn't authorized to make changes to the database.
Offending code:
// The table doesn't exist
Err(Error::TbNotFound) => match opt.auth.check(Level::Db) {
// We can create the table automatically
true => {
run.add_and_cache_ns(opt.ns(), opt.strict).await?;
run.add_and_cache_db(opt.ns(), opt.db(), opt.strict).await?;
run.add_and_cache_tb(opt.ns(), opt.db(), &rid.tb, opt.strict).await
}
// We can't create the table so error
false => Err(Error::TbNotFound), // Wrong Error Message
},
This has since been fixed and should now return a query permission error if the client is unauthorized.
I am a beginner of metaplex.
What I want to achieve is to provide the feature of presale, which means white list!.
I followed the instruction in metaplex to set config.json.
}
"whitelistMintSettings": {
"mode" : { "neverBurn": true },
"mint" : "xxxxKnH5",
"presale" : true,
"discountPrice" : 0.5
},
Here, I set xxxxKnH5 as a member of whitelist and let him mint before public mint.
Then I update_candy_machine (this works fine).
But in the UI interface, I always got error message:
There was a problem fetching whitelist token balance
Home.tsx:184 Error: failed to get token account balance: Invalid param: could not find account
Any idea of why I received this message and how can I fix it?
Make sure that the "mint": "XXX...X", line should contain a 0 decimals SPL-TOKEN.
Moreover, for airdropping it to users, you join the Metaplex Discord server and check the #airdrop-bash thread. It contains a JS script that'll make airdropping the token to wallets alot easier:)
I am trying to connect Azure sql db from Databricks with AAD - Password auth. I imported azure sql db& adal4j libs. but still getting below error
java.lang.NoClassDefFoundError: com/nimbusds/oauth2/sdk/AuthorizationGrant
stack trace:
at com.microsoft.sqlserver.jdbc.SQLServerADAL4JUtils.getSqlFedAuthToken(SQLServerADAL4JUtils.java:24)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.getFedAuthToken(SQLServerConnection.java:3609)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.onFedAuthInfo(SQLServerConnection.java:3580)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.processFedAuthInfo(SQLServerConnection.java:3548)
at com.microsoft.sqlserver.jdbc.TDSTokenHandler.onFedAuthInfo(tdsparser.java:261)
at com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:103)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:4290)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:3157)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.access$100(SQLServerConnection.java:82)
at com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:3121)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7151)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2478)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2026)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:1687)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:1528)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:866)
at com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:569)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:63)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:56)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:115)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:5
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:590)
at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:474)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:548)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:380)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:327)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:215)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.nimbusds.oauth2.sdk.AuthorizationGrant
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
imported nimbusds lib into my workspace.
here is config
import com.microsoft.azure.sqldb.spark.config.Config
import com.microsoft.azure.sqldb.spark.connect._
import org.apache.spark.sql.SparkSession
val spark: SparkSession = SparkSession.builder().getOrCreate()
val config = Config(Map(
"url" -> "ServerName.database.windows.net",
"databaseName" -> "dbname",
"dbTable" -> "dbo.test",
"user" -> "alias#domain.com",
"password" -> "pwd",
"authentication" -> "ActiveDirectoryPassword",
"encrypt" -> "true",
"trustServerCertificate"->"false",
"hostNameInCertificate"->"*.database.windows.net"
))
val collection = spark.read.sqlDB(config)
collection.show()
please help me if any one resolved this issue.
Click here to download a working notebook.
Create a Databricks Cluster
Known working configuration - Databricks Runtime 5.2 (includes Apache Spark 2.4.0, Scala 2.11)
Install the Spark Connector for Microsoft Azure SQL Database and SQL Server
Navigate to Cluster > Library > Install New > Maven > Search Packages
Switch to Maven Central
Search for azure-sqldb-spark (com.microsoft.azure:azure-sqldb-spark)
Click Select
Click Install
Known working version - com.microsoft.azure:azure-sqldb-spark:1.0.2
Update Variables
Update variable values (custerName, server, database, table, username, password)
Run the Initialisation command (ONCE ONLY)
This will do the following:
Create a folder called init under dbfs:/databricks/init/
Creta a sub-folder with the name of the Databricks cluster
Create a bash script per dependency
Bash Script Commands:
* wget: Retrieve content from a web server
* --quit: Turns off wget's output
* -O: Output
Dependencies:
http://central.maven.org/maven2/com/microsoft/azure/adal4j/1.6.0/adal4j-1.6.0.jar
http://central.maven.org/maven2/com/nimbusds/oauth2-oidc-sdk/5.24.1/oauth2-oidc-sdk-5.24.1.jar
http://central.maven.org/maven2/net/minidev/json-smart/1.1.1/json-smart-1.1.1.jar
http://central.maven.org/maven2/com/nimbusds/nimbus-jose-jwt/7.0.1/nimbus-jose-jwt-7.0.1.jar
Restart the Databricks Cluster
This is needed to execute the init script.
Run the last cell in this Notebook
This will test the ability to connect to an Azure SQL Database via Active Directory authentication.
Init Command
// Initialisation
// This code block only needs to be run once to create the init script for the cluster (file remains on restart)
// Get the cluster name
var clusterName = dbutils.widgets.get("cluster")
// Create dbfs:/databricks/init/ if it doesn’t exist.
dbutils.fs.mkdirs("dbfs:/databricks/init/")
// Create a directory named (clusterName) using Databricks File System - DBFS.
dbutils.fs.mkdirs(s"dbfs:/databricks/init/$clusterName/")
// Create the adal4j script.
dbutils.fs.put(s"/databricks/init/$clusterName/adal4j-install.sh","""
#!/bin/bash
wget --quiet -O /mnt/driver-daemon/jars/adal4j-1.6.0.jar http://central.maven.org/maven2/com/microsoft/azure/adal4j/1.6.0/adal4j-1.6.0.jar
wget --quiet -O /mnt/jars/driver-daemon/adal4j-1.6.0.jar http://central.maven.org/maven2/com/microsoft/azure/adal4j/1.6.0/adal4j-1.6.0.jar""", true)
// Create the oauth2 script.
dbutils.fs.put(s"/databricks/init/$clusterName/oauth2-install.sh","""
#!/bin/bash
wget --quiet -O /mnt/driver-daemon/jars/oauth2-oidc-sdk-5.24.1.jar http://central.maven.org/maven2/com/nimbusds/oauth2-oidc-sdk/5.24.1/oauth2-oidc-sdk-5.24.1.jar
wget --quiet -O /mnt/jars/driver-daemon/oauth2-oidc-sdk-5.24.1.jar http://central.maven.org/maven2/com/nimbusds/oauth2-oidc-sdk/5.24.1/oauth2-oidc-sdk-5.24.1.jar""", true)
// Create the json script.
dbutils.fs.put(s"/databricks/init/$clusterName/json-smart-install.sh","""
#!/bin/bash
wget --quiet -O /mnt/driver-daemon/jars/json-smart-1.1.1.jar http://central.maven.org/maven2/net/minidev/json-smart/1.1.1/json-smart-1.1.1.jar
wget --quiet -O /mnt/jars/driver-daemon/json-smart-1.1.1.jar http://central.maven.org/maven2/net/minidev/json-smart/1.1.1/json-smart-1.1.1.jar""", true)
// Create the jwt script.
dbutils.fs.put(s"/databricks/init/$clusterName/jwt-install.sh","""
#!/bin/bash
wget --quiet -O /mnt/driver-daemon/jars/nimbus-jose-jwt-7.0.1.jar http://central.maven.org/maven2/com/nimbusds/nimbus-jose-jwt/7.0.1/nimbus-jose-jwt-7.0.1.jar
wget --quiet -O /mnt/jars/driver-daemon/nimbus-jose-jwt-7.0.1.jar http://central.maven.org/maven2/com/nimbusds/nimbus-jose-jwt/7.0.1/nimbus-jose-jwt-7.0.1.jar""", true)
// Check that the cluster-specific init script exists.
display(dbutils.fs.ls(s"dbfs:/databricks/init/$clusterName/"))
Test Command
// Connect to Azure SQL Database via Active Directory Password Authentication
import com.microsoft.azure.sqldb.spark.config.Config
import com.microsoft.azure.sqldb.spark.connect._
// Get Widget Values
var server = dbutils.widgets.get("server")
var database = dbutils.widgets.get("database")
var table = dbutils.widgets.get("table")
var username = dbutils.widgets.get("user")
var password = dbutils.widgets.get("password")
val config = Config(Map(
"url" -> s"$server.database.windows.net",
"databaseName" -> s"$database",
"dbTable" -> s"$table",
"user" -> s"$username",
"password" -> s"$password",
"authentication" -> "ActiveDirectoryPassword",
"encrypt" -> "true",
"ServerCertificate" -> "false",
"hostNameInCertificate" -> "*.database.windows.net"
))
val collection = sqlContext.read.sqlDB(config)
collection.show()
As a 2020 update:
I did cluster init scripts as described, but in the end my working setup didn't seem to require that.
I ended up using scala 2.11 (note 2.11) and these libraries installed through UI: com.microsoft.azure:azure-sqldb-spark:1.0.2 and mssql_jdbc_8_2_2_jre8.jar (note jre8). Also I had to explicitly mention the driver class in config:
import com.microsoft.azure.sqldb.spark.config.Config
import com.microsoft.azure.sqldb.spark.connect._
val config = Config(Map(
"url" -> "....database.windows.net",
"databaseName" -> "...",
"dbTable" -> "...",
"accessToken" -> "...",
"hostNameInCertificate" -> "*.database.windows.net",
"encrypt" -> "true",
"ServerCertificate" -> "false",
"driver" -> "com.microsoft.sqlserver.jdbc.SQLServerDriver"
))
val collection = spark.read.sqlDB(config)
collection.show()
Token acquisition was done with msal (python):
import msal
TenantId = "...guid..."
authority = "https://login.microsoftonline.com/" + TenantId
scope = "https://database.windows.net//.default" #🤦 yes, with double "//"
ServicePrincipalId = "...guid..."
ServicePrincipalPwd = "secret"
app = msal.ConfidentialClientApplication(client_id=ServicePrincipalId, authority=authority, client_credential=ServicePrincipalPwd, )
result = None
result = app.acquire_token_silent(scopes=[scope], account=None)
if not result:
result = app.acquire_token_for_client(scope)
if "access_token" in result:
sqlAzureAccessToken = result["access_token"]
I have defined Participant as :
participant SupplierChainParticipant identified by participantId {
o String participantId
o String identity
o String type
o String secret
}
I have also created participant using composer-rest-sever loopback API, Participant created :
{
"$class": "org.suppchain.SupplierChainParticipant",
"participantId": "Deepak",
"identityName": "Parmar",
"type": "OEM",
"secret": "not defined"
}
Now, I am trying to issue identity using composer cli using command :
composer identity issue -n 'supplychain-network' -p hlfv1 -i admin -s adminpw -u dvparmar -a "org.suppchain.SupplierChainParticipant#Deepak"
But, I am getting following error:
Error: fabric-ca request register failed with errors [[{"code":0,"message":"User 'dvparmar' is already registered"}]]
Command failed.
Any input/suggestion will be helpful.
Please file an issue if you can reproduce this and we will look at the order in which we perform operations and the error handling.
We use Windows Server 2012 on our machine.
When I try to create a text index an error occur after %50.
MongoDB v3.0.4
290GB database size 30m row at collection
How can i solve this problem?
> db.myCollection.ensureIndex({myField:1}, {name: "myIndex"})
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"ok" : 0,
"errmsg" : "error opening file \"E:\\MongoDB\\base1\\_tmp\extsort.509\":errno:24 Too many open files",
"code" : 16818
}