RSQLite does not create local database - rsqlite

Reprex:
con <- DBI::dbConnect(RSQLite::SQLite(), path = "test.sqlite")
dbWriteTable(con, "mtcars", mtcars)
dbListTables(con)
[1] "mtcars"
dbDisconnect(con)
When I come back:
con <- DBI::dbConnect(RSQLite::SQLite(), path = "test.sqlite")
dbListTables(con)
character(0)
I thought dbConnect should create a database if none exists. I don't know what is going on.

To create a local database you still need to supply parameters specific to your machine. This fixed the issue for me:
con <- DBI::dbConnect(RSQLite::SQLite(),
user = 'root',
password = '',
dbname = 'test.sqlite',
host = 'localhost')
This is documented poorly if you do not have much understanding of SQLite databases. It seems a database was only being created in memory. Maybe someone else can expound on this as I think warnings would help guide users in this situation.

Related

Failed to run spark query in databricks notebook after storage configurations

I already set up key vault scope in the notebooks and I established the connection to the storage account using the following steps:
spark.conf.set("fs.azure.account.auth.type."+StorageAccountName+".dfs.core.windows.net", "OAuth")
spark.conf.set("fs.azure.account.oauth.provider.type."+StorageAccountName+".dfs.core.windows.net","org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set("fs.azure.account.oauth2.client.id."+StorageAccountName+".dfs.core.windows.net",clientId)
spark.conf.set("fs.azure.account.oauth2.client.secret."+StorageAccountName+".dfs.core.windows.net",clientSecret)
spark.conf.set("fs.azure.account.oauth2.client.endpoint."+StorageAccountName+".dfs.core.windows.net","https://login.microsoftonline.com/mytenantid/oauth2/token")
The values of "StorageAccountName", "clientId", "clientSecret" all come from key vault and I am able to get their value properly. In my storage account access control I also assigned the
Storage Blob Data Contributor role to my service principal.
After these configurations, I assigned a connection variable:
var apptable = "abfss://container#"+StorageAccountName+".dfs.core.windows.net/path/to/data"
If I run the following command, I am able to see the files in the blob storage
display(dbutils.fs.ls(apptable))
I am also able to check the schema:
var df = spark.read.format("delta").load(apptable)
df.printSchema()
but if I tried to run the following query:
var last_appt = spark.sql(s"""select max(updateddate) from apptable""").collect()(0).getTimestamp(0)
I got the error:
KeyProviderException: Failure to initialize configuration
Caused by: InvalidConfigurationValueException: Invalid configuration value detected for fs.azure.account.key
I researched online and seems there are some issues in the spark configs. But if it failed to get access to the storage, how come the above display command is running well? What could be possibly missing in such scenario?
I have limited experience on databricks. Appreciate any help.
I tried to reproduce the same in my environment and got the below results and I configure same as mentioned above.
Please follow below code:
Read spark dataframe df.
var df = spark.read.format("delta").load(apptable)
Create temp table:
%scala
temp_table_name = "demtb"
df.createOrReplaceTempView(temp_table_name)
Now, using below code. I got this output.
%scala
val aa= spark.sql("""select max(marks) from demtb""")
display(aa)
Update:
As mentioned, in below comment its working fine for me.
df1.write.mode("overwrite").format("parquet").option("path","/FileStore/dd/").option("overwriteschema","true").saveAsTable("app")
And also, you can try this syntax for configuring azure gen2.As per requirement you can change file format. For demo I'm using csv.
spark.conf.set("fs.azure.account.key.<storage_account_name>.dfs.core.windows.net","Access_key")
Scala
%scala
val df1 = spark.read.format("csv").option("header", "true").load("abfss://pool#vamblob.dfs.core.windows.net/")
display(df1)
Python
df1 = spark.read.format("csv").option("header", "true").load("abfss://pool#vamblob.dfs.core.windows.net/")
display(df1)

Access Based Share Enum in samba on Macos

I'm trying to set up a samba server on an M1 Mac. Have installed samba via homebrew.
Users and groups have been added to the OS, and then users added to samba via smbpasswd.
Everything works as expected. Users are allowed into shares to which their groups have permissions (using groups on the directory) and denied when they don't. So far, so good.
Everything breaks when I try to hide shares the user doesn't have access to, by adding 'access based share enum = yes' to the global section in smb.conf.
Then client returns an error saying "There are no shares available or you are not allowed to access them on the server."
The frustrating bit is that I have had it working, but I'm starting to lose the will!
Any suggestions?
smb.conf below
[global]
workgroup = WORKGROUP
security = user
passdb backend = tdbsam
inherit permissions = yes
inherit owner = yes
ea support = yes
min protocol = SMB2
vfs objects = fruit streams_xattr
#fruit:metadata = stream
#fruit:model = MacSamba
#fruit:posix_rename = yes
#fruit:wipe_intentionally_left_blank_rfork = yes
#fruit:veto_appledouble = no
#fruit:delete_empty_adfiles = yes
access based share enum = yes
max log size = 100000
[IT_application]
path = /Volumes/WORKFLOW/data/shares/IT
valid users = #workflow_it
force group = workflow_it
read only = no
browseable = yes
public = no
writeable = yes
what happens when you remove the valid users = #workflow_it & force group = workflow_it option?
There was a regression related to reading the share_info.tdb file. It was fixed a few weeks ago so it should show up in the 4.16.9 and 4.17.5 releases.
https://bugzilla.samba.org/show_bug.cgi?id=15265
While I wouldn't recommend it in a production system I can confirm that chmod 666 on share_info.tdb does in fact restore the expected behavior.

Linking to Oracle tables in Access using VB6: Error 3000?

I am trying to link an Oracle table to access using the following Visual Basic 6.0 code:
Dim objApp, objDB, objTable As Object
Dim strFile, strConnect, strLocalTable, strServerTable As String
strFile = "C:\path\to\base.mdb"
strLocalTable = "local"
strServerTable = "BASE.TABLE_NAME"
strConnect = "ODBC;Driver={Microsoft ODBC for Oracle};ConnectString=name.world;Uid=username;Pwd=password;"
Set objApp = CreateObject("Access.Application")
objApp.OpenCurrentDatabase strFile
Set objDB = objApp.CurrentDb()
Set objTable = objDB.CreateTableDef(strLocalTable)
objTable.Connect = strConnect
objTable.SourceTableName = strServerTable
objDB.TableDefs.Append objTable 'Generates 3000 Error
objDB.TableDefs.Refresh
On the second to last row I get (loosely translated from swedish by me) "Run time error 3000: Reserved error (-7778). There is no message for this error."
Any ideas on why this may be? I am told this code has worked before, so it could possibly be some kind of version conflict with updated software. The database is in Access 2000 format, and Access 2013 is installed on the computer (however, saving the database as Access 2013 does not help). Or is there something wrong with the connection string perhaps?
EDIT: I tried using a DSN in the connection string:
strConnect = "ODBC;Driver={Microsoft ODBC for Oracle};DSN='test';"
I get the same error, even though I can use that very DSN to link the tables manually in Access.
Also (as I stated in the comments) changing some of the information in the connection string (like deliberately providing an incorrect username) leads to a different error (3146: Connection failed). This leads me to believe that the connection to the database works, since it seems to be able to differentiate between good and bad credentials.
Try this connection string and leave out the 'world.' part
ODBC;DRIVER={Oracle in orahome32};UID=userId;PWD=password;SERVER=servername;dbq=servername
(I was having trouble earlier today with connections that left the dbq out)
Or maybe your existing one will work, but regardless...I think Access likes you to create the table default in one swoop and not break things up so.....
Instead of this:
Set objTable = objDB.CreateTableDef(strLocalTable)
objTable.Connect = strConnect
objTable.SourceTableName = strServerTable
Try This:
Set objTable = objDB.CreateTableDef(strLocalTable, dbAttachSavePWD, strServerTable, strConnect)
(NOTE: the dbAttachSavePWD will help avoid users getting prompted for password every time they touch the table; leave it out if that is not desired)

Mongo::OperationFailure - need to login when using from_uri

My goal is to connect with my heroku/mongolab database but I keep getting this error:
Mongo::OperationFailure at /mongotest/a/b
: need to login
file: networking.rb
location: send_message_with_gle
line: 89
The code I'm using is:
client = Mongo::MongoClient.from_uri(ENV['MONGOLAB_URI'])
db = client.db('test')
testcoll = db['testcoll']
testcoll.insert({:'_id' => "def", :'test' => "woop de doop"})
testcoll.find()
ENV['MONGOLAB_URI']=mongodb://heroku_app########:password#ds0xxxxx.mongolab.com:xxxxx/heroku_app########
I know that the uri is correct and contains the username and password, so why the error? Also, the error occurs on the insert() line, not the line where I authenticate.
Welp, turns out the url connects me to the heroku_app######## database, but I'm then trying to access the database called test so obviously I'm not authenticated. Would have been nice Mongo had returned an error specifying that I had logged in but not to the right database. Oh well.
I hadn't paid enough attention to the format of the uri, which is
mongodb://username:password#host:port/database
The database part is... pretty important, it turns out.
(I actually found the answer to this while writing the test, but if this answer had existed it might have saved me an embarrassingly large amount of time, so I'm writing it again and answering it myself.)

ResultSet.getString() on VARCHAR2 column returns empty string

Disclaimer: I don't actually know anything about nether Oracle nor Java. The issue is in a project that some other developer completed at some point in time and then left the company. Now I have to setup webserver, database and get it all up and running.
the code is approx this:
OracleDataSource ods = new OracleDataSource();
ods.setURL("jdbc:oracle:thin:<user>/<password>#localhost:1521:xe");
OracleConnection ocon = (OracleConnection)ods.getConnection();
OracleStatement stmt = (OracleStatement)ocon.createStatement();
OracleResultSet rs = (OracleResultSet)stmt.executeQuery("SELECT POLLID, QUESTION, ISMULTISELECT FROM POLL WHERE POLLID = " + pollID);
if (!rs.next()) {
System.out.println("No rows found.");
return false;
}
this._PollID = rs.getInt("POLLID");
this._Question = rs.getString("QUESTION");
this._IsMultiSelect = rs.getBoolean("ISMULTISELECT");
The POLLID and ISMULTISELECT columns return correct values as expected. The QUESTION seem to always return empty string. The value in the DB is obviously not empty.
The rs.getAsciiStream("QUESTION").available() also returns zero.
Am I missing something completely obvious here?
EDIT:
sqlplus returns varchar2 value just fine
connecting via odbc (as opposed to thin) also makes things work
so no Exceptions, you are not using reserved words...maybe try to use other driver, or select into other table and experiment start with empty QUESTION column, then add some value and debug.
Thanks to everyone who replied. At this point it seems issue is between thin driver and XE version of Oracle. Unfortunately we don't have full version kickin' around (we are primarily ASP.NET/MS SQL developers), so we'll have to stick with ODBC driver for now and hope issue will magically resolve itself when we push it to live environment (hosted by third party). Very crappy assumption to make, but at this point I see no other options....
I had the same exact issue and found that the root of the problem comes from the orai18n.jar. once i removed this from my classpath, the issue went away.
I have the same problem. I do not have access to the driver that is used because the connection is taken from a Weblogic server using JNDI. I cannot remove any .jar from the server neither.
The workaround I found :
String value = new String(resultset.getBytes());
Make sure you use the right encoding if required :
String value = new String(resultset.getBytes(), [CHARSET])
I had this same issue with eclise GCJ ( Stock centos6 ) and mysql-connector with the same concatenated queries. The problem was solved with reverting back to openJDK.
I had the same issue. "getInt()" would return correct value from Oracle 9i DB, but using "getString()" would result into empty string, no matter how many times i ran, within eclipse or outside on seperate Tomcat or other servers.
After going through a lot of various threads, and quite a few trials, I came to the conclusion that the issue is with the version of ojdbc6.jar that I was using. Earlier, I was using ojdbc6.jar with Oracle version 12.1.0.1., which is not so good for connecting to OLD Oracle 9i DB. After realising, I switched on to ojdbc6.jar from Oracle 11.2.0.3 and it worked like a charm.
Hope it helps. cheers!

Resources