Google Cloud SQL + RETURN_GENERATED_KEYS - jdbc

I've got a table in my Google Cloud SQL database with an auto-incrementing column.
How do I execute an INSERT query via google-apps-script/JDBC and get back the value for the newly incremented column?
For example, my column is named ticket_id. I want to INSERT and have the new ticket_id value be returned in the result set.
In other words, if I have the following structure, what would I need to modify or how, so that I can do something like rs = stmt.getGeneratedKeys();
var conn = Jdbc.getCloudSqlConnection("jdbc:google:rdbms:.......
var stmt = conn.createStatement();
//build my INSERT sql statement
var sql = "insert into ......
var rs = stmt.executeUpdate(sql);
I see that there is a JDBC statement class with a member called RETURN_GENERATED_KEYS but I have so far not been smart enough to figure out how to properly manipulate that and get what I need. Is RETURN_GENERATED_KEYS a constant, is it an attribute, or how can I make use of it?

It seems like the documentation with the Apps Script JDBC service is a bit lacking. I've created an internal task item for that. Thankfully, Apps Script JDBC API follows the Java JDBC API pretty closely. The key is to get the result set back using the stmt.getGeneratedKeys() call.
I built a sample table using the animals example from the MySQL docs and this sample below works nicely against that and logs the next incremented ID.
function foo() {
var conn = Jdbc.getCloudSqlConnection("jdbc:google:rdbms://<instance>/<db>");
var stmt = conn.createStatement();
var sql = "INSERT INTO animals (name) VALUES ('dog')";
var count = stmt.executeUpdate(sql,1)//pass in any int for auto inc IDs back
var rs = stmt.getGeneratedKeys();
//if you are only expecting one row back, no need for while loop
// just do rs.next();
while(rs.next()) {
Logger.log(rs.getString(1));
}
rs.close();
stmt.close();
conn.close();
}

Related

Multiple SQL Queries with Google Script and JDBC connectors in a single Sheet

I'm using an App Script with JDBC connector to create a connection with my MySQL stats database.
I would like to be able to run multiple Queries in a single script to pull out specific info from my database into a single sheet
All my Queries' results have the same structure (all are group by date) YEAR,MONTH,DAY,VALUE and I was wondering if it was possible to have the YEAR,MONTH, DATE Columns fixed and all the other results populating following columns in my results.
Something like
Col A: YEAR, Col B: MONTH, Col C: DAY, Col D: Result from Query 1, Col E: Result from Query 2....
Here is what I started with
function loadData() {
var sheet = SpreadsheetApp.getActiveSheet();
var instanceUrl = 'jdbc:mysql://' + address;
var dbUrl = instanceUrl + '/' + db;
var conn = Jdbc.getConnection(dbUrl, user, userPwd);
var stmt = conn.createStatement();
stmt.setMaxRows(10);
var results = stmt.executeQuery('Select YEAR(dateCreated) as yearAdded,MONTH(dateCreated) as monthAdded, DAY(dateCreated) as dayAdded,sum(amount)/100 from stripeTransactions GROUP BY yearAdded ,monthAdded,dayAdded;');
var numCols = results.getMetaData().getColumnCount();
while (results.next()) {
var rowArray = new Array();
for (var col = 0; col < numCols; col++) {
rowArray.push(results.getString(col+1));
}
sheet.appendRow(rowArray);
}
results.close();
stmt.close();
var stmt2 = conn.createStatement();
stmt2.setMaxRows(10);
var results2= stmt2.executeQuery('Select YEAR(dateAdded) as yearAdded,MONTH(dateAdded) as monthAdded, DAY(dateAdded) as dayAdded,sum(amount) from itunestransactions GROUP BY yearAdded ,monthAdded,dayAdded;');
var numCols2 = results2.getMetaData().getColumnCount();
while (results2.next()) {
var rowArray2 = new Array();
for (var cal = numCols; cal < numCols2; cal++) {
rowArray2.push(results2.getString(cal+1));
}
sheet.appendRow(rowArray2);
}
results2.close();
stmt2.close();
}
Thanks
(this script only append rows for every results, I guess could have a single sheet per results for each query then merge every sheet in one but I'd rather do it in a single sheet from the beginning)
You'll need to know more advanced MySQL to pull this off but you can nest sub-queries into your main query.
Moreover, if you want your queries to be more efficient you can wrap your queries into a MySQL stored procedure and then call that procedure from Apps Script via JDBC.
Here are links to some tutorials to get you started:
MySQL Subquery
MySQL Stored Procedures

How to Insert a row and return autoincrement value in sqlite in wp?

I have an app using sqlite client. In the app i insert the data in a table and i need the Id which is autoincreament. Is there anyway to get the id in executenonquery? s.th like sqlparameter or ...
I am using the following method to fetch data and i thought rec variable hold the id but it is always rec=1 and i dont know what this is good for?
int x= (System.Windows.Application.Current as App).db.Insert
<CsWidget>(ObjWidget, #"Insert into Tbl_Widget (Name) values(#Name");
public int Insert<T>(T obj, string statement) where T : new()
{
Open();
SQLiteCommand cmd = db.CreateCommand(statement);
int rec = cmd.ExecuteNonQuery(obj);
return rec;
}
Try ExecuteScalar method instead of ExecuteNonQuery. It can work that way depending on your SQLite wrapper. If not the separate query after insert is your only way.
As #gleb.kudr said, a separate query after the insert is the only way.
cmd.ExecuteNonQuery();
cmd.CommandText = "SELECT LAST_INSERT_ROWID()";
object r = cmd.ExecuteScalar();
int id = 0;
int.TryParse( r.ToString(), out id );

Linq to CRM - Invalid operation exception

I'm using a LINQ to CRM from Advanced Developer Extension for MS CRM 4.0. It works fine with direct queries. But I've got a problem when query looks like this:
var connectionString = #"User ID=u; Password=p; Authentication Type=AD; Server=http://crm:5555/UO";
var connection = CrmConnection.Parse(connectionString);
var dataContext = new CrmDataContext(connection);
var data = from u in dataContext.Accounts
select new
{
Id = u.AccountID,
Name = u.AccountName,
};
var r = from n in data
where n.Name.StartsWith("test")
select new
{
Id = n.Id
};
r.Dump();
it throws an InvalidOperationException "Cannot determine the attribute name."
It's fine when a condition is directly in first query:
var data = from n in dataContext.Accounts
where n.AccountName.StartsWith("test")
select new
{
Id = n.AccountID,
Name = n.AccountName,
};
I cannot find any useful information about this kind of error. Is it a bug in Xrm Linq Provider?
Thanks in advance for any help.
Try eager loading the initial query with a ToList() so the latter query over your anonymous type is then evaluated locally. I get this is far from ideal if you have a lot of accounts but it'll prove the point. You essentially have a solution anyway in the last statement.
This is because the first query isn't executed at all until you call .Dump() at which point the entire expression including the second query is evaluated as one (deferred execution) by the provider which then looks for an attribute of Name.

How to return a RefCursor from Oracle function?

I am trying to execute a user-defined Oracle function that returns a RefCursor using ODP.NET. Here is the function:
CREATE OR REPLACE FUNCTION PKG.FUNC_TEST (ID IN TABLE.ID%type)
RETURN SYS_REFCURSOR
AS
REF_TEST SYS_REFCURSOR;
BEGIN
OPEN REF_TEST FOR
SELECT *
FROM TABLE;
RETURN REF_TEST;
END;
/
I can call this function in Toad (select func_test(7) from dual) and get back a CURSOR. But I need to get the cursor using C# and ODP.NET to fill a DataSet, but I keep getting a NullReferenceException - "Object reference not set to an instance of an object". Here is what I have for that:
OracleConnection oracleCon = new OracleConnection(ConfigurationManager.ConnectionStrings["MyConnectionString"].ConnectionString);
OracleCommand sqlCom = new OracleCommand("select func_test(7) from dual", oracleCon);
sqlCom.Parameters.Add("REF_TEST", OracleDbType.RefCursor, ParameterDirection.ReturnValue);
OracleDataAdapter dataAdapter = new OracleDataAdapter();
dataAdapter.SelectCommand = sqlCom;
DataSet dataSet = new DataSet();
dataAdapter.Fill(dataSet); //FAILS HERE with NullReferenceException
I was able to find lots of info and samples on using stored procedures and ODP.NET, but not so much for returning RefCursors from functions.
EDIT: I do not want to explicitly add input parameters to the OracleCommand object (i.e. sqlCom.Parameters.Add("id", OracleDbType.Int32,ParameterDirection.Input).Value = 7;) as that makes it difficult to implement this as a generic RESTful web service, but I'm reserving it as my last resort but would use stored procedures instead.
Any help is much appreciated!
I think you are missing the sqlCom.ExecuteNonQuery();
also, instead of running the select func_test(7) from dual; lets switch it to run the function and pass in the param
OracleConnection oracleCon = new OracleConnection(ConfigurationManager.ConnectionStrings["MyConnectionString"].ConnectionString);
// Set the command
string anonymous_block = "begin " +
" :refcursor1 := func_test(7) ;" +
"end;";
//fill in your function and variables via the above example
OracleCommand sqlCom= con.CreateCommand();
sqlCom.CommandText = anonymous_block;
// Bind
sqlCom.Parameters.Add("refcursor1", OracleDbType.RefCursor);
sqlCom.Parameters[0].Direction = ParameterDirection.ReturnValue;
try
{
// Execute command; Have the parameters populated
sqlCom.ExecuteNonQuery();
// Create the OracleDataAdapter
OracleDataAdapter da = new OracleDataAdapter(sqlCom);
// Populate a DataSet with refcursor1.
DataSet ds = new DataSet();
da.Fill(ds, "refcursor1", (OracleRefCursor)(sqlCom.Parameters["refcursor1"].Value));
// Print out the field count the REF Cursor
Console.WriteLine("Field count: " + ds.Tables["refcursor1"].Columns.Count);
}
catch (Exception e)
{
Console.WriteLine("Error: {0}", e.Message);
}
finally
{
// Dispose OracleCommand object
cmd.Dispose();
// Close and Dispose OracleConnection object
con.Close();
con.Dispose();}
this is based on the example ODP that can be found # %ora_home%\Client_1\ODP.NET\samples\RefCursor\Sample5.csproj
If you want to avoid (for better or worst!) the custom built param collection for each proc/function call you can get around that by utilizing anonymous blocks in your code, I have ammended (once again untested!) the code above to reflect this technique.
Here is a nice blog (from none other than Mark Williams) showing this technique.
http://oradim.blogspot.com/2007/04/odpnet-tip-anonymous-plsql-and.html

Reading/Writing DataTables to and from an OleDb Database LINQ

My current project is to take information from an OleDbDatabase and .CSV files and place it all into a larger OleDbDatabase.
I have currently read in all the information I need from both .CSV files, and the OleDbDatabase into DataTables.... Where it is getting hairy is writing all of the information back to another OleDbDatabase.
Right now my current method is to do something like this:
OleDbTransaction myTransaction = null;
try
{
OleDbConnection conn = new OleDbConnection("PROVIDER=Microsoft.Jet.OLEDB.4.0;" +
"Data Source=" + Database);
conn.Open();
OleDbCommand command = conn.CreateCommand();
string strSQL;
command.Transaction = myTransaction;
strSQL = "Insert into TABLE " +
"(FirstName, LastName) values ('" +
FirstName + "', '" + LastName + "')";
command.CommandType = CommandType.Text;
command.CommandText = strSQL;
command.ExecuteNonQuery();
conn.close();
catch (Exception)
{
// IF invalid data is entered, rolls back the database
myTransaction.Rollback();
}
Of course, this is very basic and I'm using an SQL command to commit my transactions to a connection. My problem is I could do this, but I have about 200 fields that need inserted over several tables. I'm willing to do the leg work if that's the only way to go. But I feel like there is an easier method. Is there anything in LINQ that could help me out with this?
If the column names in the DataTable match exactly to the column names in the destination table, then you might be able to use a OleDbCommandBuilder (Warning: I haven't tested this yet). One area you may run into problems is if the data types of the source data table do not match those of the destination table (e.g if the source column data types are all strings).
EDIT
I revised my original code in a number of ways. First, I switched to using the Merge method on a DataTable. This allowed me to skip using the LoadDataRow in a loop.
using ( var conn = new OleDbConnection( destinationConnString ) )
{
//query off the destination table. Could also use Select Col1, Col2..
//if you were not going to insert into all columns.
const string selectSql = "Select * From [DestinationTable]";
using ( var adapter = new OleDbDataAdapter( selectSql, conn ) )
{
using ( var builder = new OleDbCommandBuilder( adapter ) )
{
conn.Open();
var destinationTable = new DataTable();
adapter.Fill( destinationTable );
//if the column names do not match exactly, then they
//will be skipped
destinationTable.Merge( sourceDataTable, true, MissingSchemaAction.Ignore );
//ensure that all rows are marked as Added.
destinationTable.AcceptChanges();
foreach ( DataRow row in destinationTable.Rows )
row.SetAdded();
builder.QuotePrefix = "[";
builder.QuoteSuffix= "]";
//forces the builder to rebuild its insert command
builder.GetInsertCommand();
adapter.Update( destinationTable );
}
}
}
ADDITION An alternate solution would be to use a framework like FileHelpers to read the CSV file and post it into your database. It does have an OleDbStorage DataLink for posting into OleDb sources. See the SqlServerStorage InsertRecord example to see how (in the end substitute OleDbStorage for SqlServerStorage).
It sounds like you have many .mdb and .csv that you need to merge into a single .mdb. This answer is running with that assumption, and that you have SQL Server available to you. If you don't, then consider downloading SQL Express.
Use SQL Server to act as the broker between your multiple datasources and your target datastore. Script each datasource as an insert into a SQL Server holding table. When all data is loaded into the holding table, perform a final push into your target Access datastore.
Consider these steps:
In SQL Server, create a holding table for the imported CSV data.
CREATE TABLE CsvImport
(CustomerID smallint,
LastName varchar(40),
BirthDate smalldatetime)
Create a stored proc whose job will be to read a given CSV filepath, and insert into a SQL Server table.
CREATE PROC ReadFromCSV
#CsvFilePath varchar(1000)
AS
BULK
INSERT CsvImport
FROM #CsvFilePath --'c:\some.csv'
WITH
(
FIELDTERMINATOR = ',', --your own specific terminators should go here
ROWTERMINATOR = '\n'
)
GO
Create a script to call this stored proc for each .csv file you have on disk. Perhaps some Excel trickery or filesystem dir piped commands can help you create these statements.
exec ReadFromCSV 'c:\1.csv
For each .mdb datasource, create a temp linked server.
DECLARE #MdbFilePath varchar(1000);
SELECT #MdbFilePath = 'C:\MyMdb1.mdb';
EXEC master.dbo.sp_addlinkedserver #server = N'MY_ACCESS_DB_', #srvproduct=N'Access', #provider=N'Microsoft.Jet.OLEDB.4.0', #datasrc=#MdbFilePath
-- grab the relevant data
--your data's now in the table...
INSERT CsvImport(CustomerID,
SELECT [CustomerID]
,[LastName]
,[BirthDate]
FROM [MY_ACCESS_DB_]...[Customers]
--remove the linked server
EXEC master.dbo.sp_dropserver #server=N'MY_ACCESS_DB_', #droplogins='droplogins'
When you're done importing data into that holding table, create a Linked Server in your SQL Server instance. This is the target datastore. SELECT the data from SQL Server into Access.
EXEC master.dbo.sp_addlinkedserver #server = N'MY_ACCESS_TARGET', #srvproduct=N'Access', #provider=N'Microsoft.Jet.OLEDB.4.0', #datasrc='C:\Target.mdb'
INSERT INTO [MY_ACCESS_TARGET]...[Customer]
([CustomerID]
,[LastName]
,[BirthDate])
SELECT Customer,
LastName,
BirthDate
FROM CsvImport

Resources