Using Rails Update to Append to a Text Column in Postgresql - activerecord

Thanks in advance for any help on this one.
I have a model in rails that includes a postgresql text column.
I want to append (i.e. mycolumn = mycolumn || newdata) data to the existing column. The sql I want to generate would look like:
update MyOjbs set mycolumn = mycolumn || newdata where id = 12;
I would rather not select the data, update the attribute and then write the new data back to the database. The text column could grow relatively large and I'd rather not read that data if I don't need to.
I DO NOT want to do this:
#myinstvar = MyObj.select(:mycolumn).find(12)
newdata = #myinstvar.mycolumn.to_s + newdata
#myinstvar.update_attribute(:mycolumn, newdata)
Do I need to do a raw sql transaction to accomplish this?

I think you could solve this problem directly writing your query using the arel gem, that's already provided with rails.
Given that you have these values:
column_id = 12
newdata = "a custom string"
you can update the table this way:
# Initialize the Table and UpdateManager objects
table = MyOjbs.arel_table
update_manager = Arel::UpdateManager.new Arel::Table.engine
update_manager.table(table)
# Compose the concat() function
concat = Arel::Nodes::NamedFunction.new 'concat', [table[:mycolumn], new_data]
concat_sql = Arel::Nodes::SqlLiteral.new concat.to_sql
# Set up the update manager
update_manager.set(
[[table[:mycolumn], concat_sql]]
).where(
table[:id].eq(column_id)
)
# Execute the update
ActiveRecord::Base.connection.execute update_manager.to_sql
This will generate a SQL string like this one:
UPDATE "MyObjs" SET "mycolumn" = concat("MyObjs"."mycolumn", 'a custom string') WHERE "MyObjs"."id" = 12"

Related

Laravel Save Multiple Data to 1 column

So I have 2 variable for storing the selected time of the user ('time_to' and 'time_from) with these sample data('7:30','8:00')
How can I save these two into 1 column('c_time') so it would look like this('7:30-8:00')?
if i understand you correctly, you can create a column of string (varchar) type. and then create the data for your column like this :
$time_to = '7:30';
$time_from= '8:00';
$colValueToBeStored = "(".$time_to."-".$time_from.")";
then just put $colValueToBeStored inside your column.
and to reverse it:
$colValueToBeStored = "(7:30-8:00)";
$res = explode("-",str_replace([")","("],"",$colValueToBeStored));
$time_to = $res[0];
$time_from = $res[1];
define your c_time column type as JSON, that way you can store multiple values, as it will be easier to retrieve as well. Like,
...
$cTime['time_to'] = "7:30";
$cTime['time_from'] = "8:00";
$cTimeJson = json_encode($cTime);
// save to db
...

How to create a temporary column + when + order by with Criteria Builder

here is the sql statement I am trying to translate in jpa :
select
id,
act_invalidation_id,
last_modification_date,
title,
case when act_invalidation_id is null then 1 else 0 end as test
from act order by test, last_modification_date desc
The actual translation
Root<Act> act = query.from(Act.class);
builder.selectCase()
.when(builder.isNull(actRoot.get("actInvalidation")), 1)
.otherwise(0).as(Integer.class);
Expression<?> actInvalidationPath = actRoot.get("actInvalidation");
Order byInvalidationOrder = builder.asc(actInvalidationPath);
Path<Date> publicationDate = actRoot.get("metadata").get("publicationDate");
Order byLastModificationDate = builder.desc(publicationDate);
query.select(act).orderBy(byInvalidationOrder, byLastModificationDate);
entityManager.createQuery(query).getResultList();
I try to create a temporary column (named test) of Integer type and orderby this column, then orderby lastmodificationdate. The content of this new column is determined by the value of actInvalidation field.
In short: How to create a temp column with integer values, then order by this temp column in jpa ?
Thank you
I didn't test this but it should work like this:
Root<Act> act = query.from(Act.class);
Expression<?> test = builder.selectCase()
.when(builder.isNull(actRoot.get("actInvalidation")), 1)
.otherwise(0).as(Integer.class);
Expression<?> actInvalidationPath = actRoot.get("actInvalidation");
Order byInvalidationOrder = builder.asc(actInvalidationPath);
Path<Date> publicationDate = actRoot.get("metadata").get("publicationDate");
Order byLastModificationDate = builder.desc(publicationDate);
Order byTest = builder.asc(test);
query.select(act).orderBy(byTest, byInvalidationOrder, byLastModificationDate);
entityManager.createQuery(query).getResultList();

NHibernate querying with WHERE conditions in pairs

I have a collection of NHibernate objects ConcurrentBag<Event> with properties project_id and name. I want to a set of unrelated (schema wise) objects from another table which also match these properties.
The SQL I would expect would look something like:
SELECT * FROM table WHERE (project_id = 1 AND name = 'foo')
OR (project_id = 2 AND name = 'foo')
OR (project_id = 1 AND name = 'bar')
... etc
The pairs of values in the WHERE clause are based on the values from each Event in the ConcurrentBag<Event>.
I am not sure how to query this with NHibernate (ideally with LINQ). Is this even possible?
This would be difficult with LINQ I think, but if you use QueryOver it's easy to build a WHERE clause dynamically using Restrictions.Disjunction:
var disjunction = Restrictions.Disjunction();
foreach (Event evt in events)
{
disjunction.Add(Restrictions.Where<Table>(
t => t.project_id == evt.project_id && t.name == evt.name);
}
var rows = session.QueryOver<Table>()
.Where(disjunction)
.List<Table>();

Update entity columns iterating through col list using LINQ

I can get column list from the table using LINQ like this:
OrderDataContext ctx = new OrderDataContext();
var cols = ctx.Mapping.MappingSource
.GetModel( typeof( OrderDataContext ) )
.GetMetaType( typeof( ProductInformation ) )
.DataMembers;
This gives me the list of columns, so I can do this:
foreach ( var col in cols )
{
// Get the value of this column from another table
GetPositionForThisField( col.Name );
}
So this all works, I can iterate through column list and pull the values for those columns from an another table (since the column names are the keys in that another table), so I don't have to do switch....or lot of if...then...
Now the question:
After I get these values, how do I populate the entity in order to save it back? I would normally go like this:
ProductInformation info = new ProductInformation();
info.SomeField1 = val1;
info.SomeField2 = val2;
ctx.ProductInformation.InsertOnSubmit( info );
ctx.SubmitChanges();
But how to use the same column collection from above to populate the columns while iterating over that, when there is no such thing as:
info["field1"].Value = val1;
Thanks.
Just fetch the object that you want to modofy, set the property and call SubmitChanges. There is no need to create a new object and insert it. The Context tracks your changed properties and generates the update statement accordingly. In your case you may want to set the properties via reflection rather than manually since you are reading them from another table.
You'll need to use reflection. Assuming you can get the PropertyInfo from the metadata:
PropertyInfo property = GetPropertyForThisField(col.Name);
property.SetValue(info, val1, null);

Reading/Writing DataTables to and from an OleDb Database LINQ

My current project is to take information from an OleDbDatabase and .CSV files and place it all into a larger OleDbDatabase.
I have currently read in all the information I need from both .CSV files, and the OleDbDatabase into DataTables.... Where it is getting hairy is writing all of the information back to another OleDbDatabase.
Right now my current method is to do something like this:
OleDbTransaction myTransaction = null;
try
{
OleDbConnection conn = new OleDbConnection("PROVIDER=Microsoft.Jet.OLEDB.4.0;" +
"Data Source=" + Database);
conn.Open();
OleDbCommand command = conn.CreateCommand();
string strSQL;
command.Transaction = myTransaction;
strSQL = "Insert into TABLE " +
"(FirstName, LastName) values ('" +
FirstName + "', '" + LastName + "')";
command.CommandType = CommandType.Text;
command.CommandText = strSQL;
command.ExecuteNonQuery();
conn.close();
catch (Exception)
{
// IF invalid data is entered, rolls back the database
myTransaction.Rollback();
}
Of course, this is very basic and I'm using an SQL command to commit my transactions to a connection. My problem is I could do this, but I have about 200 fields that need inserted over several tables. I'm willing to do the leg work if that's the only way to go. But I feel like there is an easier method. Is there anything in LINQ that could help me out with this?
If the column names in the DataTable match exactly to the column names in the destination table, then you might be able to use a OleDbCommandBuilder (Warning: I haven't tested this yet). One area you may run into problems is if the data types of the source data table do not match those of the destination table (e.g if the source column data types are all strings).
EDIT
I revised my original code in a number of ways. First, I switched to using the Merge method on a DataTable. This allowed me to skip using the LoadDataRow in a loop.
using ( var conn = new OleDbConnection( destinationConnString ) )
{
//query off the destination table. Could also use Select Col1, Col2..
//if you were not going to insert into all columns.
const string selectSql = "Select * From [DestinationTable]";
using ( var adapter = new OleDbDataAdapter( selectSql, conn ) )
{
using ( var builder = new OleDbCommandBuilder( adapter ) )
{
conn.Open();
var destinationTable = new DataTable();
adapter.Fill( destinationTable );
//if the column names do not match exactly, then they
//will be skipped
destinationTable.Merge( sourceDataTable, true, MissingSchemaAction.Ignore );
//ensure that all rows are marked as Added.
destinationTable.AcceptChanges();
foreach ( DataRow row in destinationTable.Rows )
row.SetAdded();
builder.QuotePrefix = "[";
builder.QuoteSuffix= "]";
//forces the builder to rebuild its insert command
builder.GetInsertCommand();
adapter.Update( destinationTable );
}
}
}
ADDITION An alternate solution would be to use a framework like FileHelpers to read the CSV file and post it into your database. It does have an OleDbStorage DataLink for posting into OleDb sources. See the SqlServerStorage InsertRecord example to see how (in the end substitute OleDbStorage for SqlServerStorage).
It sounds like you have many .mdb and .csv that you need to merge into a single .mdb. This answer is running with that assumption, and that you have SQL Server available to you. If you don't, then consider downloading SQL Express.
Use SQL Server to act as the broker between your multiple datasources and your target datastore. Script each datasource as an insert into a SQL Server holding table. When all data is loaded into the holding table, perform a final push into your target Access datastore.
Consider these steps:
In SQL Server, create a holding table for the imported CSV data.
CREATE TABLE CsvImport
(CustomerID smallint,
LastName varchar(40),
BirthDate smalldatetime)
Create a stored proc whose job will be to read a given CSV filepath, and insert into a SQL Server table.
CREATE PROC ReadFromCSV
#CsvFilePath varchar(1000)
AS
BULK
INSERT CsvImport
FROM #CsvFilePath --'c:\some.csv'
WITH
(
FIELDTERMINATOR = ',', --your own specific terminators should go here
ROWTERMINATOR = '\n'
)
GO
Create a script to call this stored proc for each .csv file you have on disk. Perhaps some Excel trickery or filesystem dir piped commands can help you create these statements.
exec ReadFromCSV 'c:\1.csv
For each .mdb datasource, create a temp linked server.
DECLARE #MdbFilePath varchar(1000);
SELECT #MdbFilePath = 'C:\MyMdb1.mdb';
EXEC master.dbo.sp_addlinkedserver #server = N'MY_ACCESS_DB_', #srvproduct=N'Access', #provider=N'Microsoft.Jet.OLEDB.4.0', #datasrc=#MdbFilePath
-- grab the relevant data
--your data's now in the table...
INSERT CsvImport(CustomerID,
SELECT [CustomerID]
,[LastName]
,[BirthDate]
FROM [MY_ACCESS_DB_]...[Customers]
--remove the linked server
EXEC master.dbo.sp_dropserver #server=N'MY_ACCESS_DB_', #droplogins='droplogins'
When you're done importing data into that holding table, create a Linked Server in your SQL Server instance. This is the target datastore. SELECT the data from SQL Server into Access.
EXEC master.dbo.sp_addlinkedserver #server = N'MY_ACCESS_TARGET', #srvproduct=N'Access', #provider=N'Microsoft.Jet.OLEDB.4.0', #datasrc='C:\Target.mdb'
INSERT INTO [MY_ACCESS_TARGET]...[Customer]
([CustomerID]
,[LastName]
,[BirthDate])
SELECT Customer,
LastName,
BirthDate
FROM CsvImport

Resources