Read multiple rows from Oracle table - oracle

I am trying to read a table in Oracle from PowerShell and want to save them in an ArrayList. The connection is working but reading the any rows after the first doesn't work.
Here's what I'm trying to do.
$rows = New-Object System.Collections.ArrayList
class Table {
[String] $name
[String] $type
}
try {
$oraConn.Open()
$sql = [string]::Format("select name, type from source_table where type = 'running'")
$oraCmd = New-Object System.Data.OracleClient.OracleCommand($sql, $oraConn)
$reader = $oraCmd.ExecuteReader()
#add tables to arraylist
while ($reader.Read()) {
$table = New-Object Table
$table.name = $reader["name"];
$table.type = $reader["type"];
[void]$rows.Add($table)
}
Write-Host "rows collected"
}
My problem is, I only read the first row of the table, how can I tell Oracle to read them all? Would I have to countthem first and then query for each row?
I check the contents of $rows later in the code, it's not really relevant to the question since I know that this part works, so I left it out.
I know that my query returns something because I tried it in Oracle.
Do I need a foreach loop? It would make sense but how can I tell Oracle to do that? Would I have to query for each row of the table and set a counter to query only one row at a time?
I hope someone can help me and point me in the right direction, since I'm already trying a long time to get my script working. I got most of the logic for my script, but if I can't load the rows into my list, my logic doesn't help me at all.

Use the following code snippet as a some base for an own solution:
$cs = 'data source=oradb;user id=/;dba privilege=sysdba'
$oc = new-object oracle.dataaccess.client.oracleconnection $cs
$oc.open()
$cm = new-object oracle.dataaccess.client.oraclecommand
$cm.connection = $oc
$cm.commandtext = "select name, type from source_table where type = 'running'"
$da = new-object oracle.dataaccess.client.oracledataadapter
$da.selectcommand = $cm
$tbl = new-object data.datatable
$da.fill($tbl)
$tbl | %{"$($_.name = $_.type)"}

Related

getting answer from sql query about specific command

I am new here I tried to find my problem here but I couldnt.
I have 2 db and I am searching two different information and comparing them. I thought many other languages which we called "containts"
I need to find which names rapor_one has and rapor_two does not have. Probably I have problem with Contains part. How can I Use it
for($i=1; $i -le $rapor_one.Rows.Count;$i++ ){
if($rapor_two.Contains($row)){
Write-Host $row
}
}
If it matters I am using connection string and connection adapter which I am also adding below
$connectionString =
$connection =
$connection.ConnectionString =
$connection.Open()
$command =
$command.CommandText =
$DataAdapter = new-object System.Data.SqlClient.SqlDataAdapter $
$Dataset = new-object System.Data.Dataset
$DataAdapter.Fill($Dataset)
I am sure my query working well I take db response but I just have problem with comparing them
****ADDED: Also I want to ask that if there was 2 or more attributes in my variable how can I compare them?
****ADDED:
This algorithm is better. Cant we use contains any other modern languages(I know we have in C# java etc?
foreach($row in $rapor_two)
{
if ($rapor_one - Contains $row){
Write-Host "True"
}
}

Writing 100 rows form oracle database using powershell

I have an problem using powershell .
first i would like to explain what the sitiation is :
I have an Oracle database running with an table NAMES . In the table i got about 10000 rows with data . I would like to to count them til 100 rows and and then "echo " them on my powershell prompt here is where the problem comes in because i can count the rows using the following script:
$query = “SELECT * FROM NAMES"
$command = $con.CreateCommand()
$command.CommandText = $query
$result = $command.ExecuteReader()
$test= $result|Measure-Object
$i=$test.Count
The number that returns is the correct number but then it goes wrong because when i want to use an foreach loop i cant get the names from my table
Here is wat i got maybey it helps
foreach ($Row in $query.Tables[0].Rows)
{
write-host "value is : $($Row[0])"
}
hope someone finds an answer
You are missing the strict mode: Set-StrictMode -Version latest. By setting it, you'd get much more informative an error message:
$query = "SELECT * FROM NAMES"
foreach ($Row in $query.Tables[0].Rows) { ... }
Property 'Tables' cannot be found on this object. Make sure that it exists.
+ foreach ($Row in $query.Tables[0].Rows) { ... }
The $query variable doesn't contain member Tables, so attempting to read it is futile. It would seem likely that the $result variable contains the Tables member. That depends on the data provider you are using and what's missing on the code sample.

Bind params for the bulk INSERT query? (avoid SQL injections)

I want to do some mass DB population from Excel file.
The most time economic way is to use INSERT INTO statement with lots of values to be stored in one transaction:
INSERT INTO `assortment`(`id`, `sku`, `agroup`, `subgroup`, `title`, `measure_unit`, `price`, `discount`, `imageUrl`, `fileUrl`)
VALUES ([value-1],[value-2],[value-3],[value-4],[value-5],[value-6],[value-7],[value-8],[value-9],[value-10]),
([value-1],[value-2],[value-3],[value-4],[value-5],[value-6],[value-7],[value-8],[value-9],[value-10]),
([value-1],[value-2],[value-3],[value-4],[value-5],[value-6],[value-7],[value-8],[value-9],[value-10]),
([value-1],[value-2],[value-3],[value-4],[value-5],[value-6],[value-7],[value-8],[value-9],[value-10]),
...
Yet, to avoid SQL injection i wish to bind params, the yii providing functionality for that. Yet, it seems impossible for me to do it for hundredes/thousands of values. Isn't it?
To keep SQL hygene i did the simple insert thru Active Record attributes (Yii AR functionality sanitizes input data by default):
$auxarr = array();
for ($i = 0; $sheetData[$i]; $i++)
{
$model = new Assortment();
$j = 0;
foreach ($labels as $label)
{
$auxarr[$label] = $sheetData[$i][$j++];
}
$model->attributes = $auxarr;
if (!$model->save())
throw new CHttpException(400, 'Error db storing');
}
This approach obviously being time non-efficient.
Is there any way that would feature both security and time efficiency in the bulk SQL inserting?
Yii is using conventional PDO in CDbCommand.
So, you can create a string consists of series values like this
(?,?,?),(?,?,?),(?,?,?),(?,?,?),(?,?,?),(?,?,?)
then create an array with values for all these placeholders
and finally execute all the stuff
My approach is
$sql = "INSERT INTO `assortment`(`id`, `sku`, `agroup`, `subgroup`, `title`, `measure_unit`, `price`, `discount`, `imageUrl`, `fileUrl`) VALUES "
$params = array();
$cntRows = count($sheetData);
for ($i = 0; $i < $cntRows; $i++)
{
$j = 0;
$rowParams = array();
foreach ($labels as $label)
{
$rowParams[":{$label}_{$i}_{j}"] = $sheetData[$i][$j++];
}
$params = array_merge($params, $rowParams);
$sql . = "(" . implode(",", array_keys($rowParams) ) .")"
}
/*
Sql now is : INSERT INTO assortment (....) VALUES ( :id_1_1 , sku_1_1 , ... ) (:id_2_1 , sku_2_1 , ...)
AND $params is { :id_1_1 => [value] ........ }
*/
$cmd = Yii::app()->db->createCommand($sql);
$cmd->execute($params);
We excute insert sql in one transaction, no multiple transactions or use ActiveRecord (waste memory and many functions are executed) and avoid SQL injections. If your data is large you can split it to multiple transactions.

Doctrine ORM Update Row with ID

Is there a way to update a row directly with and ID? I just want the ability to update a table row field without querying for the object first. I tried this...
$id = 1;
$s = new Sandbox();
$s->setId($id);
$s->setFname('moon');
$e = $em->merge($s);
$em->flush($e);
and it tried to do an update to the database, however, it failed because it tried to update all the undefined fields as well whereas I just want to update the fname field.
Thanks
$id = 1;
$s = $em->getReference('Sandbox', $id);
$s->setFname('moon');
$em->persist($s);
$em->flush($s);

Laravel query optimization

I have a query in laravel:
...
$query = $model::group_by($model->table().'.'.$model::$key);
$selects = array(DB::raw($model->table().'.'.$model::$key));
...
$rows = $query->distinct()->get($selects);
this works fine and gives me the fields keys' that I need but the problem is that I need to get all the columns and not just the Key.
using this:
$selects = array(DB::raw($model->table().'.'.$model::$key), DB::raw($model->table().'.*'));
is not an option, cuz it's not working with PostgreSQL, so i used $rows to get the rest of columns:
for ($i = 0; $i<count($rows); $i++)
{
$rows[$i] = $model::find($rows[$i]->key);
}
but as you see this is it's so inefficient, so what can i do to make it faster and more efficient?
you can find the whole code here: https://gist.github.com/neo13/5390091
ps. I whould use join but I don't know how?
Just don't pass anything in to get() and it will return all the columns. Also the key is presumably unique in the table so I don't exactly understand why you need to do the group by.
$models = $model::group_by( $model->table() . '.'. $model::$key )->get();

Resources