getting answer from sql query about specific command - algorithm

I am new here I tried to find my problem here but I couldnt.
I have 2 db and I am searching two different information and comparing them. I thought many other languages which we called "containts"
I need to find which names rapor_one has and rapor_two does not have. Probably I have problem with Contains part. How can I Use it
for($i=1; $i -le $rapor_one.Rows.Count;$i++ ){
if($rapor_two.Contains($row)){
Write-Host $row
}
}
If it matters I am using connection string and connection adapter which I am also adding below
$connectionString =
$connection =
$connection.ConnectionString =
$connection.Open()
$command =
$command.CommandText =
$DataAdapter = new-object System.Data.SqlClient.SqlDataAdapter $
$Dataset = new-object System.Data.Dataset
$DataAdapter.Fill($Dataset)
I am sure my query working well I take db response but I just have problem with comparing them
****ADDED: Also I want to ask that if there was 2 or more attributes in my variable how can I compare them?
****ADDED:
This algorithm is better. Cant we use contains any other modern languages(I know we have in C# java etc?
foreach($row in $rapor_two)
{
if ($rapor_one - Contains $row){
Write-Host "True"
}
}

Related

Powershell question - Looking for fastest method to loop through 500k objects looking for a match in another 500k object array

I have two large .csv files that I've imported using the import-csv cmdlet. I've done a lot of searching and trying and am finally posting to ask for some help to make this easier.
I need to move through the first array that will have anywhere from 80k rows to 500k rows. Each object in these arrays has multiple properties, and I then need to find the corresponding entry in a second array of the same size matching on a property from there.
I'm importing them as [systems.collection.arrayList] and I've tried to place them as hashtables too. I have even tried to muck with LINQ which was mentioned in several other posts.
Any chance anyone can offer advice or insight how to make this run faster? It feels like I'm looking in one haystack for matching hay in a different stack.
$ImportTime1 = Measure-Command {
[System.Collections.ArrayList]$fileList1 = Import-csv file1.csv
[System.Collections.ArrayList]$fileSorted1 = ($fileList1 | Sort-Object -property 'Property1' -Unique -Descending)
Remove-Variable fileList1
}
$ImportTime2 = Measure-Command {
[System.Collections.ArrayList]$fileList2 = Import-csv file2.csv
[System.Collections.ArrayList]$fileSorted2 = ($fileList2 | Sort-Object -property 'Property1' -Unique -Descending)
Remove-Variable fileList2
}
$fileSorted1.foreach({
$varible1 = $_
$target = $fileSorted2.where({$_ -eq $variable1})
###do some other stuff
})
This may be of use: https://powershell.org/forums/topic/comparing-two-multi-dimensional-arrays/
The updated solution in comment #27359 + add the suggested change by Max Kozlov in comment #27380.
Function RJ-CombinedCompare() {
[CmdletBinding()]
PARAM(
[Parameter(Mandatory=$True)]$List1,
[Parameter(Mandatory=$True)]$L1Match,
[Parameter(Mandatory=$True)]$List2,
[Parameter(Mandatory=$True)]$L2Match
)
$hash = #{}
foreach ($data in $List1) {$hash[$data.$L1Match] += ,[pscustomobject]#{Owner=1;Value=$($data)}}
foreach ($data in $List2) {$hash[$data.$L2Match] += ,[pscustomobject]#{Owner=2;Value=$($data)}}
foreach ($kv in $hash.GetEnumerator()) {
$m1, $m2 = $kv.Value.where({$_.Owner -eq 1}, 'Split')
[PSCustomObject]#{
MatchValue = $kv.Key
L1Matches = $m1.Count
L2Matches = $m2.Count
L1MatchObject = $L1Match
L2MatchObject = $L2Match
List1 = $m1.Value
List2 = $m2.Value
}
}
}
$fileList1 = Import-csv file1.csv
$fileList2 = Import-csv file2.csv
$newList = RJ-CombinedCompare -List1 $fileList1 -L1Match $(yourcolumnhere) -List2 $fileList2 -L2Match $(yourothercolumnhere)
foreach ($item in $newList) {
# your logic here
}
It should be fast to pass the lists into this hashtable and it's fast to iterate through as well.

Read multiple rows from Oracle table

I am trying to read a table in Oracle from PowerShell and want to save them in an ArrayList. The connection is working but reading the any rows after the first doesn't work.
Here's what I'm trying to do.
$rows = New-Object System.Collections.ArrayList
class Table {
[String] $name
[String] $type
}
try {
$oraConn.Open()
$sql = [string]::Format("select name, type from source_table where type = 'running'")
$oraCmd = New-Object System.Data.OracleClient.OracleCommand($sql, $oraConn)
$reader = $oraCmd.ExecuteReader()
#add tables to arraylist
while ($reader.Read()) {
$table = New-Object Table
$table.name = $reader["name"];
$table.type = $reader["type"];
[void]$rows.Add($table)
}
Write-Host "rows collected"
}
My problem is, I only read the first row of the table, how can I tell Oracle to read them all? Would I have to countthem first and then query for each row?
I check the contents of $rows later in the code, it's not really relevant to the question since I know that this part works, so I left it out.
I know that my query returns something because I tried it in Oracle.
Do I need a foreach loop? It would make sense but how can I tell Oracle to do that? Would I have to query for each row of the table and set a counter to query only one row at a time?
I hope someone can help me and point me in the right direction, since I'm already trying a long time to get my script working. I got most of the logic for my script, but if I can't load the rows into my list, my logic doesn't help me at all.
Use the following code snippet as a some base for an own solution:
$cs = 'data source=oradb;user id=/;dba privilege=sysdba'
$oc = new-object oracle.dataaccess.client.oracleconnection $cs
$oc.open()
$cm = new-object oracle.dataaccess.client.oraclecommand
$cm.connection = $oc
$cm.commandtext = "select name, type from source_table where type = 'running'"
$da = new-object oracle.dataaccess.client.oracledataadapter
$da.selectcommand = $cm
$tbl = new-object data.datatable
$da.fill($tbl)
$tbl | %{"$($_.name = $_.type)"}

Writing 100 rows form oracle database using powershell

I have an problem using powershell .
first i would like to explain what the sitiation is :
I have an Oracle database running with an table NAMES . In the table i got about 10000 rows with data . I would like to to count them til 100 rows and and then "echo " them on my powershell prompt here is where the problem comes in because i can count the rows using the following script:
$query = “SELECT * FROM NAMES"
$command = $con.CreateCommand()
$command.CommandText = $query
$result = $command.ExecuteReader()
$test= $result|Measure-Object
$i=$test.Count
The number that returns is the correct number but then it goes wrong because when i want to use an foreach loop i cant get the names from my table
Here is wat i got maybey it helps
foreach ($Row in $query.Tables[0].Rows)
{
write-host "value is : $($Row[0])"
}
hope someone finds an answer
You are missing the strict mode: Set-StrictMode -Version latest. By setting it, you'd get much more informative an error message:
$query = "SELECT * FROM NAMES"
foreach ($Row in $query.Tables[0].Rows) { ... }
Property 'Tables' cannot be found on this object. Make sure that it exists.
+ foreach ($Row in $query.Tables[0].Rows) { ... }
The $query variable doesn't contain member Tables, so attempting to read it is futile. It would seem likely that the $result variable contains the Tables member. That depends on the data provider you are using and what's missing on the code sample.

Laravel query optimization

I have a query in laravel:
...
$query = $model::group_by($model->table().'.'.$model::$key);
$selects = array(DB::raw($model->table().'.'.$model::$key));
...
$rows = $query->distinct()->get($selects);
this works fine and gives me the fields keys' that I need but the problem is that I need to get all the columns and not just the Key.
using this:
$selects = array(DB::raw($model->table().'.'.$model::$key), DB::raw($model->table().'.*'));
is not an option, cuz it's not working with PostgreSQL, so i used $rows to get the rest of columns:
for ($i = 0; $i<count($rows); $i++)
{
$rows[$i] = $model::find($rows[$i]->key);
}
but as you see this is it's so inefficient, so what can i do to make it faster and more efficient?
you can find the whole code here: https://gist.github.com/neo13/5390091
ps. I whould use join but I don't know how?
Just don't pass anything in to get() and it will return all the columns. Also the key is presumably unique in the table so I don't exactly understand why you need to do the group by.
$models = $model::group_by( $model->table() . '.'. $model::$key )->get();

How to optimize saving multiple models?

I need to manually save a very large number of order items to the database. I'm currently doing something like this:
for($x=0; $x<250000; $x++)
{
$orderItem = Mage::getModel('sales/order_item');
$data = array('a'=>1, 'b'=>2, 'c'=>3);
$orderItem->setData($data)->save();
}
Attempting to run this code from a shell script takes forever. What strategies can I use to speed up this code?
I'm not sure but take a look # Mage::getModel('core/resource_transaction')
See https://stackoverflow.com/a/4879133/1191288
You could try doing a batch save on ever X amount
$batch = 500;
$orderItem = Mage::getModel('sales/order_item');
$transactionSave = Mage::getModel('core/resource_transaction');
for($x=0; $x<250000; $x++){
$orderItem->setData( array('a'=>1, 'b'=>2, 'c'=>3));
$transactionSave->addObject($orderItem);
$orderItem->reset()
if ($x % $batch == 0){
$transactionSave->save();
$transactionSave = null;
$transactionSave = Mage::getModel('core/resource_transaction');
}
}
if($x % $batch > 0)
$transactionSave->save();
I think in this situation, your best bet to speed this up significantly would be to create custom database queries.
Edit: Related option: try to rewrite the model and resource model to implement a method bulkSave() which is like save() but only creates the corresponding query object and returns it, so it does not actually use the database. Then collect all the queries and run them in big transactions every thousand items or so.

Resources