I have a Laravel application in which I integrated PHP graph sdk to use Facebook graph API. I have a stats page, I display number of posts per type, top 3 posts, and some insights metric like "page_post_engagements" ... in moris charts.
this is my controller:
public function stats($id,Facebook $fb)
{
$page = Page::find($id);
$page_fb_id = $page->fb_id;
$page_access_token = $page->access_token;
$oAuth2Client = $fb->getOAuth2Client();
$fb->setDefaultAccessToken($oAuth2Client->getLongLivedAccessToken($page_access_token)->getValue());
$nb_photos = '0';
$nb_videos = '0';
$nb_links = '0';
$nb_likes = '0';
$nb_comments = '0';
$nb_shares = '0';
$count_posts = '0';
$startDate = Carbon::now()->subDays('29');
$endDate = Carbon::now();
$numberOfDays = $endDate->diffInDays($startDate);
$classement = array();
$posts = $fb->get('/'.$page_fb_id.'/posts?fields=type&since='.Carbon::parse($startDate).'&until='.Carbon::parse($endDate))->getGraphEdge();
if(empty($posts))
{
$classement = ['total'=>0,'likes'=>0,'comments'=>0,'shares'=>0];
}
else
{
foreach ($posts as $key => $post)
{
if($post['type'] == 'photo')
{
$nb_photos++;
}
else if($post['type'] == 'video')
{
$nb_videos++;
}
else if($post['type'] == 'link')
{
$nb_links++;
}
$count_posts++;
$impress = $fb->get('/'.$post['id'].'/insights?metric=post_impressions_unique')->getGraphEdge();
$engage = $fb->get('/'.$post['id'].'/insights?metric=post_engaged_users')->getGraphEdge();
$classement[] = ['total'=>($engage[0]['values'][0]['value']/$impress[0]['values'][0]['value'])*100,'id'=>$post['id'],'impressions'=>$impress[0]['values'][0]['value'],'engage'=>$engage[0]['values'][0]['value']];
}
if(is_array($classement))
{
asort($classement);
$tops = array_slice($classement, -3, 3);
$start_elem = array_slice($tops, 0, 1);
$mid_elem = array_slice($tops, 1, 1);
$end_elem = end($tops);
}
if($end_elem['id'])
{
$top1 = $fb->get('/'.$end_elem['id'].'/?fields=id,message,full_picture,source,type,created_time,from{name,picture}')->getGraphNode();
$likes1 = $fb->get('/'.$end_elem['id'].'/likes?limit=1000000')->getGraphEdge()->count();
$comments1 = $fb->get('/'.$end_elem['id'].'/comments?limit=1000000')->getGraphEdge()->count();
$shares1 = $fb->get('/'.$end_elem['id'].'/sharedposts?limit=1000000')->getGraphEdge()->count();
}
if($mid_elem[0]['id'])
{
$top2 = $fb->get('/'.$mid_elem[0]['id'].'/?fields=id,message,full_picture,source,type,created_time,from{name,picture}')->getGraphNode();
$likes2 = $fb->get('/'.$mid_elem[0]['id'].'/likes?limit=1000000')->getGraphEdge()->count();
$comments2 = $fb->get('/'.$mid_elem[0]['id'].'/comments?limit=1000000')->getGraphEdge()->count();
$shares2 = $fb->get('/'.$mid_elem[0]['id'].'/sharedposts?limit=1000000')->getGraphEdge()->count();
}
if($start_elem[0]['id'])
{
$top3 = $fb->get('/'.$start_elem[0]['id'].'/?fields=id,message,full_picture,source,type,created_time,from{name,picture}')->getGraphNode();
$likes3 = $fb->get('/'.$start_elem[0]['id'].'/likes?limit=1000000')->getGraphEdge()->count();
$comments3 = $fb->get('/'.$start_elem[0]['id'].'/comments?limit=1000000')->getGraphEdge()->count();
$shares3 = $fb->get('/'.$start_elem[0]['id'].'/sharedposts?limit=1000000')->getGraphEdge()->count();
}
}
foreach (range(0, $numberOfDays) as $day)
{
$a[] = ['year'=>$endDate->copy()->subDays($day)->format('Y-m-d')];
}
foreach (array_reverse($a) as $key => $value)
{
$page_fans = $fb->get('/'.$page_fb_id.'/insights?metric=page_fans&since='.Carbon::parse($value['year']).'&until='.Carbon::parse($value['year'])->addDays('2'))->getGraphEdge()[0]['values'][0]['value'];
$fans[] = ['year'=>$value['year'],'value'=>$page_fans];
$page_post_engagements = $fb->get('/'.$page_fb_id.'/insights?metric=page_post_engagements&since='.Carbon::parse($value['year']).'&until='.Carbon::parse($value['year'])->addDays('2'))->getGraphEdge()[0]['values'][0]['value'];
$post_engagements[] = ['year'=>$value['year'],'value'=>$page_post_engagements];
$page_impressions = $fb->get('/'.$page_fb_id.'/insights?metric=page_impressions&since='.Carbon::parse($value['year']).'&until='.Carbon::parse($value['year'])->addDays('2'))->getGraphEdge()[0]['values'][0]['value'];
$impressions[] = ['year'=>$value['year'],'value'=>$page_impressions];
$page_actions_post_reactions_like_total = $fb->get('/'.$page_fb_id.'/insights?metric=page_actions_post_reactions_like_total&since='.Carbon::parse($value['year']).'&until='.Carbon::parse($value['year'])->addDays('2'))->getGraphEdge()[0]['values'][0]['value'];
$post_reactions_like_total[] = ['year'=>$value['year'],'value'=>$page_actions_post_reactions_like_total];
/*$page_engaged_users = $fb->get('/'.$page_fb_id.'/insights?metric=page_engaged_users&since='.Carbon::parse($value['year']).'&until='.Carbon::parse($value['year'])->addDays('2'))->getGraphEdge()[0]['values'][0]['value'];
$engaged_users[] = ['year'=>$value['year'],'value'=>$page_engaged_users];
$page_views_total = $fb->get('/'.$page_fb_id.'/insights?metric=page_views_total&since='.Carbon::parse($value['year']).'&until='.Carbon::parse($value['year'])->addDays('2'))->getGraphEdge()[0]['values'][0]['value'];
$views_total[] = ['year'=>$value['year'],'value'=>$page_views_total];*/
}
/*$fans_total = end($fans)['value'];
$moyenne_interaction = array_sum(array_column($engaged_users,'value'))/count(array_column($engaged_users,'value'));
$average_interaction = array_sum(array_column($engaged_users,'value'))/count(array_column($engaged_users,'value'))/(end($fans)['value']);*/
$fans = json_encode($fans);
$post_engagements = json_encode($post_engagements);
$impressions = json_encode($impressions);
$post_reactions_like_total = json_encode($post_reactions_like_total);
/*$engaged_users = json_encode($engaged_users);
$views_total = json_encode($views_total);*/
return view('stats',with(compact('randon','id','tasks','page','fans','impressions','post_engagements','post_reactions_like_total','engaged_users','views_total','nb_photos','nb_videos','nb_links','count_posts','fans_total','top1','likes1','comments1','shares1','top2','likes2','comments2','shares2','top3','likes3','comments3','shares3')));
}
Is there a way to optimize my code so I can get a better response time and especially avoid the Connection timed out after 10000 milliseconds exception?
I'm assuming to get the API data and parse it... it is taking longer than 10 seconds?
You have 2 things to do to improve your system: 1) Use Cache 2) Use Queues.
Cache! (Save the API/parsed result for 15 minutes or so)
Queues! (API parsing is done in the server background)
Your system workflow will be:
User hits /stats
Is there a cache version? Use it. Response time ~ barely 500ms.
No/Expired cache? Fire a Queued Job to get the new API data.
Show the user 'Data is being refreshed!' and display the latest cache.
Related
so i got imported my large excel file with huge informations with 1 sec and i did get all the information i need to display within 1 sec to , but i got a big probleme when displaying the array in the view coz i m testing if the cells are correct or not i m using 5 ifs and 3 foreach and it takes mote than 2 mins , i need help to display all the info in a short time this is my array , and thanks
and there is my code of the view which takes to much time to display infos
and thanks
//here we get our final result of true and false fields
$finale_array = [];
// here we get all our
$current_table2 = [];
$results_applied = [];
$current_result = [];
$columns = SCHEMA::getColumnListing('imports');
// here we get all our conditions
$conditions = DB::table('conditions')->select('number', 'field', 'value')->get();
// here we get all our data
$imports = DB::table('imports')->get();
$results = DB::table('results')->get();
$x = 0;
$default_value = 0;
foreach ($imports as $key => $imported) {
$res = get_object_vars($imported);
foreach ($conditions as $value) {
$array = get_object_vars($value);
$result = $this->test($columns, $array['field']); // the result of our test function
if ($result == "ok") {
if ($res[$array['field']] == $array['value']) {
foreach ($results as $value_result) {
$res_resultat = get_object_vars($value_result);
$test_field = $this->test($columns, $res_resultat['field']);
// testing if the condtion numder match with the result number
if (($res_resultat['condition_number'] == $array['number'])) {
if (($test_field == 'ok')) {
if ($res['id'] != $default_value) {
// here test if the difference between the id and the default value is different from the current id to insert
if (($res['id'] - $default_value) != $res['id']) {
array_push($current_table2, $results_applied);
array_push($finale_array, $current_table2);
$current_table2 = [];
$results_applied = [];
$default_value = $res['id'];
}
$current_table2 = [$res, $res['id']];
}
$current_result = [$res_resultat['field'], $res[$res_resultat['field']]];
if ($res_resultat['value'] == $res[$res_resultat['field']]) {
$current_result[2] = 'true';
$current_result[3] = $res_resultat['value'];
} else {
$current_result[2] = 'false';
$current_result[3] = $res_resultat['value'];
}
$current_result[4] = $array['number'];
array_push($results_applied, $current_result);
}
}
}
}
}
}
$default_value = $res['id'];
}
array_push($current_table2, $results_applied);
array_push($finale_array, $current_table2);
dd($finale_array);
return view('Appliedconditions', ['imports' => $finale_array, 'columns' => $columns]);
}
From what I can see, you are recovering all the data without paging it.
You must paginate the data using the paginate(#elements_per_page) directive in your query, where #elements_per_page is the number of elements you want to display per page.
For example:
$elements = Elements::select('*')->paginate(10);
and in your blade view you can retreive pagination links after the closing table tag in this way: {{ $elements->links() }}
I'm looking for the best method to export some datas from elasticsearch.
My index structure is very simple : ID + 4/5 simple fields
The ID is the ID of French companies,
The other fields are, for example, the activity code, the legal form code, employee size, zip code, ...
The total of records is about 30M.
The count is very fast, but if I want to extract only the IDs, it take a long time.
I need this list "online" for other treatments in a webpage.
How can I do this ?
My query for search is :
{"query":{"bool":{"must":[{"bool":{"should":[{"match":
{"etatadministratifunitelegale":"A"}}]}},{"bool":{"should":[{"match":
{"etatadministratifetablissement":"A"}}]}},{"bool":{"should":[{"match":
{"activiteprincipaleunitelegale":"68.20A"}}]}}], "must_not":[], "should":[] } } }
It's $params in my php code.
This query gives more than 650 000 IDs. To extract them :
$hosts = [
'http://xxxxxxxxx:9200'
];
$client = Elasticsearch\ClientBuilder::create()
->setHosts($hosts)
->build();
$xparams['index'] = 'vecteur';
$xparams['type'] = '_doc';
$xparams['stored_fields'] = [];
$xparams['scroll'] = '1s';
$xparams['size'] = '50000';
$xparams['body'] = $params;
$res = $client->search($xparams);
$scroll_id = $res['_scroll_id'];
$zzz = $res['hits']['hits'];
$xstr = "";
foreach ($zzz as $elt) {
$xstr .= $elt['_id']."\n";
}
fwrite($myfile, $xstr);
while (\true) {
$params2['scroll'] = '1s';
$params2['scroll_id'] = $scroll_id;
$response = $client->scroll($params2);
$zzz = $response['hits']['hits'];
$xstr = "";
foreach ($zzz as $elt) {
$xstr .= $elt['_id']."\n";
}
fwrite($myfile, $xstr);
if (count($response['hits']['hits']) > 0) {
$scroll_id = $response['_scroll_id'];
} else {
break;
}
}
fclose($myfile);
$myfile is a CSV file. Is it clear ?
I am making a report in which user can check due balance of them invoice day wise like between 1-30 days how many amount due. between 30-60 days how many amount due etc for that I have make below function.
I don't know how can I optimize this process time.
public function yajraAgingARSummaryByDueDate()
{
$customerData = GenerateInvoice::select('orders.customer_id','customers.first_name', 'customers.last_name','customers.terms','customers.account_id','customers.account_suffix','invoice.due_date','invoice.invoice_number')
->leftJoin('orders','orders.id','=','invoice.order_id')
->leftJoin('customers', 'customers.id', 'orders.customer_id')
->whereDate('invoice.due_date','<=',date('Y-m-d'))
->whereNotNull('orders.customer_id')
->where('invoice.is_invoice_paid','=','no')
->whereNull('invoice.deleted_at')
->groupBy('orders.customer_id')
->get();
$resultArr = [];
if(count($customerData))
{
foreach($customerData as $key => $row)
{
$current = 0;
$b_1_30 = 0;
$b_31_60 = 0;
$b_61_90 = 0;
$b_over_90 = 0;
$total_ar = 0;
$balance = 0;
$invoiceData = GenerateInvoice::select('invoice.id')
->leftJoin('orders','orders.id','=','invoice.order_id')
->where('orders.customer_id',$row['customer_id'])
->whereDate('invoice.due_date','<=',date('Y-m-d'))
->whereNotNull('orders.customer_id')
->whereNull('invoice.deleted_at')
->where('invoice.is_invoice_paid','=','no')
->get();
if(count($invoiceData))
{
foreach($invoiceData as $row1)
{
$data = GenerateInvoice::select(DB::raw("abs(DATEDIFF(STR_TO_DATE(due_date, '%Y-%m-%d'),CURDATE())) AS Days"),'orders.total_order_value','payments.payment_amount')
->selectRaw(' (select sum(auto_cash_distributions.payment_amount) from auto_cash_distributions where auto_cash_distributions.order_id = invoice.order_id and payment_amount is not null ) as acd_payament_amount ')
->leftJoin('orders','orders.id','=','invoice.order_id')
->leftJoin('payments','payments.order_id','=','orders.id')
->where('invoice.id',$row1['id'])
->whereNotNull('orders.customer_id')
->whereNull('invoice.deleted_at')
->whereNull('payments.deleted_at')
->where('invoice.is_invoice_paid','=','no')
// ->groupBy('payments.invoice_id')
->first();
$days = $data['Days'];
$acd_payament_amount = !(empty($data['acd_payament_amount'])) ? $data['acd_payament_amount'] : 0;
$payment_amount = $data['payment_amount'] + $acd_payament_amount;
$amount = $data['total_order_value'] - $payment_amount;
// H 30/12
$amount = $amount;
$balance =$balance + $amount;
$total_ar = $total_ar + $amount;
// $total_ar = $amount;
if($days>90)
{
$b_over_90 = $b_over_90 + $amount;
}
else if($days <=90 && $days>=61)
{
$b_61_90 = $b_61_90 + $amount;
}
else if($days <=60 && $days>=31)
{
$b_31_60 = $b_31_60 + $amount;
}
else if($days <=30 && $days>=1)
{
$b_1_30 = $b_1_30 + $amount;
}
else
{
$current = $current + $amount;
}
}
}
$resultArr[$key]['account_id'] = $row['account_id'];
$resultArr[$key]['account_suffix'] = $row['account_suffix'];
$resultArr[$key]['first_name'] = $row['first_name'];
$resultArr[$key]['terms'] = $row['terms'];
$resultArr[$key]['due_date'] = $row['due_date'];
$resultArr[$key]['invoice_number'] = $row['invoice_number'];
$resultArr[$key]['balance'] = number_format($balance,2,'.','');
$resultArr[$key]['current'] = number_format($current,2,'.','');
$resultArr[$key]['b_1_30'] = number_format($b_1_30,2,'.','');
$resultArr[$key]['b_31_60'] = number_format($b_31_60,2,'.','');
$resultArr[$key]['b_61_90'] = number_format($b_61_90,2,'.','');
$resultArr[$key]['b_over_90'] = number_format($b_over_90,2,'.','');
$resultArr[$key]['total_ar'] = number_format($total_ar,2,'.','');
}
}
return Datatables::of($resultArr)
->addColumn('due_date',function($sql){
return !empty($sql['due_date']) ? date(get_config('date_format'),strtotime($sql['due_date'])) : '';
})
->addIndexColumn()
->rawColumns(['due_date'])
->make(true);
}
but this code is taking to much time to open. can anybody help me to optimize this code?
I got some data from database.
$searchModel = new InventorySearch();
$dataProvider = $searchModel->search(Yii::$app->request->queryParams);
The revenue and the cost of each product are calulated using a complex algorithm. After that the $dataProvider is updated with the new costs and revenues as below:
foreach ($dataProvider->models as $model) {
$model->cost = 0;
$model->revenue = 0;
$model->profit = 0;
$model->profit_margin = 0;
foreach($calculated_costs as $item)
{
if($model->id == $item['product_id'])
{
$model->cost = $item['total_cost'];
break;
}
}
foreach($calculated_revenue as $rev_item)
{
if($model->id == $rev_item['product_id'])
{
$model->revenue = $rev_item['total_sales'];
break;
}
}
$model->profit = floatval($model->revenue) - floatval($model->cost) ;
if($model->revenue != 0)
{
$model->profit_margin = ( floatval($model->profit) ) / floatval($model->revenue) ;
}
}
I need to sort the dataprovider by profit after updating the dataprovider. Is there a way to do that?
I've been using Subsonic 3.0.0.4 (ActiveRecord approach) for a while, and I recently coded a small page that basically retrieves about 500 records for a given year, and then i just loop through each of them, creating new instances of the Active Record class, just modifying the Period field, and saving each instance in the loop.
The issue is that after executing that page, a LOT of SQL connections are left hanging/open in SQL server (by looking at the sp_who2). Before the page finishes executing, I get the "Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached." error.
The code is the following:
if (string.IsNullOrEmpty(tbPeriodoAGenerar.Text)) return;
var idPeriodo = Convert.ToInt32(tbPeriodoAGenerar.Text);
var nuevaEncuesta = new Encuesta();
nuevaEncuesta.IdPeriodo = idPeriodo;
nuevaEncuesta.IdResponsable = 1;
nuevaEncuesta.fechaCierre1 = Convert.ToDateTime(dpFechaCierre1.Value);
nuevaEncuesta.fechaCierre2 = Convert.ToDateTime(dpFechaCierre2.Value);
nuevaEncuesta.IdTipoEncuesta = (int)ETipoEncuesta.PorAnio;
nuevaEncuesta.nombreEncuesta = NombresEncuestas.COVA;
nuevaEncuesta.nombrePublico = NombresEncuestas.COVA_PUBLICO;
nuevaEncuesta.Save();
var empresasActivas = Empresa.Find(x => x.activo == 1);
foreach (var empresa in empresasActivas)
{
EmpresaEncuesta ee = new EmpresaEncuesta();
ee.IdEmpresa = empresa.IdEmpresa;
ee.IdEncuesta = nuevaEncuesta.IdEncuesta;
ee.IdEstatusContestado = (int)EEstatusEmpresaEncuesta.SinContestar;
ee.fechaMod = DateTime.Now;
ee.IdUsuario = 1;
ee.ipMod = IpUsuarioActual;
ee.Save();
}
if (chkMigrarRespuestas.Checked)
{
var periodosAnteriores = new EncuestaBO().ObtenerPeriodosAnteriores(NombresEncuestas.COVA, idPeriodo);
int? periodoAnterior = null;
if (periodosAnteriores.Tables[0].Rows.Count > 0)
{
periodoAnterior = Convert.ToInt32(periodosAnteriores.Tables[0].Rows[0][Columnas.ID_PERIODO]);
}
if (!periodoAnterior.HasValue) return;
var respuestasCortoPlazo = COVACortoPlazo.Find(x => x.Periodo == (periodoAnterior));
COVACortoPlazo ccp;
foreach (var ccpAnterior in respuestasCortoPlazo)
{
if (!empresasActivas.Where(emp => emp.IdEmpresa == ccpAnterior.IdEmpresa).Any()) continue;
ccp = new COVACortoPlazo();
ccp.IdEmpresa = ccpAnterior.IdEmpresa;
ccp.CuentaCortoPlazo = ccpAnterior.CuentaCortoPlazo;
ccp.ComentariosAdicionales = ccpAnterior.ComentariosAdicionales;
ccp.RetiroVoluntarioOpcionId = ccpAnterior.RetiroVoluntarioOpcionId;
ccp.RetiroVoluntarioOtroDesc = ccpAnterior.RetiroVoluntarioOtroDesc;
ccp.RetiroEmpresaOpcionId = ccpAnterior.RetiroEmpresaOpcionId;
ccp.RetiroEmpresaOtroDesc = ccpAnterior.RetiroEmpresaOtroDesc;
ccp.Periodo = idPeriodo;
ccp.Save();
}
var tablaCortoPlazoAnterior = COVATablaCortoPlazo.Find(x => x.Periodo == (periodoAnterior));
COVATablaCortoPlazo ctcp;
foreach (var ctcpAnterior in tablaCortoPlazoAnterior)
{
if (!empresasActivas.Where(emp => emp.IdEmpresa == ctcpAnterior.IdEmpresa).Any()) continue;
ctcp = new COVATablaCortoPlazo();
ctcp.IdEmpresa = ctcpAnterior.IdEmpresa;
ctcp.Periodo = idPeriodo;
ctcp.COVASegmentoOpcionId = ctcpAnterior.COVASegmentoOpcionId;
ctcp.NivelDinamicaMin = ctcpAnterior.NivelDinamicaMin;
ctcp.NivelDinamicaMax = ctcpAnterior.NivelDinamicaMax;
ctcp.NombreBono = ctcpAnterior.NombreBono;
ctcp.COVAPeriodicidadOpcionId = ctcpAnterior.COVAPeriodicidadOpcionId;
ctcp.MetodoCalculo = ctcpAnterior.MetodoCalculo;
ctcp.COVABaseCalculoOpcionId = ctcpAnterior.COVABaseCalculoOpcionId;
ctcp.RealAnualizado = ctcpAnterior.RealAnualizado;
ctcp.Save();
}
var respuestasAnual = COVAAnual.Find(x => x.Periodo == (periodoAnterior));
COVAAnual ca;
foreach (var caAnterior in respuestasAnual)
{
if (!empresasActivas.Where(emp => emp.IdEmpresa == caAnterior.IdEmpresa).Any()) continue;
ca = new COVAAnual();
ca.IdEmpresa = caAnterior.IdEmpresa;
ca.CuentaAnual = caAnterior.CuentaAnual;
ca.NombreBono = caAnterior.NombreBono;
ca.FechaPago = caAnterior.FechaPago;
ca.ComentariosAdicionales = caAnterior.ComentariosAdicionales;
ca.RetiroVoluntarioOpcionId = caAnterior.RetiroVoluntarioOpcionId;
ca.RetiroVoluntarioOtroDesc = caAnterior.RetiroVoluntarioOtroDesc;
ca.RetiroEmpresaOpcionId = caAnterior.RetiroEmpresaOpcionId;
ca.RetiroEmpresaOtroDesc = caAnterior.RetiroEmpresaOtroDesc;
ca.Periodo = idPeriodo;
ca.Save();
}
var tablaAnualAnterior = COVATablaAnual.Find(x => x.Periodo == (periodoAnterior));
COVATablaAnual cta;
foreach (var ctaAnterior in tablaAnualAnterior)
{
if (!empresasActivas.Where(emp => emp.IdEmpresa == ctaAnterior.IdEmpresa).Any()) continue;
cta = new COVATablaAnual();
cta.IdEmpresa = ctaAnterior.IdEmpresa;
cta.Periodo = idPeriodo;
cta.COVASegmentoOpcionId = ctaAnterior.COVASegmentoOpcionId;
cta.NivelDinamicaMin = ctaAnterior.NivelDinamicaMin;
cta.NivelDinamicaMax = ctaAnterior.NivelDinamicaMax;
cta.Minimo = ctaAnterior.Minimo;
cta.Target = ctaAnterior.Target;
cta.Maximo = ctaAnterior.Maximo;
cta.RealAnualPagado = ctaAnterior.RealAnualPagado;
cta.MetodoCalculo = ctaAnterior.MetodoCalculo;
cta.COVABaseCalculoOpcionId = ctaAnterior.COVABaseCalculoOpcionId;
cta.Save();
}
var respuestasLargoPlazo = COVALargoPlazo.Find(x => x.Periodo == (periodoAnterior));
COVALargoPlazo clp;
foreach (var clpAnterior in respuestasLargoPlazo)
{
if (!empresasActivas.Where(emp => emp.IdEmpresa == clpAnterior.IdEmpresa).Any()) continue;
clp = new COVALargoPlazo();
clp.IdEmpresa = clpAnterior.IdEmpresa;
clp.CuentaLargoPlazo = clpAnterior.CuentaLargoPlazo;
clp.ComentariosAdicionales = clpAnterior.ComentariosAdicionales;
clp.RetiroVoluntarioOpcionId = clpAnterior.RetiroVoluntarioOpcionId;
clp.RetiroVoluntarioOtroDesc = clpAnterior.RetiroVoluntarioOtroDesc;
clp.RetiroEmpresaOpcionId = clpAnterior.RetiroEmpresaOpcionId;
clp.RetiroEmpresaOtroDesc = clpAnterior.RetiroEmpresaOtroDesc;
clp.PermiteCompraAcciones = clpAnterior.PermiteCompraAcciones;
clp.Periodo = idPeriodo;
clp.Save();
}
var tablaLargoPlazoAnterior = COVATablaLargoPlazo.Find(x => x.Periodo == (periodoAnterior));
COVATablaLargoPlazo ctlp;
foreach (var ctlpAnterior in tablaLargoPlazoAnterior)
{
if (!empresasActivas.Where(emp => emp.IdEmpresa == ctlpAnterior.IdEmpresa).Any()) continue;
ctlp = new COVATablaLargoPlazo();
ctlp.IdEmpresa = ctlpAnterior.IdEmpresa;
ctlp.Periodo = idPeriodo;
ctlp.NombrePlan = ctlpAnterior.NombrePlan;
ctlp.COVATipoPlanOpcionId = ctlpAnterior.COVATipoPlanOpcionId;
ctlp.COVASegmentoOpcionId = ctlpAnterior.COVASegmentoOpcionId;
ctlp.NivelDinamicaMin = ctlpAnterior.NivelDinamicaMin;
ctlp.NivelDinamicaMax = ctlpAnterior.NivelDinamicaMax;
ctlp.RealPagadoFinalPlan = ctlpAnterior.RealPagadoFinalPlan;
ctlp.AniosEjerce = ctlpAnterior.AniosEjerce;
ctlp.MetodoCalculo = ctlpAnterior.MetodoCalculo;
ctlp.BaseCalculo = ctlpAnterior.BaseCalculo;
ctlp.Save();
}
var respuestasVentas = COVAVentas.Find(x => x.Periodo == (periodoAnterior));
COVAVentas cv;
foreach (var cvAnterior in respuestasVentas)
{
if (!empresasActivas.Where(emp => emp.IdEmpresa == cvAnterior.IdEmpresa).Any()) continue;
cv = new COVAVentas();
cv.IdEmpresa = cvAnterior.IdEmpresa;
cv.CuentaVentas = cvAnterior.CuentaVentas;
cv.ComentariosAdicionales = cvAnterior.ComentariosAdicionales;
cv.RetiroVoluntarioOpcionId = cvAnterior.RetiroVoluntarioOpcionId;
cv.RetiroVoluntarioOtroDesc = cvAnterior.RetiroVoluntarioOtroDesc;
cv.RetiroEmpresaOpcionId = cvAnterior.RetiroEmpresaOpcionId;
cv.RetiroEmpresaOtroDesc = cvAnterior.RetiroEmpresaOtroDesc;
cv.Periodo = idPeriodo;
cv.Save();
}
var tablaVentasAnterior = COVATablaVentas.Find(x => x.Periodo == (periodoAnterior));
COVATablaVentas ctv;
foreach (var ctvAnterior in tablaVentasAnterior)
{
if (!empresasActivas.Where(emp => emp.IdEmpresa == ctvAnterior.IdEmpresa).Any()) continue;
ctv = new COVATablaVentas();
ctv.IdEmpresa = ctvAnterior.IdEmpresa;
ctv.Periodo = idPeriodo;
ctv.COVASegmentoOpcionId = ctvAnterior.COVASegmentoOpcionId;
ctv.COVAPeriodicidadOpcionId = ctvAnterior.COVAPeriodicidadOpcionId;
ctv.Minimo = ctvAnterior.Minimo;
ctv.Target = ctvAnterior.Target;
ctv.Maximo = ctvAnterior.Maximo;
ctv.RealAnualizado = ctvAnterior.RealAnualizado;
ctv.MetodoCalculo = ctvAnterior.MetodoCalculo;
ctv.BaseCalculo = ctvAnterior.BaseCalculo;
ctv.Save();
}
var respuestasGenerales = COVAGenerales.Find(x => x.Periodo == (periodoAnterior));
COVAGenerales cg;
foreach (var cgAnterior in respuestasGenerales)
{
if (!empresasActivas.Where(emp => emp.IdEmpresa == cgAnterior.IdEmpresa).Any()) continue;
cg = new COVAGenerales();
cg.IdEmpresa = cgAnterior.IdEmpresa;
cg.AccionesPorSituacionActual = cgAnterior.AccionesPorSituacionActual;
cg.ComentariosAccionesSituacionActual = cgAnterior.ComentariosAccionesSituacionActual;
cg.TomaCuentaSituacionDefinicionObjetivos = cgAnterior.TomaCuentaSituacionDefinicionObjetivos;
cg.Periodo = idPeriodo;
cg.Save();
}
}
Am I doing it the wrong way? At this point, I am not sure if this is a Subsonic bug or if I need to manually close the connection myself somehow.
I've googled for posts about similar problems when using subsonic, but none have come up. The usual cause for the error I get is not closing the SqlDataReader, but I honestly do not believe Subsonic is not closing it..and Im using the latest version.
Any ideas? Any help is greatly appreciated.
Any time you have a loop on an ORM-based object you have to consider there's probably an N+1 issue. I can't see your model, but I'll bet that in your loop you're executing a number of additional queries.
I know that Save() fires and closes an ExecuteScalar() - this should not leave a connection open. However, if you're fetching related records inside that loop - yeah that could have problems.
So - I would recommend using a profiler of some kind and the debugger to step through your loop - see what queries are made.
Alternatively - this is ripe for using the BatchInsert stuff which would keep everything in a nice, tidy single-connection transaction.
Read here for more:
http://subsonic.wekeroad.com/docs/Linq_Inserts