In MVC6/EF7, should there be a difference in an order I use Include() to include navigation properties into a query?
This query works
var vt = await db.VehicleTypes
.Include(t => t.Photos)
.Include(t => t.VehicleModels)
.ThenInclude(m => m.Units)
.Include(t => t.Rates)
.ThenInclude(r => r.DailyPrice.Currency)
.ToListAsync()
But this query throws an exception at ToListAsync()
var vt = await db.VehicleTypes
.Include(t => t.Photos)
.Include(t => t.Rates)
.ThenInclude(r => r.DailyPrice.Currency)
.Include(t => t.VehicleModels)
.ThenInclude(m => m.Units)
.ToListAsync()
The error is
ArgumentOutOfRangeException: Index was out of range. Must be non-negative and less than the size of the collection.
Parameter name: index
I understand it's Beta, there may be bugs. In this case - is it a bug or a designed behavior?
Looks like a bug; the order shouldn't matter. Would you mind creating an issue?
Related
The following code works most of the time but sometimes it throws an exception with this message:
Invalid NEST response built from a unsuccessful () low level call on POST: /queries2020-09/_search?typed_keys=true
var response = await client.SearchAsync<LogEntry>(s => s
.Query(q => q
.Bool(b => b
.Must(m => m.DateRange(r => r.Field(l => l.DateTimeUTC)
.GreaterThanOrEquals(new DateMathExpression(since))),
m => m.Term(term)
)))
.Aggregations(a => a
.Sum("total-cost", descriptor => descriptor
.Field(f => f.Cost)
.Missing(1)))
.Size(0));
if (!response.IsValid)
{
throw new Exception("Elasticsearch response error. " + response.ToString());
}
This seems to be a very generic message that pops up a lot on Q&A websites. How do I debug it to see the root cause?
Using NEST 7.6.1.
It may be better to write the debug information out rather than .ToString()
if (!response.IsValid)
{
throw new Exception("Elasticsearch response error. " + response.DebugInformation);
}
The debug information includes the audit trail and details about an error/exception, if there is one. It's a convenience method for collecting the pertinent information available on IResponse in a human readable form.
If a response is always checked for validity and an exception thrown, you may want to set ThrowExceptions() on ConnectionSettings to throw when an error occurs.
So I'm working on a project from work using the facebook API so we can build ads (yes I know you can do it from facebook ads manager but my company wants me to create our own interface). Anyways so I'm able to create campaigns and ad sets through the API but I cannot seem to have any luck with ads and or creatives.
{message: "Invalid parameter", exception: "FacebookAds\Http\Exception\AuthorizationException",…}
exception
:
"FacebookAds\Http\Exception\AuthorizationException"
file
:
"/Users/bradgoldsmith/Desktop/SquibLib/vendor/facebook/php-ads-sdk/src/FacebookAds/Http/Exception/RequestException.php"
line
:
144
message
:
"Invalid parameter"
trace
:
[{,…}, {,…},…]
That's the error I seem to get and from the looks it has something to do with Authorization but I'm able to create campaigns and ad sets so I figured I'm authorized. I'm just testing dummy data on a page that I am an admin on. Any advice or pointers in the right direction would be greatly appreciated.
So turns out the authorization issue has something to do with the fact that our app is still in review. I was able to however create a carousel ad/ad creative through their API with this bit of code, which I also got from their documentation. Anyways the code that worked for me is here:
$product1 = (new AdCreativeLinkDataChildAttachment())->setData(array(
AdCreativeLinkDataChildAttachmentFields::LINK =>
'https://www.link.com/product1',
AdCreativeLinkDataChildAttachmentFields::NAME => 'Product 1',
AdCreativeLinkDataChildAttachmentFields::DESCRIPTION => '$8.99',
AdCreativeLinkDataChildAttachmentFields::IMAGE_HASH => '<IMAGE_HASH>',
));
$product2 = (new AdCreativeLinkDataChildAttachment())->setData(array(
AdCreativeLinkDataChildAttachmentFields::LINK =>
'https://www.link.com/product2',
AdCreativeLinkDataChildAttachmentFields::NAME => 'Product 2',
AdCreativeLinkDataChildAttachmentFields::DESCRIPTION => '$9.99',
AdCreativeLinkDataChildAttachmentFields::IMAGE_HASH => '<IMAGE_HASH>',
));
$product3 = (new AdCreativeLinkDataChildAttachment())->setData(array(
AdCreativeLinkDataChildAttachmentFields::LINK =>
'https://www.link.com/product3',
AdCreativeLinkDataChildAttachmentFields::NAME => 'Product 3',
AdCreativeLinkDataChildAttachmentFields::DESCRIPTION => '$10.99',
AdCreativeLinkDataChildAttachmentFields::IMAGE_HASH => '<IMAGE_HASH>',
));
$link_data = new AdCreativeLinkData();
$link_data->setData(array(
AdCreativeLinkDataFields::LINK => '<URL>',
AdCreativeLinkDataFields::CHILD_ATTACHMENTS => array(
$product1, $product2, $product3,
),
));
$object_story_spec = new AdCreativeObjectStorySpec();
$object_story_spec->setData(array(
AdCreativeObjectStorySpecFields::PAGE_ID => <PAGE_ID>,
AdCreativeObjectStorySpecFields::LINK_DATA => $link_data,
));
$creative = new AdCreative(null, 'act_<AD_ACCOUNT_ID>');
$creative->setData(array(
AdCreativeFields::NAME => 'Sample Creative',
AdCreativeFields::OBJECT_STORY_SPEC => $object_story_spec,
));
$creative->create();
I was trying to understand if RxJS would be a good fit for solving the problem that this node module performs: https://github.com/ericdolson/debouncing-batch-queue
It's description says: "A queue which will emit and clear its contents when its size or timeout is reached. Ideal for aggregating data for bulk apis where batching in a timely manner is best. Or anything really where batching data is needed."
If so, could someone walk me through how to implement the simple example in this npm module with RxJS? Ideally with ES5 if possible.
There's an operator for that™: bufferwithtimeorcount. If you need it to be truly equivalent, the input stream would be a Subject, with group_by for namespaces, like the following:
var dbbq$ = new Subject();
dbbq$.group_by(function(v_ns) { return v_ns[1]; })
.flatMap(function(S) {
return S.bufferwithtimeorcount(1000, 2)
});
dbbq$.next([ 'ribs 0' ]);
dbbq$.next([ 'more ribs', 'bbq1' ]);
// is analogous to
var dbbq = new DBBQ(1000, 2);
dbbq.add('ribs 0');
dbbq.add('more ribs', 'bbq1');
No way I'm doing this with ES5 :)
const dataWithNamespace = (data, namespace) => ({data, namespace});
const source = [
dataWithNamespace('ribs 0'),
dataWithNamespace('ribs 1'),
dataWithNamespace('ribs 2'),
dataWithNamespace('ribs 3'),
dataWithNamespace('ribs 4'),
dataWithNamespace('more ribs', 'bbq1'),
dataWithNamespace('more ribs', 'bbq1'),
dataWithNamespace('brisket', 'best bbq namespace')
];
const DBBQ = (debounceTimeout, maxBatchSize) =>
source$ => source$
.groupBy(x => x.namespace)
.mergeMap(grouped$ => grouped$
.switchMap(x =>
Rx.Observable.of(x.data)
.concat(Rx.Observable.of(undefined)
.delay(debounceTimeout)
)
)
.bufferCount(maxBatchSize)
.filter(x => x.length == maxBatchSize)
.map(x => x.filter(x => x !== undefined))
);
const source$ = Rx.Observable.from(source);
DBBQ(1000, 2)(source$).subscribe(console.log)
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.5.6/Rx.js"></script>
I have the following query that I build piecemeal/dynamically using "&=". Elasticsearch 5.x and Nest 5.x.
QueryContainer qfilter = null;
qfilter = Query<ClassEL>.Term(m => m.OrderID, iOrderID);
qfilter &= Query<ClassEL>
.Range(r => r
.Field(f => f.Distance)
.GreaterThan(100))
&&
.Query<ClassEL>.Term(t => t.Active, true);
var searchDes = new SearchDescriptor<ClassEL>()
.From(0)
.Size(10)
.Query(qfilter); <===== *** ERROR HERE ***
In Visual Studio, it shows the following error message tip:
Error: Cannot convert from 'Nest.QueryContainer' to 'System.Func<Nest.QueryContainerDescriptor<ClassEL>, Nest.QueryContainer>'
The problem is I can't get the searchDescriptor to accept the query I built. Examples online show Search + Query rolled into one which differs from what I'm trying to accomplish. Below is common example that I want to avoid:
var response = client.Search<Tweet>(s => s
.From(0)
.Size(10)
.Query(q =>
q.Term(t => t.User, "kimchy")
|| q.Match(mq => mq.Field(f => f.User).Query("nest"))
)
);
EDIT: Using the Andrey's answer works just fine. A problem arises however when I try to get the results back from the search query:
List<ClassViewEL> listDocuments = response.Documents.ToList();
Visual Studio doesn't highlight it as an error immediately, but during compile time has a problem:
error CS0570:
'Nest.ISearchResponse.Documents' is
not supported by the language
Debugging and choosing to IGNORE the above error works fine, the code executes just as expected no problems. However the compile time error will prevent code deployments. How can this error be fixed?
Solution to EDIT: One of my dependencies in my projects (Newtonsoft.Json.dll) were targeting an older version causing the error to appear. Cleaning the solution and rebuilding fixes it.
You should use Func<SearchDescriptor<ClassEL>, ISearchRequest> or pass descriptor in separate method. For example:
var queryContainer = Query<ClassEL>.Term(x => x.Field(f => f.FirstName).Value("FirstName"));
queryContainer &= Query<ClassEL>.Term(x => x.Field(f => f.LastName).Value("LastName"));
Func<SearchDescriptor<ClassEL>, ISearchRequest> searchFunc = searchDescriptor => searchDescriptor
.Index("indexName")
.Query(q => queryContainer);
var response = _client.Search<ClassEL>(searchFunc);
or like this
ISearchRequest ExecuteQuery(SearchDescriptor<ClassEL> searchDescriptor, QueryContainer queryContainer)
{
return searchDescriptor.Index("indexName")
.Query(q => queryContainer);
}
public void GetResults()
{
var queryContainer = Query<ClassEL>.Term(x => x.Field(f => f.FirstName).Value("FirstName"));
queryContainer &= Query<ClassEL>.Term(x => x.Field(f => f.LastName).Value("LastName"));
var response = _client.Search<ClassEL>(s => ExecuteQuery(s, queryContainer));
}
I am using Ruby 1.9.3 without Rails and version 1.0.4 of the Gibbon gem.
I have referrals populated with my list and can send the following to MailChimp with Gibbon. However, only the email address and email type fields are populated in the list in MailChimp. What am I doing wrong that is prohibiting all the merge fields from being imported via API?
Here is the batch and map of the list.
referrals.each_slice(3) do |batch|
begin
prepared_batch = batch.map do |referral|
{
:EMAIL => {:email => referral['client_email']},
:EMAIL_TYPE => 'html',
:MMERGE6 => referral['field_1'],
:MMERGE7 => referral['field_2'],
:MMERGE8 => referral['field_3'],
:MMERGE9 => referral['field_4'],
:MMERGE11 => referral['field_5'],
:MMERGE12 => referral['field_6'],
:MMERGE13 => referral['field_7'],
:MMERGE14 => referral['field_8'],
:MMERGE15 => referral['field_9'],
:FNAME => referral['client_first_name']
}
end
#log.info("prepared_batch : #{prepared_batch}")
result = #gibbon.lists.batch_subscribe(
:id => #mc_list_id,
:batch => prepared_batch,
:double_optin => false,
:update_existing => true
)
#log.info("#{result}")
rescue Exception => e
#log.warn("Unable to load batch into mailchimp because #{e.message}")
end
end
The above executes successfully. However, only the email address and email type are populated but most of the fields should be populated.
Here is my log output for one of the prepared_batches. I replaced the real values with Value. I used my own email for testing.
I, [2013-11-11T09:01:14.778907 #70827] INFO -- : prepared_batch : [{:EMAIL=>
{:email=>"jason+6#marketingscience.co"}, :EMAIL_TYPE=>"html", :MMERGE6=>"Value",
:MMERGE7=>"Value", :MMERGE8=>nil, :MMERGE9=>nil, :MMERGE11=>"8/6/13 0:00",
:MMERGE12=>"Value", :MMERGE13=>nil, :MMERGE14=>"10/18/13 19:09", :MMERGE15=>"Value",
:FNAME=>"Value"}, {:EMAIL=>{:email=>"jason+7#marketingscience.co"}, :EMAIL_TYPE=>"html",
:MMERGE6=>"Value", :MMERGE7=>"Value", :MMERGE8=>nil, :MMERGE9=>nil, :MMERGE11=>"8/6/13
0:00", :MMERGE12=>"Value", :MMERGE13=>nil, :MMERGE14=>nil, :MMERGE15=>"Value",
:FNAME=>"Value"}, {:EMAIL=>{:email=>"jason+8#marketingscience.co"}, :EMAIL_TYPE=>"html",
:MMERGE6=>"Value", :MMERGE7=>"Value", :MMERGE8=>nil, :MMERGE9=>nil, :MMERGE11=>"8/7/13
0:00", :MMERGE12=>"Value", :MMERGE13=>nil, :MMERGE14=>nil, :MMERGE15=>"Value",
:FNAME=>"Value"}]
Here is the log output of result from the MailChimp call.
I, [2013-11-11T09:01:14.778691 #70827] INFO -- : {"add_count"=>3, "adds"=>
[{"email"=>"jason+3#marketingscience.co", "euid"=>"ab512177b4", "leid"=>"54637465"},
{"email"=>"jason+4#marketingscience.co", "euid"=>"eeb8388524", "leid"=>"54637469"},
{"email"=>"jason+5#marketingscience.co", "euid"=>"7dbc84cb75", "leid"=>"54637473"}],
"update_count"=>0, "updates"=>[], "error_count"=>0, "errors"=>[]}
Any advice on how to get all the fields to update in MailChimp is appreciated. Thanks.
Turns out the documentation for using the Gibbon gem to batch subscribe is not correct. You need to add the :merge_vars struct to contain the fields other than email and email type. My final code looks like the following. I'm also going to update this code in its entirety at: https://gist.github.com/analyticsPierce/7434085.
referrals.each_slice(3) do |batch|
begin
prepared_batch = batch.map do |referral|
{
:EMAIL => {:email => referral['email']},
:EMAIL_TYPE => 'html',
:merge_vars => {
:MMERGE6 => referral['field_1'],
:MMERGE7 => referral['field_2'],
:MMERGE8 => referral['field_3'],
:MMERGE9 => referral['field_4'],
:MMERGE11 => referral['field_5'],
:MMERGE12 => referral['field_6'],
:MMERGE13 => referral['field_7'],
:MMERGE14 => referral['field_8'],
:MMERGE15 => referral['field_9'],
:FNAME => referral['first_name']
}
}
end
#log.info("prepared_batch : #{prepared_batch}")
result = #gibbon.lists.batch_subscribe(
:id => #mc_list_id,
:batch => prepared_batch,
:double_optin => false,
:update_existing => true
)
#log.info("#{result}")
rescue Exception => e
#log.warn("Unable to load batch into mailchimp because #{e.message}")
end
end