Enabling nested any queries on OData V4 endpoints with WebAPI - asp.net-web-api

I'm trying to build a nested any query like this ...
~/Api/Calendar?$filter=Roles/any(r:r/User/any(u:u/Name eq 'Joe
Bloggs'))
if I remove the inner any clause leaving me with ...
~/Api/Calendar?$filter=Roles/any(r:r/User/any())
... then the endpoint returns ...
{
"error": {
"code": "",
"message": "The query specified in the URI is not valid. The Any/All nesting limit of '1' has been exceeded. 'MaxAnyAllExpressionDepth' can be configured on ODataQuerySettings or EnableQueryAttribute."
}
}
... which I think lends some light on the problem but I actually have here.
So far I have tried to raise this limit with this during my context initialisation but it doesn't appear to be working ....
config.AddODataQueryFilter(new EnableQueryAttribute { MaxAnyAllExpressionDepth = 3 });
does anyone have any ideas how I can do this globally (i don't want to have to go to every get action on every controller and set the depth.
UPDATE:
So it turns out where I inherit from my own baseEntityController, on the actions I had the EnableQuery attribute which superceeded my global config change hense the reason my changes were not respected.
Simply removing the attribute from the actions themselves has all controllers that inherit from my base working with this new nested any and all limit, but i seem to now have the side effect that expands don't work any more ...
var query = new EnableQueryAttribute {
MaxExpansionDepth = 8,
PageSize = 100,
MaxAnyAllExpressionDepth = 3,
AllowedFunctions = System.Web.OData.Query.AllowedFunctions.All,
AllowedLogicalOperators = System.Web.OData.Query.AllowedLogicalOperators.All,
AllowedQueryOptions = System.Web.OData.Query.AllowedQueryOptions.All,
AllowedArithmeticOperators = System.Web.OData.Query.AllowedArithmeticOperators.All,
MaxTop = 1000
};
config.AddODataQueryFilter(query);
.. as you can see I tried throwing lots of extras in there but it's not having any of it!

The simplest way I found to do this and have everything work is to apply the attribute on the base controller actions, it therefore applied everything correctly to the actions on that controller or any of it's derived types.
It wasn't my ideal but I couldn't find a way to get a global fix for this to work as part of initialising the odata context.
Hopefully this will help someone out there.

Related

Microsoft.OData.Client $expand does not populate the model

I am using Microsoft.OData.Client based on the microsoft sample application.
Here's my simple WebAPI Controller:
[Route("test")]
[HttpGet]
public IHttpActionResult Test()
{
var context = _dynamicsContextFactory.CreateContext();
// adding this had no effect // context.MergeOption = MergeOption.AppendOnly;
// adding this had no effect // context.MergeOption = MergeOption.OverwriteChanges;
// adding this had no effect // context.MergeOption = MergeOption.NoTracking;
// adding this had no effect // context.MergeOption = MergeOption.PreserveChanges;
var result = context.SalesOrderHeadersV2.Expand("SalesOrderLines").Take(1).ToList();
return Ok(result);
}
The client generates the correct URL.
https://example.com/data/SalesOrderHeadersV2?$top=1&$expand=SalesOrderLines
I can see in fiddler the SalesOrderLines property returned in the JSON.
However when I inspect the result variable (or view the output) there is no SalesOrderLines property. So the order lines have not been mapped into my result object from the data downloaded from the oData source.
Important Note: I am using EDMXTrimmer to reduce the number of entities in my client, could this be an issue If I’m missing a joining entity? (It seems unlikely there's a joining entity in this case)
Clue?
When I try to change this line:
var result = context.SalesOrderHeadersV2.Expand(x=>x.SalesOrderLines).Take(1).ToList();
It will not compile because 'SalesOrderHeaderV2' does not contain a definition for 'SalesOrderLines' ...
Note: context.SalesOrderLines does exist.
The issue was that EDMXTrimmer removed the navigation properties.
EDMXTrimmer has since been fixed.

Passing parameters from Command to Converter

I defined a new type of model element as a plug-in; let's refer to it as Foo. A Foo node in the model should translate to a section element in the view. So far, so good. I managed to do that by defining simple conversion rules. I also managed to define a new FooCommand that transforms (renames) selected blocks to Foo.
I got stuck trying to have attributes on those Foo model nodes be translated to attributes on the view elements (and vice-versa). Suppose Foos have an attribute named fooClass which should map to the view element's class attribute.
<Foo fooClass="green-foo"> should map to/from <section class="green-foo">
I can successfully receive parameters in FooCommand, but I can't seem to set them on the blocks being processed by the command:
execute(options = {}) {
const document = this.editor.document;
const fooClass = options.fooClass;
document.enqueueChanges(() => {
const batch = options.batch || document.batch();
const blocks = (options.selection || document.selection).getSelectedBlocks();
for (const block of blocks) {
if (!block.is('foo')) {
batch.rename(block, 'foo');
batch.setAttribute(block, 'fooClass', fooClass);
}
}
});
}
Below is the code for the init function in the Foo plugin, including the model→view and view→model conversions:
init() {
const editor = this.editor;
const doc = editor.document;
const data = editor.data;
const editing = editor.editing;
editor.commands.add('foo', new FooCommand(editor));
doc.schema.registerItem('foo', '$block');
buildModelConverter().for(data.modelToView, editing.modelToView)
.fromElement('foo')
.toElement(modelElement => {
const fooClass = modelElement.item.getAttribute('fooClass'));
return new ContainerElement('section', {'class': fooClass});
});
buildViewConverter().for(data.viewToModel)
.fromElement('section')
.toElement(viewElement => {
let classes = Array.from(viewElement.getClassNames());
let modelElement = new ModelElement('foo', {'fooClass': classes[0]});
return modelElement;
});
}
When I try to run the command via
editor.execute('foo', { fooClass: 'green-foo' })
I can see that the green-foo value is available to FooCommand, but the modelElement in the model→view conversion, on the other hand, has no fooClass attribute.
I'm sure I'm missing the point here and misusing the APIs. I'd be really thankful if someone could shed some light on this issue. I can provide more details, as needed.
Follow-up after initial suggestions
Thanks to #Reinmar and #jodator for their suggestion regarding configuring the document schema to allow for the custom attribute. I really thought that would have taken care of it, but no. It may have been a necessary step anyway, but I'm still unable to get the attribute value from the model element during the model→view conversion.
First, let me add an important piece of information I had left out: the CKEditor5's version I'm working with is 1.0.0-alpha2. I am aware several of the APIs are bound to change, but I would still like to get things working with the present version.
Model→view conversion
If I understand it correctly, one can either pass a string or a function to the toElement call. A question about using the latter: what exactly are the parameters passed to the function? I assumed it would be the model element (node?) to be converted. Is that the case? If so, why is the attribute set on that node via batch.setAttribute (inside a document.enqueueChanges) not available when requested? Should it be?
A sequencing problem?
Additional testing seems to indicate there's some kind of order-of-execution issue happening. I've observed that, even though the attribute is not available when I first try to read it from the modelElement parameter, it will be so if I read it again later. Let me try to illustrate the situation below. First, I'll modify the conversion code to make it use some dummy value in case the attribute value is not available when read:
buildModelConverter().for(data.modelToView, editing.modelToView)
.fromElement('foo')
.toElement(modelElement => {
let fooClass = modelElement.item.getAttribute('fooClass') || 'naught';
let viewElement = new ContainerElement('section');
viewElement.setAttribute('class', fooClass);
return viewElement;
});
Now I reload the page and execute the following instructions on the console:
c = Array.from(editor.document.getRoot().getChildren());
c[1].is('paragraph'); // true
// Changing the node from 'paragraph' to 'foo' and adding an attribute
// 'fooClass' with value 'green-foo' to it.
editor.document.enqueueChanges(() => {
const batch = editor.document.batch();
batch.rename(c[1], 'foo');
batch.setAttribute(c[1], 'fooClass', 'green-foo');
return batch;
});
c[1].is('paragraph'); // false
c[1].is('foo'); // true
c[1].hasAttribute('fooClass'); // true
c[1].getAttribute('fooClass'); // 'green-foo'
Even though it looks like the expected output is being produced, a glance at the generated view element shows the problem:
<section class="naught"/>
Lastly, even if I try to reset the fooClass attribute on the model element, the change is not reflected on the view element. Why is that? Shouldn't changes made via enqueueChanges cause the view to update?
Sorry for the very long post, but I'm trying to convey as many details as I can. Here's hoping someone will spot my mistake or misunderstanding of how the CKEditor 5's API actually works.
View not updating?
I turned to Document's events and experimented with the changesDone event. It successfully addresses the "timing" issue, as it consistently triggers only after all changes have been processed. Still, the problem of the view not updating in response to a change in the model remains. To make it clear, the model does change, but the view does not reflect that. Here is the call:
editor.document.enqueueChanges(() => editor.document.batch().setAttribute(c[1], 'fooClass', 'red-foo'));
To be 100% sure I wrote the whole feature myself. I use the 1.0.0-beta.1 API which is completely different than what you had.
Basically – it works. It isn't 100% correct yet, but I'll get to that.
How to convert an element+attribute pair?
The thing when implementing a feature which needs to convert element + attribute is that it requires handling the element and attribute conversion separately as they are treated separately by CKEditor 5.
Therefore, in the code below you'll find that I used elementToElement():
editor.conversion.elementToElement( {
model: 'foo',
view: 'section'
} );
So a converter between model's <foo> element and view's <section> element. This is a two-way converter so it handles upcasting (view -> model) and downcasting (model -> view) conversion.
NOTE: It doesn't handle the attribute.
Theoretically, as the view property you could write a callback which would read the model element's attribute and create view element with this attribute set too. But that wouldn't work because such a configuration would only make sense in case of downcasting (model -> view). How could we use that callback to downcast a view structure?
NOTE: You can write converters for downcast and upcast pipelines separately (by using editor.conversion.for()), in which case you could really use callbacks. But it doesn't really make sense in this case.
The attribute may change independently!
The other problem is that let's say you wrote an element converter which sets the attribute at the same time. Tada, you load <section class=ohmy> and gets <foo class=ohmy> in your model.
But then... what if the attribute will change in the model?
In the downcast pipeline CKEditor 5 treats element changes separately from attribute changes. It fires them as separate events. So, when your FooCommand is executed on a heading it calls writer.rename() and we get the following events in DowncastDispatcher:
remove with <heading>
insert:section
But then the attribute is changed too (writer.setAttribute()), so we also get:
setAttibute:class:section
The elementToElement() conversion helper listens to insert:section event. So it's blind to setAttribute:class:selection.
Therefore, when you change the value of the attribute, you need the attributeToAttribute() conversion.
Sequencing
I didn't want to reply to your question before we released 1.0.0-beta.1 because 1.0.0-beta.1 brought the Differ.
Before 1.0.0-beta.1 all changes were converted immediately when they were applied. So, rename() would cause immediate remove and insert:section events. At this point, the element that you got in the latter one wouldn't have the class attribute set yet.
Thanks to the Differ we're able to start the conversion once all the changes are applied (after change() block is executed). This means that the insert:section event is fired once the model <foo> element has the class attribute set already. That's why you could write a callback-based converters... bur you shouldn't :D
The code
import { downcastAttributeToAttribute } from '#ckeditor/ckeditor5-engine/src/conversion/downcast-converters';
import { upcastAttributeToAttribute } from '#ckeditor/ckeditor5-engine/src/conversion/upcast-converters';
class FooCommand extends Command {
execute( options = {} ) {
const model = this.editor.model;
const fooClass = options.class;
model.change( writer => {
const blocks = model.document.selection.getSelectedBlocks();
for ( const block of blocks ) {
if ( !block.is( 'foo' ) ) {
writer.rename( block, 'foo' );
writer.setAttribute( 'class', fooClass, block );
}
}
} );
}
}
class FooPlugin extends Plugin {
init() {
const editor = this.editor;
editor.commands.add( 'foo', new FooCommand( editor ) );
editor.model.schema.register( 'foo', {
allowAttributes: 'class',
inheritAllFrom: '$block'
} );
editor.conversion.elementToElement( {
model: 'foo',
view: 'section'
} );
editor.conversion.for( 'upcast' ).add(
upcastAttributeToAttribute( {
model: 'class',
view: 'class'
} )
);
editor.conversion.for( 'downcast' ).add(
downcastAttributeToAttribute( {
model: 'class',
view: 'class'
} )
);
// This should work but it does not due to https://github.com/ckeditor/ckeditor5-engine/issues/1379 :(((
// EDIT: The above issue is fixed and will be released in 1.0.0-beta.2.
// editor.conversion.attributeToAttribute( {
// model: {
// name: 'foo',
// key: 'class'
// },
// view: {
// name: 'section',
// key: 'class'
// }
// } );
}
}
This code works quite well, except the fact that it converts the class attribute on any possible element that has it. That's because I had to use very generic downcastAttributeToAttribute() and upcastAttributeToAttribute() converters because of a bug that I found (EDIT: it's fixed and will be available in 1.0.0-beta.2). The commented out piece of code is how you it should be defined if everything worked fine and it will work in 1.0.0-beta.2.
It's sad that we missed such a simple case, but that's mainly due to the fact that all our features... are much more complicated than this.

Spring REST and PATCH method

I'm using SpringBoot and Spring REST.
I would like to understand the HTTP PATCH method to update properties of my Model
Is there any good tutorial explaining how to make it works ?
HTTP PATCH method and body to be Send
Controller method and how to manage the update operation
I've noticed that many of the provided answers are all JSON patching or incomplete answers. Below is a full explanation and example of what you need with functioning real world code
First, PATCH is a selective PUT. You use it to update any number of fields for an object or list of objects. In a PUT you typically send the entire object with whatever updates.
PATCH /object/7
{
"objId":7,
"objName": "New name"
}
PUT /object/7
{
"objId":7,
"objName": "New name",
"objectUpdates": true,
"objectStatus": "ongoing",
"scoring": null,
"objectChildren":[
{
"childId": 1
},
............
}
This allows you to update records without huge amounts of endpoints. For example, with above, to update scoring you need object/{id}/scoring, then to update name you need object/{id}/name. Literally one endpoint for every item or you require the front end to post the entire object for every update. If you have a huge object, this can take a lot of network time or mobile data that is unnecessary. The patch lets you have 1 endpoint with the minimal object property sends that a mobile platform should use.
here is an example of a real world use for patch:
#ApiOperation(value = "Patch an existing claim with partial update")
#RequestMapping(value = CLAIMS_V1 + "/{claimId}", method = RequestMethod.PATCH)
ResponseEntity<Claim> patchClaim(#PathVariable Long claimId, #RequestBody Map<String, Object> fields) {
// Sanitize and validate the data
if (claimId <= 0 || fields == null || fields.isEmpty() || !fields.get("claimId").equals(claimId)){
return new ResponseEntity<>(HttpStatus.BAD_REQUEST); // 400 Invalid claim object received or invalid id or id does not match object
}
Claim claim = claimService.get(claimId);
// Does the object exist?
if( claim == null){
return new ResponseEntity<>(HttpStatus.NOT_FOUND); // 404 Claim object does not exist
}
// Remove id from request, we don't ever want to change the id.
// This is not necessary, you can just do it to save time on the reflection
// loop used below since we checked the id above
fields.remove("claimId");
fields.forEach((k, v) -> {
// use reflection to get field k on object and set it to value v
// Change Claim.class to whatver your object is: Object.class
Field field = ReflectionUtils.findField(Claim.class, k); // find field in the object class
field.setAccessible(true);
ReflectionUtils.setField(field, claim, v); // set given field for defined object to value V
});
claimService.saveOrUpdate(claim);
return new ResponseEntity<>(claim, HttpStatus.OK);
}
The above can be confusing for some people as newer devs don't normally deal with reflection like that. Basically, whatever you pass this function in the body, it will find the associated claim using the given ID, then ONLY update the fields you pass in as a key value pair.
Example body:
PATCH /claims/7
{
"claimId":7,
"claimTypeId": 1,
"claimStatus": null
}
The above will update claimTypeId and claimStatus to the given values for claim 7, leaving all other values untouched.
So the return would be something like:
{
"claimId": 7,
"claimSrcAcctId": 12345678,
"claimTypeId": 1,
"claimDescription": "The vehicle is damaged beyond repair",
"claimDateSubmitted": "2019-01-11 17:43:43",
"claimStatus": null,
"claimDateUpdated": "2019-04-09 13:43:07",
"claimAcctAddress": "123 Sesame St, Charlotte, NC 28282",
"claimContactName": "Steve Smith",
"claimContactPhone": "777-555-1111",
"claimContactEmail": "steve.smith#domain.com",
"claimWitness": true,
"claimWitnessFirstName": "Stan",
"claimWitnessLastName": "Smith",
"claimWitnessPhone": "777-777-7777",
"claimDate": "2019-01-11 17:43:43",
"claimDateEnd": "2019-01-11 12:43:43",
"claimInvestigation": null,
"scoring": null
}
As you can see, the full object would come back without changing any data other than what you want to change. I know there is a bit of repetition in the explanation here, I just wanted to outline it clearly.
There is nothing inherently different in PATCH method as far as Spring is concerned from PUT and POST. The challenge is what you pass in your PATCH request and how you map the data in the Controller. If you map to your value bean using #RequestBody, you'll have to figure what is actually set and what null values mean. Others options would be limit PATCH requests to one property and specify it in url or map the values to a Map.
See also Spring MVC PATCH method: partial updates
Create a rest template using -
import org.springframework.http.client.HttpComponentsClientHttpRequestFactory;
RestTemplate rest = new RestTemplate(new HttpComponentsClientHttpRequestFactory());
now make the PATCH call
ResponseEntity<Map<String, Object>> response = rest.exchange(api, HttpMethod.PATCH, request,
responseType);

Is it a good practice to build a collection cache?

I'm experiencing with MongoDB with Node.js using the plugin node-mongodb-native. A problem I'm experiencing is the amount of nested callbacks. I'm trying to simplify a few things by lessening the code required for a query.
Instead of this ...
db.collection("test", function(err, collection) {
collection.find(...).toArray(function(err, results) {
// ...
});
});
... I was thinking of building an object which acts as a cache of collections so that the first callback is not necessary. I'm using the following code for building the object:
var collections = {};
["test", "foo"].forEach(function(name) {
db.collection(name, function(err, coll) {
collections[name] = coll;
});
});
With it, I'm able to clean up the first code snippet to:
collections.test.find(...).toArray(function(err, results) {
// ...
});
I was wondering whether this is a good practice. It works just fine, but I guess the callback of getting a collection is there for a reason. Does it make sense to build a collection cache as I'm doing now?
That completely depends on what a collection object is.
- Is it live?
- Is it connected to the database?
- Does it do any internal caching?
- Does it reflect new data?
Without knowing those details I recommend you create a lazy evaluation proxy.
Mongo.collection("test").find(...).toArray(function(err, results) {
// ...
});
The idea here is that you internally store the find command and when you call toArray you get the collection and invoke the find command on it, then invoke toArray.
This means your getting a new collection every time and avoid the "is caching safe" problem but still have a nice API.

MVP with Moq - loading a mocked view

I've read a lot about mocking/stubbing/faking - and still hit my mental roadblocks.
I'm trying to adapt MVP (Model View Presenter) with a "fun" weight loss tracking system I'm building for my own Fatty McFatter-self. I'm trying to TDD & 'by the book' this but hit many mental blocks and stall out.
I am building my Presenter and mocking my Service & View at the moment. Here's my test: again note: service and view are mocked with Moq
[Test]
public void GetLog_WithExistingDate_ViewSetWithExistingLog()
{
WeightLogModel model = new WeightLogModel
{
EntryDate = DateTime.Now,
Waist = 42,
Weight = 242
};
service.Setup(x => x.GetLog(It.IsAny<DateTime>())).Returns(model);
presenter.Display(DateTime.Now);
IWeightLogView myView = view.Object;
Assert.AreEqual(model.Weight, myView.Weight);
}
and in my Presenter - this is my Display method:
public void Display(DateTime date)
{
var weightLog = service.GetLog(date);
if(weightLog == null) return;
View.EntryDate = weightLog.EntryDate;
View.Waist = weightLog.Waist;
View.Weight = weightLog.Weight;
}
Now - if I debug as Display is being called - I see the weightLog is filled with the correct info I've setup in the mock. But as it's suppose to set View.EntryDate, View.Waist, etc - the View values never change. They stay zero or 0001/1/1
Is there some way to make it work? Or is this just a bad test and I'm floundering in confusion?
Thanks to Phil for starting me in motion. Although I didn't want to explicitly set what I was going to return - I wanted the mock view to behave like my view. You can have the mocked setter behave as normal by calling SetupProperty --> view.SetupProperty(x => x.Weight) //in my case... here's the test that will now pass asserting the weight was set
[Test]
public void GetLog_WithExistingDate_ViewSetWithExistingLog()
{
WeightLogModel model = new WeightLogModel
{
EntryDate = DateTime.Now,
Waist = 42,
Weight = 242
};
service.Setup(x => x.GetLog(It.IsAny<DateTime>())).Returns(model);
// I ADDED THIS ONE LINE
view.SetupProperty(x => x.Weight);
presenter.Display(DateTime.Now);
IWeightLogView myView = view.Object;
Assert.AreEqual(model.Weight, myView.Weight);
}
You are not showing all your setup code here, nor the dependencies between classes.
However if you are indeed mocking the view called "myView", it's going to return what you have the mock set up to return, or defaults for each type if you haven't specified anything for it to return (which sounds like what is happening).
From your comment:
I am trying to setup the
service.GetLog(date) to return the
WeightLogModel I have in the test. My
thinking is that doing so - would make
that WeightLogModel available in my
presenter
So far that seems like it is working from your original question.
to assign to my mocked view - where
View.EntryDate = weightLog.EntryDate
.... in this case weightLog is what is
setup in the test.... I hope I'm clear
as to where my head is... I'm not
saying I'm right - this is what my
thinking is though.
Where are you going wrong is where you say "to assign to my mocked view". It's not clear from your code whether or not the View property is in fact your mocked view (because your code is incomplete).
Although, in this case, it actually doesn't matter. If the View property is in fact a mock, it will only return what you tell it to return--its properties are not going to behave like "normal" properties.
So the following will fail without explicit setup:
mockView.MyProperty = "hello";
Assert.AreEqual("hello", mock.MyProperty);

Resources