I'm using the Bing Maps API to generate images from certain locations using the BotFramework-Location NuGet package. (Which code can be found here)
Sometimes it works, but sometimes the images are not loaded due to an error in the Rest call (which is called by the BotFramework, not by me)
This is my code:
IGeoSpatialService geoService = new AzureMapsSpatialService(this.azureApiKey);
List < Location > concreteLocations = new List < Location > ();
foreach(String location in locations) {
LocationSet locationSet = await geoService.GetLocationsByQueryAsync(location);
concreteLocations.AddRange(locationSet ? .Locations);
}
// Filter out duplicates
var seenKeys = new HashSet < String > ();
var uniqueLocations = new List < Location > ();
foreach(Location location in concreteLocations) {
if (seenKeys.Add(location.Address.AddressLine)) {
uniqueLocations.Add(location);
}
}
concreteLocations = new List < Location > (uniqueLocations.Take(5));
if (concreteLocations.Count == 0) {
await context.PostAsync("No dealers found.");
context.Done(true);
} else {
var locationsCardReply = context.MakeMessage();
locationsCardReply.Attachments = new LocationCardBuilder(this.bingApiKey, new LocationResourceManager()).CreateHeroCards(concreteLocations).Select(card => card.ToAttachment()).ToList();
locationsCardReply.AttachmentLayout = AttachmentLayoutTypes.Carousel;
await context.PostAsync(locationsCardReply);
}
Which responds this:
The reason not all images are shown is because the Rest call to the Bing Maps API returns this:
mapArea: This parameter value is out of range.
Here is one of the image uris that fail (I removed my key):
https://dev.virtualearth.net/REST/V1/Imagery/Map/Road?form=BTCTRL&mapArea=49.5737,5.53792,49.57348,5.53744&mapSize=500,280&pp=49.5737,5.53744;1;1&dpi=1&logo=always&key=NOT_REAL_12758FDLKJLDKJO8769KLJDLKJF
Anyone know what I've been doing wrong?
I think I understood why you got that error. I tried to get this map and got the same result as yours, as you said:
mapArea: This parameter value is out of range.
If you look at the sample url you provided, the mapArea is equal to 49.5737,5.53792,49.57348,5.53744
So I just inverted the coordinates of the 2 points defining this area, putting the one with the smaller latitude value first, I got a reply:
EDIT:
As you commented, this call is made inside BotBuilder-Location code, not on yours. I had a look to it, this is the method to generate the map, called inside LocationCardBuilder class that you are instantiating:
public string GetLocationMapImageUrl(Location location, int? index = null)
{
if (location == null)
{
throw new ArgumentNullException(nameof(location));
}
var point = location.Point;
if (point == null)
{
throw new ArgumentNullException(nameof(point));
}
if (location.BoundaryBox != null && location.BoundaryBox.Count >= 4)
{
return string.Format(
CultureInfo.InvariantCulture,
ImageUrlByBBox,
location.BoundaryBox[0],
location.BoundaryBox[1],
location.BoundaryBox[2],
location.BoundaryBox[3],
point.Coordinates[0],
point.Coordinates[1],
index,
this.apiKey);
}
else
{
return string.Format(
CultureInfo.InvariantCulture,
ImageUrlByPoint,
point.Coordinates[0],
point.Coordinates[1],
index,
apiKey);
}
}
As you can see, there is no restriction on the format of the BoundaryBox item. But if you look at the documentation about location data here:
BoundingBox: A geographic area that contains the location. A bounding box contains SouthLatitude, WestLongitude, NorthLatitude, and EastLongitude values in units of degrees.
As you may know, in code, SouthLatitude is smaller that NorthLatitude (as it expressed with positive and negative values in code depending on the location compared to Ecuador: 43N is 43, 43S is -43). So the problem seems to be here.
I made a quick test to see if I got the error base on the call you are doing before (to GetLocationsByQueryAsync), I wasn't able to reproduce this case. Can you share the query you did that linked to this problem?
Related
I am fairly new to unit testing and I am trying to filter a XML file in java spring-boot the filtering Function looks like this:
public Document filterRecordsByReleaseDate(Document document, String dateString, RDSymbol symbol) throws ParseException {
Document newDocument = builder.newDocument();
Node root = newDocument.createElement("records");
newDocument.appendChild(root);
Date comparisonDate = new SimpleDateFormat("yyyy-MM-dd").parse(dateString);
NodeList nodeList = document.getElementsByTagName("record");
for (int i = 0; i < nodeList.getLength(); i++) {
Node node = newDocument.adoptNode(nodeList.item(i));
Element element = (Element) node;
String releaseDateString = element.getElementsByTagName("releasedate").item(0).getTextContent();
Date releaseDate = new SimpleDateFormat("yyyy.MM.dd").parse(releaseDateString);
if (releaseDate.after(comparisonDate) && symbol.toString().equals("GT")) {
root.appendChild(node);
} else if (releaseDate.before(comparisonDate) && symbol.toString().equals("LT")) {
root.appendChild(node);
}
}
return newDocument;
}
The function itself is working fine, but I was thinking about how I might unit test this code. Currently, I only have one File which is supplied from the src/main/resourcesfolder. The Data will at some point come from some External Service/DB and it will follow the same format.
In my head I have several Questions:
how do I mock the Input Document for the file
to what should I compare the output of the function?
Concerning question 1: would It be ok to just use the RecordRepository.getXML function as a dependency, as only thing I could do otherwise would be to replicate its code anyways?
Concerning question 2: would It be ok to create a mocks folder in the src/test/resources directory to which I save the outputs of previous succesfull filters, to compare to. I feel like this would make the Test kind of redundant, but I also dont see any other way. Is there something I am not seing?
I've tried a few different ways to do this and each way has a piece of code I am just not getting right. I need to update a custom field UsrCustomerShipAccount in Customer Locations if it is updated in the Customer Delivery Tab. I tried SetValueExt and creating a graph instance. Sorry about the dumb question.
The way that seemed to get me the closest is below:
protected void LocationExtAddress_UsrCustomerShipAccnt_FieldUpdated(PXCache cache, PXFieldUpdatedEventArgs e, PXFieldUpdated InvokeBaseHandler)
{
if(InvokeBaseHandler != null)
InvokeBaseHandler(cache, e);
var row = (LocationExtAddress)e.Row;
if (row == null) return;
PXSelectBase<Location> locationObj = new PXSelect<Location, Where<Location.bAccountID, Equal<Required<Location.bAccountID>>>>(Base);
Location deliveryLocation = locationObj.Select(row.LocationBAccountID);
var locationExt = PXCache<Location>.GetExtension<LocationExt>(location); <-- This generates error that there is no LocationExt.
deliveryLocation.Cache.SetValueExt(deliveryLocation, "UsrCustomerShipAccount", -->This needs to be the value that changed LocationExtAddress.UsrCustomerShipAccount but I don't see how to get this<--);
deliveryLocation.Cache.IsDirty = true;
deliveryLocation.Update(deliveryLocation); <--I don't know if this doesn't work because it is wrong or if it is because "UsrCustomerShipAccount" is not in deliverLocation.
}
You have
var locationExt = PXCache<Location>.GetExtension<LocationExt>(location);
shouldn't this be
var locationExt = PXCache<Location>.GetExtension<LocationExt>(deliveryLocation );
?
I'm using version 8 of Kentico and I have a custom document/page that has a unique numeric identity field, unfortunately this data from an existing source and because I cannot set the primary key ID of the page's coupled data when using the API I was forced to have this separate field.
I ensure the field is new and unique during the DocumentEvents.Insert.Before event using node.SetValue("ItemIdentifier", newIdentifier); if the the node's class name matches, etc. So Workflow is handled as well I also implemented the same method for WorkflowEvents.SaveVersion.Before.
This works great when creating a new item, whoever if we attempt to Copy an existing node the source Identifier remains unchanged. I was hoping I could exclude the field from being copied, but am as yet to find an example of that.
So I went ahead and implemented a solution to ensure a new identifier is created when a node is being copied by handling the DocumentEvents.Copy.Before and DocumentEvents.Copy.After.
Unfortunately in my case the e.Node from these event args are useless, I could not for the life of me get the field modified, when I opened IlSpy I realized why, the node copy method grabs a fresh copy of the node from the database always! Hence rendering DocumentEvents.Copy.Before useless if you want to modify fields before a node is copied.
So I instead pass the identifier along in a RequestStockHelper that the Insert, further down the cycle, handles to generate a new identifier for the cloned node.
Unfortunately, unbeknownst to me, if we copy a published node, the value on the database is correct, but the NodeXML value of it is not.
This IMO sounds like a Kentico bug, it's either retaining the source node's NodeXML/version, or for some reason node.SetValue("ItemIdentifier", newIdentifier); is not working properly on the WorkflowEvents.SaveVersion.Before since it's a published and workflowed node.
Anyone come across a similar issue to this? Is there any other way I can configure a field to be a unique numeric identity field, that is not the primary key, and is automatically incremented when inserted? Or exclude a field from the copy procedure?
As a possible solution, could you create a new document in DocumentEvents.Copy.Before and copy the values over from the copied document, then cancel the copy event itself?
ok, turns out this is not a Kentico issue but the way versions are saved.
if you want to compute a unique value in DocumentEvents.Insert.Before you need to pass it along to WorkflowEvents.SaveVersion.Before because the node that is sent in the later is the same as the original from the former. e.g. whatever changes you do in the Insert node are not sent along to SaveVersion, you need to handle this manually.
So here's the pseudo code that handles the copy scenario and insert of a new item of compiled type CineDigitalAV:
protected override void OnInit()
{
base.OnInit();
DocumentEvents.Insert.Before += Insert_Before;
DocumentEvents.Copy.Before += Copy_Before;
WorkflowEvents.SaveVersion.Before += SaveVersion_Before;
}
private void Copy_Before(object sender, DocumentEventArgs e)
{
if (e.Node != null)
{
SetCopyCineDigitalIdentifier(e.Node);
}
}
private void SaveVersion_Before(object sender, WorkflowEventArgs e)
{
if (e.Document != null)
{
EnsureCineDigitalIdentifier(e.Document);
}
}
private void Insert_Before(object sender, DocumentEventArgs e)
{
if (e.Node != null)
{
EnsureCineDigitalIdentifier(e.Node);
}
}
private void SetCopyCineDigitalIdentifier(TreeNode node)
{
int identifier = 0;
if (node.ClassName == CineDigitalAV.CLASS_NAME)
{
identifier = node.GetValue<int>("AVCreation_Identifier", 0);
// flag next insert to create a new identifier
if (identifier > 0)
RequestStockHelper.Add("Copy-Identifier-" + identifier, true);
}
}
private void EnsureCineDigitalIdentifier(TreeNode node)
{
int identifier = 0;
if (node.ClassName == CineDigitalAV.CLASS_NAME)
{
identifier = node.GetValue<int>("AVCreation_Identifier", 0);
}
if (identifier == 0 || (identifier != 0 && RequestStockHelper.Contains("Copy-Identifier-" + identifier)))
{
// generate a new identifier for new items ot those being copied
RequestStockHelper.Remove("Copy-Identifier-" + identifier);
int newIdentifier = GetNewCineDigitalIdentifierAV(node.NodeSiteName);
node.SetValue("AVCreation_Identifier", newIdentifier);
// store the newidentifier so that saveversion includes it
RequestStockHelper.Add("Version-Identifier-" + identifier, newIdentifier);
}
else if (RequestStockHelper.Contains("Version-Identifier-" + identifier))
{
// handle saveversion with value from the insert
int newIdentifier = ValidationHelper.GetInteger(RequestStockHelper.GetItem("Version-Identifier-" + identifier), 0);
RequestStockHelper.Remove("Version-Identifier-" + identifier);
node.SetValue("AVCreation_Identifier", newIdentifier);
}
}
private int GetNewCineDigitalIdentifierAV(string siteName)
{
return (DocumentHelper.GetDocuments<CineDigitalAV>()
.OnSite(siteName)
.Published(false)
.Columns("AVCreation_Identifier")
.OrderByDescending("AVCreation_Identifier")
.FirstObject?
.AVCreation_Identifier ?? 0) + 1;
}
I have code that generates records based on my DataGridView. These records are temporary because some of them already exist in the database.
Crop_Variety v = new Crop_Variety();
v.Type_ID = currentCropType.Type_ID;
v.Variety_ID = r.Cells[0].Value.ToString();
v.Description = r.Cells[1].Value.ToString();
v.Crop = currentCrop;
v.Crop_ID = currentCrop.Crop_ID;
Unfortunately in this little bit of code, because I say that v.Crop = currentCrop,
now currentCrop.Crop_Varieties includes this temporary record. And when I go to insert the records of this grid that are new, they have a reference to the same Crop record, and therefore these temporary records that do already exist in the database show up twice causing duplicate key errors when I submit.
I have a whole system for detecting what records need to be added and what need to be deleted based on what the user has done, but its getting gummed up by this relentless tracking of references.
Is there a way I can stop Linq-To-Sql from automatically adding these temporary records to its table collections?
I would suggest revisiting the code that populates DataGridView (grid) with records.
And then revisit the code that operates on items from a GridView, keeping in mind that you can grab bound item from a grid row using the following code:
public object GridSelectedItem
{
get
{
try
{
if (_grid == null || _grid.SelectedCells.Count < 1) return null;
DataGridViewCell cell = _grid.SelectedCells[0];
DataGridViewRow row = _grid.Rows[cell.RowIndex];
if (row.DataBoundItem == null) return null;
return row.DataBoundItem;
}
catch { }
return null;
}
}
It is also hard to understand the nature of Crop_Variety code that you have posted. As the Crop_Variety seems to be a subclass of Crop. This leads to problems when the Crop is not yet bound to database and potentially lead to problems when you're adding Crop_Variety to the context.
For this type of Form application I normally have List _dataList inside form class, then the main grid is bound to that list, through ObjectBindingList or another way. That way _dataList holds all data that needs to be persisted when needed (user clicked save).
When you assign an entity object reference you are creating a link between the two objects. Here you are doing that:
v.Crop = currentCrop;
There is only one way to avoid this: Modify the generated code or generate/write your own. I would never do this.
I think you will be better off by writing a custom DTO class instead of reusing the generated entities. I have done both approaches and I like the latter one far better.
Edit: Here is some sample generated code:
[global::System.Data.Linq.Mapping.AssociationAttribute(Name="RssFeed_RssFeedItem", Storage="_RssFeed", ThisKey="RssFeedID", OtherKey="ID", IsForeignKey=true, DeleteOnNull=true, DeleteRule="CASCADE")]
public RssFeed RssFeed
{
get
{
return this._RssFeed.Entity;
}
set
{
RssFeed previousValue = this._RssFeed.Entity;
if (((previousValue != value)
|| (this._RssFeed.HasLoadedOrAssignedValue == false)))
{
this.SendPropertyChanging();
if ((previousValue != null))
{
this._RssFeed.Entity = null;
previousValue.RssFeedItems.Remove(this);
}
this._RssFeed.Entity = value;
if ((value != null))
{
value.RssFeedItems.Add(this);
this._RssFeedID = value.ID;
}
else
{
this._RssFeedID = default(int);
}
this.SendPropertyChanged("RssFeed");
}
}
}
As you can see the generated code is establishing the link by saying "value.RssFeedItems.Add(this);".
In case you have many entities for wich you would need many DTOs you could code-generate the DTO classes by using reflection.
I´m still having a hard time with Linq.
I need to write a Update Function tat receives an object that has a list. Actually, A region has a list of cities. I want to pass an object "Region" that has a name filed and a list of cities. The problem, is the city objects came from another context and I am unable to attach them to this context. I have been trying several functions, and always get an error like "EntitySet was modified during enumeration" or other. I am tring to make the code below work, but if anyone has a different approach please help.
public int Updateregion(region E)
{
try
{
using (var ctx = new AppDataDataContext())
{
var R =
(from edt in ctx.regiaos
where edt.ID == E.ID
select edt).SingleOrDefault();
if (R != null)
{
R.name = R.name;
R.description = E.description;
}
R.cities = null;
R.cities.AddRange(Edited.Cities);
ctx.SubmitChanges();
return 0 //OK!
}
}
catch (Exception e)
{
......
}
You can't attach objects retrieved from one datacontext to another, it's not supported by Linq-to-SQL. You need to somehow dettach the objects from their original context, but this isn't supported either. One can wonder why a dettach method isn't available, but at least you can fake it by mapping the list to new objects:
var cities = Edited.Cities.Select(city => new City {
ID = city.ID,
Name = city.Name,
/* etc */
});
The key here is to remember to map the primary key and NOT map any of the relation properties. They must be set to null. After this, you should be able to attach the new cities list, and have it work as expected.