Before grouping:
After grouping:
Code snippet for grid:
"controls": [
{
"control_key": "siteassignment_grid",
"control_type_key": "grid",
"target_device_key": "normal",
"configuration": {
"AssignmentGrid": {
"options": {
"AllowSorting": {
"mode": "single",
"allowUnsort": "false"
},
"AllowFiltering": "true",
"AllowGrouping": "true",
"AllowReordering": "true",
"Selection": "false"
},
"schema": {
"fields": [
"string",
"string",
"string",
"string",
"string",
"string",
"string",
"string",
"string",
"string",
"string",
"string"
]
},
"columns": [
{
"headerTemplate": "<input type='checkbox' id='chkSelectAll' class='chkSelectAll' onclick='SiteAssignment.Widget.SearchSitesAssign.checkAll(this)'/>",
"template": "<input type='checkbox' class='checkbox' onclick='SiteAssignment.Widget.UpdateAssignment.LoadWidget()'/>", "width": "25px"
},
{
"field": "element_code",
"title": "Config Item Id",
"min_width": "75",
"max_width": "75",
},
{
"field": "element_desc",
"title": "Config Item Name",
"min_width": "110",
"max_width": "110",
"template": "#= SiteAssignment.Widget.SearchSitesAssign.GridColumnToolTip(element_desc) #"
},
...etc...
Knowing that the change event is fired when grouping, I checked the code to see if anything there might be causing this. I disabled this and still have the probelm. Any help would greatly appreciated.
Could it be possible that setting min_width and max_width is causing problems? I'm pretty new to using Kendo UI, but I'm wondering if it recalculates width after grouping or something like that. I would start by hard coding the width values and seeing if it will cooperate more then.
Related
I am currently getting data from an external API for use my in Laravel API. I have everything working but I feel like it is slow.
I'm getting the data from the API with Http:get('url) and that works fast. It is only when I start looping through the data and making edits when things are slowing down.
I don't need all the data, but it would still be nice to edit before entering the data to the database as things aren't very consitent if possible. I also have a few columns that use data and some logic to make new columns so that each app/site doesn't need to do it.
I am saving to the database on each foreach loop with the eloquent Model:updateOrCreate() method which works but these json files can easily be 6000 lines long or more so it obviously takes time to loop through each set modify values and then save to the database each time. There usually isn't more than 200 or so entries but it still takes time. Will probably eventually update this to the new upset() method to make less queries to the database. Running in my localhost it is currently take about a minute and a half to run, which just seams way too long.
Here is a shortened version of how I was looping through the data.
$json = json_decode($contents, true);
$features = $json['features'];
foreach ($features as $feature){
// Get ID
$id = $feature['id'];
// Get primary condition data
$geometry = $feature['geometry'];
$properties = $feature['properties'];
// Get secondary geometry data
$geometryType = $geometry['type'];
$coordinates = $geometry['coordinates'];
Model::updateOrCreate(
[
'id' => $id,
],
[
'coordinates' => $coordinates,
'geometry_type' => $geometryType,
]);
}
Most of what I'm doing behind the scenes to the data before going into the database is cleaning up some text strings but there are a few logic things to normalize or prep the data for websites and apps.
Is there a more efficient way to get the same result? This will ultimately be used in a scheduler and run on an interval.
Example Data structure from API documentation
{
"$schema": "http://json-schema.org/draft-04/schema#",
"additionalProperties": false,
"properties": {
"features": {
"items": {
"additionalProperties": false,
"properties": {
"attributes": {
"type": [
"object",
"null"
]
},
"geometry": {
"additionalProperties": false,
"properties": {
"coordinates": {
"items": {
"items": {
"type": "number"
},
"type": "array"
},
"type": "array"
},
"type": {
"type": "string"
}
},
"required": [
"coordinates",
"type"
],
"type": "object"
},
"properties": {
"additionalProperties": false,
"properties": {
"currentConditions": {
"items": {
"properties": {
"additionalData": {
"type": "string"
},
"conditionDescription": {
"type": "string"
},
"conditionId": {
"type": "integer"
},
"confirmationTime": {
"type": "integer"
},
"confirmationUserName": {
"type": "string"
},
"endTime": {
"type": "integer"
},
"id": {
"type": "integer"
},
"sourceType": {
"type": "string"
},
"startTime": {
"type": "integer"
},
"updateTime": {
"type": "integer"
}
},
"required": [
"id",
"userName",
"updateTime",
"startTime",
"conditionId",
"conditionDescription",
"confirmationUserName",
"confirmationTime",
"sourceType",
"endTime"
],
"type": "object"
},
"type": "array"
},
"id": {
"type": "string"
},
"name": {
"type": "string"
},
"nameId": {
"type": "string"
},
"parentAreaId": {
"type": "integer"
},
"parentSubAreaId": {
"type": "integer"
},
"primaryLatitude": {
"type": "number"
},
"primaryLongitude": {
"type": "number"
},
"primaryMP": {
"type": "number"
},
"routeId": {
"type": "integer"
},
"routeName": {
"type": "string"
},
"routeSegmentIndex": {
"type": "integer"
},
"secondaryLatitude": {
"type": "number"
},
"secondaryLongitude": {
"type": "number"
},
"secondaryMP": {
"type": "number"
},
"sortOrder": {
"type": "integer"
}
},
"required": [
"id",
"name",
"nameId",
"routeId",
"routeName",
"primaryMP",
"secondaryMP",
"primaryLatitude",
"primaryLongitude",
"secondaryLatitude",
"secondaryLongitude",
"sortOrder",
"parentAreaId",
"parentSubAreaId",
"routeSegmentIndex",
"currentConditions"
],
"type": "object"
},
"type": {
"type": "string"
}
},
"required": [
"type",
"geometry",
"properties",
"attributes"
],
"type": "object"
},
"type": "array"
},
"type": {
"type": "string"
}
},
"required": [
"type",
"features"
],
"type": "object"
}
Second, related question.
Since this is being updated on an interval I have it updating and creating records from the json data, but is there an efficient way to delete old records that are no longer in the json file? I currently get an array of current ids and compare them to the new ids and then loop through each and delete them. There has to be a better way.
Have no idea what to say to your first question, but I think you may try to do something like this regarding the second question.
SomeModel::query()->whereNotIn('id', $newIds)->delete();
$newIds you can collect during the first loop.
i have a dataobject.json and a corresponding example.json. I like to compare both if everything what is in the example is the same notation as in the dataobject.
I added in the exampe the dataobject file as a schema to validate against. This works, but only for the required field, not for the optional properties. - there the validation doesn't find a problem, even if there are some.
To validate these I added the "additionalProperties": false line. This works in general, so I find all the deviations, but I also get a problem that property §schema is not allowed.
How can I solve this?
the dataobject
{
"$schema": "http://json-schema.org/draft-07/schema",
"type": "object",
"title": "GroupDO",
"required": [
"id",
"name"
],
"additionalProperties": false,
"properties": {
"id": {
"type": "string",
"format": "uuid",
"example": "5dd6c80a-3376-4bce-bc47-8t41b3565325",
"description": "Unique id ."
},
"name": {
"type": "string",
"example": "ABD",
"description": "The name."
},
"GroupSort": {
"type": "integer",
"format": "int32",
"example": 1,
"description": "Defines in which order the groups should appear."
},
"GroupTextList": {
"type": "array",
"description": "A descriptoin in multiple languages.",
"items": {
"$ref": "../../common/dataobjects/Description_1000_DO.json"
}
},
"parentGroupId": {
"type": "string",
"format": "uuid",
"example": "8e590f93-1ab6-40e4-a5f4-aa1eeb2b6a80",
"description": "Unique id for the parent group."
}
},
"description": "DO representing a group object. "}
the example
{ "$schema": "../dataobjects/GroupDO.json",
"id": "18694b46-0833-4790-b780-c7897ap08500",
"version": 1,
"lastChange": "2020-05-12T13:57:39.935305",
"sort": 3,
"name": "STR",
"parentGroupId": "b504273e-61fb-48d1-aef8-c289jk779709",
"GroupTexts": [
{
"id": "7598b668-d9b7-4d27-a489-19e45h2bdad0",
"version": 0,
"lastChange": "2020-03-09T14:14:25.491787",
"languageIsoCode": "de_DE",
"description": "Tasche"
},
{
"id": "376e82f8-837d-4bb2-a21f-a9e0ebd59e23",
"version": 0,
"lastChange": "2020-03-09T14:14:25.491787",
"languageIsoCode": "en_GB",
"description": "Bag"
}
]
}
the problem messages:
property $schema is not allowed
Thanks in advance for your help.
I've got an ASP.NET Core 5 Web API that uses Swagger to generate an OpenAPI3 json file. Using Visual Studo 2019 (16.8.2) on my Blazor WASM client, I've added a Service Reference to that json file.
All of this is working fine, except for some reason the POCO's generated by Visual Studio (? NSWag maybe based on the references) are all using DateTimeOffset.
Now I think I need to set a setting useDateTimeOffset based on this, but no clue where or how to do so.
EDIT: It looks like DateTimeOffset is the recommend data type for general purpose dates. But I still want to know how to change the options!
In case, it's a problem with the OpenAPI3 json, targetDate should be date only, and dateRaised should be a date and time:
"InitialRequestDTO": {
"type": "object",
"properties": {
"id": {
"type": "integer",
"format": "int32"
},
"requesterName": {
"type": "string",
"nullable": true
},
"sponsorName": {
"type": "string",
"nullable": true
},
"requestTitle": {
"type": "string",
"nullable": true
},
"ownReference": {
"type": "string",
"nullable": true
},
"dateRaised": {
"type": "string",
"format": "date-time"
},
"details": {
"type": "string",
"nullable": true
},
"existingApplicationId": {
"type": "integer",
"format": "int32",
"nullable": true
},
"targetDate": {
"type": "string",
"format": "date",
"nullable": true
},
"benefits": {
"type": "string",
"nullable": true
},
"externalDependencies": {
"type": "string",
"nullable": true
},
"nameOfLargerChange": {
"type": "string",
"nullable": true
}
},
"additionalProperties": false
},
In the end I worked around this using the partial classes to add an additional property:
[JsonIgnore]
public DateTime ClosedOnDT
{
get => ClosedOn.HasValue ? ClosedOn.Value.LocalDateTime : default;
set => ClosedOn = new DateTimeOffset(value, TimeZoneInfo.Local.GetUtcOffset(value));
}
I have a Shopware 6.3 shop and need to migrate images to it using the integration API.
How should I construct a body for a media upload? Do I need to put a file somewhere or just pass in the link?
I have managed to push new products into Shopware via guide here: https://docs.shopware.com/en/shopware-platform-dev-en/admin-api-guide/writing-entities?category=shopware-platform-dev-en/admin-api-guide#creating-entities but I am not sure how to handle media. In this guide it is only explained how to create links between already uploaded media files to products in here https://docs.shopware.com/en/shopware-platform-dev-en/admin-api-guide/writing-entities?category=shopware-platform-dev-en/admin-api-guide#media-handling but no examples as to how to actually push the media files.
I have URL's for each image I need (in the database, along with produc id's and image positions).
The entity schema describes media as:
"media": {
"name": "media",
"translatable": [
"alt",
"title",
"customFields"
],
"properties": {
"id": {
"type": "string",
"format": "uuid"
},
"userId": {
"type": "string",
"format": "uuid"
},
"mediaFolderId": {
"type": "string",
"format": "uuid"
},
"mimeType": {
"type": "string",
"readOnly": true
},
"fileExtension": {
"type": "string",
"readOnly": true
},
"uploadedAt": {
"type": "string",
"format": "date-time",
"readOnly": true
},
"fileName": {
"type": "string",
"readOnly": true
},
"fileSize": {
"type": "integer",
"format": "int64",
"readOnly": true
},
"metaData": {
"type": "object",
"readOnly": true
},
"mediaType": {
"type": "object",
"readOnly": true
},
"alt": {
"type": "string"
},
"title": {
"type": "string"
},
"url": {
"type": "string"
},
"hasFile": {
"type": "boolean"
},
"private": {
"type": "boolean"
},
"customFields": {
"type": "object"
},
"createdAt": {
"type": "string",
"format": "date-time",
"readOnly": true
},
"updatedAt": {
"type": "string",
"format": "date-time",
"readOnly": true
},
"translated": {
"type": "object"
},
"tags": {
"type": "array",
"entity": "tag"
},
"thumbnails": {
"type": "array",
"entity": "media_thumbnail"
},
"user": {
"type": "object",
"entity": "user"
},
"categories": {
"type": "array",
"entity": "category"
},
"productManufacturers": {
"type": "array",
"entity": "product_manufacturer"
},
"productMedia": {
"type": "array",
"entity": "product_media"
},
"avatarUser": {
"type": "object",
"entity": "user"
},
"mediaFolder": {
"type": "object",
"entity": "media_folder"
},
"propertyGroupOptions": {
"type": "array",
"entity": "property_group_option"
},
"mailTemplateMedia": {
"type": "array",
"entity": "mail_template_media"
},
"documentBaseConfigs": {
"type": "array",
"entity": "document_base_config"
},
"shippingMethods": {
"type": "array",
"entity": "shipping_method"
},
"paymentMethods": {
"type": "array",
"entity": "payment_method"
},
"productConfiguratorSettings": {
"type": "array",
"entity": "product_configurator_setting"
},
"orderLineItems": {
"type": "array",
"entity": "order_line_item"
},
"cmsBlocks": {
"type": "array",
"entity": "cms_block"
},
"cmsSections": {
"type": "array",
"entity": "cms_section"
},
"cmsPages": {
"type": "array",
"entity": "cms_page"
},
"documents": {
"type": "array",
"entity": "document"
}
}
},
but it is not clear what fields are crucial. Do I need to create product-media folder first and then use it's id when making a POST request to media endpoint? Can I just specify the URL and will Shopware download the image itself to a folder or keep pointing to the URL I have used. I need to house the images inside the Shopware.
There is no problem for me to download the images from the URL and push them to Shopware but I am not sure how to use the API for it (there is a lot of images and they need to be done in bulk).
One possible solution:
FIRST: create a new media POST /api/{apiVersion}/media?_response=true
SECOND: "Upload Image" /api/{apiVersion}/_action/media/{mediaId}/upload?extension={extension}&fileName={imgName}&_response=true
more information can be found here: https://forum.shopware.com/discussion/comment/278603/#Comment_278603
In CASE images are for products use the endpoint POST /api/{apiVersion}/product-media and set the coverId
A complete listing of all routes is available via the OpenAPI schema: [your-domain/localhost]/api/v3/_info/openapi3.json
It's also possible to set all the media and the cover & coverId during product creation by one request. Therefore, set the product Cover and product Media
{
"coverId":"3d5ebde8c31243aea9ecebb1cbf7ef7b",
"productNumber":"SW10002","active":true,"name":"Test",
"description":"fasdf",
"media":[{
"productId":"94786d894e864783b546fbf7c60a3640",
"mediaId":"084f6aa36b074130912f476da1770504",
"position":0,
"id":"3d5ebde8c31243aea9ecebb1cbf7ef7b"
},
{
"productId":"94786d894e864783b546fbf7c60a3640",
"mediaId":"4923a2e38a544dc5a7ff3e26a37ab2ae",
"position":1,
"id":"600999c4df8b40a5bead55b75efe688c"
}],
"id":"94786d894e864783b546fbf7c60a3640"
}
Keep in mind to check if the bearer token is valid by checking for example like this:
if (JwtToken.ValidTo >= DateTime.Now.ToUniversalTime() - new TimeSpan(0, 5, 0))
{
return Client.Get(request);
}
else
{
// refresh the token by new authentication
IntegrationAuthenticator(this.key, this.secret);
}
return Client.Get(request);
This will work for Shopware 6.4
As a general advice, it depends. The APIs changed a little bit since 6.4 and there is also an official documentation available at https://shopware.stoplight.io/docs/admin-api/docs/guides/media-handling.md.
However, i think that it is always a little easier to have a real life example. What i do in our production environment is basically these steps.
(Optional) Check, if the media object exists
Create an media-file object using the endpoint GET /media-files/
If it exist then upload an image using the new media-id reference.
Let us assume the filename is yourfilename.jpg. What you also will need is a media-folder-id, which will reference the image-folder within Shopware. This can be obtained in Shopware via Admin > Content > Media > Product Media.
Step 0
Before uploading an image to Shopware, you want to ensure that the image does not exists, so that you can skip it.
This step is optional, as it is not mandatory to create an image. However you want to have some sort of validation mechanism in a production environment.
Request-Body
POST api/search/media
This will run a request against the Shopware-API with a response.
{
"filter":[
{
"type":"equals",
"field":"fileName",
"value":"yourfilename"
},
{
"type":"equals",
"field":"fileExtension",
"value":"jpg"
},
{
"type":"equals",
"field":"mediaFolderId",
"value":"d798f70b69f047c68810c45744b43d6f"
}
],
"includes":{
"media":[
"id"
]
}
}
Step 1
Create a new media-file
Request-Body
POST api/_action/sync
This request will create a new media-object in Shopware.
The value for media_id must be any UUID. I will use this value: 94f83a75669647288d4258f670a53e69
The customFields property is optional. I just use it to keep a reference of hash value which i could use to validate changed values.
The value for the media folder id is the one you will get from your Shopware-Backend.
{
"create-media": {
"entity": "media",
"action": "upsert",
"payload": [
{
"id": "{{media_id}}",
"customFields": {"hash": "{{file.hash}}"},
"mediaFolderId": "{{mediaFolderId}}"
}
]
}
}
Response
The response will tell you that everything works as expected.
{
"success":true,
"data":{
"create-media":{
"result":[
{
"entities":{
"media":[
"94f83a75669647288d4258f670a53e69"
],
"media_translation":[
{
"mediaId":"94f83a75669647288d4258f670a53e69",
"languageId":"2fbb5fe2e29a4d70aa5854ce7ce3e20b"
}
]
},
"errors":[
]
}
],
"extensions":[
]
}
},
"extensions":[
]
}
Step 2
This is the step where we will upload an image to Shopware. We will use a variant with the content-type image/jpg. However, a payload with an URL-Attribute would also work. See the details in the official documentation.
Request-Body
POST api/_action/media/94f83a75669647288d4258f670a53e69/upload?extension=jpg&fileName=yourfilename
Note that the media-id is part of the URL. And also the filename but without the file-extension JPG!
This body is pretty straightforward an in our case there is no payload, as we use an upload with Content-Type: "image/jpeg".
This would be a payload if you want to use an URL as resource:
{
"url": "<url-to-your-image>"
}
I'm getting error when try full text search number full size "9" in couchbase 6.0.3. Exception throws : err: bleve: QueryBleve validating request, err: parse error: error parsing number: strconv.ParseFloat: parsing.
If i searching with some string "9abc" , searching successfull so i think , lib of couchbase search regconize "9" is number and parse failed. I dont know to to resolve problem. Please help me!
Couchbase 6.0.3
ConjunctionQuery fts = SearchQuery.conjuncts(SearchQuery.queryString(source));
fts = fts.and(SearchQuery.matchPhrase("123").field("tm"));
fts = fts.and(SearchQuery.booleanField(true).field("active"));
SearchQuery query = new SearchQuery("segmentIndex"), fts);
SearchQueryResult result = bucket.query(query);
Exception throws : err: bleve: QueryBleve validating request, err: parse error: error parsing number: strconv.ParseFloat: parsing.
{
"name": "tmSegmentIndex",
"type": "fulltext-index",
"params": {
"doc_config": {
"docid_prefix_delim": "",
"docid_regexp": "",
"mode": "type_field",
"type_field": "type"
},
"mapping": {
"analysis": {
"analyzers": {
"remove_fullsize_number": {
"char_filters": [
"remove_fullsize_number"
],
"token_filters": [
"cjk_bigram",
"cjk_width"
],
"tokenizer": "whitespace",
"type": "custom"
}
},
"char_filters": {
"remove_fullsize_number": {
"regexp": "9",
"replace": "9",
"type": "regexp"
}
}
},
"default_analyzer": "cjk",
"default_datetime_parser": "dateTimeOptional",
"default_field": "_all",
"default_mapping": {
"default_analyzer": "cjk",
"dynamic": true,
"enabled": true
},
"default_type": "_default",
"docvalues_dynamic": true,
"index_dynamic": true,
"store_dynamic": false,
"type_field": "_type"
},
"store": {
"indexType": "scorch",
"kvStoreName": "mossStore"
}
},
"sourceType": "couchbase",
"sourceName": "tm-segment",
"sourceUUID": "973fdbffc567cdfe8f423289b9700f19",
"sourceParams": {},
"planParams": {
"maxPartitionsPerPIndex": 171,
"numReplicas": 0
},
"uuid": "1265a6bedbfd027c"
}
can you just try a custom analyser with an asciifolding character filter like below.
Also, when you directly search from the UI box without a field name, its getting searched in the "_all" field which won't get the right/intended analyser used for parsing the query text.
You may field scope the query there like => field:"9"
{
"type": "fulltext-index",
"name": "FTS",
"uuid": "401ee8132818cee3",
"sourceType": "couchbase",
"sourceName": "sample",
"sourceUUID": "6bd6d0b1c714fcd7697a349ff8166bf8",
"planParams": {
"maxPartitionsPerPIndex": 171,
"indexPartitions": 6
},
"params": {
"doc_config": {
"docid_prefix_delim": "",
"docid_regexp": "",
"mode": "type_field",
"type_field": "type"
},
"mapping": {
"analysis": {
"analyzers": {
"custom": {
"char_filters": [
"asciifolding"
],
"tokenizer": "unicode",
"type": "custom"
}
}
},
"default_analyzer": "standard",
"default_datetime_parser": "dateTimeOptional",
"default_field": "_all",
"default_mapping": {
"dynamic": false,
"enabled": true,
"properties": {
"id": {
"dynamic": false,
"enabled": true,
"fields": [
{
"analyzer": "custom",
"docvalues": true,
"include_in_all": true,
"include_term_vectors": true,
"index": true,
"name": "id",
"type": "text"
}
]
}
}
},
"default_type": "_default",
"docvalues_dynamic": true,
"index_dynamic": true,
"store_dynamic": false,
"type_field": "_type"
},
"store": {
"indexType": "scorch"
}
},
"sourceParams": {}
}
Asciifolding filters are a part of 6.5.0 Couchbase release. Its available in beta for trials.