Oracle Cloud: How to fill json file for custom metrics - metrics

I am trying to send custom t2 telemetry metrics to Oracle Cloud. Using below Command I am able to generate param json file.
oci monitoring metric-data post --generate-param-json-input metric-data > metric-data.json
Below is the generate metric-data.json file
[
{
"compartmentId": "string",
"datapoints": [
{
"count": 0,
"timestamp": "2017-01-01T00:00:00+00:00",
"value": 0.0
},
{
"count": 0,
"timestamp": "2017-01-01T00:00:00+00:00",
"value": 0.0
}
],
"dimensions": {
"string1": "string",
"string2": "string"
},
"metadata": {
"string1": "string",
"string2": "string"
},
"name": "string",
"namespace": "string",
"resourceGroup": "string"
},
{
"compartmentId": "string",
"datapoints": [
{
"count": 0,
"timestamp": "2017-01-01T00:00:00+00:00",
"value": 0.0
},
{
"count": 0,
"timestamp": "2017-01-01T00:00:00+00:00",
"value": 0.0
}
],
"dimensions": {
"string1": "string",
"string2": "string"
},
"metadata": {
"string1": "string",
"string2": "string"
},
"name": "string",
"namespace": "string",
"resourceGroup": "string"
}
]
My metrics requirement is below. Need to send below information in case any agent id is late or missing.
MetricsName: [late/missing]
Hostname: somexyz.oraclecloud.com
agentid: asdfkjgsjdg723
category: custom/DB/Webserver
Region:
AD:
Information1:
Information2:
So my question is below.
How to accomodate my information in metric-data.json file
On Cloud how to visualise my data
Do I need to register my service on cloud before sending it

On Cloud how to visualise my data
Use Metrics Explorer in OCI Console
Do I need to register my service on cloud before sending it
No need to register
Sample data -
[
{
"namespace":"monitoring",
"compartmentId":"$compartmentID",
"resourceGroup":"gpu_0_monitoring",
"name":"gpuTemperature",
"dimensions":{
"resourceId":"$instanceOCID",
"instanceName":"$instanceName"
},
"metadata":{
"unit":"degrees Celcius",
"displayName":"GPU Temperature"
},
"datapoints":[
{
"timestamp":"2022-12-06T12:43:40Z",
"value":43
}
]
}
]
Save this data in metric-data.json file.
The above is a sample data that you posting to the monitoring service.
oci monitoring metric-data post --metric-data file://metric-data.json
For visualisation you can refer to the below document.
https://docs.oracle.com/en-us/iaas/Content/Monitoring/Tasks/publishingcustommetrics.htm

Related

Best way for message structure for Kafka Events for CUD operations to a table

I am creating three different event types for Kafka or any queue for an insert, update or delete to a table in SQL server. I am thinking of how the message should be structured. What is the best way to structure these messages in Kafka or any streaming queue like azure event hubs, rabbitmq?
Message Value for Update
{
"tableName": "string",
"tableKey": [
{
"key": "string",
"value": "string"
}
],
"columns": [
{
"columnName": "string",
"columnValue": "string"
},
{
"columnName": "string",
"columnValue": "string"
}
]
}
Message Value for Delete
{
"tableName":"string",
"tableKey": [
{
"key": "string",
"value": "string"
}
]
}
Message Value for Insert
{
"tableName":"string",
"tableKey": [
{
"key": "string",
"value": "string"
}
],
"Not sure what should be there because there can be 100 columns in a table"
}

Add images via Shopware 6 API

I have a Shopware 6.3 shop and need to migrate images to it using the integration API.
How should I construct a body for a media upload? Do I need to put a file somewhere or just pass in the link?
I have managed to push new products into Shopware via guide here: https://docs.shopware.com/en/shopware-platform-dev-en/admin-api-guide/writing-entities?category=shopware-platform-dev-en/admin-api-guide#creating-entities but I am not sure how to handle media. In this guide it is only explained how to create links between already uploaded media files to products in here https://docs.shopware.com/en/shopware-platform-dev-en/admin-api-guide/writing-entities?category=shopware-platform-dev-en/admin-api-guide#media-handling but no examples as to how to actually push the media files.
I have URL's for each image I need (in the database, along with produc id's and image positions).
The entity schema describes media as:
"media": {
"name": "media",
"translatable": [
"alt",
"title",
"customFields"
],
"properties": {
"id": {
"type": "string",
"format": "uuid"
},
"userId": {
"type": "string",
"format": "uuid"
},
"mediaFolderId": {
"type": "string",
"format": "uuid"
},
"mimeType": {
"type": "string",
"readOnly": true
},
"fileExtension": {
"type": "string",
"readOnly": true
},
"uploadedAt": {
"type": "string",
"format": "date-time",
"readOnly": true
},
"fileName": {
"type": "string",
"readOnly": true
},
"fileSize": {
"type": "integer",
"format": "int64",
"readOnly": true
},
"metaData": {
"type": "object",
"readOnly": true
},
"mediaType": {
"type": "object",
"readOnly": true
},
"alt": {
"type": "string"
},
"title": {
"type": "string"
},
"url": {
"type": "string"
},
"hasFile": {
"type": "boolean"
},
"private": {
"type": "boolean"
},
"customFields": {
"type": "object"
},
"createdAt": {
"type": "string",
"format": "date-time",
"readOnly": true
},
"updatedAt": {
"type": "string",
"format": "date-time",
"readOnly": true
},
"translated": {
"type": "object"
},
"tags": {
"type": "array",
"entity": "tag"
},
"thumbnails": {
"type": "array",
"entity": "media_thumbnail"
},
"user": {
"type": "object",
"entity": "user"
},
"categories": {
"type": "array",
"entity": "category"
},
"productManufacturers": {
"type": "array",
"entity": "product_manufacturer"
},
"productMedia": {
"type": "array",
"entity": "product_media"
},
"avatarUser": {
"type": "object",
"entity": "user"
},
"mediaFolder": {
"type": "object",
"entity": "media_folder"
},
"propertyGroupOptions": {
"type": "array",
"entity": "property_group_option"
},
"mailTemplateMedia": {
"type": "array",
"entity": "mail_template_media"
},
"documentBaseConfigs": {
"type": "array",
"entity": "document_base_config"
},
"shippingMethods": {
"type": "array",
"entity": "shipping_method"
},
"paymentMethods": {
"type": "array",
"entity": "payment_method"
},
"productConfiguratorSettings": {
"type": "array",
"entity": "product_configurator_setting"
},
"orderLineItems": {
"type": "array",
"entity": "order_line_item"
},
"cmsBlocks": {
"type": "array",
"entity": "cms_block"
},
"cmsSections": {
"type": "array",
"entity": "cms_section"
},
"cmsPages": {
"type": "array",
"entity": "cms_page"
},
"documents": {
"type": "array",
"entity": "document"
}
}
},
but it is not clear what fields are crucial. Do I need to create product-media folder first and then use it's id when making a POST request to media endpoint? Can I just specify the URL and will Shopware download the image itself to a folder or keep pointing to the URL I have used. I need to house the images inside the Shopware.
There is no problem for me to download the images from the URL and push them to Shopware but I am not sure how to use the API for it (there is a lot of images and they need to be done in bulk).
One possible solution:
FIRST: create a new media POST /api/{apiVersion}/media?_response=true
SECOND: "Upload Image" /api/{apiVersion}/_action/media/{mediaId}/upload?extension={extension}&fileName={imgName}&_response=true
more information can be found here: https://forum.shopware.com/discussion/comment/278603/#Comment_278603
In CASE images are for products use the endpoint POST /api/{apiVersion}/product-media and set the coverId
A complete listing of all routes is available via the OpenAPI schema: [your-domain/localhost]/api/v3/_info/openapi3.json
It's also possible to set all the media and the cover & coverId during product creation by one request. Therefore, set the product Cover and product Media
{
"coverId":"3d5ebde8c31243aea9ecebb1cbf7ef7b",
"productNumber":"SW10002","active":true,"name":"Test",
"description":"fasdf",
"media":[{
"productId":"94786d894e864783b546fbf7c60a3640",
"mediaId":"084f6aa36b074130912f476da1770504",
"position":0,
"id":"3d5ebde8c31243aea9ecebb1cbf7ef7b"
},
{
"productId":"94786d894e864783b546fbf7c60a3640",
"mediaId":"4923a2e38a544dc5a7ff3e26a37ab2ae",
"position":1,
"id":"600999c4df8b40a5bead55b75efe688c"
}],
"id":"94786d894e864783b546fbf7c60a3640"
}
Keep in mind to check if the bearer token is valid by checking for example like this:
if (JwtToken.ValidTo >= DateTime.Now.ToUniversalTime() - new TimeSpan(0, 5, 0))
{
return Client.Get(request);
}
else
{
// refresh the token by new authentication
IntegrationAuthenticator(this.key, this.secret);
}
return Client.Get(request);
This will work for Shopware 6.4
As a general advice, it depends. The APIs changed a little bit since 6.4 and there is also an official documentation available at https://shopware.stoplight.io/docs/admin-api/docs/guides/media-handling.md.
However, i think that it is always a little easier to have a real life example. What i do in our production environment is basically these steps.
(Optional) Check, if the media object exists
Create an media-file object using the endpoint GET /media-files/
If it exist then upload an image using the new media-id reference.
Let us assume the filename is yourfilename.jpg. What you also will need is a media-folder-id, which will reference the image-folder within Shopware. This can be obtained in Shopware via Admin > Content > Media > Product Media.
Step 0
Before uploading an image to Shopware, you want to ensure that the image does not exists, so that you can skip it.
This step is optional, as it is not mandatory to create an image. However you want to have some sort of validation mechanism in a production environment.
Request-Body
POST api/search/media
This will run a request against the Shopware-API with a response.
{
"filter":[
{
"type":"equals",
"field":"fileName",
"value":"yourfilename"
},
{
"type":"equals",
"field":"fileExtension",
"value":"jpg"
},
{
"type":"equals",
"field":"mediaFolderId",
"value":"d798f70b69f047c68810c45744b43d6f"
}
],
"includes":{
"media":[
"id"
]
}
}
Step 1
Create a new media-file
Request-Body
POST api/_action/sync
This request will create a new media-object in Shopware.
The value for media_id must be any UUID. I will use this value: 94f83a75669647288d4258f670a53e69
The customFields property is optional. I just use it to keep a reference of hash value which i could use to validate changed values.
The value for the media folder id is the one you will get from your Shopware-Backend.
{
"create-media": {
"entity": "media",
"action": "upsert",
"payload": [
{
"id": "{{media_id}}",
"customFields": {"hash": "{{file.hash}}"},
"mediaFolderId": "{{mediaFolderId}}"
}
]
}
}
Response
The response will tell you that everything works as expected.
{
"success":true,
"data":{
"create-media":{
"result":[
{
"entities":{
"media":[
"94f83a75669647288d4258f670a53e69"
],
"media_translation":[
{
"mediaId":"94f83a75669647288d4258f670a53e69",
"languageId":"2fbb5fe2e29a4d70aa5854ce7ce3e20b"
}
]
},
"errors":[
]
}
],
"extensions":[
]
}
},
"extensions":[
]
}
Step 2
This is the step where we will upload an image to Shopware. We will use a variant with the content-type image/jpg. However, a payload with an URL-Attribute would also work. See the details in the official documentation.
Request-Body
POST api/_action/media/94f83a75669647288d4258f670a53e69/upload?extension=jpg&fileName=yourfilename
Note that the media-id is part of the URL. And also the filename but without the file-extension JPG!
This body is pretty straightforward an in our case there is no payload, as we use an upload with Content-Type: "image/jpeg".
This would be a payload if you want to use an URL as resource:
{
"url": "<url-to-your-image>"
}

Nifi JoltTransformRecord UUID in default transform not working as expected

I have a Nifi workflow which uses JoltTranformRecord for doing some manipulation in the data which is record based. I have to create a default value uuid in each message in flow file.
My JoltTranformRecord configuration is as below.
Jolt specification :
[{
"operation": "shift",
"spec": {
"payload": "data.payload"
}
}, {
"operation": "default",
"spec": {
"header": {
"source": "${source}",
"client_id": "${client_id}",
"uuid": "${UUID()}",
"payload_type":"${payload_type}"
}
}
}]
Shift operation and all other default operations are working fine as expected. But UUID is coming same for all the messages. I need different UUIDs for each messages. I don't want to add another processor for this purpose only.
My workflow below :
Reader & Writer configurations for JoltRecord processor is :
IngestionSchemaJsonTreeReader ( From JsonTreeReader Processor ):
IngestionSchemaAvroRecordSetWriter ( From AvroWriter Processor ) :
Configured schema registry has below schemas defined in it.
com.xyz.ingestion.pre_json
{
"type": "record",
"name": "event",
"namespace": "com.xyz.ingestion.raw",
"doc": "Event ingested to kafka",
"fields": [
{
"name": "payload",
"type": [
"null",
"string"
],
"default": "null"
}
]
}
com.xyz.ingestion.raw -
{
"type": "record",
"name": "event",
"namespace": "com.xyz.ingestion.raw",
"doc": "Event ingested to kafka",
"fields": [
{
"type": {
"name": "header",
"type": "record",
"namespace": "com.xyz.ingestion.raw.header",
"doc": "Header data for event ingested",
"fields": [
{
"name": "payload_type",
"type": "string"
},
{
"name": "uuid",
"type": "string",
"size": "36"
},
{
"name": "client_id",
"type": "string"
},
{
"name": "source",
"type": "string"
}
]
},
"name": "header"
},
{
"type": {
"name": "data",
"type": "record",
"namespace": "com.xyz.ingestion.raw.data",
"doc": "Payload for event ingested",
"fields": [
{
"name": "payload",
"type": [
"null",
"string"
],
"default": "null"
}
]
},
"name": "data"
}
]
}
The expression language is evaluated per record. UUID() is executed for each evaluation. So uuid must be unique for each record. From the information you provided I cannot see why you are getting duplicate uuids.
I tried to reproduce your problem with following flow:
GenerateFlowFile:
SplitJson: configure $ as JsonPathExpression to split Json array into records.
JoltTransformRecord:
As you can see the way I am adding the UUID is not different from how you do it. But I am getting different UUIDs as expected:

Azure Service Bus - ARM Template (Update existing Topic parameters)

I need help with the Azure service bus service. I need to create a new topic in an existing service bus service by using Visual Studio, is there any way to achieve this. Because when I try to do this, Iā€™m getting the following error:
Template deployment returned the following errors:
15:28:14 - 15:28:13 - Resource Microsoft.ServiceBus/namespaces ā€œ#######ā€ failed with message '{
15:28:14 - "error": {
15:28:14 - "message": "Namespace update failed with conflict in backend. CorrelationId: 3c155444-2c1e-525d-943f-8b25d0a1da7e",
15:28:14 - "code": "Conflict"
15:28:14 - }
Any help would be very helpful to me.
The Microsoft.Resources/deployments template allows you to update an existing resource.
Modifiying the quickstart template to allow updates looks something like this (I did remove the subscription deployment to make it a bit shorter). Keep in mind that some parameters like partitioning cannot be modified- you must delete the resource and redeploy to change them.
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"service_BusNamespace_Name": {
"type": "String",
"metadata": {
"description": "Name of the Service Bus namespace"
}
},
"serviceBusTopicName": {
"type": "String",
"metadata": {
"description": "Name of the Topic"
}
},
"serviceBusSubscriptionName": {
"type": "String",
"metadata": {
"description": "Name of the Subscription"
}
},
"location": {
"defaultValue": "[resourceGroup().location]",
"type": "String",
"metadata": {
"description": "Location for all resources."
}
}
},
"variables": {
"defaultSASKey_Name": "RootManageSharedAccessKey",
"authRuleResource_Id": "[resourceId('Microsoft.ServiceBus/namespaces/authorizationRules', parameters('service_BusNamespace_Name'), variables('defaultSASKey_Name'))]",
"sbVersion": "2017-04-01"
},
"resources": [
{
"type": "Microsoft.Resources/deployments",
"apiVersion": "2015-01-01",
"name": "updateTopic",
"properties": {
"mode": "Incremental",
"parameters": {},
"template": {
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {},
"variables": {},
"resources": [
{
"apiVersion": "2017-04-01",
"name": "[parameters('service_BusNamespace_Name')]",
"type": "Microsoft.ServiceBus/namespaces",
"location": "[parameters('location')]",
"sku": {
"name": "Standard"
},
"resources": [
{
"apiVersion": "2017-04-01",
"name": "[parameters('serviceBusTopicName')]",
"type": "Topics",
"dependsOn": [
"[concat('Microsoft.ServiceBus/namespaces/', parameters('service_BusNamespace_Name'))]"
],
"properties": {
"defaultMessageTimeToLive": "P10675199DT2H48M5.4775807S",
"maxSizeInMegabytes": "1024",
"requiresDuplicateDetection": "false",
"duplicateDetectionHistoryTimeWindow": "PT10M",
"enableBatchedOperations": "false",
"supportOrdering": "false",
"autoDeleteOnIdle": "P10675199DT2H48M5.4775807S",
"enablePartitioning": "false",
"enableExpress": "false"
}
}
]
}
]
}
}
}
],
"outputs": {
"NamespaceConnectionString": {
"type": "String",
"value": "[listkeys(variables('authRuleResource_Id'), variables('sbVersion')).primaryConnectionString]"
},
"SharedAccessPolicyPrimaryKey": {
"type": "String",
"value": "[listkeys(variables('authRuleResource_Id'), variables('sbVersion')).primaryKey]"
}
}
}
Usually you get "conflict in backend" errors when you request an operation that isn't allowed in the resource's current state.
For example, once you upgrade Service Buses to Standard, deployments will fail if it tries to downgrade to Basic.
Effectively, the only solution for those cases is re-provision the resource from scratch.

Azure DF CopyFromBlob Failing

I am trying to copy files from an Azure Blob to an Azure Data Lake using a data factory. I keep running into this error and am not finding any information on what the parameter 'baseURI' maps to:
"errorCode": "2200",
"message": "ErrorCode=InvalidParameter,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The value of the property &apos;baseUri&apos; is invalid: &apos;Value cannot be null.\r\nParameter name: baseUri&apos;.,Source=,''Type=System.ArgumentNullException,Message=Value cannot be null.\r\nParameter name: baseUri,Source=Microsoft.DataTransfer.MsiStoreServiceClient,'",
"failureType": "UserError",
"target": "CopyFromBlob"
I am using Powershell with json files. Anything obvious that I am missing here?
Azure Data Lake Linked Service
{
"name": "<redacted>",
"properties": {
"type": "AzureDataLakeStore",
"typeProperties": {
"dataLakeStoreUri": "<redacted>",
"tenant": "<redacted>",
"subscriptionId": "<redacted>",
"resourceGroupName": "<redacted>"
},
"connectVia": {
"referenceName": "<redacted>",
"type": "IntegrationRuntimeReference"
}
}
}
Azure Blob Linked Service:
{
"name": "<redacted>",
"properties": {
"type": "AzureStorage",
"typeProperties": {
"connectionString": {
"type": "SecureString",
"value": "DefaultEndpointsProtocol=https;AccountName=<redacted>;AccountKey=<redacted>"
}
},
"connectVia": {
"referenceName": "<redacted>",
"type": "IntegrationRuntimeReference"
}
}
}
Data Lake Dataset
{
"name": "<redacted>",
"properties": {
"type": "AzureDataLakeStoreFile",
"linkedServiceName":{
"referenceName": "<redacted>",
"type": "LinkedServiceReference"
},
"typeProperties": {
"folderPath": "<redacted>"
}
}
}
Blob DataSet
{
"name": "<redacted>",
"properties": {
"type": "AzureBlob",
"linkedServiceName": {
"referenceName": "<redacted>",
"type": "LinkedServiceReference"
},
"typeProperties": {
"folderPath": "<redacted>",
}
}
}
Pipeline
{
"name": "<redacted>",
"properties": {
"activities":[
{
"name": "CopyFromBlob",
"type": "Copy",
"inputs": [
{
"referenceName": "<redacted>",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "<redacted>",
"type": "DatasetReference"
}
],
"typeProperties": {
"source": {
"type": "BlobSource"
},
"sink": {
"type": "AzureDataLakeStoreSink"
}
}
}]
}}
Powershell does the following:
1. Create Data Factory
2. Create Azure Integration Runtime
3. Create Azure Data Lake Linked Service
4. Create Azure Blob Linked Service
5. Create Azure Blob Dataset
6. Create Azure Data Lake Dataset
7. Create pipeline
8. Invoke pipeline
Check your Azure Integration Runtime powershell command - are you defining a location and type? I ran into a very similar error (my null parameter was 'dictionary') when I forgot to define a location for my integration runtime.
Hope this helps!

Resources