Slack slash command - Accept media files (image, video, voice memo) - slack

I've built a slash command on slack that tries to utilize slack's built-in file upload and accepts text together with media files (and posts it to another channel if specific keywords are present).
However, whenever I get the payload of the command I only get the textual part of the message, and the image/video/voice memo are left out.
Is it possible to get user uploaded files through a slash command or slack bot?
How shall I go about it?
I've tried adding the scopes files:read and files:write (together with the standard commands) and sent a message with an uploaded image or voice memo (recorded via slack).
In both cases all I got was only the text part of the command:
token=<TOKEN>&team_id=<TEAM_ID>&team_domain=<DOMAIN>&channel_id=<CHANNEL_ID>&channel_name=directmessage&user_id=<USER_ID>&
user_name=<USERNAME>&command=%2Fcreate&text=can+I+send+a+%23voice+%23memo&
api_app_id=<APP_ID>&is_enterprise_install=false&
response_url=https%3A%2F%2Fhooks.slack.com%2F<...>&trigger_id=<TRIGGER_ID>

I ended up using the Slack Bolt library which supports all the new slack features and events.
By doing that you gain access to the full message body, which also includes the event attribute, in which you can find files array.
Code sample:
const { App } = require('#slack/bolt');
const app = new App({
token: process.env.SLACK_BOT_TOKEN,
signingSecret: process.env.SLACK_SIGNING_SECRET,
socketMode: true,
appToken: process.env.SLACK_APP_TOKEN
});
// Listens to incoming messages that contain "hello"
app.message('hello', async ({ message, say, logger, body, event }) => {
// notice event.files
logger.info('Body', body, 'Files', event.files);
await say(`Hey there <#${message.user}>!`);
});
This message will react to any message containing "hello", log files array within event and reply with a friendly message.
Example files array logged by the logger (3 seconds voice memo in this case):
[
{
id: '<file-id>',
created: 1671452738,
timestamp: 1671452738,
name: 'audio_message.webm',
title: 'audio_message.webm',
mimetype: 'audio/webm',
filetype: 'webm',
pretty_type: 'WebM',
user: '<user-id>',
user_team: '<team-id>',
editable: false,
size: 39033,
mode: 'hosted',
is_external: false,
external_type: '',
is_public: false,
public_url_shared: false,
display_as_bot: false,
username: '',
subtype: 'slack_audio',
transcription: { status: 'processing' },
url_private: 'https://files.slack.com/files-tmb/<ids>/audio_message_audio.mp4',
url_private_download: 'https://files.slack.com/files-tmb/<ids>/download/audio_message_audio.mp4',
duration_ms: 2421,
aac: 'https://files.slack.com/files-tmb/<ids>/audio_message_audio.mp4',
audio_wave_samples: [
1, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 12, 37, 47, 54, 62, 68, 79, 80, 58,
39, 30, 22, 18, 46, 82, 87, 90, 94, 100, 96, 72,
51, 76, 77, 62, 38, 47, 51, 51, 60, 68, 70, 60,
52, 42, 54, 66, 62, 61, 63, 50, 53, 55, 58, 47,
40, 36, 32, 27, 23, 16, 11, 6, 4, 4, 4, 3,
2, 2, 4, 5, 4, 3, 3, 3, 2, 2, 2, 2,
1, 1, 1, 1
],
media_display_type: 'audio',
permalink: 'https://<workplace-id>.slack.com/files/<file-id>/audio_message.webm',
permalink_public: 'https://slack-files.com/<file-id>',
has_rich_preview: false,
file_access: 'visible'
}
]
I couldn't find any specific documentation about sending files to a slack bot, hopefully that helps.

Related

How to verify with Python (PyNaCl) a message signed by Solana wallet adapter (javascript)

I have signed a message using Solana's wallet adapter example:
import { useWallet } from '#solana/wallet-adapter-react';
import bs58 from 'bs58';
import React, { FC, useCallback } from 'react';
import { sign } from 'tweetnacl';
export const SignMessageButton: FC = () => {
const { publicKey, signMessage } = useWallet();
const onClick = useCallback(async () => {
try {
// `publicKey` will be null if the wallet isn't connected
if (!publicKey) throw new Error('Wallet not connected!');
// `signMessage` will be undefined if the wallet doesn't support it
if (!signMessage) throw new Error('Wallet does not support message signing!');
// Encode anything as bytes
const message = new TextEncoder().encode("hello");
// Sign the bytes using the wallet
const signature = await signMessage(message);
// Verify that the bytes were signed using the private key that matches the known public key
if (!sign.detached.verify(message, signature, publicKey.toBytes())) throw new Error('Invalid signature!');
alert(`Message signature: ${bs58.encode(signature)}`);
} catch (error: any) {
alert(`Signing failed: ${error?.message}`);
}
}, [publicKey, signMessage]);
return signMessage ? (<button onClick={onClick} disabled={!publicKey}>Sign Message</button>) : null;
};
Pubkey used: DKpHyR1WjWE23E3xizPUhefZKmpMrMXNBVfoxQ7WXCRR
Message signed: hello
Signature received: 3EWDdtU1w8pWkr6fg8faJvKn1wBZmNjgf5kUx4Pn5gw4HeBPYVDm7cTHNqpRVMami6yX36jdaeZacv9GXR19Jzye
But I am not being able to verify the signed message using Python 3.9 with PyNaCl and Solana-py. I have tried the following:
from nacl.signing import VerifyKey
from solana.publickey import PublicKey
import base58
pubkey = bytes(PublicKey("DKpHyR1WjWE23E3xizPUhefZKmpMrMXNBVfoxQ7WXCRR"))
msg = bytes("hello", 'utf8')
signed = bytes("3EWDdtU1w8pWkr6fg8faJvKn1wBZmNjgf5kUx4Pn5gw4HeBPYVDm7cTHNqpRVMami6yX36jdaeZacv9GXR19Jzye", 'utf8')
result = VerifyKey(
pubkey
).verify(
smessage=base58.b58decode(msg),
signature=base58.b58decode(signed)
)
But verification returns: nacl.exceptions.BadSignatureError: Signature was forged or corrupt.
Somebody knows what is wrong? Could it be an encoding problem? Seems like JS uses the following byte types:
pubkey: Uint8Array(32) [144, 188, 240, 167, 187, 75, 30, 17, 232, 175, 91, 222, 73, 68, 183, 218, 108, 56, 249, 64, 250, 61, 111, 168, 194, 233, 159, 2, 247, 5, 175, 124, buffer: ArrayBuffer(32), byteLength: 32, byteOffset: 0, length: 32]
message: Uint8Array(44) [57, 85, 65, 81, 76, 53, 81, 68, 67, 89, 122, 70, 112, 107, 119, 70, 88, 52, 88, 75, 53, 70, 119, 107, 66, 54, 67, 57, 116, 57, 116, 120, 65, 89, 52, 102, 102, 122, 69, 52, 114, 97, 113, 84, buffer: ArrayBuffer(44), byteLength: 44, byteOffset: 0, length: 44]
signed: Uint8Array(64) [111, 173, 219, 10, 169, 113, 163, 35, 30, 162, 250, 243, 191, 106, 195, 99, 238, 34, 49, 192, 19, 92, 111, 142, 57, 31, 158, 235, 65, 219, 146, 176, 174, 48, 30, 255, 160, 90, 174, 179, 219, 197, 252, 189, 150, 225, 160, 133, 163, 109, 159, 80, 56, 191, 11, 1, 91, 111, 196, 214, 231, 84, 11, 1, buffer: ArrayBuffer(64), byteLength: 64, byteOffset: 0, length: 64]
And in python:
pubkey: b'\xb7\x1e+\xef\xe19#y}\xa4L\xf2K\rK\xc3\xbby\x93\x1c\x00L\xe1<\x19g`-\x9d\xd5\xee\x94'
msg: b'Cn8eVZg'
signed: b'3EWDdtU1w8pWkr6fg8faJvKn1wBZmNjgf5kUx4Pn5gw4HeBPYVDm7cTHNqpRVMami6yX36jdaeZacv9GXR19Jzye'
Do I need to use some different encoding on Python?
Thanks for providing a concrete example on this, you're very close! The encoding is absolutely the issue here -- the pubkey is correctly encoded in Python as bytes. That first byte of \x90, encoded as two hex values, is 144 in JS, and you can check that in Python with: int('90', 16) = 144.
So to verify your key, you can instead use the base58 package https://github.com/keis/base58 and do:
from nacl.signing import VerifyKey
from solana.publickey import PublicKey
import base58
pubkey = bytes(PublicKey("DKpHyR1WjWE23E3xizPUhefZKmpMrMXNBVfoxQ7WXCRR"))
msg = bytes("hello", 'utf8')
signed = bytes("3EWDdtU1w8pWkr6fg8faJvKn1wBZmNjgf5kUx4Pn5gw4HeBPYVDm7cTHNqpRVMami6yX36jdaeZacv9GXR19Jzye", 'utf8')
result = VerifyKey(
pubkey
).verify(
smessage=msg,
signature=base58.b58decode(signed)
)
Note.- on smessage you don't need to use b58, because it was
encoded with new TextEncoder().encode("hello").
Second option: if you already have the UInt8Array from JS, you can do:
result = VerifyKey(bytes(PublicKey("HERE_THE_PUB_KEY"))
).verify(
smessage=bytes([byte1, byte2, byte3, ...])
signature=bytes([byte1, byte2, byte3, ...])
)

Managing workflows with nodejs

I am trying to use nodejs to manage my workflows instead of using the console. The createWorkflow method returns a 200 code but I cannot find it in the console or when listing it. Any idea what is missing? the typescript typedef has too many partials to be useful on what is needed and what is not. In the code below the list method returns correctly the workflow names I created in the console which proves that I use the parent params correctly I guess. However when I call the create method with the name of a new workflow and some code, I get this this back and a return code of 200 which should mean it was successful but can't find it. Also trying to execute it returns a not found.
return value:
{
"_events": {},
"_eventsCount": 2,
"completeListeners": 0,
"hasActiveListeners": false,
"latestResponse": {
"name": "projects/xxxxx/locations/us-central1/operations/operation-1607572432499-5b6141fca342d-32b153bf-bfce0551",
"metadata": {
"type_url": "type.googleapis.com/google.cloud.workflows.v1beta.OperationMetadata",
"value": {
"type": "Buffer",
"data": [
10,
12,
8,
208,
183,
198,
254,
5,
16,
236,
212,
251,
180,
2,
26,
76,
112,
114,
111,
106,
101,
99,
116,
115,
47,
98,
117,
116,
116,
101,
114,
102,
108,
121,
45,
105,
116,
45,
55,
102,
55,
55,
98,
47,
108,
111,
99,
97,
116,
105,
111,
110,
115,
47,
117,
115,
45,
99,
101,
110,
116,
114,
97,
108,
49,
47,
119,
111,
114,
107,
102,
108,
111,
119,
115,
47,
116,
101,
115,
116,
95,
119,
111,
114,
107,
102,
108,
111,
119,
95,
118,
49,
34,
6,
99,
114,
101,
97,
116,
101,
42,
6,
118,
49,
98,
101,
116,
97
]
}
},
"done": false
},
"name": "projects/xxxxxxxxxxxx/locations/us-central1/operations/operation-1607572432499-5b6141fca342d-32b153bf-bfce0551",
"done": false,
"longrunningDescriptor": {
"operationsClient": {
"auth": {
"checkIsGCE": true,
"jsonContent": null,
"cachedCredential": {
"_events": {},
"_eventsCount": 0,
"transporter": {},
"credentials": {
"access_token": "ya29.c.KpcB6AdCAMGMr_FqY7veU-uQTAP2cenQDWOh3Msaw-CPjdodjeYKAEf7lw-m1joxmam06_4QgRJ5Atnlpcm7db37CAi0lz4LS5_KPkvaodE6oefkDChOly92BxyCfaJClrKqklcEbSt1yg-2iVwngXccgwtdko9sbM4UeUihNPYScdGY0bGT484x7Ai6e2vtAfZMC2r-DqGE-g",
"token_type": "Bearer",
"expiry_date": 1607573907456,
"refresh_token": "compute-placeholder"
},
"certificateCache": {},
"certificateExpiry": null,
"certificateCacheFormat": "PEM",
"refreshTokenPromises": {},
"eagerRefreshThresholdMillis": 300000,
"forceRefreshOnFailure": false,
"serviceAccountEmail": "default",
"scopes": [
"https://www.googleapis.com/auth/cloud-platform"
]
},
"_cachedProjectId": "xxxxxxxxxxxx",
"defaultScopes": [
"https://www.googleapis.com/auth/cloud-platform"
],
"_getDefaultProjectIdPromise": {}
},
"innerApiCalls": {},
"descriptor": {
"listOperations": {
"requestPageTokenField": "pageToken",
"responsePageTokenField": "nextPageToken",
"resourceField": "operations"
}
}
}
},
"result": null,
"metadata": {
"createTime": {
"seconds": "1607572432",
"nanos": 647948908
},
"target": "projects/xxxxxxxxxxxx/locations/us-central1/workflows/test_workflow_v1",
"verb": "create",
"apiVersion": "v1beta"
},
"backoffSettings": {
"initialRetryDelayMillis": 100,
"retryDelayMultiplier": 1.3,
"maxRetryDelayMillis": 60000,
"initialRpcTimeoutMillis": null,
"rpcTimeoutMultiplier": null,
"maxRpcTimeoutMillis": null,
"totalTimeoutMillis": null
},
"_callOptions": {
"timeout": 30000,
"retry": {
"retryCodes": [],
"backoffSettings": {
"initialRetryDelayMillis": 100,
"retryDelayMultiplier": 1.3,
"maxRetryDelayMillis": 60000,
"initialRpcTimeoutMillis": 60000,
"rpcTimeoutMultiplier": 1,
"maxRpcTimeoutMillis": 60000,
"totalTimeoutMillis": 600000
}
},
"autoPaginate": true,
"otherArgs": {
"headers": {
"x-goog-request-params": "parent=projects%2Fxxxxxxxxxxxx%2Flocations%2Fus-central1"
}
},
"bundleOptions": null,
"isBundling": true,
"apiName": "google.cloud.workflows.v1beta.Workflows"
}
}
** code: **
engine: ExecutionsClient;
builder: WorkflowsClient;
location = 'us-central1';
projectId: string;
constructor() {
this.engine = new ExecutionsClient();
this.builder = new WorkflowsClient();
this.projectId = serviceAccount.project_id;
}
async list(): Promise<any> {
console.log (`list workflows`);
const [ resp ] = await this.builder.listWorkflows({
parent: this.builder.locationPath(this.projectId, this.location),
});
console.log (`listWorkflow: ${JSON.stringify(resp)}`);
return resp;
}
async create(name: string, code: string): Promise<any> {
console.log (`creating workflow named: ${name}`);
const [ resp ] = await this.builder.createWorkflow({
parent: this.builder.locationPath(this.projectId, this.location),
workflow: {
sourceContents: code,
},
workflowId: name
});
console.log (`createWorkflow: ${JSON.stringify(resp)}`);
return resp;
}
async execute(name: string): Promise<any> {
const [resp] = await this.engine.createExecution({
parent: this.engine.workflowPath(this.projectId, this.location, name),
});
return resp;
}
After consulting the source for google-cloud/workflows I found where the problem was. createWorkflow is what they call a long operation and is actually 2 successive promises
const [operation] = await client.createWorkflow(request);
const [response] = await operation.promise();

page length not showing after adding export option in DataTables Library

page length not showing after adding export option in DataTables Library.
dom: 'Bfrtip',
lengthMenu: [[10, 25, 50, -1], [10, 25, 50, "All"]], // page length options
buttons: [
{
extend: 'copy',
exportOptions: {
columns: [ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22]
}
},
https://nimb.ws/OlUrQ6
Bfrtip to lBfrtip
dom: 'lBfrtip',
lengthMenu: [[10, 25, 50, -1], [10, 25, 50, "All"]], // page length options
buttons: [
{
extend: 'copy',
exportOptions: {
columns: [ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22]
}
},

I need to filter elements in google.visualization.datatable used in Google Charts

I'm populating my google's datatable via using this code
$.ajax({
url: "Default2.aspx/GetChartData",
data: "",
dataType: "json",
type: "POST",
contentType: "application/json; chartset=utf-8",
success: function (data) {
chartData = data.d;
},
error: function () {
alert("Error loading data! Please try again.");
}
}).done(function () {
google.charts.setOnLoadCallback(drawChart);
});
function drawChart() {
var data = google.visualization.arrayToDataTable(chartData);
Now I want to delete rows based on some filters which check the particular value(date) from the row of the datatable.
But the problem is that there is not any method in documentation using which i can go through row elements.
you can use a DataView to show only certain rows from the DataTable
using the getFilteredRows and setRows methods...
the getFilteredRows method returns an array of row indexes that meet certain criteria
the criteria is an array of conditions
you pass the column index and the condition
e.g. --> {column: 0, minValue: 2016} -- (first column must be >= 2016)
e.g. --> {column: 0, value: 2017} -- (first column must = 2017)
e.g. --> {column: 0, maxValue: 2015} -- (first column must be <= 2015)
see following DataTable...
var data = google.visualization.arrayToDataTable([
['X', 'Y1', 'Y2'],
[2010, 10, 14],
[2011, 14, 22],
[2012, 16, 24],
[2013, 22, 30],
[2014, 28, 36],
[2015, 30, 44],
[2016, 34, 42],
[2017, 36, 44],
[2018, 42, 50],
[2019, 48, 56]
]);
to filter on the 'X' column (column 0), we could say...
data.getFilteredRows([{column: 0, minValue: 2016}])
you can use multiple criteria as well...
data.getFilteredRows([
{column: 0, minValue: 2016},
{column: 1, minValue: 40}
])`
then pass the returned row indexes to setRows on the data view
var view = new google.visualization.DataView(data);
view.setRows(data.getFilteredRows([
{column: 0, minValue: 2016},
{column: 1, minValue: 40}
]));
see following working snippet...
google.charts.load('current', {
callback: drawChart,
packages: ['table']
});
function drawChart() {
var data = google.visualization.arrayToDataTable([
['X', 'Y1', 'Y2'],
[2010, 10, 14],
[2011, 14, 22],
[2012, 16, 24],
[2013, 22, 30],
[2014, 28, 36],
[2015, 30, 44],
[2016, 34, 42],
[2017, 36, 44],
[2018, 42, 50],
[2019, 48, 56]
]);
var view = new google.visualization.DataView(data);
view.setRows(data.getFilteredRows([
{column: 0, minValue: 2016},
{column: 1, minValue: 40}
]));
var container = document.getElementById('chart_div');
var chart = new google.visualization.Table(container);
chart.draw(view);
}
<script src="https://www.gstatic.com/charts/loader.js"></script>
<div id="chart_div"></div>
EDIT
you can filter on any type, including dates,
here is an example of filtering on exact date...
google.charts.load('current', {
callback: drawChart,
packages: ['table']
});
function drawChart() {
var data = google.visualization.arrayToDataTable([
['X', 'Y1', 'Y2'],
[new Date(2016, 7, 1), 10, 14],
[new Date(2016, 8, 1), 14, 22],
[new Date(2016, 9, 1), 16, 24],
[new Date(2016, 10, 1), 22, 30],
[new Date(2016, 11, 1), 28, 36],
[new Date(2017, 0, 1), 30, 44],
[new Date(2017, 1, 1), 34, 42],
[new Date(2017, 2, 1), 36, 44],
[new Date(2017, 3, 1), 42, 50],
[new Date(2017, 4, 1), 48, 56]
]);
var view = new google.visualization.DataView(data);
view.setRows(data.getFilteredRows([
{column: 0, value: new Date(2016, 11, 1)}
]));
var container = document.getElementById('chart_div');
var chart = new google.visualization.Table(container);
chart.draw(view);
}
<script src="https://www.gstatic.com/charts/loader.js"></script>
<div id="chart_div"></div>
note: if the date values include specific time values, then you'll need to use a range, to filter for a specific date...
google.charts.load('current', {
callback: drawChart,
packages: ['table']
});
function drawChart() {
var data = google.visualization.arrayToDataTable([
['X', 'Y1', 'Y2'],
[new Date(2017, 0, 1, 12, 35, 16), 30, 44],
[new Date(2017, 0, 1, 14, 46, 10), 34, 42],
[new Date(2017, 0, 1, 16, 12, 44), 36, 44],
[new Date(2017, 0, 1, 17, 20, 47), 42, 50],
[new Date(2017, 0, 1, 18, 23, 59), 48, 56],
[new Date(2017, 0, 2, 12, 35, 16), 30, 44],
[new Date(2017, 0, 2, 14, 46, 10), 34, 42],
[new Date(2017, 0, 2, 16, 12, 44), 36, 44],
[new Date(2017, 0, 2, 17, 20, 47), 42, 50],
[new Date(2017, 0, 2, 18, 23, 59), 48, 56]
]);
var view = new google.visualization.DataView(data);
view.setRows(data.getFilteredRows([{
column: 0,
minValue: new Date(2017, 0, 1, 0, 0, 0),
maxValue: new Date(2017, 0, 1, 23, 59, 59)
}]));
var container = document.getElementById('chart_div');
var chart = new google.visualization.Table(container);
chart.draw(view);
}
<script src="https://www.gstatic.com/charts/loader.js"></script>
<div id="chart_div"></div>

Set in kendogrid sparkline

I am going to use sparkline in the" usage" column, just in the way that the two sparkline chart cover each other
There is a problem because when I click on the button Edite "sparkline" disappears.
Or click on "usage column" think that happens.
Why tooltip as bad as it can be displayed tooltip not regular.
Why sparkline "usage column" in all rows, there is only one row
jsfiddle code
$(document).ready(function () {
//var ds = new kendo.data.DataSource({
// transport: {
// read: {
// url: '/api/clientssnapshot',
// dataType: 'json',
// type: 'get',
// cache: false
// },
// },
// batch: true,
// pageSize: 10,
// schema: {
// model: {
// fields: {
// Mac: { editable: false, nullable: true },
// RadioName: { type: "string", validation: { required: true } },
// ApName: { type: "string", validation: { required: true, min: 1 } },
// RemoteIp: { type: "boolean" },
// TX: { type: "number", validation: { min: 0, required: true } },
// RX: { type: "number", validation: { min: 0, required: true } },
// Signal: { type: "number", validation: { min: 0, required: true } },
// Uptime: { type: "number", validation: { min: 0, required: true } },
// }
// }
// }
//});
$('.table').kendoGrid({
dataSource: {
data: [{
"Mac": "D4:CA:6D:28:D1:05",
"RadioName": "D4CA6D28D105",
"ApName": "Om11",
"ApIp": "10.20.0.100",
"TX": 48,
"RX": 54,
"Signal": -64,
"Uptime": 797452,
"InRate": 0,
"OutRate": 0,
"AccountingId": 759,
"AccountingName": "فرشاد صفایی زاده",
"RemoteIp": "188.121.123.56",
"IsValidInScan": true,
"Comments": null,
"ApScanId": 26173,
"InRateHistory": "0, 0, 0, 0, 2, 0, 2, 16, 96, 16, 96, 16, 96, 113, 31, 113, 31, 113, 31, 0",
"OutRateHistory": "0, 5, 3, 5, 2, 5, 2, 35, 136, 35, 136, 35, 136, 164, 51, 164, 51, 164, 51, 4"
}, {
"Mac": "00:15:6D:BD:64:92",
"RadioName": "Behrooz Hoseyn",
"ApName": "Om11",
"ApIp": "10.20.0.100",
"TX": 48,
"RX": 18,
"Signal": -65,
"Uptime": 797446,
"InRate": 2,
"OutRate": 2,
"AccountingId": 750,
"AccountingName": "بهروز حسینی",
"RemoteIp": "188.121.123.48",
"IsValidInScan": true,
"Comments": null,
"ApScanId": 26173,
"InRateHistory": "0, 0, 0, 0, 2, 0, 2, 16, 96, 16, 96, 16, 96, 113, 31, 113, 31, 113, 31, 0",
"OutRateHistory": "0, 5, 3, 5, 2, 5, 2, 35, 136, 35, 136, 35, 136, 164, 51, 164, 51, 164, 51, 4"
}, {
"Mac": "00:15:6D:1E:B3:6C",
"RadioName": "UBNT",
"ApName": "Om11",
"ApIp": "10.20.0.100",
"TX": 54,
"RX": 24,
"Signal": -65,
"Uptime": 310336,
"InRate": 0,
"OutRate": 0,
"AccountingId": 820,
"AccountingName": "******",
"RemoteIp": "10.10.15.129",
"IsValidInScan": true,
"Comments": null,
"ApScanId": 26173,
"InRateHistory": "0, 0, 0, 0, 2, 0, 2, 16, 96, 16, 96, 16, 96, 113, 31, 113, 31, 113, 31, 0",
"OutRateHistory": "0, 5, 3, 5, 2, 5, 2, 35, 136, 35, 136, 35, 136, 164, 51, 164, 51, 164, 51, 4"
}, {
"Mac": "00:15:6D:1C:B1:89",
"RadioName": "Grous Tajhiz P",
"ApName": "Om11",
"ApIp": "10.20.0.100",
"TX": 48,
"RX": 6,
"Signal": -62,
"Uptime": 122116,
"InRate": 0,
"OutRate": 0,
"AccountingId": 595,
"AccountingName": "حمید شمس لواسانی",
"RemoteIp": "188.121.124.17",
"IsValidInScan": true,
"Comments": null,
"ApScanId": 26173,
"InRateHistory": "0, 0, 0, 0, 2, 0, 2, 16, 96, 16, 96, 16, 96, 113, 31, 113, 31, 113, 31, 0",
"OutRateHistory": "0, 5, 3, 5, 2, 5, 2, 35, 136, 35, 136, 35, 136, 164, 51, 164, 51, 164, 51, 4"
}, {
"Mac": "00:27:22:3E:91:12",
"RadioName": "Anbar Aminzade",
"ApName": "Om1",
"ApIp": "10.20.0.101",
"TX": 36,
"RX": 36,
"Signal": -68,
"Uptime": 1131461,
"InRate": 4,
"OutRate": 4,
"AccountingId": 977,
"AccountingName": "انبار شهید امین زاده ",
"RemoteIp": "188.121.123.31",
"IsValidInScan": true,
"Comments": null,
"ApScanId": 26173,
"InRateHistory": "0, 0, 0, 0, 2, 0, 2, 16, 96, 16, 96, 16, 96, 113, 31, 113, 31, 113, 31, 0",
"OutRateHistory": "0, 5, 3, 5, 2, 5, 2, 35, 136, 35, 136, 35, 136, 164, 51, 164, 51, 164, 51, 4"
}, {
"Mac": "00:15:6D:1A:59:D0",
"RadioName": "UBNT",
"ApName": "Om1",
"ApIp": "10.20.0.101",
"TX": 36,
"RX": 12,
"Signal": -73,
"Uptime": 734737,
"InRate": 2,
"OutRate": 2,
"AccountingId": 820,
"AccountingName": "******",
"RemoteIp": "10.10.15.76",
"IsValidInScan": true,
"Comments": null,
"ApScanId": 26173,
"InRateHistory": "0, 0, 0, 0, 2, 0, 2, 16, 96, 16, 96, 16, 96, 113, 31, 113, 31, 113, 31, 0",
"OutRateHistory": "0, 5, 3, 5, 2, 5, 2, 35, 136, 35, 136, 35, 136, 164, 51, 164, 51, 164, 51, 4"
}, {
"Mac": "00:15:6D:E2:2D:13",
"RadioName": "UBNT",
"ApName": "Om1",
"ApIp": "10.20.0.101",
"TX": 54,
"RX": 36,
"Signal": -72,
"Uptime": 848,
"InRate": 0,
"OutRate": 0,
"AccountingId": 820,
"AccountingName": "******",
"RemoteIp": "10.10.15.67",
"IsValidInScan": true,
"Comments": null,
"ApScanId": 26173,
"InRateHistory": "0, 0, 0, 0, 2, 0, 2, 16, 96, 16, 96, 16, 96, 113, 31, 113, 31, 113, 31, 0",
"OutRateHistory": "0, 5, 3, 5, 2, 5, 2, 35, 136, 35, 136, 35, 136, 164, 51, 164, 51, 164, 51, 4"
}, {
"Mac": "00:27:22:32:24:C9",
"RadioName": "UBNT",
"ApName": "Om7",
"ApIp": "10.20.0.100",
"TX": 36,
"RX": 24,
"Signal": -78,
"Uptime": 731588,
"InRate": 0,
"OutRate": 0,
"AccountingId": 820,
"AccountingName": "******",
"RemoteIp": "10.10.15.188",
"IsValidInScan": true,
"Comments": null,
"ApScanId": 26173,
"InRateHistory": "0, 0, 0, 0, 2, 0, 2, 16, 96, 16, 96, 16, 96, 113, 31, 113, 31, 113, 31, 0",
"OutRateHistory": "0, 5, 3, 5, 2, 5, 2, 35, 136, 35, 136, 35, 136, 164, 51, 164, 51, 164, 51, 4"
}, {
"Mac": "00:15:6D:FE:BB:E2",
"RadioName": "ketabforooshie",
"ApName": "Om7",
"ApIp": "10.20.0.100",
"TX": 54,
"RX": 36,
"Signal": -72,
"Uptime": 240361,
"InRate": 0,
"OutRate": 0,
"AccountingId": 533,
"AccountingName": "قاسم رضاپور",
"RemoteIp": "188.121.124.214",
"IsValidInScan": true,
"Comments": null,
"ApScanId": 26173,
"InRateHistory": "0, 0, 0, 0, 2, 0, 2, 16, 96, 16, 96, 16, 96, 113, 31, 113, 31, 113, 31, 0",
"OutRateHistory": "0, 5, 3, 5, 2, 5, 2, 35, 136, 35, 136, 35, 136, 164, 51, 164, 51, 164, 51, 4"
}, {
"Mac": "00:27:22:D2:86:56",
"RadioName": "UBNT",
"ApName": "Om7",
"ApIp": "10.20.0.100",
"TX": 48,
"RX": 12,
"Signal": -72,
"Uptime": 126430,
"InRate": 0,
"OutRate": 0,
"AccountingId": 1453,
"AccountingName": "حسن قربانی",
"RemoteIp": "188.121.123.154",
"IsValidInScan": true,
"Comments": null,
"ApScanId": 26173,
"InRateHistory": "0, 0, 0, 0, 2, 0, 2, 16, 96, 16, 96, 16, 96, 113, 31, 113, 31, 113, 31, 0",
"OutRateHistory": "0, 5, 3, 5, 2, 5, 2, 35, 136, 35, 136, 35, 136, 164, 51, 164, 51, 164, 51, 4"
}, {
"Mac": "00:27:22:78:A3:19",
"RadioName": "UBNT",
"ApName": "Om7",
"ApIp": "10.20.0.100",
"TX": 54,
"RX": 54,
"Signal": -56,
"Uptime": 58617,
"InRate": 0,
"OutRate": 0,
"AccountingId": 820,
"AccountingName": "******",
"RemoteIp": "10.10.15.39",
"IsValidInScan": true,
"Comments": null,
"ApScanId": 26173,
"InRateHistory": "0, 0, 0, 0, 2, 0, 2, 16, 96, 16, 96, 16, 96, 113, 31, 113, 31, 113, 31, 0",
"OutRateHistory": "0, 5, 3, 5, 2, 5, 2, 35, 136, 35, 136, 35, 136, 164, 51, 164, 51, 164, 51, 4"
}
]
},
sortable: true,
groupable: true,
selectable: true,
navigatable: true,
height: 500,
scrollable: true,
pageable: true,
columns: [{
field: "Mac",
title: "Mac",
width: 170
}, {
field: "RadioName",
title: "Radio",
width: 150
}, {
field: "ApName",
title: "Ap",
width: 80,
template: '#=ApName#'
}, {
field: "RemoteIp",
title: "Remote IP",
width: 160,
template: '#=RemoteIp#'
}, {
field: "AccountingName",
title: "Name",
width: 130,
template: ' #= AccountingName # '
}, {
field: "TX",
title: "TX",
width: 44
}, {
field: "RX",
title: "RX",
width: 50
}, {
field: "Signal",
title: "Signal",
width: 50
}, {
field: "Uptime",
title: "Uptime",
width: 78
}, {
field: "Usage",
title: "Usage",
template: '<span id="sparkline"></span>'
}, {
command: ["edit"],
title: " "
}],
editable: "popup",
});
$(".ref").click(function () {
$(".table").data("kendoGrid").dataSource.read();
});
$("#sparkline").kendoSparkline({
type: "area",
series: [{
name: "World",
data: [15.7, 16.7, 20, 23.5, 26.6, 24.8, 24.1, 20.1, 14.1, 8.6, 2.5, 3.5],
}, {
name: 'New York',
data: [0.7, 0.8, 5.7, 11.3, 17.0, 22.0, 24.8, 24.1, 20.1, 14.1, 8.6, 2.5],
}],
categoryAxis: {
categories: [2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015]
}
});
<div class="span6 box gradient main_stting">
<div class="dataTables_filter" id="txtSearch">
<label>Search:
<input type="text" aria-controls="DataTables_Table_0">
</label>
</div>
<div class="title">
<button class="btn ref" type="submit">Refresh</button>
<h3></h3>
</div>
<div class="content">
<div class="table"></div>
</div>
thank you
After edit, the HTML elements are destroyed and recreated when the Grid updates. You will need to recreate your sparklines. It is basically the same as this issue.

Resources