In IBM API Connect I am trying to use 'crypto' module in IBM API Connect gatewayscript. When I tested whether the crypto module is supported in gatewascript or not, I got the response as below
Code in Gatewayscript:
var crypto = require('crypto');
session.output.write(crypto);
Output:
*{
"getHashes": {},
"getCiphers": {},
"createHash": {},
"createHmac": {},
"createSign": {},
"createVerify": {},
"createCipheriv": {},
"createDecipheriv": {},
"randomBytes": {}
}*
But when I tried to make use of it, I got 500 Internal Server Error:
Code:
var crypto = require('crypto');
var key = "Alice";
var hmac = crypto.createHmac('hmac-sha256', key);
var input = "This is plaintext to hash";
var result = hmac.update(input).digest('base64');
session.output.write(result);
output:
{
"httpCode": "500",
"httpMessage": "Internal Server Error",
"moreInformation": "Internal Error"
}
Not sure where the things are going wrong. I am copy pasting exact example from IBM website. Here is the reference to crypto:https://www.ibm.com/support/knowledgecenter/SS9H2Y_7.7.0/com.ibm.dp.doc/crypto_js.html#crypto.createHmac
By using var key = "Alice"; you tell the datapower to use the sharedkey stored with alias 'Alice'.
If you want to use the 'Alice' string then you need to use a buffer like var key = new Buffer("Alice");
Nevertheless it won't work as HMAC expects a 160 bits key for hmac-sha1.
You can generate it like that
$ dd if=/dev/random count=20 bs=1 | xxd -ps
a73e3406e7dcc5fc168d9ae9954ec6e0d85e4444
20 as 20 Bytes (20x8 bits=160 bits)
If you want to store it in a shared object you can follow what's describe here : http://rcbj.net/blog01/2012/03/17/generating-and-uploading-a-shared-key-symmetric-key-to-datapower-appliances/
Put the hex string generated by this command into a file called secret.key.
Upload the key to the cert:/// directory on the appliance.
Navigate to Objects->Crypto Configuration->Crypto Shared Secret Key.
Click Add.
Enter a name for the shared key.
From the drop down, chose the secret.key file that was uploaded a moment ago.
Click Apply.
If no errors are displayed, the key was successfully read.
Click Save.
Related
I have a Nuxt application that needs to retrieve some information from a Spring Boot-based auth service.
Right now I sign a text message on the Nuxt app (the auth server is aware of that text message), using node-forge, and then I send it encrypted and with the signature for verification on the auth service.
The problem is that the auth service keeps telling me that the size of the signature is wrong, with a java.security.SignatureException: Signature length not correct: got 3XX but was expecting 256.
Here is the code generating the encrypted message and signature on the Nuxt side:
var md = forge.md.sha256.create();
md.update("123"); // for example purposes
var sign = pPrivateKey.sign(md);
var digestBytes = md.digest().bytes();
console.log("Signature:", sign );
console.log("Encrypted:", digestBytes);
console.log("Encrypted B64:", Buffer.from(digestBytes).toString("base64"));
var keyAuthB64Url = Buffer.from(digestBytes).toString("base64url");
var signB64Url = Buffer.from(sign).toString("base64url");
var jwt = await axios.get(process.env.URL + "/auth", { params: { encrypted: keyAuthB64Url, signature: signB64Url } });
On the auth service I have the following code:
byte[] messageBytes = Base64.getUrlDecoder().decode(encryptedMessage);
byte[] signatureBytes = Base64.getUrlDecoder().decode(signature);
Signature sign = Signature.getInstance("SHA256withRSA");
sign.initVerify(certPublicKey);
sign.update(messageBytes);
boolean verified = sign.verify(signatureBytes);
if (!verified) {
throw new Exception("Not verified!");
}
From all the debugging I have done, it seems like the Spring Boot app has a problem with the signature generated by node-forge on the Nuxt side, with a signature generated in the Spring Boot app the verification works.
There are several issues:
First, the bug that was already mentioned in the comment: While the NodeJS code does not hash implicitly, the Java side does. Therefore, hashing must not be done explicitly on the Java side:
byte[] messageBytes = "123".getBytes("utf-8");
...
sign.update(messageBytes); // Fix 1: Don't hash
Also, in the NodeJS code, sign() returns the data as a bytes string, which must therefore be imported into a NodeJS buffer as a 'binary':
var keyAuthB64Url = Buffer.from(digestBytes, "binary").toString("base64url"); // Fix 2: Import via 'binary' encoding
Without explicit specification of the encoding, a UTF-8 encoding is performed by default, which irreversibly corrupts the data.
And third, latin1 is implicitly used as encoding when generating the hash in the NodeJS code. Other encodings must be specified explicitly, e.g. for the common UTF-8 with utf8:
md.update("123", "utf8"); // Fix 3: Specify the encoding
For the example data 123 used here, this fix has no effect, which changes as soon as characters with a Unicode value larger than 0x7f are included, e.g. 123§. Note that there is little margin for error in the specification of the encoding, e.g. utf-8 would be ignored (because of the hyphen) and latin1 would be used silently.
With these fixes, verification with the Java code works.
In creating a simple program, I can't get Solana to use the devnet for its RPC connection. I keep getting the following error:
{
blockhash: '7TTVjRKApwAqP1SA7vZ2tQHuh6QbnToSmVUA9kc7amEY',
lastValidBlockHeight: 129662699
}
Error: failed to get recent blockhash: FetchError: request to http://localhost:8899/ failed, reason: connect ECONNREFUSED 127.0.0.1:8899
at Connection.getRecentBlockhash (/home/simeon/dev/freelance/niels_vacancies/node_modules/#solana/web3.js/lib/index.cjs.js:6584:13)
even though I have set all of my settable constants like ANCHOR_PROVIDER_URL=https://api.devnet.solana.com, or the relevant entries in my Anchor.toml file. I also explicitly specify the following:
const connection = new anchor.web3.Connection("https://api.devnet.solana.com/", {commitment: "max"});
const wallet = anchor.Wallet.local();
const provider = new anchor.Provider(
connection,
wallet,
{
commitment: "max",
preflightCommitment: "max",
skipPreflight: false
}
)
I even test console.log(await anchor.getProvider().connection.getLatestBlockhash()); to ensure that I can, in fact, get a blockhash from the devnet. What can I do to force the RPC calls to do so too?
You just have to set the Anchor.toml cluster to devnet and programs.devnet and then deploy the program using a wallet with devnet-sol. I will drop an Anchor.toml for devnet.
[features]
seeds = false
[programs.devnet]
first_program = "FPT...bd3"
[registry]
url = "https://anchor.projectserum.com"
[provider]
cluster = "devnet"
wallet = "PATH/TO/WALLET/WHO/WILL/PAY/FOR/DEPLOY.json"
[scripts]
test = "yarn run ts-mocha -p ./tsconfig.json -t 1000000 tests/**/*.ts"
in this case the first_program is the program_id declared on the declare_id macro.
Then you can use ur test file totally normal with anchor.setProvider(anchor.Provider.env());
If you have already updated the anchor.toml to use devnet, and are having this issue with program.provider.connection.whatever or program.account.whatever.fetch.whatever, make sure that you have set the anchor provider BEFORE creating the program, e.g:
const provider = AnchorProvider.env();
anchor.setProvider(provider);
must come before the line
const program: Program<Whatever> = workspace.Whatever;
I have been trying to upload an image in chunks with client side streaming using grpcurl. The service is working without error except that at the server, image data received is 0 bytes.
The command I am using is:
grpcurl -proto image_service.proto -v -d # -plaintext localhost:3010 imageservice.ImageService.UploadImage < out
This link mentions that the chunk data should be base64 encode and so the contents of my out file are:
{"chunk_data": "<base64 encoded image data>"}
This is exactly what I am trying to achieve, but using grpcurl.
Please tell what is wrong in my command and what is the best way to achieve streaming via grpcurl.
I have 2 more questions:
Does gRPC handles the splitting of data into chunks?
How can I first send a meta-data chunk (ImageInfo type) and then the actual image data via grpcurl?
Here is my proto file:
syntax = "proto3";
package imageservice;
import "google/protobuf/wrappers.proto";
option go_package = "...";
service ImageService {
rpc UploadImage(stream UploadImageRequest) returns (UploadImageResponse) {}
}
message UploadImageRequest {
oneof data {
ImageInfo info = 1;
bytes chunk_data = 3;
};
}
message ImageInfo {
string unique_id = 1;
string image_type = 2;
}
message UploadImageResponse {
string url = 1;
}
Interesting question. I've not tried streaming messages with (the excellent) grpcurl.
The documentation does not explain how to do this but this issue shows how to stream using stdin.
I recommend you try it that way first to ensure that works for you.
If it does, then bundling various messages into a file (out) should also work.
Your follow-on questions suggest you're doing this incorrectly.
chunk_data is the result of having split the file into chunks; i.e. each of these base64-encoded strings should be a subset of your overall image file (i.e. a chunk).
your first message should be { "info": "...." }, subsequent messages will be { "chunk_data": "<base64-encoded chunk>" } until EOF.
I have a webpage where the user can upload a PDF file and send it using AJAX to my Flask application. Here is the ajax code:
var formData = new FormData();
formData.append('attachment', document.getElementById("attachment").files[0]);
$.ajax({
type: 'POST',
url: 'process_award_storage',
contentType: false,
processData: false,
data: formData
})
I do not think that this has any problems because I can print out the content and title of file in the python code. Then my flask model is defined as such, using LargeBinary for the PDF attachment:
class AwardStore(db.Model):
__tablename__ = 'awards_store'
id = db.Column(db.Integer, primary_key=True)
...
file = db.Column(db.LargeBinary, nullable=True) #I am sure this is the right file type for PDF saving
Lastly, here is how the AJAX file is saved to the database:
award = AwardStore(name=name, file=request.files['attachment'].read())
db.session.add(award)
db.session.commit()
I can see using my MySQL Workbench that it is saved. However, when I try to download the BLOB from there to the Desktop and open it, it says "Failed to load PDF document". The same happens when I use flask's send_file. It seems like I did everything as I saw online, but something is wrong.
Maybe these warnings are related?:
C:\Users\msolonko\Desktop\NICKFI~1\CACHTM~1\APP_DE~1\virt\lib\site-packages\pymysql\cursors.py:170: Warning: (1300, "Invalid utf8 character string: 'C4E5F2'")
result = self._query(query)
C:\Users\msolonko\Desktop\NICKFI~1\CACHTM~1\APP_DE~1\virt\lib\site-packages\pymysql\cursors.py:170: Warning: (1265, "Data truncated for column 'file' at row 1")
result = self._query(query)
I tried googling them and did not find anything. I appreciate any assistance.
EDIT:
I noticed that small files are typically uploaded and displayed properly. The issue is with files with sizes > ~50kB. The database I am using the AWS RDS, Is there a setting I can change somewhere to enable greater sizes?
The LargeBinary accepts a length field also.
which converts to mysql's BLOB whose default length is 65535 bytes (64kb).
Try increasing the length and set it to
a MEDIUMBLOB for 16777215 bytes (16 MB)
a LONGBLOB for 4294967295 bytes (4 GB).
Hope this will help
I am trying to integrate QnAmaker knowledge base with Azure Bot Service.
I am unable to find knowledge base id on QnAMaker portal.
How to find the kbid in QnAPortal?
The Knowledge Base Id can be located in Settings under “Deployment details” in your knowledge base. It is the guid that is nestled between “knowledgebases” and “generateAnswer” in the POST (see image below).
Hope of help!
Hey you can also use python to get this by take a look at the following code.
That is if you wanted to write a program to dynamically get the kb ids.
import http.client, os, urllib.parse, json, time, sys
# Represents the various elements used to create HTTP request path for QnA Maker
operations.
# Replace this with a valid subscription key.
# User host = '<your-resource-name>.cognitiveservices.azure.com'
host = '<your-resource-name>.cognitiveservices.azure.com'
subscription_key = '<QnA-Key>'
get_kb_method = '/qnamaker/v4.0/knowledgebases/'
try:
headers = {
'Ocp-Apim-Subscription-Key': subscription_key,
'Content-Type': 'application/json'
}
conn = http.client.HTTPSConnection(host)
conn.request ("GET", get_kb_method, None, headers)
response = conn.getresponse()
data = response.read().decode("UTF-8")
result = None
if len(data) > 0:
result = json.loads(data)
print
#print(json.dumps(result, sort_keys=True, indent=2))
# Note status code 204 means success.
KB_id = result["knowledgebases"][0]["id"]
print(response.status)
print(KB_id)
except :
print ("Unexpected error:", sys.exc_info()[0])
print ("Unexpected error:", sys.exc_info()[1])