Chainlink Node Job: requestData: while interpolating variables in JSON payload: invalid character '_' after top-level value: bad input for task - chainlink

my fetch statement in my job is causing an error.
Log of the job run:
fetch bridge
requestData: while interpolating variables in JSON payload: invalid character '_' after top-level value: bad input for task
name: graphql
requestData: {"jobRunId": $(jobSpec.externalJobId), "data": {"query": "{ reserve (id: $(decode_cbor.reserveId)) { paramsHistory(orderby: timestamp, orderDirection: desc, where: { timestamp_gt: $(decode_cbor.timestamp) }) { liquidityRate, } } }", "variables": null, "graphqlEndpoint": $(decode_cbor.graphqlEndpoint)}}
TOML Job:
type = "directrequest"
schemaVersion = 1
name = "Get 30-day Average Liquidity Rate 2"
contractAddress = "0x6eFc5873cB4eB9CE5024E9DeBA2139Aab235D84C"
maxTaskDuration = "0s"
observationSource = """
decode_log [type="ethabidecodelog"
abi="OracleRequest(bytes32 indexed specId, address requester, bytes32 requestId, uint256 payment, address callbackAddr, bytes4 callbackFunctionId, uint256 cancelExpiration, uint256 dataVersion, bytes data)"
data="$(jobRun.logData)"
topics="$(jobRun.logTopics)"]
decode_cbor [type="cborparse" data="$(decode_log.data)"]
fetch [type="bridge" name="graphql" requestData="{\\"jobRunId\\": $(jobSpec.externalJobId), \\"data\\": {\\"query\\": \\"
{
reserve (id: $(decode_cbor.reserveId)) {
paramsHistory(orderby: timestamp, orderDirection: desc, where: { timestamp_gt: $(decode_cbor.timestamp) }) {
liquidityRate,
}
}
}\\", \\"variables\\": null, \\"graphqlEndpoint\\": $(decode_cbor.graphqlEndpoint)}}"]
parse [type="jsonparse" path="data,liquidityRate" data="$(fetch)"]
mean [type="mean" values="$(parse)" precision=0]
encode_data [type="ethabiencode" abi="(uint256 value)" data="{ \\"value\\": $(mean) }"]
encode_tx [type="ethabiencode"
abi="fulfillOracleRequest(bytes32 requestId, uint256 payment, address callbackAddress, bytes4 callbackFunctionId, uint256 expiration, bytes32 data)"
data="{\\"requestId\\": $(decode_log.requestId), \\"payment\\": $(decode_log.payment), \\"callbackAddress\\": $(decode_log.callbackAddr), \\"callbackFunctionId\\": $(decode_log.callbackFunctionId), \\"expiration\\": $(decode_log.cancelExpiration), \\"data\\": $(encode_data)}"
]
submit_tx [type="ethtx" to="0x6eFc5873cB4eB9CE5024E9DeBA2139Aab235D84C" data="$(encode_tx)"]
decode_log -> decode_cbor -> fetch -> parse -> mean -> encode_data -> encode_tx -> submit_tx
"""
Seems the request data part of my fecth statement is hitting an _ and not liking it. I am not sure how to format _ within the job.
Any help would be appreciated.

Related

One to many inserting

I have two models: Order and OrderItem. I need to insert order and item of it from same request, e.g.:
{
"user_id": "1",
"total_price": "200",
"items": [
{
"product_id": 1,
"quantity": 10
},
{
"product_id": 2,
"quantity": 5
},
{
"product_id": 3,
"quantity":3
}
]
}
This is Order and OrderItem model
=========================== Order ===========================
type Order struct {
ID int `boil:"id" json:"id" toml:"id" yaml:"id"`
OrderNumber string `boil:"order_number" json:"order_number" toml:"order_number" yaml:"order_number"`
OrderDate time.Time `boil:"order_date" json:"order_date" toml:"order_date" yaml:"order_date"`
Status string `boil:"status" json:"status" toml:"status" yaml:"status"`
Note string `boil:"note" json:"note" toml:"note" yaml:"note"`
UserID int `boil:"user_id" json:"user_id" toml:"user_id" yaml:"user_id"`
CreatedAt time.Time `boil:"created_at" json:"created_at" toml:"created_at" yaml:"created_at"`
UpdatedAt time.Time `boil:"updated_at" json:"updated_at" toml:"updated_at" yaml:"updated_at"`
R *orderR `boil:"-" json:"-" toml:"-" yaml:"-"`
L orderL `boil:"-" json:"-" toml:"-" yaml:"-"`
}
=========================== OrderItem ===========================
type OrderItem struct {
ID int `boil:"id" json:"id" toml:"id" yaml:"id"`
OrderID int `boil:"order_id" json:"order_id" toml:"order_id" yaml:"order_id"`
ProductID int `boil:"product_id" json:"product_id" toml:"product_id" yaml:"product_id"`
ProductPrice float64 `boil:"product_price" json:"product_price" toml:"product_price" yaml:"product_price"`
ProductName string `boil:"product_name" json:"product_name" toml:"product_name" yaml:"product_name"`
Quantity int `boil:"quantity" json:"quantity" toml:"quantity" yaml:"quantity"`
Discount float64 `boil:"discount" json:"discount" toml:"discount" yaml:"discount"`
Note string `boil:"note" json:"note" toml:"note" yaml:"note"`
CreatedAt time.Time `boil:"created_at" json:"created_at" toml:"created_at" yaml:"created_at"`
UpdatedAt time.Time `boil:"updated_at" json:"updated_at" toml:"updated_at" yaml:"updated_at"`
R *orderItemR `boil:"-" json:"-" toml:"-" yaml:"-"`
L orderItemL `boil:"-" json:"-" toml:"-" yaml:"-"`
}
What do people usually do? Is there a way to do this quickly with sqlboiler?
I think you needn't to do that, you can insert order model and orderItems model in different requests. This solution provide more option than for client. If you need to update or add new order item into the order, you also need this API to solve it.
Create some models like
type OrderItemInput struct {
ProductId int `json:"product_id"`
Quantity int `json:"quantity"`
}
type OrderInsertInput struct {
UserID int `json:"user_id"`
TotalPrice float64 `json:"total_price"`
Items []OrderItemInput `json:"items"`
}
Create new Order by fields UserId and TotalPrice of OrderInsertInput.
When there was OrderID, we will create OrderItem with each OrderItemInput and the OrderID.

How to deduct funds from users withdrawals in Django Rest Framework

I am building a logistics web application using DRF and i want to set users to withdrawal certain amount, let's say a users withdraw $200, i want the users to get $180 this means minus $20 from each withdrawals, but when i tried using the API, i am getting $220 instead of $180, which means my code is returning a plus sign for withdrawals instead of negative sign.
Below is the codes from my Models
class UserWallet(models.Model):
wallet_id = models.UUIDField(unique=True, default=uuid.uuid4)
user = models.OneToOneField("accounts.User", on_delete=models.CASCADE, related_name="wallet")
currency = models.CharField(max_length=10, default="NGN")
created_at = models.DateTimeField(default=timezone.now)
def get_balance(self):
query = (Q(status="success") | Q(status="processing")) & Q(wallet=self)
balance = WalletTransaction.objects.filter(
query).aggregate(Sum('amount'))
return balance
def to_dict(self):
balance = self.get_balance()["amount__sum"]
return {
"wallet_id": self.wallet_id,
"balance": f"{balance:.2f}" if balance else "0.00",
"currency": self.currency,
"currency_symbol": "₦"
}
def get_earnings(self):
total = WalletTransaction.objects.filter(
wallet=self, status="success", transaction_type="payment").aggregate(Sum('amount'))
return total
def get_withdrawals(self):
total = WalletTransaction.objects.filter(
wallet=self, status="success", transaction_type="withdrawal").aggregate(Sum('amount'))
return total
def get_transfers(self):
total = WalletTransaction.objects.filter(
wallet=self, status="success", transaction_type="transfer").aggregate(Sum('amount'))
return total
def get_deposits(self):
total = WalletTransaction.objects.filter(
wallet=self, status="success", transaction_type="deposit").aggregate(Sum('amount'))
return total
Codes from the Views
#api_view(["POST"])
#transaction.atomic
def initialize_transaction(request):
payload = request.data
user = request.user
wallet = UserWallet.objects.get(user=user)
if payload["transaction_type"] == "deposit":
ser = DepositSerializer(data=request.data)
if not ser.is_valid():
return ApiResponse(message=error_to_string(ser.errors), data=ser.errors, status_code=400).response()
transaction = WalletTransaction.objects.create(
wallet=wallet,
amount=payload["amount"],
description="WALLET TOP UP",
transaction_type="deposit"
)
return Response({
"status": True,
"data": {
"email": user.email,
"amount": payload["amount"],
"reference": transaction.reference
}
})
if payload['transaction_type'] == "transfer":
ser = TransferSerializer(data=request.data, context={"request": request})
if not ser.is_valid():
return ApiResponse(message=error_to_string(ser.errors), data=ser.errors, status_code=400).response()
transaction = WalletTransaction.objects.create(
wallet=wallet,
amount=ser.validated_data["amount"]*-1,
description=payload['description'],
transaction_type="transfer",
extras=payload
)
otp = OTP.generate_otp(wallet)
task.otp_mail(request.user.email, {"code": otp})
data = {
"transaction_id": transaction.transaction_id,
}
return ApiResponse(data=data, message="otp sent").response()
if payload['transaction_type'] == "withdrawal":
ser = WithdrawSerializer(data=request.data, context={"request": request})
if not ser.is_valid():
return ApiResponse(message=error_to_string(ser.errors), data=ser.errors, status_code=400).response()
payload['bank'] = ser.validated_data['bank'].json()
transaction = WalletTransaction.objects.create(
wallet=wallet,
amount=ser.validated_data["amount"]*-1,
description=payload['description'],
transaction_type="withdrawal",
extras=payload
)
otp = OTP.generate_otp(wallet)
task.otp_mail(request.user.email, {"code": otp})
data = {
"transaction_id": transaction.transaction_id,
}
return ApiResponse(data=data, message="otp sent").response()
return ApiResponse(status_code=400, message="Invalid TransactionType").response()

Chainlink: Contract cannot retrieve large-response data type from external adapter

I tried to use the large-response type to fulfil the request but somehow it does not show up in my contract I tried to fulfil the status value, and the job works completely as shown in my chainlink node but it does not change the status value, it stays 0x as it is. So, I wonder that is my contract or job spec wrong?
This is my contract
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.7;
import "#chainlink/contracts/src/v0.8/ChainlinkClient.sol";
contract APIConsumer is ChainlinkClient {
using Chainlink for Chainlink.Request;
bytes public status;
string public statusString;
address private oracle;
bytes32 private jobId;
uint256 private fee;
event RequestFulfilled(bytes32 indexed requestId,bytes indexed data);
/**
* Network: Kovan
* Oracle: 0xc57B33452b4F7BB189bB5AfaE9cc4aBa1f7a4FD8 (Chainlink Devrel
* Node)
* Job ID: d5270d1c311941d0b08bead21fea7747
* Fee: 0.1 LINK
*/
constructor() {
setPublicChainlinkToken();
oracle = 0xDFE5e6C5C624724384b55719b7da79d3EbB60057;
fee = 1 * 10 ** 18; // (Varies by network and job)
}
function requesData(string memory _jobId) public returns (bytes32 requestId)
{
Chainlink.Request memory request = buildChainlinkRequest(stringToBytes32(_jobId), address(this), this.fulfill.selector);
// Set the URL to perform the GET request on
request.add("trackingNo", "HF123456789DL");
return sendChainlinkRequestTo(oracle, request, fee);
}
/**
* Receive the response in the form of uint256
*/
function fulfill(bytes32 _requestId, bytes memory bytesData) public recordChainlinkFulfillment(_requestId)
{
emit RequestFulfilled(_requestId, bytesData);
status = bytesData;
statusString = string(status);
}
// function withdrawLink() external {} - Implement a withdraw function to avoid locking your LINK in the contract
// function getStatus() public view returns (string memory) {
// return bytes32ToString(status);
// }
function bytes32ToString(bytes32 _bytes32)
public
pure
returns (string memory)
{
uint8 i = 0;
while (i < 32 && _bytes32[i] != 0) {
i++;
}
bytes memory bytesArray = new bytes(i);
for (i = 0; i < 32 && _bytes32[i] != 0; i++) {
bytesArray[i] = _bytes32[i];
}
return string(bytesArray);
}
function stringToBytes32(string memory source)
public
pure
returns (bytes32 result)
{
bytes memory tempEmptyStringTest = bytes(source);
if (tempEmptyStringTest.length == 0) {
return 0x0;
}
assembly {
// solhint-disable-line no-inline-assembly
result := mload(add(source, 32))
}
}
}
This is my job spec.
type = "directrequest"
schemaVersion = 1
name = "Halffin-Data-EA-Create-Tracking8"
externalJobID = "3f706a6b-efdd-44ac-8167-f880a6ca63ac"
maxTaskDuration = "0s"
contractAddress = "0xDFE5e6C5C624724384b55719b7da79d3EbB60057"
minIncomingConfirmations = 0
observationSource = """
decode_log [type=ethabidecodelog
abi="OracleRequest(bytes32 indexed specId, address requester, bytes32 requestId, uint256 payment, address callbackAddr, bytes4 callbackFunctionId, uint256 cancelExpiration, uint256 dataVersion, bytes data)"
data="$(jobRun.logData)"
topics="$(jobRun.logTopics)"]
decode_cbor [type=cborparse data="$(decode_log.data)"]
fetch [type=bridge name="halffin-data" requestData="{\\"id\\": $(jobSpec.externalJobID), \\"data\\": { \\"trackingNo\\": $(decode_cbor.trackingNo)}}"]
parse [type=jsonparse path="data,tracking,slug" data="$(fetch)"]
encode_data [type=ethabiencode abi="(bytes value)" data="{ \\"value\\": $(parse) }"]
encode_tx [type=ethabiencode
abi="fulfillOracleRequest(bytes32 requestId, uint256 payment, address callbackAddress, bytes4 callbackFunctionId, uint256 expiration, bytes data)"
data="{\\"requestId\\": $(decode_log.requestId), \\"payment\\": $(decode_log.payment), \\"callbackAddress\\": $(decode_log.callbackAddr), \\"callbackFunctionId\\": $(decode_log.callbackFunctionId), \\"expiration\\": $(decode_log.cancelExpiration), \\"data\\": $(encode_data)}"
]
submit_tx [type=ethtx to="0xDFE5e6C5C624724384b55719b7da79d3EbB60057" data="$(encode_tx)"]
decode_log -> decode_cbor -> fetch -> parse -> encode_data -> encode_tx -> submit_tx
"""
These are logs from completed job
fetch
"{\"jobRunID\":\"3f706a6b-efdd-44ac-8167-f880a6ca63ac\",\"data\":{\"tracking\":{\"id\":2,\"slug\":\"halffin-logistics\",\"tracking_number\":\"HF123456789DL\"},\"result\":null},\"result\":null}"
name: halffin-data
requestData: {"id": $(jobSpec.externalJobID), "data": { "trackingNo": $(decode_cbor.trackingNo)}}
parse
"halffin-logistics"
path: data,tracking,slug
data: $(fetch)
encode_data
"0x0000000000000000000000000000000000000000000000000000000000000020000000000000000000000000000000000000000000000000000000000000001168616c6666696e2d6c6f67697374696373000000000000000000000000000000"
abi: (bytes value)
data: { "value": $(parse) }
encode_tx
"0x728853aa63b008d8b908b2d431b9ea703268ba10e60ab40603941ec91a2955278f219c1e0000000000000000000000000000000000000000000000000de0b6b3a7640000000000000000000000000000136e61cdeae727926aa768574e2f979c724d6cad7c1de7e1000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000062586d6800000000000000000000000000000000000000000000000000000000000000c000000000000000000000000000000000000000000000000000000000000000600000000000000000000000000000000000000000000000000000000000000020000000000000000000000000000000000000000000000000000000000000001168616c6666696e2d6c6f67697374696373000000000000000000000000000000"
abi: fulfillOracleRequest(bytes32 requestId, uint256 payment, address callbackAddress, bytes4 callbackFunctionId, uint256 expiration, bytes data)
data: {"requestId": $(decode_log.requestId), "payment": $(decode_log.payment), "callbackAddress": $(decode_log.callbackAddr), "callbackFunctionId": $(decode_log.callbackFunctionId), "expiration": $(decode_log.cancelExpiration), "data": $(encode_data)}
submit_tx
"{\"logs\": [], \"root\": \"0x\", \"status\": \"0x0\", \"gasUsed\": \"0x5c49\", \"blockHash\": \"0x5b55db677b2776bb637fdb9ba2077e7db21de8e8beba60fb79e1384ae51f39a8\", \"logsBloom\": \"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000\", \"blockNumber\": \"0x1d94cd5\", \"contractAddress\": \"0x0000000000000000000000000000000000000000\", \"transactionHash\": \"0xc5371c5ce1692b39835e44fb61c47d2deeeb509f71e32fccb0c5d42eec3be443\", \"transactionIndex\": \"0x1\", \"cumulativeGasUsed\": \"0x39418\"}"
to: 0xDFE5e6C5C624724384b55719b7da79d3EbB60057
data: $(encode_tx)
In case, you might wonder why I use the large-response data type. I followed this link
The contractAddress specified in the (Get > Large bytes) must be pointed at operator.sol not oracle.sol. (Get > Uint256) is pointed at oracle.sol. Oracle.sol is not meant to handle Large bytes, nor multi-variable Uint256 output.
Here is the code to deploy the current version of operator.sol on remix to then obtain the correct contractAddress to the associated job-spec within the chainlink node GUI.
// SPDX-License-Identifier: MIT
pragma solidity ^0.7.0;
import "#chainlink/contracts/src/v0.7/Operator.sol";

Django Rest Framework: Serializing splits list of strings into list of characters

I am setting up an API with DRF, everything is going smoothly but I am having a bit of a problem when passing a field with a list of strings
Json object
{
"postID": 1,
"index": 0,
"text": "For years you had a President who apologized for America – now you have a President who is standing up for America, and standing up for PENNSYLVANIA. Tomorrow, you have the power, with your vote, to save AMERICA! GET OUT AND VOTE!! #MAGA ",
"date": "2020-11-02",
"likesCount": 145000,
"commentsCount": 4500,
"sharesCount": 3500,
"hashtags": [
"[",
"'",
"M",
"A",
"G",
"A",
"'",
" ",
"'"
"A",
"G",
"A",
"I",
"N",
"'",
"]"
]
}
Whereas hashtags field values should be: "['MAGA' , 'AGAIN']".
How could I override the serializer and prevent that strings are split into characters ?
models.py
from djongo import models
from django.contrib.postgres.fields import ArrayField
# Create your models here.
class Post(models.Model):
index = models.IntegerField()
postID = models.IntegerField(primary_key=True)
text = models.CharField(max_length=500)
date = models.DateField()
likesCount = models.IntegerField()
commentsCount = models.IntegerField()
sharesCount = models.IntegerField()
hashtags = ArrayField(models.CharField(max_length=20, blank=True), size=50)
tokens = ArrayField(models.CharField(max_length=20, blank=True), size=50)
tagged = ArrayField(models.CharField(max_length=20, blank=True), size=50)
keywords = ArrayField(models.CharField(max_length=20, blank=True), size=50)
entities = ArrayField(models.CharField(max_length=20, blank=True), size=50)
noun_phrases = ArrayField(models.CharField(max_length=20, blank=True), size=50)
noun_phrases_keywords = ArrayField(models.CharField(max_length=20, blank=True), size=50)
serializer.py
from rest_framework import serializers
from .models import Post, Comments
class PostSerializer(serializers.ModelSerializer):
class Meta:
model = Post
fields = '__all__'
Serializer inspection
PostSerializer():
postID = IntegerField(label='PostID', max_value=2147483647, min_value=-2147483648, validators=[<UniqueValidator(queryset=Post.objects.all())>])
index = IntegerField(max_value=2147483647, min_value=-2147483648)
text = CharField(max_length=500)
date = DateField()
likesCount = IntegerField(label='LikesCount', max_value=2147483647, min_value=-2147483648)
commentsCount = IntegerField(label='CommentsCount', max_value=2147483647, min_value=-2147483648)
sharesCount = IntegerField(label='SharesCount', max_value=2147483647, min_value=-2147483648)
hashtags = ListField(allow_empty=False, child=CharField(allow_blank=True, label='Hashtags', max_length=20, required=False), validators=[<django.contrib.postgres.validators.ArrayMaxLengthValidator object>])
tokens = ListField(allow_empty=False, child=CharField(allow_blank=True, label='Tokens', max_length=20, required=False), validators=[<django.contrib.postgres.validators.ArrayMaxLengthValidator object>])
tagged = ListField(allow_empty=False, child=CharField(allow_blank=True, label='Tagged', max_length=20, required=False), validators=[<django.contrib.postgres.validators.ArrayMaxLengthValidator object>])
keywords = ListField(allow_empty=False, child=CharField(allow_blank=True, label='Keywords', max_length=20, required=False), validators=[<django.contrib.postgres.validators.ArrayMaxLengthValidator object>])
entities = ListField(allow_empty=False, child=CharField(allow_blank=True, label='Entities', max_length=20, required=False), validators=[<django.contrib.postgres.validators.ArrayMaxLengthValidator object>])
noun_phrases = ListField(allow_empty=False, child=CharField(allow_blank=True, label='Noun phrases', max_length=20, required=False), validators=[<django.contrib.postgres.validators.ArrayMaxLengthValidator object>])
noun_phrases_keywords = ListField(allow_empty=False, child=CharField(allow_blank=True, label='Noun phrases keywords', max_length=20, required=False), validators=[<django.contrib.postgres.validators.ArrayMaxLengthValidator object>])
I got to a solution using DRF serialize ArrayField as string as a reference :).
serializer.py
class StringArrayField(ListField):
def to_representation(self, obj):
# convert list to string
return "".join([str(element) for element in obj])
def to_internal_value(self, data):
data = data.split(",") # convert string to list
return super().to_internal_value(self, data)

Solidity mapping returning null values

So basically, I create a mapping within a smart contract to store hashes of user data. It's mapped from a user id to the hash itself (a bytes32 value). I use a double sha256 hash and store it in the mapping with the aforementioned id. The function for storing it returns the hash by returning the values at the id in the mapping. This hash is correct, meaning at the very least it's initially stored correctly. However, I have another function that gets the hash from the id and it always returns a null value in the javascript tests. I am wondering if it's a problem with the test or with the contract itself.
pragma solidity ^0.4.22;
contract UserStore {
mapping(uint => bytes32) UserHashes; //User id to hash
event HashStored (
uint id,
bytes32 original,
bytes32 hash
);
function HashData(bytes32 data) returns (bytes32){
return sha256(abi.encodePacked(sha256(abi.encodePacked(data))));
}
function StoreHash(uint user_id, bytes32 data) external view returns (bytes32){
UserHashes[user_id] = HashData(data);
HashStored(user_id, data, UserHashes[user_id]);
return UserHashes[user_id];
}
/*
Gets the hash from the blockchain.
*/
function GetHash(uint u_id) view public returns (bytes32){
return UserHashes[u_id];
}
}
Everytime I run this test, GetHash returns a 0 value;
contract("Storage_Test", function(accounts) {
const args = {user_id: 0,
data: "This is some security data",
group_id : 15,
user_ids : [1,2,3,4,5],
num_accounts : 2
}
it("Hash Test: Multiple Storage and retrieving", async function() { return
await UserStore.deployed()
.then(async function(instance) {
var temp = args.data;
var _temp;
for (i = 1; i < args.num_accounts; i++) {
_temp = temp;
temp = await instance.HashData.call(temp);
// console.log("Datahash: " + temp);
result = await instance.StoreHash.call(i, _temp);
// console.log("Result: " + result);
assert.equal(result, temp, "Hash at " + i + " wasn't returned
correctly");
}
temp = args.data;
for (i= 1; i < args.num_accounts; i++) {
temp = await instance.HashData.call(temp);
result = await instance.GetHash.call(i);
assert.equal( result, temp, "Hash at " + i + " wasn't stored
correctly");
}
})
});
});
Change instance.StoreHash.call(...) to instance.StoreHash.sendTransaction(...). call() runs the function locally instead of submitting the transaction. The result is any state change isn’t persisted.

Resources