I'm attempting to grab data in RethinkDB by the primary key. However, my code cannot find the data.
I'm using the RethinkDBDash driver to grab the data and such, I defined Rethink as client.db. The ID is a sliced array from data I'm grabbing from an API, when I console log it, the value is 1. I tested this out by simply entering the integer 1 and it was able to grab the data, however, when I use the data from the API to grab it, it returns a null erorr. The table I am trying to grab data on is named suggestions with a primary key of sid, originally id. The table looks like this:
{
"author": {
"id": "535986058991501323"
} ,
"sid": 1 ,
"status": "PENDING" ,
"suggestion": "wtf"
}
On past projects, I have been able to grab data from the primarykey no problem using this exact method.
const id = args.slice(0).join(' ');
console.log(id);
const data = await client.db.table('suggestions').get(id).run();
console.log(data.sid);
From grabbing the data, I expected a console log with the value 1. However, what I got was an error that says this:
(node:34984) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'sid' of null
at Object.run (C:\Users\facto\Desktop\Projects\JavaScript\Bots\CountSill\commands\deny.js:7:26)
(node:34984) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:34984) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
I figured out the issue, the primarykey expects an integer rather than a string, so it looks for an integer rather than a string and since it can't find it, it throws an erorr. To fix it, I just had to change the data from an integer to a string. You can also just use the parseInt function to convert the array from a string to an integer.
Related
So, the thing is, I'm trying to query a GSI table on DynamoDB and get a really weird behaviour.
The main table schema is as follows
- Partition key: test_id (string)
- Sort key: version (string)
- Other attributes (createdAt, tags, etc)
I want to obtain every entry that has a sort_key equal to v0_test WITHOUT filtering by partition key. To do this and avoid a full scan I created a GSI (Global Secondary Index) as follows:
- Partition key: version (string)
- Sort key: createdAt (number)
- Other attributes (test_id, tags, etc)
When querying this from AWS console I can query for every partition key equal to v0_test and I get the expected results, but when I query from inside a lambda function (runtime: nodejs16.x) I get an error.
The code for the query is as follows:
const dynamoDb = new AWS.DynamoDB.DocumentClient();
let params = {
TableName: dynamoTable,
IndexName: dynamoTableIndex ,
KeyConditionExpression: 'version = :v0 AND createdAt BETWEEN :tLower AND :tUpper',
ExpressionAttributeValues: {
':v0': 'v0_test',
':tUpper': Math.floor(Date.now() / 1000).toString(),
':tLower': '0'
}
};
let result = await dynamoDb.query(params).promise();
console.log("Success", result);
And the error I get is
ERROR ValidationException: Query condition missed key schema element: test_id
As you can see it's asking for the partition key for the main table.
Things I've tried:
Using AWS.DynamoDB instead of AWS.DynamoDB.DocumentClient. Same error
Changing version for test_id in the query. Got ERROR ValidationException: Query condition missed key schema element: version
Sending both version and test_id on KeyConditionExpression. Got the following error:ERROR ValidationException: KeyConditionExpressions must only contain one condition per key
New Dynamo table, new GSI. Same errors
I wasn't expecting that at all. It's my first time using DynamoDB but as I understand it the idea behind GSI's (or one of them) is to be able to query a DynamoDB table by other attribute than it's main partition key without having to do a full scan.
Any help appreciated and if you need more details just ask! It's my first time asking on StackOverflow too so I'm sure I'll miss something.
EDIT
Tested the suggested solution and got an error suggesting that the schema was wrong, which was a new error and got me thinking, so I tried specifying the ExpressionAttributeNames and it WORKED!.
I created the request as follows using DocumentClient:
const dynamoDb = new AWS.DynamoDB.DocumentClient();
let params = {
TableName: 'CLASSIFIER_TESTS_DEV_us-east-1',
IndexName: 'version-createdAt-index',
KeyConditionExpression: '#versionAttr = :version AND #ca BETWEEN :tLower AND :tUpper',
ExpressionAttributeNames: {
"#versionAttr": "version",
"#ca": "createdAt"
},
ExpressionAttributeValues: {
":version": "v0_test",
":tUpper": Date.now(),
":tLower": 0
}
};
let result = await dynamoDb.query(params).promise();
Thanks everyone!
I still think that it should have worked the way I did it the first time as that's the way everyone does it on tutorials/examples/documentation, but oh well, got it working and that's what matters for now.
This should not be an issue. I would suggest the following:
Test locally, have your DynamoDB API call run from your local machine, this will ensure that no issues are being caused by Lambda invocations
Hard code your TableName and IndexName as strings in your parameters.
If the above is still causing an issue, try the same logic but this time a Query from the base table.
Let me know if you still face issue after that, happy to help.
i thought i got the hang of dexie, but now i'm flabbergasted:
two tables, each with a handful of records. Komps & Bretts
output all Bretts
rdb.Bretts.each(brett => {
console.log(brett);
})
output all Komps
rdb.Komps.each(komp=> {
console.log(komp);
})
BUT: this only outputs the Bretts, for some weird reason, Komps is empty
rdb.Bretts.each(brett => {
console.log(brett);
rdb.Komps.each(komp=> {
console.log(komp);
})
})
i've tried all kinds of combinations with async/await, then() etc, the inner loop cannot find any data in the inner table, whatever table i want to something with.
2nd example. This Works:
await rdb.Komps.get(163);
This produces an error ("Failed to execute 'objectStore' on 'IDBTransaction…ction': The specified object store was not found.")
rdb.Bretts.each(async brett => {
await rdb.Komps.get(163);
})
Is there some kind of locking going on? something that can be disabled?
Thank you!
Calling rdb.Bretts.each() will implicitly launch a readOnly transaction limited to 'Bretts' only. This means that within the callback you can only reach that table. And that's the reason why it doesn't find the Comps table at that point. To get access to the Comps table from within the each callback, you would need to include it in an explicit transaction block:
rdb.transaction('r', 'Komps', 'Bretts', () => {
rdb.Bretts.each(brett => {
console.log(brett);
rdb.Komps.each(komp=> {
console.log(komp);
});
});
});
However, each() does not respect promises returned by the callback, so even this fix would not be something that I would recommend either - even if it would solve your problem. You could easlily get race conditions as you loose the control of the flow when launching new each() from an each callback.
I would recommend you to using toArray(), get(), bulkGet() and other methods than each() where possible. toArray() is also faster than each() as it can utilize faster IDB Api IDBObjectStore.getAll() and IDBIndex.getAll() when possible. And you don't nescessarily need to encapsulate the code in a transaction block (unless you really need that atomicy).
const komps = await rdb.Komps.toArray();
await Promise.all(
komps.map(
async komp => {
// Do some async call per komp:
const brett = await rdb.Bretts.get(163));
console.log("brett with id 163", brett);
}
)
);
Now this example is a bit silly as it does the exact same db.Bretts.get(163) for each komp it founds, but you could replace 163 with some dynamic value there.
Conclusion: There are two issues.
The implicit transaction of Dexie's operation and the callback to each() lives within that limited transaction (tied to one single table only) unless you surround the call with a bigger explicit transaction block.
Try avoid to start new async operation within the callback of Dexie's db.Table.each() as it does not expect promises to be returned from its callback. You can do it but it is better to stick with methods where you can keep control of the async flow.
I have a table I created in the Hasura console. A few of the columns are integer int types and I get the expected data type: Maybe<Scalars['Int']>.
However, a few needed to be an array of integers so I created those in the Hasura SQL tab:
ALTER TABLE my_table
add my_ids integer[];
If I populate those in GraphiQL with the following query variable everything works just fine:
{
"my_ids": "{1200, 1201, 1202}",
}
However, when I try to make the same request from my front-end client, I received the following error: A string is expected for type : _int4. Looking at the datatype, it is slightly different than the preset (in the data type dropdown) integer types: Maybe<Scalars['_int4']>.
Is there a way to get the array of integers to be Maybe<Scalars['Int']> like the preset integer types? Or if there isn't, how can resolve the issue with my request throwing errors for this _int4 type?
Hasura cannot process array but it can process array literals.
I've written a function to help you to transform your array into array literal before mutation:
const toArrayLiteral = (arr) => (JSON.stringify(arr).replace('[', '{').replace(']', '}'))
...
myObject.array = toArrayLiteral(myObject.array) // will make ['friend'] to {'friend'}
Good luck
I am trying to append an object into an array in rethink. Here is how I am trying to append it:
rethink('shifts')
.get(shiftId)
.update(row => row("milestones").default([]).append({
dateAchieved: date,
phaseType: phasetype.toUpperCase()
})).run(rethinkConnection)
The error I get is this:
UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): ReqlQueryLogicError: Expected type TABLE but found DATUM:
"shifts" in:
r("shifts").get("5d1b607f-e670-4eb5-b873-6e800f8ae8f8").update(function(var_15) { return var_15("milestones").default([]).append({"dateAchieved": r.ISO8601("2012-04-12T06:00:00.000Z"), "phaseType": "PILOT"}); })
^^^^^^^^
What does it mean that it is expecting a 'TABLE', but found a 'DATUM'? How do I get this to insert an object into the array?
You need to do
rethink.table(“shifts”)
Calling the rethink namespace with a string just returns an expression
I'm using a Typed DataSet with an Insert statement; I have a table that has a smalldatetime field defined to accept null values. When I insert from a .NET 2.0 FormView, I get a "SqlDateTime overflow. Must be between 1/1/1753 12:00:00 AM and 12/31/9999 11:59:59 PM."
Now, I've read this post, and the parameter as sent to the class constructor is defined as
global::System.Nullable<global::System.DateTime> DoB
So, it looks like it should accept a Nullable obj. Additionally, the generated code is testing the value sent.
if ((DoB.HasValue == true)) {
command.Parameters[6].Value = ((System.DateTime)(DoB.Value));
}
else {
command.Parameters[6].Value = global::System.DBNull.Value;
}
Specifically, the error is occurring when generated SqlClient.SqlCommand.ExecuteScalar() runs:
try {
returnValue = command.ExecuteScalar();
}
So, I guess my question is: how do I use a Typed DataSet to set a blank value (passed from a FormView on CommandName=Insert) to a null in a database?
Ok, so here's what worked for me. First, to reiterate, I've got a Typed DataSet with DataAdapters that's generating the ADO objects. So, on my page, I can create a ObjectDataSource with the type that points to my adapter, and then name the different access methods housed there-in.
No, I have an Insert to a table where basically all the columns are nullable; some varchar, some smalldatetime.
When I submit an empty form, I'd like nulls to be entered. They're not and lots of various errors are thrown. What I ended up doing is subclassing the ObjectDataSource to gain access to the Inserting event. (subclassed for reusability) In the Inserting event, I looped through the InputParameters, and if it was a string and == "", I set it to null. Also, you cannot set ConvertNullToDBNull to true; that causes the strings to fail. This successfully allowed the Nullable to remain null.