I am new to DynamoDB and want only to create a new object if the Primary sort key(name) does not exist twice. I tried it like this:
params.id = randomId();
var item = {
TableName: tableName,
Item: params,
ConditionExpression: "#na <> :n",
ExpressionAttributeNames:{"#na":"name"},
ExpressionAttributeValues:{
":n":params.name
}
};
docClient.put(item, function(err, data) {
console.log("Data:", data);
console.log("Err:", err);
});
But the item is still created :/ Is ist even possible to create a condition expression on the primary sort key ?
Actually just ran into this issue myself, as explained here it looks like you can't, you'll have to use a Global Secondary Index for the 'sort' key.
You will have to do a seperate get request on the GSI first to see if "name" exists for eg.
function checkNameDoesNotExist(name, fn){
query.IndexName = 'nameInUsers';
query.KeyConditionExpression = 'name = :n';
query.ExpressionAttributeValues = {
':n': name
};
dynamodb.query(query, function(err, data){
if (err) {
return fn(err);
} else {
fn(null, data);
}
});
}
Disclaimer: wrote the code off the top of my head, don't know if it works but should give you a good starting point
You can use the exist condition. It will return an error saying that the object already exists
var item = {
TableName: tableName,
Item: params,
Expected: {
name: {
Exists: false
}
};
docClient.put(item, function(err, data) {
console.log("Data:", data);
console.log("Err:", err);
});
Related
My table looks like [alias, inheritedLdap, LdapGroup ] here alias is the string and the LdapGroup is the List form eg: [{S:aws}]. So basically my use case is to get the list of aliases whose ldapGroup is aws. Here the alias is the partition key, we don't have the sort key. So I need to write a method which takes the ldapGroup as the parameter and filter the list of the alias when the ldapGroup is aws. But ldapGroup doesn't contain scalar values. I tried to implement the code but its failing when I try to compile,
public async getMemberList(): Promise<any> {
const input: any = {
TableName: UserInfoDao.TABLE_NAME, // use this from the constants
ProjectionExpression: "alias",
FilterExpression: "#l = :ldapGroups",
ExpressionAttributeNames: {
"#l": "ldapGroups"
},
ExpressionAttributeValues: {
":ldapGroups": "PPOA"
}
};
try {
const ddbClient = DynamDBClient.getInstance();
return await ddbClient.scan(input);
} catch (error) {
const message = `ERROR: Failed to retrieve alias for given ldapGroups:
ERROR: ${JSON.stringify(error)}`;
error.message = message;
throw error;
}
}
But when I use the ScanCommandOutput and ScanCommadInput in my code instead of any, its shows the error that the
Type 'Record<string, AttributeValue>[] | undefined' is not assignable to type 'ScanCommandInput'. Type 'undefined' is not assignable to type 'ScanCommandInput'
Property '$metadata' is missing in type 'Request<ScanOutput, AWSError>' but required in type 'ScanCommandOutput'.
Can someone help me with this one.
I am expecting whether my approach is correct or not
This works for me, I made some edits you your example:
import { DynamoDBClient } from "#aws-sdk/client-dynamodb";
import { ScanCommand, ScanCommandInput } from "#aws-sdk/lib-dynamodb";
const client = new DynamoDBClient({
region: 'eu-west-1',
});
class MyClass {
public getMemberList(): Promise<any> {
const input: ScanCommandInput = {
TableName: 'Test1',
// ProjectionExpression: "alias",
FilterExpression: "contains(#l, :ldapGroups)",
ExpressionAttributeNames: {
"#l": "ldapGroups"
},
ExpressionAttributeValues: {
":ldapGroups": "aws"
}
};
try {
return client.send(new ScanCommand(input))
} catch (error) {
const message = `ERROR: Failed to retrieve alias for given ldapGroups: ERROR: ${JSON.stringify(error)}`;
error.message = message;
throw error;
}
}
}
const c = new MyClass();
c.getMemberList().then(res => console.log(res)).catch(err => console.log(err));
My problem is with creating a database. It is created too late and causes problems with further queries. I tried to use async and await but it seems it doesn't solve the problem.
async function storeDailyDealToDB(dailyDeal) {
const db = new sqlite3.Database('database.db');
await new Promise((resolve) => {
const QUERY_CREATE_TABLE =
"CREATE TABLE IF NOT EXISTS daily_deal ( id INTEGER PRIMARY KEY AUTOINCREMENT, title TEXT,)";
db.run(QUERY_CREATE_TABLE);
resolve("done")
});
await new Promise((resolve) => {
const insert =
"INSERT INTO daily_deal (title) VALUES (?)";
const stmt = db.prepare(insert);
stmt.run([dailyDeal['title']]);
stmt.finalize();
resolve("done")
});
let lastRow = await new Promise((resolve) => {
db.each("SELECT * FROM daily_deal ORDER BY id DESC LIMIT 1", function (err, row) {
resolve(err == null ? {} : row)
});
});
db.close();
return lastRow
}
Here is the error I get:
[Error: SQLITE_ERROR: no such table: daily_deal
Emitted 'error' event on Statement instance at:
] {
errno: 1,
code: 'SQLITE_ERROR'
}
Node.js v17.9.0
I did a lot of research and I am stuck. I read to use Promise but it works partially. I am not sure how to tackle this problem.
After looking at the reference docs, of Database#run, you should pass a callback to the run method. Inside this callback, you want to either resolve or reject the promise depending on the outcome.
await Promise((res, rej) => {
db.run(..., (err, result) => {
if (err) rej(err) else res(result)
});
});
I think this is correct (untested however).
I have a mutation
mutation createQuoteLineMutation {
createQuoteLine {
quoteLine {
name
price
product {
name
}
}
}
}
My updater function is as below.
updater: (store) => {
const payload = store.getRootField('createQuoteLine');
const newQuoteLine = payload.getLinkedRecord('quoteLine');
const quote = store.getRoot().getLinkedRecord('getQuote');
const quoteLines = quote.getLinkedRecords('quoteLines') || [];
const newQuoteLines = [...quoteLines, newQuoteLine];
quote.setLinkedRecords(newQuoteLines, 'quoteLines');
}
This works fine for the first time, but the consequent mutations all the previously added quoteLines change to new one I'm assuming this is because newQuoteLine points to same object all the time.
adding below line at the end of updater function unlink quoteLine from createQuoteLine also does not work.
payload.setValue(null, 'quoteLine');
Any help in this regard is highly appreciated.
I have seen a quite similar problem, but I am not sure if it's the same. Try to pass an clientMutationId to the mutation, and increment it along.
const commit = (
input,
onCompleted: (response) => void,
) => {
const variables = {
input: {
...input,
clientMutationId: temp++,
},
};
commitMutation(Environment, {
mutation,
variables,
onCompleted,
onError: null,
updater: store => {
// ....
},
});
};
Try something like this and let me know if it fixes :).
I have been trying updating the db index dynamically and keeps failed, stuck for a few days.
I'm using angular7 & type script and latest dexie version. When I try to use the same code, it give me error:
Is there anything I should do to get it working? Thx!
ERROR Error: Uncaught (in promise): UpgradeError: Dexie specification of currently installed DB version is missing
UpgradeError: Dexie specification of currently installed DB version is missing
I literally just copy pasted the sample code here:
changeSchema(db, schemaChanges) {
db.close();
const newDb = new Dexie(db.name);
newDb.version(db.verno + 1).stores(schemaChanges);
return newDb.open();
}
// Open database dynamically:
async playAround() {
let db = new Dexie('FriendsDatabase');
if (!(await Dexie.exists(db.name))) {
db.version(1).stores({});
}
await db.open();
// Add a table with some indexes:
db = await this.changeSchema(db, { friends: 'id, name' });
// Add another index in the friends table
db = await this.changeSchema(db, { friends: 'id, name, age' });
// Remove the age index again:
db = await this.changeSchema(db, { friends: 'id, name' });
// Remove the friends table
db = await this.changeSchema(db, { friends: null });
}
This sample was faulty. I've updated the docs with a working sample:
async function changeSchema(db, schemaChanges) {
db.close();
const newDb = new Dexie(db.name);
newDb.on('blocked', ()=>false); // Silence console warning of blocked event.
// Workaround: If DB is empty from tables, it needs to be recreated
if (db.tables.length === 0) {
await db.delete();
newDb.version(1).stores(schemaChanges);
return await newDb.open();
}
// Extract current schema in dexie format:
const currentSchema = db.tables.reduce((result,{name, schema}) => {
result[name] = [
schema.primKey.src,
...schema.indexes.map(idx => idx.src)
].join(',');
return result;
}, {});
console.log("Version: " + db.verno);
console.log("Current Schema: ", currentSchema);
// Tell Dexie about current schema:
newDb.version(db.verno).stores(currentSchema);
// Tell Dexie about next schema:
newDb.version(db.verno + 1).stores(schemaChanges);
// Upgrade it:
return await newDb.open();
}
// Open database dynamically:
async function playAround() {
let db = new Dexie ('FriendsDatabase2');
if (!(await Dexie.exists(db.name))) {
console.log("Db does not exist");
db.version(1).stores({});
}
await db.open();
console.log("Could open DB")
// Add a table with some indexes:
db = await changeSchema(db, {friends: 'id, name'});
console.log("Could enforce friends table with id and name")
// Add another index in the friends table
db = await changeSchema(db, {friends: 'id, name, age'});
console.log("Could add the age index")
// Remove the age index again:
db = await changeSchema(db, {friends: 'id, name'})
console.log("Could remove age index")
// Remove the friends table
db = await changeSchema(db, {friends: null});
console.log("Could delete friends table")
}
playAround().catch(err => console.error(err));
Fiddle:
https://jsfiddle.net/dfahlander/jzf2mc7n/
I really can't understand why when I run a bulk insert I lost the previous data in the same collection without executing any delete operation?
this is weird.
any idea?
var client = new elasticsearch.Client( {
hosts: [
'http://localhost:9200/'
]
})
.
.
.
InserTweets: function (arrayobj, callback) {
var items=[];
var count=1;
arrayobj.forEach(element => {
items.push({ index: { _index: 'twitter', _type: 'tweet', _id: count }},element);
count++;
});
client.bulk({body:items}, function (err, resp, status) {
callback(err, resp, status);
}, function (err, resp, status) {
console.log(err);
});
}
You are setting the _id to the count so on the second operation its overwriting/updating the existing record to the new record.
The _id needs to be unique for each record.
has element got anything unique like its own id which you could use?