I write in Ruby On Jets and use Dynomite to work with DynamoDB. And I have a problem with GSI.
I have a table that has 3 fields: display, value, title_hilight. I need to use search across all three fields. For this, I decided to use the global secondary index. For testing purposes, I decided to add GSI for the "display" field.
I created migration
class SomeTableMigration<Dynomite::Migration
def up
create_table 'table-name' do | t |
t.partition_key "id: string: hash" # required
t.gsi do | i |
i.partition_key "display: string"
end
end
end
end
Then I created a model
require "active_model"
class ModelName<ApplicationItem
self.set_table_name 'some-model-name'
column :id, :display,:val, :title_hilight
end
Now I'm trying to find a record by value from the "display" field:
ModelName.where ({display: 'asd'}) and I'm getting that error:
Aws::DynamoDB::Errors::ValidationException (Query condition missed key schema element)
Here is the output of aws dynamodb describe-table --table-name table-name --endpoint-url http://localhost:8000
{
"Table": {
"AttributeDefinitions": [
{
"AttributeName": "id",
"AttributeType": "S"
},
{
"AttributeName": "display",
"AttributeType": "S"
}
],
"TableName": "some-table-name",
"KeySchema": [
{
"AttributeName": "id",
"KeyType": "HASH"
}
],
"TableStatus": "ACTIVE",
"CreationDateTime": "2020-10-26T14:52:59.589000+03:00",
"ProvisionedThroughput": {
"LastIncreaseDateTime": "1970-01-01T03:00:00+03:00",
"LastDecreaseDateTime": "1970-01-01T03:00:00+03:00",
"NumberOfDecreasesToday": 0,
"ReadCapacityUnits": 5,
"WriteCapacityUnits": 5
},
"TableSizeBytes": 112,
"ItemCount": 1,
"TableArn": "arn:aws:dynamodb:ddblocal:000000000000:table/some-table-name",
"GlobalSecondaryIndexes": [
{
"IndexName": "display-index",
"KeySchema": [
{
"AttributeName": "display",
"KeyType": "HASH"
}
],
"Projection": {
"ProjectionType": "ALL"
},
"IndexStatus": "ACTIVE",
"ProvisionedThroughput": {
"ReadCapacityUnits": 5,
"WriteCapacityUnits": 5
},
"IndexSizeBytes": 112,
"ItemCount": 1,
"IndexArn": "arn:aws:dynamodb:ddblocal:000000000000:table/some-table-name/index/display-index"
}
]
}
}
I changed the name of the real table to SomeTableName (sometimes just table-name). The rest of the code remained unchanged. Thanks for help
As mentioned here:
In DynamoDB, you can optionally create one or more secondary indexes
on a table and query those indexes in the same way that you query a
table.
You need to specify GSI name explicitly in your query.
#jny answer is correct. He told me to use a different index. I don’t know how to use a different model (see comments on his answer), but the idea with an index is very, very correct. This is how everything works for me now
ModelName.query(
index_name: 'display-index',
expression_attribute_names: { "#display_name" => "display" },
expression_attribute_values: { ":display_value" => "das" },
key_condition_expression: "#display_name = :display_value",
)
Related
I have a "users" table that is connected to "interestTags" table. I would like to be able to search users interestTags and return all users that match one or more tags, In this example I would like to be able to return all users that has interestTags of either "dog" or "apple.
The code below is only showing matches for "apple" and leaving out the "dog" interestTag users. I would like to get both "dog" users and "apple" users returned instead of one or the other. How would I go about doing this? Here is my code:
users(offset: $offset, limit: 30, order_by: {lastRequest: asc}, where: {dob: {_gte: $fromDate, _lte: $toDate}, interestTagsFromSenderId: {_or: [{tag: $tagList}]}}) {
id
displayName
profilePhotoUrl
dob
bio
location
interestTags: interestTagsFromSenderId {
tag
}
created_at
}
}
graphql query variables:
{
"offset": 0,
"fromDate": "1999-07-01",
"toDate": "2024-01-01",
"tagList":
{
"_eq": "dog", "_eq": "apple"
}
}
This is what graphql is returning:
{
"data": {
"users": [
{
"id": 31,
"displayName": "n00b account",
"profilePhotoUrl": "default.jpg",
"dob": "2021-07-15",
"bio": null,
"location": null,
"interestTags": [
{
"tag": "apple"
}
],
"created_at": "2021-07-15T06:57:23.068243+00:00"
}
]
}
}
to fix the issue:
I added $tagList: [interestTags_bool_exp!] to the query function
I changed the query to interestTagsFromSenderId: {_or: $tagList}}
And changed the variable query to { "tagList": [{"tag": {"_eq": "dog"}}, {"tag": {"_eq": "apple"}}]}
I'm new to DynamoDB. I am trying to create DynamoDB with SAM project. I know I can only use "S", "N", "B" for AttributeType but I want to use like the below.
{
"TableName": "xyz",
"KeySchema": [
{
"AttributeName": "uid",
"KeyType": "HASH"
}
],
"AttributeDefinitions": [
{
"AttributeName": "uid",
"AttributeType": "S"
},
{
"AttributeName": "email",
"AttributeType": "S"
},
{
"AttributeName": "postal_code",
"AttributeType": "S"
},
{
"AttributeName": "bookmark",
"AttributeType": "L" ← (I want to use List)
},
{
"AttributeName": "children",
"AttributeType": "M" ← (I want to use Map)
}
],
"ProvisionedThroughput": {
"ReadCapacityUnits": 2,
"WriteCapacityUnits": 2
}
It is my table.json and I want to create the table with this aws command.
aws dynamodb --profile local --endpoint-url http://localhost:8000 create-table --cli-input-json file://./testdata/table.json
How do you have list data and map data with DynamoDB?
It is best to do it when you add items to the table. DynamoDB has a flexible schema and therefore does not enforce a schema beyond the primary key. Item A might have attribute1, but item B might omit that attribute.
When you create your table, just add the primary key (either partition key or partition key + sort key) and that's it. Then add your items with the data typed attributes that item needs.
I think SS can be used for List where SS stands for String Set. e.g.:
...
{
"AttributeName": "bookmark",
"AttributeType": "SS"
},
...
Ref: DynamoDB Supported Data Types
I'm new to DynamoDB.
When I read data from the table with AWS.DynamoDB.DocumentClient class, the query works but I get the result in the wrong format.
Query:
{
TableName: "users",
ExpressionAttributeValues: {
":param": event.pathParameters.cityId,
":date": moment().tz("Europe/London").format()
},
FilterExpression: ":date <= endDate",
KeyConditionExpression: "cityId = :param"
}
Expected:
{
"user": "boris",
"phones": ["+23xxxxx999", "+23xxxxx777"]
}
Actual:
{
"user": "boris",
"phones": {
"type": "String",
"values": ["+23xxxxx999", "+23xxxxx777"],
"wrapperName": "Set"
}
}
Thanks!
The [unmarshall] function from the [AWS.DynamoDB.Converter] is one solution if your data comes as e.g:
{
"Attributes": {
"last_names": {
"S": "UPDATED last name"
},
"names": {
"S": "I am the name"
},
"vehicles": {
"NS": [
"877",
"9801",
"104"
]
},
"updatedAt": {
"S": "2018-10-19T01:55:15.240Z"
},
"createdAt": {
"S": "2018-10-17T11:49:34.822Z"
}
}
}
Please notice the object/map {} spec per attribute, holding the attr type.
Means you are using the [dynamodb]class and not the [DynamoDB.DocumentClient].
The [unmarshall] will Convert a DynamoDB record into a JavaScript object.
Stated and backed by AWS. Ref. https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/Converter.html#unmarshall-property
Nonetheless, I faced the exact same use case, as yours. Having one only attribute, TYPE SET (NS) in my case, and I had to manually do it. Next a snippet:
// Please notice the <setName>, which represents your set attribute name
ddbTransHandler.update(params).promise().then((value) =>{
value.Attributes[<setName>] = value.Attributes[<setName>].values;
return value; // or value.Attributes
});
Cheers,
Hamlet
Suppose I have these documents in a Things table:
{
"name": "Cali",
"state": "CA"
},
{
"name": "Vega",
"state": "NV",
},
{
"name": "Wash",
"state": "WA"
}
My UI is a state-picker where the user can select multiple states. I want to display the appropriate results. The SQL equivalent would be:
SELECT * FROM Things WHERE state IN ('CA', 'WA')
I have tried:
r.db('test').table('Things').filter(r.expr(['CA', 'WA']).contains(r('state')))
but that doesn't return anything and I don't understand why that wouldn't have worked.
This works for getting a single state:
r.db('test').table('Things').filter(r.row('state').eq('CA'))
r.db('test').table('Things').filter(r.expr(['CA', 'WA']).contains(r.row('state')))
seems to be working in some versions and returns
[
{
"id": "b20cdcab-35ab-464b-b10b-b2f644df73e6" ,
"name": "Cali" ,
"state": "CA"
} ,
{
"id": "506a4d1f-3752-409a-8a93-83385eb0a81b" ,
"name": "Wash" ,
"state": "WA"
}
]
Anyway, you can use a function instead of r.row:
r.db('test').table('Things').filter(function(row) {
return r.expr(['CA', 'WA']).contains(row('state'))
})
I am trying to move data between services and need to remove a reoccurring hash from a large record that contains both hashes and arrays.
The hash to remove from every section of the record is
{
"description": "simple identifier",
"name": "id",
"type": "id"
},
Heres example data :
{"stuff": { "defs": [
{
"description": "simple identifiery",
"name": "id",
"type": "id"
},
{
"name": "aDate",
"type": "date"
},
{
"defs": [
{
"description": "simple identifier",
"name": "id",
"type": "id"
},
{
"case-sensitive": true,
"length": null,
"name": "Id",
"type": "string"
},
{
"name": "anotherDate",
"type": "dateTime"
}
],
},
{
"defs": [
{
"description": "simple identifier",
"name": "id",
"type": "id"
},
...lots more....
I created a couple recursive function to remove the element(s) but I'm left with an empty hash '{}'. I also tried to remove the parent but found that I removed the hashes parent and not the hash itself.
I'm pretty sure I could create a new hash and populate it with the data I want but there must be a way to do this.
I am not working in rails and would like to avoid using rails gems.
I figured this out by looking at the data structure closer. The elements that need to be removed are always in an array so before recursing check if the hash key/value exists and delete if so. I'm sure this could be coded better so let me know what you think.
def recursive_delete!(node, key, value)
if node.is_a?(Array)
node.delete_if { |elm| elm[key] == value }
node.each do |elm|
recursive_delete!(elm, key, value)
end
elsif node.is_a?(Hash)
node.each_value do |v|
recursive_delete!(v, key, value)
end
end
end
If you are looking for the way to delete the same hash as you have inside complex Array/Hash data structure, it's easy:
def remove_hash_from(source, hsh)
return unless source.is_a?(Hash) || source.is_a?(Array)
source.each do |*args|
if args.last == hsh
source.delete(args.first)
elsif args.last.is_a?(Hash) || args.last.is_a?(Array)
remove_hash_from(args.last, hsh)
end
end
source
end
data = [
{h: 'v',
j: [{h: 'v'},
{a: 'c'},
8,
'asdf']
},
asdf: {h: 'v', j: 'c'}
]
remove_hash_from(data, {h: 'v'})
# => [{:h=>"v", :j=>[{:a=>"c"}, 8, "asdf"]}, {:asdf=>{:h=>"v", :j=>"c"}}]
Possibly, you will need to adjust method above for your needs. But common idea is clear, I hope.