This is related to another StackOverflow question from a year ago. But, a bit different.
This is in Ruby/Mongoid: 2.2.6.
When I got going with MongoDB, it looks like we stored the ID of an MongoDb document incorrectly, because a simple find on an ID is not working, yet it's returning the document when we run a where query on other attributes.
I've tried "resetting" the id field by using the object returned from the where and setting the "id" and the "_id" to a BSON::ObjectId version of the stored string. This did not work, as the then record is still not queryable by this field.
Any other suggestions before I just completely wipe the disk (losing months of production data) and starting over?
Edit: An example of a document that is retuned in the loop that is not retrievable.
{"_id"=>"4f47267193546d160b0171a2", "attribute_tags"=>[{"tag"=>"website"}, {"tag"=>"twitter"}, {"tag"=>"website"}, {"tag"=>"twitter"}], "contact_info"=>{"facebook"=>[], "success"=>true, "created_at"=>2012-02-24 05:58:06 UTC, "tags"=>[], "twitter"=>[], "email"=>[], "phone"=>[], "linkedin"=>[], "google_plus"=>[], "youtube"=>[], "contact_form"=>false}, "created_at"=>2012-02-24 05:56:01 UTC, "data"=>{"twitter_followers_count"=>112, "twitter_is_translator"=>112, "twitter_protected"=>false, "twitter_url"=>"http://www.bettyunderground.com", "twitter_verified"=>false, "twitter_statuses_count"=>2040, "twitter_listed_count"=>14, "twitter_geo_enabled"=>true, "twitter_friends_count"=>124, "twitter_created_at"=>"Fri Jul 17 21:41:00 +0000 2009", "twitter_contributors_enabled"=>false, "enriched_at"=>2012-02-24 05:58:09 UTC}, "demographics"=>{}, "description"=>"The trials and tribulations of a polemicist", "directory_ids"=>[], "forums"=>[], "found_at_url"=>"http://www.bettyunderground.com", "geographics"=>{"language"=>"en", "location"=>"San Francisco, CA"}, "hashtags"=>{"tag"=>{"website"=>true, "twitter"=>true}, "reachable_via"=>{"twitter"=>true}}, "host_names"=>[], "ignore_project_ids"=>[], "keyword_scores"=>{"return policy"=>0.0}, "keywords"=>["return policy"], "last_contact_info_update"=>2012-02-24 05:58:09 UTC, "name"=>"Betty Underground", "new_profiles"=>[{"service"=>"twitter", "user_id"=>"BettyUndergrnd", "score"=>1.0}, {"service"=>"twitter", "username"=>"BettyUndergrnd", "score"=>1.0}], "presence_score"=>0, "profile_url"=>"http://a2.twimg.com/profile_images/1459407098/image_normal.jpg", "profiles_retrieved"=>true, "references"=>[], "share_counts"=>{}, "tags"=>["website", "twitter"], "twitter"=>"BettyUndergrnd", "updated_at"=>2012-03-17 10:08:09 UTC, "wordsmaster_ids"=>[], "reachable_via"=>[], "read_project_ids"=>[]}
It doesn't have a ObjectId for the ID field. Not sure how it got busted this way, but that's the way it is.
The code I'm using to modify it is:
#if d is the document
old_id = d._id
d["_id"] = BSON::ObjectId(old_id)
d.save
I have placed a gist of doing this from my console. You can see exactly what I'm doing.
Any thoughts would be appreciated.
https://gist.github.com/2087011
_id field is immutable. You have to insert a new document with a new value of _id and delete the old one.
In Mongoid, there is a rake task to convert the ObjectIds.
If you use this, you'll have a mirror of your collection. Then, simply rename and you'll be set.
It will error if you have duplicate object_ids though, so you might need to run it a few times.
And, it's SLOW.
Related
I'm sending LUIS a query that is based on a time value (e.g. "what is the time 10 minutes from now" - just an example). I want the time to come back in the local timezone, so on the LuisPredictionOptions object (C#) I set the TimezoneOffset (as an example I set it to 2 hours ahead, or 120 minutes).
In Fiddler I can see when it calls the LUIS endpoint it's correctly adding "timezoneOffset=120.0".
However, the timezone comes back as UTC - it doesn't matter whether the timezoneOffset is set, or even what it is set to, the time always comes back UTC, using the builtin datetimeV2 entity.
Does anyone know what the TimezoneOffset property is for? Am I just using it incorrectly? Is there another way perhaps to get a local time from LUIS?
[Update]: Here are some examples: https://westus.api.cognitive.microsoft.com/luis/v2.0/apps/[AppId]?verbose=true&timezoneOffset=0&subscription-key=[subscriptionkey]&q=/luis/v2.0/apps/c1be57f4-3850-489e-8266-db376b82c011?timezoneOffset=120&log=true
https://westus.api.cognitive.microsoft.com/luis/v2.0/apps/[AppId]?verbose=true&timezoneOffset=0&subscription-key=[subscriptionkey]&q=/luis/v2.0/apps/c1be57f4-3850-489e-8266-db376b82c011?timezoneOffset=240&log=true
and I'm trying the following example utterance: "in 10 minutes".
When I do this, the timex is in UTC (e.g. timex=2020-01-11T16:08:25) and the "value" comes back with the same value, minus the "T", as follows: value=2020-01-11 16:08:25
I could understand perhaps if the timex is in UTC, but then possibly "value" should be adjusted by the timezoneOffset?
It looks like there's an incorrect question mark in your URL, right before timezoneOffset.
Using the same query I was able to get the expected behavior, where the returned value is different by 10 minutes.
Which SDK are you using? Perhaps you're using the V3 Runtime SDK which uses the V3 endpoint that doesn't use timeZoneOffset but instead uses datetimeReference, and need to use the V2 Runtime SDK instead.
https://westus.api.cognitive.microsoft.com/luis/v2.0/apps/[app-id]?verbose=true&timezoneOffset=10&subscription-key=[key]&q=in 10 minutes
The TimeZoneInfo class's FindSystemTimeZoneById method can be used to determine the correct timezoneOffset based on system time. An example in C# is shown below:
// Get CST zone id
TimeZoneInfo targetZone = TimeZoneInfo.FindSystemTimeZoneById("Central Standard Time");
// Get local machine's value of Now
DateTime utcDatetime = DateTime.UtcNow;
// Get Central Standard Time value of Now
DateTime cstDatetime = TimeZoneInfo.ConvertTimeFromUtc(utcDatetime, targetZone);
// Find timezoneOffset
int timezoneOffset = (int)((cstDatetime - utcDatetime).TotalMinutes);
Reference:
https://learn.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-data-alteration?tabs=V2#c-code-determines-correct-value-of-timezoneoffset
I have this hash, which is built dynamically:
additional_values = {"grouping_id"=>1}
I want to merge it with this record object after creation via first_or_create:
result = model.where(name: 'test').first_or_create do |record|
# I'm trying to merge any record attributes that exist in my hash:
record.attributes.merge(additional_values)
# This works, but it sucks:
# record.grouping_id = data['grouping_id'] if model.name == 'Grouping'
end
#Not working:
#result.attributes>>{"id"=>1, "name"=>"Test", "grouping_id"=>nil}
I understand that if the record already exists (returned via 'first'), it won't be updated...although that would be a nice option and any recommendations on that are welcome, but the table was just dropped and recreated, so that's not the issue.
What am I missing?
I also tried using to_sym, resulting with:
additional_values = {:grouping_id=>1}
...just in case there was some weirdness I didn't know about...didn't make a difference
The problem is Hash#merge returns a new hash and then you aren't doing anything with that hash, you're just throwing it away. I would also suggest sticking to using the ActiveRecord methods for updating attributes, instead of trying to manipulate the underlying hash, such as using assign_attributes or, if you want to save the record update. Though, you may find the create_with, which can be used with find_or_create_by, useful here:
model.create_with(additional_values).find_or_create_by(name: 'test')
I can't find any documentation that I like (if at all) for first_or_create in recent rails versions, but if you like that more than find_or_create_by, then if we look at the Rails 3 documentation for first_or_create, you should be able to do with out the create_with:
model.where(name: 'test').first_or_create(additional_attributes)
1) I have an Object (JB8XGctiZw) that has to keys: "patternName", and "tempoIntensities". When I run the following code, It will fetch the correct values of both keys.
ParseObject parseObject=ParseObject.createWithoutData("AudioPattern", "JB8XGctiZw");
parseObject=parseObject.fetch();
parseObject.pin();
Log.d("pattern",parseObject.getJSONArray("tempoIntensities").toString());
Log.d("pattern",parseObject.getString("patternName"));
2) From the web interface I change value contents for BOTH keys.
3) Run again the above code and it will successfully bring me the new value for "patterName", but "tempoIntensities", the JSONArray, will not be updated.
The only way to get the JSONArray updated is to clear local storage.
Is this a bug? Is this behavior normal?
Regards
It seems to be an issue with using ParseObject.getJSONArray and ParseObject.getJSONObject. For the moment I fixed this using ParseObject.get() and then casting it.
The version of Parse this applies to is 1.8.2,
And there is an assigned issue here:
https://developers.facebook.com/bugs/846833412048862/
You should be logged in to facebook to see this bug
I'm trying to us the Ruby Azure SDK to query an Azure Table. I can get the call to work, and if I look at the wireshark it's returning tons of results. But I can't figure out how to iterate through them.
query = {:filter => "Timestamp ge datetime'2015-01-01T00:00:00Z'", :select => ["FileName"]}
result, token = azure_table_service.query_entities("ActivityTable", query)
p result
p token
Shows this as the output.
#<Azure::Table::Entity:0xb8f74fdc #properties={"FileName"=>"LOCKINFO.DAT"}, #table="ActivityTable", #updated=2015-01-06 20:22:14 UTC, #etag=nil>
#<Azure::Table::Entity:0xb8f74f3c #properties={"FileName"=>"Scan000.pdf"}, #table="ActivityTable", #updated=2015-01-06 20:22:14 UTC, #etag=nil>
I tried result.count, result.pop, and others. The documentation really sucks too, https://github.com/Azure/azure-sdk-for-ruby/blob/master/lib/azure/table/table_service.rb. I looks like I'm getting an array of EnumerationResults back but none of the array calls work.
I also can't figure out how to use the token to get the next set of results but that's after I can figure out how to use the ones I have.
-Update-
p result.class
p token.class
Shows that both are Azure::Table::Entity
Okay! I found an issue with the documentation I guess. I shouldn't use their example since
result = azure_table_service.query_entities("XASActivityTable", query)
returns an array of the expected values. Adding that Token variable seems to cause some type of Ruby magic where the first and second values are put into the variables and the rest are dumped.
You can get the status by using below statement.
status = result.properties['status']
I'm trying to embed a document inside an existing document using the Ruby Driver.
Here's what my primary document looks like:
db = Mongo::Connection.new.db("Portfolios")
project_collection = db.collection("Projects")
new_Project = { :url => 'http://www.tekfolio.me/billy/portfolio/focus', :author => 'Billy'}
project_collection.insert(new_Project)
After I've created my new_project and added it to my project_collection I may or may not add another collection to the same document later called assets. This is where I'm stuck. The following code doesn't seem to do anything:
new_asset = { :image_url => 'http://assets.tekfolio.me/portfolios/68fbb25a-8353-41a8-a779-4bd9762b00f2/projects/13/assets/20/focus2.PNG'}
new_Project.assest.insert(new_asset)
I'm certain I've butchered my understanding of Mongodb and the Ruby driver and the embeded document concept and would appreciate your help getting me out of this wet paper bag I can't seem to get out of ;)
Have you tried just setting the value of asset without insert and instead using update?
new_Project["asset"] = new_asset
project_collection.update({"_id" => new_Project["_id"]}, new_Project)
I think , are you trying to "update" the new_project record with the asset
it doesn't work because then you are only updating the hash in ruby, not in mongo, you have to first get the reference to the object in mongo, update it, and then save it, check this info:
http://www.mongodb.org/display/DOCS/Updating+Data+in+Mongo
(if you can, you can assign the asset before inserting, and it should work)