How to convert cqlengine resultset objects into JSON format - cassandra-2.0

I am writing an API which queries a Cassandra 2.1.2 based database and returns results in JSON format. I am using cqlengine for this.
Here is the simplified schema -
class Checkins(Model):
Model.__comment__ = "Table mapping for submit datastore"
changelist = columns.Integer (primary_key= True) # Changelist number
checkin_date = columns.DateTime() # Submit time
stream_name = columns.Ascii (primary_key= True) # Stream-name
creator = columns.Ascii () # Creator
My query is this
clobj = Checkins.objects(changelist=changelist).get()
How do I convert the resultset into a json format ?

You can create dictionaries from models as of cqlengine 0.12. From there you can use the json module to get a JSON format. You do have to be careful because datetimes are not json serializable. Therefore you will need to convert it to a string first (Or look at this question for other ways to fix the datetime serialization problem).
import json
clobj = Checkins.objects(changelist=changelist).get()
clobj_dict = dict(clobj_dict)
clobj_dict['checkin_date'] = str(clobj_dict['checkin_date'])
json_string = json.dumps(clobj_dict)
or you could add it as a property on the class
import json
class Checkins(Model):
# Define your model as before
# ...
#property
def json(self):
# Perform the same thing as before.
json_dict = dict(self)
json_dict['checkin_date'] = str(json_dict['checkin_date'])
return json.dumps(clobj_dict)
# Just call the property.
clobj = Checkins.objects(changelist=changelist).get()
json_string = clobj.json

Related

use generic types in python to refactor code

I have been trying several ways to refactor the following code as these classes are recurring in my app:
class CreateRecord(Mutation):
record = Field(lambda: Unit)
class Arguments:
input = CreateInput(required=True)
def mutate(self, info, input):
data = input_to_dictionary(input)
data['createdAt'] = datetime.utcnow()
# data['createdBy'] = <user> # TODO: <user> input
record = UnitModel(**data)
db_session.add(record)
db_session.commit()
return CreateRecord(record=record)
class UpdateRecord(Mutation):
record = Field(lambda: Unit)
class Arguments:
input = UpdateInput(required=True)
def mutate(self, info, input):
data = input_to_dictionary(input)
data['updatedAt'] = datetime.utcnow()
# data['updatedBy'] = <user> # TODO: <user> input
record = db_session.query(UnitModel).filter_by(id=data['id'])
record.update(data)
db_session.commit()
record = db_session.query(UnitModel).filter_by(id=data['id']).first()
return UpdateRecord(record=record)
class DeleteRecord(Mutation):
record = Field(lambda: Unit)
class Arguments:
input = DeleteInput(required=True)
def mutate(self, info, input):
data = input_to_dictionary(input)
data['deletedAt'] = datetime.utcnow()
# data['deletedBy'] = <user> # TODO: <user> input
data['isDeleted'] = True
record = db_session.query(UnitModel).filter_by(id=data['id'])
record.update(data)
db_session.commit()
record = db_session.query(UnitModel).filter_by(id=data['id']).first()
return DeleteRecord(record=record)
I was thinking of using generic types but I'm kinda' stuck on how to implement it. I've tried creating a master class and in the
def mutate:
method I'd just check if it's a create, update or delete action but I still want to work with generic types before I do that.
Any help is highly appreciated. TIA.
I've solved this particular problem for myself with a mixin class that includes the following method:
from graphene.utils.str_converters import to_snake_case
class MutationResponseMixin(object):
#classmethod
def get_operation_type(cls):
"""
Determine the CRUD type from the mutation class name.
Uses mutation's class name to determine correct operation.
( create / update / delete )
"""
return to_snake_case(cls.__name__).split('_')[0]
This allows me to include a mutation method in the mixin that is shared by create, update, and delete methods and takes conditional action based on value of get_operation_type.
I also needed a way to determine the base record from the mixin's mutation (which in your case would be UnitModel) so my case I ended up declarding it explicitly as an attribute of each mutation class.

How can I create a complete_name field in a custom module for a custom hierarchy like used on product categories in Odoo?

I'm trying to create a field “complete_name” that displays a hierarchy name similar to whats done on the product categories grid but I can't seem to get it to work. It just puts Odoo in an endless loading screen when I access the relevant view using the new field "complete_name".
I have tried to copy the code used in addons/product/product.py and migrate to work with Odoo 9 API by using compute instead of .function type but it did not work.
Can someone help me understand whats wrong? Below is my model class which works fine without the complete_name field in my view.
class cb_public_catalog_category( models.Model ):
_name = "cb.public.catalog.category"
_parent_store = True
parent_left = newFields.Integer( index = True )
parent_right = newFields.Integer( index = True )
name = newFields.Char( string = 'Category Name' )
child_id = newFields.One2many( 'catalog.category', 'parent_id', string = 'Child Categories' )
complete_name = newFields.Char( compute = '_name_get_fnc', string = 'Name' )
def _name_get_fnc( self ):
res = self.name_get( self )
return dict( res )
Your compute function is supposed to define the value of an attribute of your class, not return a value. Ensure the value you are assigning complete_name is a string.
Also name_get() returns a tuple. I am not sure if you really want a string representation of this tuple or just the actual name value.
Try this
def _name_get_fnc( self ):
self.complete_name = self.name_get()[1]
If you really want what is returned by name_get() then try this.
def _name_get_fnc( self ):
self.complete_name = str(self.name_get())
If you are still having issues I would incorporate some logging to get a better idea of what you are setting the value of complete_name to.
import logging
_logger = logging.getLogger(__name__)
def _name_get_fnc( self ):
_logger.info("COMPUTING COMPLETE NAME")
_logger.info("COMPLETE NAME: " + str(self.name_get()))
self.complete_name = self.name_get()
If this does not make it apparent what the issue is you could always try statically assigning it a value in the off chance that there is a problem with your view.
def _name_get_fnc( self ):
self.complete_name = "TEST COMPLETE NAME"
After further review I think I have the answer to my own question. It turns out as with a lot of things its very simple.
Simply use "_inherit" and inherit the product.category
model. This gives access to all the functions and fields
of product.category including the complete_name field
and computes the name from my custom model data. I was
able to remove my _name_get_func and just use the inherited
function.
The final model definition is below. Once this
update was complete I was able to add a "complete_name" field
to my view and the results were as desired!
class cb_public_catalog_category( models.Model ):
_name = "cb.public.catalog.category"
_inherit = 'product.category'
_parent_store = True
parent_left = newFields.Integer( index = True )
parent_right = newFields.Integer( index = True )
name = newFields.Char( string = 'Category Name' )
child_id = newFields.One2many( 'catalog.category', 'parent_id', string = 'Child Categories' )

Update hash with Rails 4

I have an hash object:
#chosen_opportunity = {"id"=>66480, "prize_id"=>4, "admin_user_id"=>1, "created_at"=>2015-09-20 18:37:29 +0200, "updated_at"=>2015-09-20 18:37:29 +0200, "opportunity_available"=>true}
How do I update the value of deal_available to false?
I tried this but it fails:
#chosen_opportunity['deal_available'] = false
#chosen_opportunity.save
controllers/deal_controller.rb:
def show_opportunity
#deal = Deal.friendly.find(params[:id])
#chosen_opportunity = Opoortunity.find_by_sql(
" SELECT \"opportunities\".*
FROM \"opportunities\"
WHERE (deal_id = #{#deal.id}
AND opportunity_available = true)
ORDER BY \"opportunities\".\"id\" ASC LIMIT 1"
)
# comes from http://apidock.com/rails/ActiveRecord/Base/find_by_sql/class
#chosen_opportunity[0].attributes['opportunity_available'] = false
#chosen_opportunity[0].save
respond_to do |format|
format.js
end
end
Can I update the value of opportunity_available from the Opportunity model inside a Deal controller? Is that why it's not working?
I know I could use Active Record but I need to use raw PostgreSQL for the first query. Thanks for your understanding of this non very Rails-y way.
You can change your code to:
#chosen_opportunity = Opportunity.find_by deal_id: deal.id, opportunity_available: true
#chosen_opportunity.opportunity_available = false
#chosen_opportunity.save!
This is much more Rails compatible. In addition, if I'm not mistaken, Rails won't let you save an object that you got through find_by_sql, so at the least you'd need to get a proper model object from the result. You can write (very ugly) code like:
Opportunity.where(id: #chosen_opportunity[0].attributes['id'])
.update_all(opportunity_available: false)
Warning: This will update the database, but not the #chosen_opportunity[0] object.
There is no method save on a hash. You have to do it on the model that holds that hash as an instance variable.

Use JSON result in query using Ruby, Sinatra and PostgreSQL

I am using Ruby and the Plivo api to create a subaccount.
The code is:
AUTH_ID = "my_id"
AUTH_TOKEN = "my_token"
p = RestAPI.new(AUTH_ID, AUTH_TOKEN)
params = {'name' => 'thegreatone'}
response = p.create_subaccount(params)
// up to here is fine and (without my attempts below to access the json response) it works and the account is created.
The JSON response is :
[{"auth_token"=>"ZjgxMGQwMTY2NGY3Nzk3ZmM3ZGE3ZmIxMGQyZWYy",
"message"=>"created",
"api_id"=>"2c1eff4a-b955-11e2-8361-123141011ae6",
"auth_id"=>"SAMZBJOGZKZDIXMMEXNJ"}]
I would like to "extract" the "auth_token" and "auht_id" so that I can insert them into my database.
So I have tried (among other things):
obj = JSON.parse(response)
:user_key = obj['auth_token']
the message in my terminal is:
syntax error, unexpected '=', expecting $end
:user_key = obj['auth_token']
How can I extract these variables and then pass them to my insert query?
I am using postgres with SEQUEL, ruby and sinatra.
You're trying to assign a value to a symbol and the response returns an array:
obj = JSON.parse(response)
:user_key = obj['auth_token']
should be
obj = JSON.parse(response).first
user_key = obj['auth_token']
symbols are not variables, they are constants.
The response is an array containing the code response and then the data, so your code should be
obj = JSON.parse(response).last
user_key = obj['auth_token']
Edit: Turns out the gem already do the parse for you plivo.rb. So the code would be:
obj = response.last
user_key = obj['auth_token']

Django - JSON response with serialized object

I am trying to send a serialized object using JSON. Here is my view code:
if request.is_ajax():
resp = {}
if request.POST:
if form.is_valid():
g = form.save()
resp['graph'] = serializers.serialize('json', [g, ])
resp['success'] = True
else:
resp['errors'] = form.errors
resp['success'] = False
return HttpResponse(simplejson.dumps(resp), mimetype='application/javascript')
return render(request, 'graph/inlines/create.html', {'form':form})
The problem is (rather obviously) that the 'graph' object I am trying to return is being serialized twice (once with serializers.serialize and again when I used simplejson.dumps) and the object is being received as a json string.
I tried just doing this:
resp['graph'] = g
But it throws a server error as the object obviously isn't serialized when I try to use simplejson.dumps.
Is there a way I can tell it to ignore this key when dumping the data? Would appreciate any help.
Rather than serializing the graph queryset to json the first time, use serializers.serialize('python', g) to convert it to a Python dictionary first. Then the whole thing will be converted to json at the end.

Resources