I have a serializer like this:
class SureveyResponseSerializer(serializers.ModelSerializer):
respondent_profile = RespondentsProfileSerializer(read_only=True)
def create(self, validated_data):
... some stuff ...
return survey_response
class Meta:
model = SurveyResponse
fields = '__all__'
My problem here is dealing with creating and reading objects using the same serializer. For reading, I want to show the nested instance of respondent_profile with all its fields.
For a SurveyResponse (and adding a relation to an existing respondent_profile) I want to simply pass the UUID of an existing respondent_profile.
Is this possible or do I need two different serializers?
Related
Here I have 2 objects recipe objects in my Recipe model. I am trying to understand what happens when I pass the recipe objects to my serializer like this. What system does to that objects? And what it returns?
recipes = Recipe.objects.all().order_by('-id')
serializer = RecipeSerializer(recipes, many=True)
And here is serializer:
class RecipeSerializer(serializers.ModelSerializer):
class Meta:
model = Recipe
fields = '__all__'
serializer = RecipeSerializer(recipes, many=True)
This means that you will take the data from the RecipeSerializer, not from the model recipe; in your case, that will be the same except the form (the serializer returns data as OrderedDict), but in case that the serializer have more data than the model, it's going to be more useful to call the serializer.data.
I am using a model that consists of many fields. There is one field that is a property, and it returns an instance of a model. Something like the following:
class A(Model):
#property
def last_obj(self):
# Returns an object
The issue I'm having is that this property can return 2 different Model types. It can either return an object of type one, or an object of type two. This creates complications in the serializer. I have a serializer that consists of nested serializers. The two objects are similar enough that one serializer can be used over the other, but then the fields unique to them are not serialized.
class A_Serializer(Serializer):
class SerializerOne(CustomSerializer):
#Serializes certain fields in custom manner
class Meta:
model = models.one
exclude = ('id')
base_name = 'one'
class SerializerTwo(CustomSerializer):
#Serializes certain fields in custom manner
class Meta:
model = models.two
exclude = ('id')
base_name = 'two'
last_obj = SerializerOne() #This works, but not viable because of what I stated above
So my solution to be able to dynamically call the correct serializer, was to conditionally serialize the property within a serializer method field:
class A_Serializer(Serializer):
class SerializerOne(CustomSerializer):
#Serializes certain fields in custom manner
class Meta:
model = models.one
exclude = ('id')
base_name = 'one'
class SerializerTwo(CustomSerializer):
#Serializes certain fields in custom manner
class Meta:
model = models.two
exclude = ('id')
base_name = 'two'
def get_last_obj(self, instance):
if (isinstance(instance.last_obj, models.one)):
return self.SerializerOne(instance.last_obj).data
else:
return self.SerializerTwo(instance.last_obj).data
last_obj = SerializerMethodField() #Does not work
However, this solution creates the error "NoneType Object is not iterable" and it happens at
super(ReturnDict, self).__init__(*args, **kwargs) in rest_framework/utils/serializers_helpers.py in init which causes the error at return ReturnDict(ret, serializer=self) in rest_framework/serializers.py in data
I do not understand why calling a nested serializer like obj = Serializer() works, but calling the serializer explicitly like obj = Serializer(instance).data does not work in this situation. Can anyone figure out what I have been doing wrong? Thank you.
I have found out from here that when working with hyperlinked relations (which in my case was the CustomSerializer that SerializerOne and SerializerTwo were inheriting from), you must pass the request object through context. The reason why obj = Serializer() works, but obj = Serializer(instance).data does not work is that in the former, the request object is automatically added through context through DRF. While in the latter, it is being explicitly called so you must pass context with the request object manually. So for me to get it working, I did:
return self.SerializerOne(instance.last_obj, context={'request': self.context['request']}).data
inside the serializer method field.
I've implemented graphql and I'm migrating to relay. I already have a uuid for every table and that is named 'id'. And my application I found this github thread that talks about possibly changing the spec but it feels like a rabbit hole.
Is there a simple way that I can use my own custom id with relay?
If you've already implemented a default relay endpoint then you should have some
TableNameNode classes that have a Meta nested class, and a seperate Query
class.
class ExampleTableNameNode(DjangoObjectType):
class Meta:
model = ExampleTableName
interface = (relay.Node,)
class Query(object):
example_table_name = relay.Node.Field(ExampleTableNameNode)
all_example_table_names = DjangoFilterConnectionField(ExampleTableNameNode)
def resolve_example_table_name(self, info, **kwargs):
pass
def resolve_all_example_table_names(self, info, **kwargs):
pass
The interface = (relay.Node,) is what defines:
How the ids are being generated
How they are used to fetch data
If we create a relay.Node subclass that redefines these two features then we can use our custom ids.
class CustomNode(relay.Node):
class Meta:
name = 'Node'
#staticmethod
def to_global_id(type, id):
#returns a non-encoded ID
return id
#staticmethod
def get_node_from_global_id(info, global_id, only_type=None):
model = getattr(Query,info.field_name).field_type._meta.model
return model.objects.get(id=global_id)
Here we implemented two functions, to_global_id, and get_node_from_global_id.
The line model = ... is a bit of magic to go from the graphql query table name
to the actual model. If that doesn't work you'll just need to make a dictionary
to go from something like example_table_name to the actual ExampleTableName
django model.
Once you do that you'll have to replace the two references to relay.Node with
CustomNode like so.
class ExampleTableNameNode(DjangoObjectType):
class Meta:
model = ExampleTableName
interface = (CustomNode,)
class Query(object):
example_table_name = CustomNode.Field(ExampleTableNameNode)
all_example_table_names = DjangoFilterConnectionField(ExampleTableNameNode)
def resolve_example_table_name(self, info, **kwargs):
pass
def resolve_all_example_table_names(self, info, **kwargs):
pass
The answer is in the graphene docs. I read them when I was implementing
graphene and relay but there is so much to learn at once that it's easy to read
through custom node section and not remember later that you need to do a custom
node solution.
Store has a foreign key to SimilarStore. Normally, there is about a hundred of similar stores in similarstore_set. Is there a way to limit the number of similar stores in similarstore_set when I make API with Django REST Framework?
serializer.py
class SimilarStoreSerializer(ModelSerializer):
class Meta:
model = SimilarStore
fields = ('domain', )
class StoreSerializer(ModelSerializer):
similarstore_set = SimilarStoreSerializer(many=True)
class Meta:
model = Store
fields = '__all__'
UPDATE
The following codes throws 'Store' object has no attribute 'similarstores_set', it actually has similarstore_set, why is it throwing the error?
class StoreSerializer(ModelSerializer):
image_set = ImageSerializer(many=True)
promotion_set = PromotionSerializer(many=True)
similar_stores = SerializerMethodField()
def get_similar_stores(self, obj):
# get 10 similar stores for this store
stores = obj.similarstores_set.all()[:10] <-- This line throws the error
return SimilarStoreSerializer(stores, many=True).data
class Meta:
model = Store
fields = '__all__'
You can use a SerializerMethodField to perform a custom lookup and limit the number of records:
class StoreSerializer(ModelSerializer):
similar_stores = serializers.SerializerMethodField()
def get_similar_stores(self, obj):
stores = obj.similarstore_set.all()[:10] # get 10 similar stores for this store
return SimilarStoreSerializer(stores, many=True).data
class Meta:
model = Store
fields = '__all__'
You could add a serializers.SerializerMethodField() for similarstore_set and define a method that would query the SimilarStore data and set similarstore_set. You could pass the number of elements you want in similarstore_set by passing context to your serializer. see https://www.django-rest-framework.org/api-guide/serializers/#including-extra-context
I have two related models and serializers for both of them. When I am serializing one of these models (the serializer has a depth of 1) the result includes some fields from the related object that should't be visible. How an I specify which serializer to use for the relation? Or is there anyway to tell Rest Framework to exclude some fields from the related object?
Thank you,
I think one way would be to create an extra serializer for the model where you want to return only limited number of fields and then use this serializer in the serializer of the other model. Something like this:
class MyModelSerializerLimited(serializers.ModelSerializer):
class Meta:
model = MyModel
fields = ('field1', 'field2') #fields that you want to display
Then in the other serializer use the MyModelSerializerLimited:
class OtherModelSerializer(serializers.ModelSerializer):
myfield = MyModelSerializerLimited()
class Meta:
model = OtherModel
fields = ('myfield', ...)
depth = 1
You could override restore_fields method on serializer. Here in restore_fields method you can modify list of fields - serializer.fields - pop, push or modify any of the fields.
eg: Field workspace is read_only when action is not 'create'
class MyPostSerializer(ModelSerializer):
def restore_fields(self, data, files):
if (self.context.get('view').action != 'create'):
self.fields.get('workspace').read_only=True
return super(MyPostSerializer, self).restore_fields(data, files)
class Meta:
model = MyPost
fields = ('id', 'name', 'workspace')