Can you extend Playwright Page() Python sync? - python-asyncio

I'm testing out Playwright (Python) and there are a bunch of additional methods that I want to attach to the Page(). I tried inheriting the class but keep getting _init__() missing 1 required positional argument: 'impl_obj' What is this impl_obj and what is the best way to extend methods to their Page()
from playwright.sync_api import Page
class CustomPage(Page):
def __init__(self) -> None:
super().__init__()
def test(self):
print('test')
page = CustomPage()
TypeError: __init__() missing 1 required positional argument: 'impl_obj'

Related

python code d-linked list. however i am getting this error- 'insert_at_begining() missing 1 required positional argument: 'data' here's my code

please Resolve the issue i have no idea why I am getting error
class Node:
def __init__(self,perv=None,data=None,Next=None):
self.perv=perv
self.data=data
self.Next=Next
class dll:
def __init__(self):
self.head=None
def insert_at_begining(self,data):
node=Node(None,data,self.head)
self.head=node
def insert_at_end(self,data):
itr=self.head
while itr.next:
itr=itr.Next
node=Node(itr,data,None)
itr.Next=node
def printlist(self):
llstr=''
itr=self.head
while itr:
llstr +=str(itr.data)+'-->'
itr=itr.Next
print(llstr)
Please resolve my problem for linked list . why my code is not working i have not idea

How can I DRY up SemanticLogger::Loggable and SemanticLogger::Loggable::ClassMethods by extending classes?

Summary
I'm trying to use SemanticLogger 4.10.0 as a part of a dynamic tracing solution in a Ruby 3.1.1 app. However, I seem to be misunderstanding something about how or where to access the logger instance that SemanticLogger::Loggable is supposed to create on the class, and am getting NameError exceptions (documented below) when calling #logger_measure_method from inside the extended class because I can't seem to implement the class extension properly.
Code That Demonstrates the Problem
Module with CustomLogger
require 'json'
require 'semantic_logger'
module SomeGem
module CustomLogger
include SemanticLogger
class MyFormat < SemanticLogger::Formatters::Default
def call (log, logger, machine_name: nil, json: nil)
self.log = log
self.logger = logger
message = "[#{machine name}] #{message}" if machine_name
payload = JSON.parse(json)
[time, level, process.info, tags, named.tags, duration, name, message, payload, exception].compact!
end
end
def self.extended(klass)
prepend SemanticLogger, SemanticLogger::Loggable, SemanticLogger::Loggable::ClassMethods
SemanticLogger.default_level = :debug
SemanticLogger.add_appender(file_name: "/dev/stderr", formatter: SomeGem::CustomLogger::MyFormat.new,
level: :warn, filter: proc { %i[warn fatal unknown].include? _1.level }
)
SemanticLogger.add_appender(file_name: "/dev/stdout", formatter: SomeGem::CustomLogger::MyFormat.new,
level: :trace, filter: proc { %i[trace debug info error].include? _1.level }
)
define_method(:auto_measure_method_calls) do
public_instance_methods(false).
reject { _1.to_s.match? /^logger/ }.
each { logger_measure_method _1, level: :trace }
end
end
end
end
Module Extended by CustomLogger
module SomeGem
class Foo
extend CustomLogger
def alpha = 1
def beta = 2
def charlie = 3
auto_measure_method_calls if ENV['LOG_LEVEL'] == 'trace'
end
end
Exercising the Code
ENV['LOG_LEVEL'] = 'trace'
f = SomeGem::Foo.new
pp f.alpha, f.beta, f.charlie
Problems and Issues with Code
My issues are:
The underlying (and possibly X/Y) issue is that I want to DRY up the code of including SemanticLogger::Loggable and SemanticLogger::Loggable::ClassMethods in every class I'm tracing, and auto-measuring the extended classes' methods in development.
When I try to extend a class with a module that is intended to pull in SemanticLogger::Loggable, I don't seem to always have logger available as an accessor throughout the class.
I'm also concerned that including the module in multiple classes would result in duplicate appenders being added to the #appenders array, wherever that's actually stored.
Most importantly though, when I try to automagically add logging and method measurement through extending a class, I get errors like the following:
~/.gem/ruby/3.1.1/gems/semantic_logger-4.10.0/lib/semantic_logger/loggable.rb:96:in `alpha': undefined local variable or method `logger' for #SomeGem::Foo:0x00000001138f6a38 (NameError) from foo.rb:51:in `<main>'
Am I missing something obvious about how to extend the classes with SemanticLogger? If there's a better way to accomplish what I'm trying to do, that's great! If I'm doing something wrong, understanding that would be great, too. If it's a bug, or I'm using the feature wrong, that's useful information as well.

Decent way to add a validator for DRF Field class child

I have a custom DRF field element that looks like this:
class TimestampField(serializers.DateTimeField):
def __init__(self, allow_future=True, *args, **kwargs):
self.allow_future = allow_future
def some_sort_of_validator(...): # Don't know how to do that
if not self.allow_future:
if value > timezone.now():
raise ValidationError('...')
Basically, I want to do some custom validation for that field elements. For example, I want to assure that future dates are prohibited. Looks like I need to add something that is refered to as validator in the docs. And I wonder how to do that correctly, so that not to kill native validators. I found nothing regarding this neither in the DRF doc, nor in SO.
There is an article in the docs about writing validators and a section about writing custom validators.
in your case, something like this should work
class TimestampValidator:
def __init__(self, allow_future):
self.allow_future = allow_future
def __call__(self, value):
if not self.allow_future:
if value > timestamp.now():
raise ValidationError('...')
and to use it in your actual serializer
class MySerializer((serializers.Serializer):
timestamp = serializers.DateTimeField(validators=[TimestampValidator(allow_future=True)])
# .. the rest of your serializer goes here
you can also check the code for the built-in validators to see how they are done

Django Forms - is it advisable to change user submitted data in the clean method

I have the following code in my Django project:
def my_view(request):
form = MyForm(request.POST):
if form.is_valid():
instance = form.save(commit = False)
instance.some_field = 'foo'
form.save()
#...
The question is, is it advisable to rewrite this the following way:
forms.py
class MyForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
# ...
def clean(self):
form.some_field = 'foo'
I wonder, if clean method should be used exclusively for data validation, or I could also perform here some business logic stuff, thus making my views more concise and devoid of business logic details.
One of possible advantages of making the assignment in the clean method is that sometimes data validation could be dependent on some_field.
Have you considered putting it in .save?

How can a ChoiceField.choices callable know what choices to return?

In Django 1.8, the ChoiceField's choices argument can accept a callable:
def get_choices():
return [(1, "one"), (2, "two")]
class MyForm(forms.Form):
my_choice_field = forms.ChoiceField(choices=get_choices)
In the above example, get_choices() always returns the same choices. However, being able to assign a callable to choices does not make much sense unless that callable knows something like, say, an object id, each time it is called. How can I pass such a thing to it?
You can't do it in the form declaration because the CallableChoiceIterator calls the function without arguments that he gets from here.
Doing in the __init__ Form method is easier than creating your own ChoiceField I guess. Here is what I suggest:
class MyForm(forms.Form):
my_choice_field = forms.ChoiceField(choices=())
def __init__(self, *args, **kwargs):
# Let's pass the object id as a form kwarg
self.object_id = kwargs.pop('object_id')
# django metaclass magic to construct fields
super().__init__(*args, **kwargs)
# Now you can get your choices based on that object id
self.fields['my_choice_field'].choices = your_get_choices_function(self.object_id)
That supposes that you have some Class Based View that looks that has a method like this :
class MyFormView(FormView):
# ...
def get_form_kwargs(self):
kwargs = super().get_form_kwargs()
kwargs['object_id'] = 'YOUR_OBJECT_ID_HERE'
return kwargs
# ...
P.S : The super() function call supposes you are using python 3
The reason it's possible to set a callable like that is to avoid situations where you're using models before they're ready.
forms.py
class Foo(ModelForm):
choice_field = ChoiceField(choices=[
user.username.lower() for user in User.objects.all()
])
Were forms.py imported before models were ready, (which it probably is because views.py generally likes to import it, and urls.py generally likes to import that, and urls.py is imported by the startup machinery), it will raise an exception due to trying to do ORM stuff before all the apps are imported.
The correct way is to use a callable like so:
def lower_case_usernames():
return [user.username.lower() for user in User.objects.all()]
class Foo(ModelForm):
choice_field = ChoiceField(choices=lower_case_usernames)
This also has the benefit of being able to change without restarting the server.

Resources