1

I design a validation API where call-backs are used to check values. There a two variants of callback signatures:

def check(self, value):
    pass

def check(self, value, domain_object):          
    pass

Example for calling the callback implementations:

for constraint in constraints:
    constraint.check(value) 
    # or constraint.check(value, domain_object) depending on the implementation

For now I count the number of arguments reflectively before the method is called and depending on the result I pass one ore two parameters to it. But is this good style?

Would it be better to

  • always use the signature with three arguments: check(self, value, domain_object) or
  • use a different name like check_with_domain_object for the second case?

I think in terms of oop it would be the cleanest way to always use the three argument variant. What do you think?

5
  • 2
    Can't you assign some default value to domain_object? Commented May 20, 2011 at 8:29
  • ... default values, *args/**kws, unique method names ... depends. Keep it simple. That's the "Python way", right? ;-) Commented May 20, 2011 at 8:36
  • Maybe you can add an example of how the functions are used, in order to clarify the question. Commented May 20, 2011 at 8:42
  • @Evpok My intention was to avoid the domain_object param at all on the implementation side, if it is not needed. Commented May 20, 2011 at 10:47
  • @Space_C0wb0y: I've added an example for calling the callbacks. Commented May 20, 2011 at 14:42

3 Answers 3

2

The most idiomatic way would be to first try with two arguments, and if it fails, try with one:

try:
    callback(value_param, domain_object_param)
except TypeError:
    callback(value_param)
Sign up to request clarification or add additional context in comments.

6 Comments

Uh. I completely disagree. Ick, ick and more ick. (I think I was reading the question differently: both methods/functionality will always be present, it's not a check for a given method, but rather a question of simulating method overloading.)
@pst: The functions are callbacks. This means I assume that some other function gets one of them as parameter, but doesn't know which one. So it requires some way to find out weather to use the domain_object parameter or not. The above does just that with minimal overhead.
@Space_C0wb0y: This swallows any TypeError raised inside callback(), making debugging a lot harder.
@lunaryorn: That is correct. However, TypeError usually indicates a programmers fault, and that should have been covered by independent testing of the callback.
@Space_C0wb0y: Agreed, but you can't rely on test coverage, especially not, if the callbacks might come from 3rd party modules. Explicitly checking the signature isn't much harder than try-catch, but can make life a lot easier, if there is a bug in a callback, which wasn't caught by tests.
|
2

I like @Space_C0wb0y's answer, it is similar to code Raymond Hettinger sent to me to address a similar situation in pyparsing (see below). For your simple case, try using this normalizer class to wrap the given callbacks:

class _ArityNormalizer(object):
    def __init__(self, fn):
        self.baseFn = fn
        self.wrapper = None

    def __call__(self, value, domain_object):
        if self.wrapper is None:
            try:
                self.wrapper = self.baseFn
                return self.baseFn(value, domain_object)
            except TypeError:
                self.wrapper = lambda v,d: self.baseFn(v)
                return self.baseFn(value)
        else:
            return self.wrapper(value, domain_object)

Your code can now wrap the callbacks in an _ArityNormalizer, and at callback time, always call with 2 arguments. _ArityNormalizer will do the trial-and-error "call with 2 args and if that fails call with 1 arg" logic only once, and from then on will go directly to the correct form.

In pyparsing, I wanted to support callbacks that may be defined to take 0, 1, 2, or 3 arguments, and wrote code that would wrap the called function with one of several decorators depending on what the callback function's signature was. This way, at run/callback time I'd just always call with 3 arguments, and the decorator took care of making the actual call with the correct number of args.

My code did a lot of fragile/non-portable/version-sensitive signature introspection to do this (sounds like what the OP is currently doing), until Raymond Hettinger sent me a nice arity-trimming method that does essentially what @Space_C0wb0y's answer proposes. RH's code used some very neat decorator wrapping with a nonlocal variable to record the arity of the successful call, so that you only have to go through the trial-and-error once, instead of every time you call the callback. You can see his code in the pyparsing SVN repository on SourceForge, in the function _trim_arity - note that his code has Py2/Py3 variants, due to the use of the "nonlocal" keyword.

The _ArityNormalizer code above was inspired by RH's code, before I fully understood his code's magic.

Comments

0

"Space_C0wb0y"'s idea to use try ... except TypError looks nice, but I don't like the fact that this could swallow other exceptions. "Paul McGuire"'s suggestion with the _ArityNormalizer is essentially the same with a nice interface.

In the end I decided to keep things as simple and object oriented as possible and go always with two paramters even if there would be several cases in which the second parameter will be unused:

Implementation side:

def check(self, value, domain_object):          
    pass

Calling side:

constraint.check(value, domain_object)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.