I'd love it if testify had the ability to report on skipped tests, like unittest: http://docs.python.org/library/unittest.html#skipping-tests-and-expected-failures
The idea behind skipped tests is to tell the user, yeah, those other tests passed, but I couldn't run this test at all because your system doesn't support it.
It's different from expected failures (which indicate that tests are known to be broken and not to worry about it) or disabling tests (which causes the tests to be siliently not run).
I think we could basically make a PEP8 version of the unittest decorators (skip, skip_if, and skip_unless). I'm thinking the reason argument should be optional.
Also, it'd be nice if instead of using a decorator, the test could raise a SkipTest exception (or whatever you want to call it).
I'd love it if
testifyhad the ability to report on skipped tests, likeunittest: http://docs.python.org/library/unittest.html#skipping-tests-and-expected-failuresThe idea behind skipped tests is to tell the user, yeah, those other tests passed, but I couldn't run this test at all because your system doesn't support it.
It's different from expected failures (which indicate that tests are known to be broken and not to worry about it) or disabling tests (which causes the tests to be siliently not run).
I think we could basically make a PEP8 version of the
unittestdecorators (skip,skip_if, andskip_unless). I'm thinking thereasonargument should be optional.Also, it'd be nice if instead of using a decorator, the test could raise a
SkipTestexception (or whatever you want to call it).