This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.
The query in test case cbcl-codepoints-to-string-021 is: let $y := 65536*65536 return for $x in $y to $y+10 return codepoints-to-string(65 to $x) For my implementation, this builds large intermediate results and takes a long time to run. I'm hoping that we can test FOCH0001 (codepoints in $arg is not a permitted XML character) in a less resource intensive way.
I'll probably just remove this test. Mike has kindly been importing CBCL's old XQTS tests. Some of them are designed to hit specific code paths in our product. This is most likely one of them.
There are a few regex tests in the CBCL collection that similarly blow my implementation out of the water. Personally, I think having such tests is good; it challenges implementors to get better. I was wondering though if we couldn't have a standard error code for "resource limits exceeded" and add that as an acceptable result.
Some additional test cases that I'm finding resource intensive: cbcl-subsequence-010 cbcl-subsequence-011 cbcl-subsequence-012 cbcl-subsequence-013 cbcl-subsequence-014 The query in cbcl-subsequence-010 is: count(subsequence(1 to 3000000000, -2147483648, 2147483647))
If I switch off our optimiser, I'd expect quite a few tests to run slowly. It's handy to be able to run the test suite in such a manner to ensure that an optimization isn't causing a test to pass which would otherwise fail. So I'd agree with Mike, but suggest adding an attribute on tests such as these warning of potentially slow queries. This would be generally useful to filter out tests which can in some circumstances be slow. Mike - if you agree, would you like to make a modification to the schema? I'd be happy to make the annotations to the identified tests.
We currently maintain a secondary catalog (an exceptions file) for this kind of thing - basically metadata that's particular to our own use of the test suite, with extra "private" information about how to run the tests - including tests that we don't (normally) run because they are resource hogs. Because this kind of detail is implementation-specific, I think that a local exceptions file is the right way to manage it.
We do a similar thing. Marking as resolved won't fix. Please mark as CLOSED if you agree with the resolution, otherwise REOPEN.