Experience at the Daffodil project is that allowing fn:count argument expression to be a non-array non-optional element, where fn:count would always return 1, just hides errors that are very hard to find, and this situation comes up often as a schema is written. Usually the expression to fn:count is initially correct with an array/optional as the argument, but element nesting evolves, and the paths need updating, but end up referring not to the array/optional element, but that name is now of a scalar enclosing element of the array, so the fn:count is always 1, and the schema is incorrect because the expression is not doing what is intended, but no error is detected. This is then quite hard to isolate and fix.
A concrete example of this experience is you start with a schema like:
<element name="record" maxOccurs="unbounded">
<complexType>
<sequence>
.... elements of the record
But then you need the valueLength of the whole array of all the records, to store the length for unparsing, so you revise this to:
<element name="record">
<complexType>
<sequence>
<element name="item" maxOccurs="unbounded"/>
<complexType>
<sequence>
.... elements of each record 'item'.
And now, paths you had like fn:count(foo/bar/record) are no longer to an array, they are to a scalar, so always return 1. This is decidedly unhelpful in a large schema.
It is far better if fn:count(foo/bar/record) becomes an SDE because record is now scalar.
So the clarification I'm seeking is whether section 35 was just missed when updates were made about this node-sequence stuff, or if it is reasonable to implement the restrictions in Section 35.
I am biased. I want the restrictions in Section 35, but this was muddy enough that I thought we should get a clarification first.
Daffodil already doesn't implement any query-style expressions so the fn:count(b) example above would be an SDE in Daffodil.
Mike Beckerle