[
https://issues.jboss.org/browse/CDI-480?page=com.atlassian.jira.plugin.sy...
]
Antoine Sabot-Durand updated CDI-480:
-------------------------------------
Description:
One of the nicest thing in CDI TCK is the audit files. These XML files (like :
https://github.com/cdi-spec/cdi-tck/blob/master/impl/src/main/resources/t...)
list all challenges coming from the specification and allow to bind a given test to a
given challenge thanks to {{@SpecAssertion}} annotation (see
https://github.com/cdi-spec/cdi-tck/blob/master/impl/src/main/java/org/jb...
for instance).
This system is great to know what assertions are broken, retrieve what spec rule is tested
from the code and have nice report on spec coverage in TCK.
The only weakness here is that these audit files are only maintained "by hand"
when we could (perhaps) have an automatic or semi-automatic way of generating them or at
least checking them against the spec.
Since the spec doc is generated with Asciidoctor, this could be done by creating an
Asciidoctor macro or extension (not sure of the terminology) and add a meta data set to
each rule written in the spec (allowing to have one or more test for one rule). These meta
could be used to help generate the xml file for the TCK or at least check that they
contains all rules from the spec.
The immediate benefit I see here would be :
# make the spec contributors more concerned about TCK
# reduce the risk of forgetting rules in TCK
# produce a version of the spec with links to TCK test
# reduce the fear of refactoring the spec if needed
To make short, if this "unification" could be done, it would probably give us
more efficiency and improve spec and TCK quality.
was:
One of the nicest thing in CDI TCK is the audit files. These XML files (like :
https://github.com/cdi-spec/cdi-tck/blob/master/impl/src/main/resources/t...)
list all challenges coming from the specification and allow to bind a given test to a
given challenge thanks to {{@SpecAssertion}} annotation (see
https://github.com/cdi-spec/cdi-tck/blob/master/impl/src/main/java/org/jb...
for instance).
This system is great to know what assertions are broken, retrieve what spec rule is tested
from the code and have nice report on spec coverage in TCK.
The only weakness here is that these audit files are only maintained "by hand"
when we could (perhaps) have an automatic or semi-automatic way of generating or at least
checking these file against the spec.
This could be done by creating an Asciidoc macro or extension (not sure of the
terminology) and add a meta data set to each rule written in the spec (allowing to have
one or more test for one rule). These meta could be used to help generate the xml file for
the TCK or at least check that they contains all rules from the spec.
The immediate benefit I see here would be :
# make the spec contributors more concerned about TCK
# reduce the risk of forgetting rules in TCK
# produce a version of the spec with links to TCK test
# reduce the fear of refactoring the spec if needed
To make short, if this "unification" could be done, it would probably give us
more efficiency and improve spec and TCK quality.
Introduce real link between spec doc and TCK
--------------------------------------------
Key: CDI-480
URL:
https://issues.jboss.org/browse/CDI-480
Project: CDI Specification Issues
Issue Type: Feature Request
Affects Versions: 1.1.Final
Reporter: Antoine Sabot-Durand
One of the nicest thing in CDI TCK is the audit files. These XML files (like :
https://github.com/cdi-spec/cdi-tck/blob/master/impl/src/main/resources/t...)
list all challenges coming from the specification and allow to bind a given test to a
given challenge thanks to {{@SpecAssertion}} annotation (see
https://github.com/cdi-spec/cdi-tck/blob/master/impl/src/main/java/org/jb...
for instance).
This system is great to know what assertions are broken, retrieve what spec rule is
tested from the code and have nice report on spec coverage in TCK.
The only weakness here is that these audit files are only maintained "by hand"
when we could (perhaps) have an automatic or semi-automatic way of generating them or at
least checking them against the spec.
Since the spec doc is generated with Asciidoctor, this could be done by creating an
Asciidoctor macro or extension (not sure of the terminology) and add a meta data set to
each rule written in the spec (allowing to have one or more test for one rule). These meta
could be used to help generate the xml file for the TCK or at least check that they
contains all rules from the spec.
The immediate benefit I see here would be :
# make the spec contributors more concerned about TCK
# reduce the risk of forgetting rules in TCK
# produce a version of the spec with links to TCK test
# reduce the fear of refactoring the spec if needed
To make short, if this "unification" could be done, it would probably give us
more efficiency and improve spec and TCK quality.
--
This message was sent by Atlassian JIRA
(v6.3.1#6329)