Inline.
On 03/18/2011 07:10 AM, Jaikiran Pai wrote:
On Friday 18 March 2011 12:51 PM, Andrew Lee Rubinger wrote:
>
> I suspect this breaks down into two categories, which may be modelled as
> separate modules under the existing "testsuite" aggregator parent:
>
> * Specification
> * AS-specific APIs
On a slightly different categorization, we also have to come up with
tests for stress testing. They can either be for "Specification" related
tests or even for "AS-specific APIs". Perhaps test "profiles" (need
_not_ be Maven profile) within those 2 modules would be better instead
of a separate module?
On the stress tests: I've kept that out of scope for this discussion,
though sure, those are needed.
And I personally like the stress tests in a separate module, set to a
non-executing-by-default profile (so we don't slow down the default
build). For instance in ShrinkWrap before a release I do "mvn test
-Pstress".
>
> [End Goal]
>
> 1) No compile-time dependencies in the module except for what's
> absolutely necessary.
>
> For the spec suite, this means: JDK and EE Spec APIs only in the
> compilation classpath.
We might also have to include JBoss specific APIs which are exposed to
the users (for ex: org.jboss.ejb3.annotation which currently isn't there
but might be introduced soon in AS7).
> Testable asset sources and resources (ie. EJBs,
> Servlets, etc) would live under src/main/* to enforce that. Only the
> tests themselves would be located under "src/test/*".
Sounds fine. Although, do we want to end up with all tech specific tests
(like for EJB3, web, etc...) all in one source path? i.e.
spec-tests/src/main/org/jboss/spec/<ejb3, web, jsf>....? Or should we
plan to have different maven modules under the "Spec" tests for each
individual tech?
I'm not opposed to drilling down into finer-grained modules for each
spec type to keep things organized.
>
> 2) Every single new test created is to have an associated JIRA.
I'm not quite sure that this will work out. I'm not against it, but it
might not be practical.
>
> By linking to JIRA we get history of intent, which acts as a
> nice record even in the case that the test isn't so well-documented.
> I'd argue that tests are a bigger asset than our code, and we should be
> thinking about these in terms of long-term maintenance to outlive any
> specific impl.
I agree about the well-documented tests, but JIRA linking shouldn't
always be an absolute necessity.
Given that the AS tests and their docs have historically degraded over
time, I like JIRA as a rule. Also it's easy to be lazy and
insufficiently document, but harder to ignore a policy to associate w/ a
JIRA (also it's easier for upstream committers to enforce this).
>
> 3) Documentation
>
> Alongside the JIRA reference, a quick note about we're looking to
> accomplish is something I find very helpful. I don't personally buy the
> argument that code is self-documenting if written well. It gets
> refactored and stale over time.
+1000. A simple brief explanation on what the test is supposed to test
is really useful.
I hope folks take this to heart.
>
> 4) Run-mode profiles
>
> Arquillian provides a wonderful abstraction such that we can get
> coverage for AS in both remote managed *and* embedded modes without
> changing the test itself. To certify that everything is working as
> advertised no matter the runtime, we should be able to run the same
> suite in standalone, domain, and embedded modes (generally speaking).
Are these container specific modes that we are talking about? Or are
these more of test profiles (a.k.a groups) wherein we could perhaps add
a test under a "stress-test" profile?
Well, the "stress" aspect is orthogonal to this. I'm more thinking
along the lines of that for any given test, it might need to be executed
in N runtimes to ensure consistent behaviour. Embedded, Domain,
Standalone, Full clustered, etc..
S,
ALR
-Jaikiran
_______________________________________________
jboss-as7-dev mailing list
jboss-as7-dev(a)lists.jboss.org
https://lists.jboss.org/mailman/listinfo/jboss-as7-dev