Thank you for your feedback,
we will look into these points very quickly (reporting and tear down scripts...) and share our results.
Smoke tests are on their way and shouldn't even take the 15 minutes you talk of.
KR,
Arthur PELTIER
eXo Southeast Asia - QA Team
Vietnam : +84 1 253 988 745
France : +33 6 26 36 64 84
arthur.peltier@gmail.com
Skype ID : arthur.peltier
On Wed, Sep 8, 2010 at 7:54 PM, Arthur Peltier <arthur.peltier@gmail.com> wrote:
Some remarks in-line2010/9/7 Marko Strukelj <marko.strukelj@gmail.com>In order to run selenium tests with jboss I have to do the following modification to current pom.xml:Index: testsuite/selenium-snifftests/pom.xml===================================================================--- testsuite/selenium-snifftests/pom.xml (revision 4045)+++ testsuite/selenium-snifftests/pom.xml (working copy)@@ -13,12 +13,12 @@<properties><org.selenium.server.version>1.0.1</org.selenium.server.version>- <selenium.port>4444</selenium.port>+ <selenium.port>8444</selenium.port><selenium.browser>firefox</selenium.browser><selenium.timeout>10000</selenium.timeout><selenium.speed>300</selenium.speed><selenium.host>localhost</selenium.host>- <org.selenium.maven-plugin.version>1.0</org.selenium.maven-plugin.version>+ <org.selenium.maven-plugin.version>1.0.1</org.selenium.maven-plugin.version>Could you explain why you need this maintenance version exactly ?Without this version upgrade Firefox windows deadlock when opening up so they fail to open, which causes all the tests to fail. I see this on Windows, didn't try on unixes.</properties><dependencies>I've been running these a lot the last two days.They aren't very useful at the moment for doing pre-commit checks to catch any introduction of systemic issues primarily for three reasons.- one, it takes 1 hour 40 mins on my laptop to run the suite. If I want a 'before my change', and 'after my change' I have to run it twice so I can see a diff in test failures. The name 'snifftests' gives an impression that this is a quick testsuite to be run before doing a commit :)- two, at the moment many tests are failing - my last run: Tests run: 248, Failures: 74, Errors: 21, Skipped: 0which makes it difficult to determine if some might be due to my code changes. The same tests sometimes fail with a 'Failure', and sometimes with 'Error', so the end report always looks different making finding effective differences a challenge even with a diff tool.- three, some of the tests seem to fail randomly - more likely they are sensitive to initial conditions which can change if some other test fails to do a proper cleanup. It can also happen by killing the test in the middle of execution. The situation is exacerbated by the fact that the tests are run in random order ...Point number one could be addressed by making a set of simple to maintain tests that perform a few operations that touch many aspects of the portal. These would go into 'snifftests' module. The exhaustive mass of other detailed tests - which are undoubtedly also a burden to maintain, would go into 'alltests'.In fact there are differences in Functional tests and sniff tests.
- Sniff tests have "SNF" in the name of the test
- These tests can be run specifically by using "-Dtest=Test*SNF* " with your maven command
- We will be implementing shortly what we call "Smoke tests" that will test in <20 steps most functionalities of GateIn (these will be for testing installations on many different configurations)
- The name of the folder and how to run the different tests will be also be changed in SELEGEN 1.1 (expected in a week or so...see more improvements in the next points)
I gave this a try. SNF tests take 40 mins to run for me. In my opinion that's still a bit too long for smoke tests. I feel like 15 min is a max. Running it twice then makes it 30 min.Point number two could be addressed by creating some kind of final report that would throw 'failures' and 'errors' in a single set and sort it alphabetically.- This will become a reality with SELEGEN 1.1 (we will use a DOXIA maven framework to create html reports)...
- see attached "surefire-report.html" for an old example (i've been using it on my own for some time). The format is not perfect but better than nothing for now.This is better (alphabetical ordering), but not ideal. There is data that changes on every run (Time column). Ideally there would also be a .txt file with proper formatting so it looks like a table.Then, when doing a diff it's all very cleary visible. Now if I do copy paste of the html into a text file, then do diff, all the lines have been changed due to different Time. Even if I use a more advanced diff tool that gives me precise inline diff in the text form formatting is all skewed and hard to read...For point number three, a slow workaround is to run another packaging/pkg/mvn install.Maybe we could add some cleanup mechanism, so each test can define the tear down sequence which would go through all the steps ignoring any errors. Removing test artifacts created during a test is already part of the existing tests. But if it was separated it could be run explicitly:- We have a problem using testsuites with our Maven approach, that is why each script is independent and cleans after himself.
- This also permits us to keep the artifacts of failed tests to better analyse what happened
- we would need to pass a specific scenario between each test... (any ideas on this would be helpful)Actually I was thinking more like having complementary test files:Test_SNF_PRL_03_ChangeDisplayLanguage.htmlTest_CLEAN_SNF_PRL_03_ChangeDisplayLanguage.htmlAnd then run: -Dtest=Test_CLEAN*The CLEAN would be automatically executed right after the complementary non-CLEAN. But you can also run all the CLEAN directly. And these would not be sensitive to errors. They would run all the way through on a best-effort basis.mvn -Pselenium-cleanup integration-testOtherwise thumbs up for the sheer number of these tests, and the systematic approach ...For that we can thank Hang Nguyen for her work on this !
I attached 2 docs that can give you more visibility on how we work in eXo vietnam- marko
_______________________________________________
gatein-dev mailing list
gatein-dev@lists.jboss.org
https://lists.jboss.org/mailman/listinfo/gatein-dev