On 13 Jan 2016, at 6:09, John Sanda wrote:
I am currently working on accounts integration in metrics. The
integration code lives in the hawkular-component module in the metrics
repo. Now I want to write some integration tests to verify that things
work as expected. I don’t really think that integration/glue code
belongs in the component repo. I want to make sure that wherever it
lives I can still get feed back from CI builds when changes are made.
When I make a change to component code in the metrics repo, the CI
build runs and checks for regressions by running tests. There is a
pretty quick feedback loop which is very important. If the glue code
is living elsewhere, I would like for a a CI build to get kicked off
at some
is that metrics-accounts specific glue code? Which I mean
as is that needed also for hawkular? If so, is that test
then enough to verify functioning for all of Hawkular?
point after the metrics CI builds completes successfully so that I
can
find out if any of my changes to the component code caused a
regression in the glue code. And of course if I make a change directly
in the glue code, I should get the feedback from the check-in CI job.
In RHQ we had a build pipeline with jenkins that handled this for us.
Is this possible with travis?
Travis apparently does not have them yet. They seem
in the pipeline, but it looks like in the first iteration they
will not be as powerful as in Jenkins.
In theory we could that with Travis by having an overall
repo/build script
foo:
- build metrics
- test integration
where 'build metrics' would push the generated artifact somewhere
and 'test integration' would load if from this somewhere and run the
tests against it. IIrc we already push all the build results
to the Nexus snaphshot repo, so we already have the 'somewhere'.
For some "larger scale" itests, I think they should go into
hawkular-main, but I understand very well that this does not
solve your issue especially as there is currently no way of
triggering a hawkular-main build when e.g. metric makes a
change.
We need to keep an eye though on the duplication of code
to standup the itests. It may not be good to have that in all
projects in various permutations, as it will be a maintenance
nightmare.