Well that's an interesting finding. I guess that's better than it
happening for all DBs. But still a bit concerning. :(
-Eric
On 8/15/2015 2:42 AM, Brandon Gaisford wrote:
Yo,
Some additional findings regarding, “… cursory tests against H2 using the existing PR
changelogs and then updating the JPA entities and then doing a diff to see the level of
effort to reconcile the changelogs. The results that come back are somewhat perplexing.
All the unique constraints come back and new adds, the hibernate_sequence comes back as a
drop, etc, etc.”.
It turns out that this weird behavior of returning existing db objects maybe be specific
to H2.
Brandon
On Jul 7, 2015, at 5:12 PM, Brandon Gaisford <bgaisford(a)punagroup.com> wrote:
>
> Hey all,
>
> I don’t have much time of late to work on task, but I have found some cycles here and
there. I was able to hook up the liquibase “offline” mode, pretty straight forward. I
did some cursory tests against H2 using the existing PR changelogs and then updating the
JPA entities and then doing a diff to see the level of effort to reconcile the changelogs.
The results that come back are somewhat perplexing. All the unique constraints come back
and new adds, the hibernate_sequence comes back as a drop, etc, etc. Devil is in the
details. :)
>
> What do you guys think about introducing the “apiman” schema? It’s probably a good
idea to get away from the default public schema. I’m not sure of your install base and
what downstream affects this has. Think about it.
>
> I’ll keep plugging away and I have time.
>
> Best regards,
>
> Brandon
>
>
> On Jun 29, 2015, at 6:42 AM, Brandon Gaisford <bgaisford(a)punagroup.com> wrote:
>
>>
>> Hey Brett,
>>
>> Thanks for the feedback! Excellent find on the Liquibase “offline” mode, thank
you. I’ll give it a try as soon as I can get to it.
>>
>> Regarding your item 3, many production operations guys will only allow vetted SQL
scripts to progress onto their production systems. In my opinion, it would be unwise not
to support SQL DDLs and SQL migration scripts. I like the Liquibase managed approach as
well, I think some users would really appreciate it. A hybrid approach perhaps?
>>
>> Brandon
>>
>> On Jun 29, 2015, at 5:04 AM, Brett Meyer <brmeyer(a)redhat.com> wrote:
>>
>>> Hey guys, apologies for the delayed response! Brandon, thanks for taking a
look at Liquibase. Artificer is going to need a similar strategy...
>>>
>>> Some thoughts:
>>>
>>> 1.) Although I understand the desire to lean on Hibernate's
SchemaCreator, SchemaUpdater, and other tools, I'd highly advise against relying on it
in production environments. (I'm one of the Hibernate ORM core devs.) The tools have
to make quite a few assumption, many of which end up being less-than-ideal. Even with
additional annotations (column sizes, column types, indexes, etc.), you still end up with
something that needs a lot of optimization help. It's far better to control a set of
manually-maintained DDL SQL scripts, as apiman and Artificer have done. Although,
SchemaCreator can fit into that picture. I'll typically let it generate DDL for all
my supported dialects, then update them by-hand. It helps with some of the busy work...
>>>
>>> 2.) As I understand it, Liquibase does *not* require a live DB. See
http://www.liquibase.org/documentation/offline.html.
>>>
>>> 3.) What do you guys think about forcing Liquibase as the only DDL
installation route, as opposed to also including the literal, most-up-to-date SQL scripts
in the distro?
>>>
>>> ----- Original Message -----
>>>> From: "Eric Wittmann" <eric.wittmann(a)redhat.com>
>>>> To: "Brandon Gaisford" <bgaisford(a)punagroup.com>
>>>> Cc: "Marc Savy" <marc.savy(a)redhat.com>, "Brett
Meyer" <brmeyer(a)redhat.com>
>>>> Sent: Thursday, June 25, 2015 7:11:50 AM
>>>> Subject: Re: APIMAN-451 Doc
>>>>
>>>> OK that does make it clearer, thanks.
>>>>
>>>> On 6/24/2015 6:13 PM, Brandon Gaisford wrote:
>>>>>
>>>>> I don’t think I answered your question #3 very well. I’ll take
another
>>>>> stab at it here just to be clear. The process pretty much goes like
this:
>>>>>
>>>>> 1) We have a set of changelogs that describe the desired end state of
our
>>>>> database (all the changelogs in the current folder)
>>>>> 2) Given an existing database (empty for DDL creation), we ask
liquibase to
>>>>> update the existing database so it is in sync with the current
changelogs
>>>>> 3) Liquibase queries the existing database and asks, “what changeset
are
>>>>> you currently on?”
>>>>> 4) Given the existing changeset the db is at, and the final changeset
the
>>>>> current changelogs are at, liquibase knows what changesets to apply
to
>>>>> update the database.
>>>>> 5) We can instruct liquibase to not actually update the database,
but
>>>>> instead emit the SQL that would be used to make the update
>>>>>
>>>>> That’s how the migration scripts (and DDLs) would be created.
>>>>> Additionally, given two databases built using liquibase changelogs,
one
>>>>> could diff those databases and also generate a migration SQL script.
>>>>>
>>>>> Hope that’s more clear. Liquibase can only diff two databases.
Update and
>>>>> diff operations are different.
>>>>>
>>>>> Brandon
>>>>>
>>>>> On Jun 24, 2015, at 9:30 AM, Eric Wittmann
<eric.wittmann(a)redhat.com>
>>>>> wrote:
>>>>>
>>>>>> Thanks - I think I understand now (regarding point #3).
Basically the
>>>>>> downside to using liquibase is that to generate the DB-specific
DDLs we
>>>>>> need live databases.
>>>>>>
>>>>>> -Eric
>>>>>>
>>>>>> On 6/24/2015 3:14 PM, Brandon Gaisford wrote:
>>>>>>> Hey Eric,
>>>>>>>
>>>>>>> Regarding your queries below:
>>>>>>>
>>>>>>> 1) Have a look at
000-apiman-manager-api.db.sequences.changelog.xml.
>>>>>>> That file contains the DBMS specific sequence creation
changesets. We
>>>>>>> should create a new file along the same lines to deal with
indexes.
>>>>>>> I’ll add that to my TODO list.
>>>>>>>
>>>>>>> 2) Yes, if we manage the changelogs (specifically the
meta-data within
>>>>>>> the changesets) correctly, we can produce db version x to
version y
>>>>>>> migration scripts on demand.
>>>>>>>
>>>>>>> 3) Currently liquibase can only compare two databases. So
to produce
>>>>>>> the update SQL (migration script) to go from one database
version to
>>>>>>> another requires two databases. Those databases would be
created using
>>>>>>> the appropriate changelogs for that specific database
version.
>>>>>>>
>>>>>>> Hope this makes sense.
>>>>>>>
>>>>>>> Brandon
>>>>>>>
>>>>>>>
>>>>>>> On Jun 24, 2015, at 6:29 AM, Eric Wittmann
<eric.wittmann(a)redhat.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hey Brandon.
>>>>>>>>
>>>>>>>> Thanks for the high level doc - very informative.
>>>>>>>>
>>>>>>>> I have a couple of questions (I haven't looked deeply
into the PR yet).
>>>>>>>>
>>>>>>>> 1) Where/how do we manage indexes? Currently we are
adding a variety of
>>>>>>>> indexes in the manual DDLs.
>>>>>>>> 2) Can differential migration scripts be generated
on-demand between any
>>>>>>>> two arbitrary versions of apiman?
>>>>>>>> 3) What files are used when generating migration scripts?
I assume the
>>>>>>>> changelog (xml) files?
>>>>>>>>
>>>>>>>> -Eric
>>>>>>>>
>>>>>>>> On 6/20/2015 9:37 PM, Brandon Gaisford wrote:
>>>>>>>>> Here you go guys. I think this doc will help better
understand the
>>>>>>>>> project. I don’t have Brett’s email, so perhaps one
of you could
>>>>>>>>> forward along.
>>>>>>>>>
>>>>>>>>> Laters,
>>>>>>>>>
>>>>>>>>> Brandon
>>>>>>>>>
>>>>>>>
>>>>>
>>>>
>>
>
>
> _______________________________________________
> Apiman-dev mailing list
> Apiman-dev(a)lists.jboss.org
>
https://lists.jboss.org/mailman/listinfo/apiman-dev
_______________________________________________
Apiman-dev mailing list
Apiman-dev(a)lists.jboss.org
https://lists.jboss.org/mailman/listinfo/apiman-dev