Splitting out dml/ddl so one could build something like a "structural
compare of configurations" to create a better
migration would be interesting; though I really don't believe dynamic
mappings that also dynamically decided
the db-layout and statements is going to be more than a quick
prototyping tool. i.e. don't go in production with it :)
But something like this and/or a real Hibernate Migrations API would
definitely be useful.
/max
Steve Ebersole wrote:
1) A DML/DDL split is not going to happen. Too broad.
2) keep this discussion on list please.
On Wed, 2009-06-10 at 14:49 +0200, Francis Galiegue wrote:
> 2009/6/10 Steve Ebersole<steve(a)hibernate.org>:
>
>> On Tue, 2009-06-09 at 17:11 +0200, Francis Galiegue wrote:
>>
> [...]
>
>>> Steven has pointed to a Jira task talking about an overhaul of the
>>> Dialect abstract class and all its derivatives, because for one, the
>>> Dialect doesn't provide "purpose oriented" capabilities, just
one big
>>> lump of methods. After looking at the code (3.3.1), I can see that
>>> this is the case: for instance, there's no separation between DML and
>>> DDL.
>>>
>> Well the JIRA[1] to which I pointed you does not really go into this
>> per-se. It is more talking about potentially changing the way DDL is
>> created to use a delegate. Currently what happens is that the mapping
>> objects (Table, Index, etc) all know how to create DDL there own DDL.
>> The tie-in with Dialect here is that they coordinate with the Dialect to
>> figure out its capabilities and syntax. What I'd rather see is an
>> externalization of this DDL rendering. There are many parts to this and
>> it is in no way trivial. But the basic "externalize the DDL generation
>> code" should not be overly complex.
>>
>> I too had been considering some similar form of cleanup. The ones I had
>> considered were things like grouping together various "purpose
oriented"
>> methods into componentized contracts. The ones I mentioned to you to
>> give example were the methods pertaining to a dialects IDENTITY support
>> and its sequence support. So I'd imagine something like:
>>
>> public interface IdentitySupport {
>> public boolean supportsIdentityColumns();
>> public boolean supportsInsertSelectIdentity();
>> ...
>> }
>>
>> public interface SequenceSupport {
>> public boolean supportsSequences();
>> public boolean supportsPooledSequences();
>> ...
>> }
>>
>> The Dialect contract could then be changed to return these contracts.
>> But this is all really secondary to what you want to achieve. For your
>> purposes, you are really just interested in the DDL generator piece.
>> For that I had envisioned asking the Dialect to get its DDL generation
>> delegate; maybe something like:
>> class Dialect {
>> ...
>> public DDLGenerator getDDLGenerator() {
>> return ...;
>> }
>> }
>>
>> But I never got the chance to completely think this all through. There
>> are lots of design questions that need to be addressed. Stuff like does
>> the delegate contract allow for both create/drop and alter usages? Or
>> do we ask the Dialect for a delegate for each usage? At what
>> granularity do we generate DDL commands? And if say at the granularity
>> of each Table etc, how do we account for say a table DDL command which
>> includes creating the PK versus one that does not (and therefore needs
>> separate DDL)?
>>
>> Anyway, that's about as far as I got.
>>
>>
> Well, my primary interest is in DDL, admittedly, but I'm interested in
> the whole Dialect in general.
>
> I still see as necessary a distinction between DML and DDL. I imagine
> one could do such a thing as:
>
> DMLDialect dml = Dialect.getDMLDialect();
>
> if (dml.supportsWhatever()) ...
>
> I haven't had a deep enough look to separate DML from DDL usage yet.
> Most importantly, I don't know where the mechanism to generate DML
> statements is, and how it works.
>
> As to DDL itself, it is basically stateless. If it were only for me
> and my crazy usage scenario (dynamic mappings at runtime), this is the
> way I envision it:
>
> //oldcfg the old Configuration, newcfg the new Configuration - note
> that the basic settings don't change, only the mappings can change
> DDLExecutor ddlexec =
> newcfg.getDDLDialect().generateDDLExecutor(oldcfg, newcfg);
> ddlexec.before();
> // renew SessionFactory with newcfg
> ddlexec.after();
>
> The DDLDialect would have a TableSupport, ColumnSupport,
> SequenceSupport, IdentityColumnSupport, IndexSupport..., and the
> .buildExecutor() would generate a DDLExecutor object (an interface,
> really) with DDL statements that must be executed before renewal of
> the SessionFactory (.before()), because it has been detected that DDL
> changes between oldcfg and newcfg _would_ affect existing Sessions,
> and DDL statements that can be postponed until only after the new
> SessionFactory has ben spawned.
>
> The .generateDDLExecutor would then be in charge of detecting changes
> and call upon the appropriate subdialects (it would call the
> appropriate DDL "fragments", say, IndexSupport if it sees that an
> index must be added/removed) and generate the appropriate scripts and
> fill the DDLExecutor with said scripts (the ones that must be executed
> before the SessionFactory renewal and the ones that may be delayed
> until after it has been spawned).
>
>