Hibernate Search for Hibernate 5 - status
by Sanne Grinovero
I'm concerned about seeing issues like this one being reported:
- https://hibernate.atlassian.net/browse/HHH-9832
I don't think it's acceptable we withhold an Hibernate 5 compatible
version of Hibernate Search for much longer.
I have a working branch of Hibernate Search "on hold" since a while
which is compiling and passing all tests successfully up to the point
of WildFly integration tests, as I could not find a way yet to run the
tests using modules and override the Hibernate ORM version (see the
wildfly-dev mailing list for details).
It was not viable either to keep this Hibernate Search branch
backwards compatible with the Hibernate ORM version 4.3 as included in
available WildFly releases.
Also since I was focusing on the WildFly roadblock so far, the branch
still needs a bit of work, for example I didn't face the OSGi tests
yet.
The "theme" for current development branch - Hibernate Search 5.3 - so
far was about the great new faceting improvements. As I discussed with
Hardy, we'd like to polish that branch for a release, so include only
necessary wrap up and fixing any reported regression, but focus
aggressively already on a new minor branch so that we could release an
ORM5 compatible Alpha in short time.
We could even consider working on a 5.3 and 5.4 in parallel.
Would you all be ok in a quick release - probably named 5.4.0.Alpha1 -
of Hibernate Search to be compatible with ORM5 and omit publishing the
modules for WildFly? That would be useful for WildFly 10 too, so they
can include the new libraries and ORM 5, breaking the cycle.
As a next step we'll see also about breaking those cycles permanently;
e.g. Hardy suggested to release modules and feature packs for WildFly
as an independent cycle, and I like that although we can keep that
decision for a post 5.4.
Thanks,
Sanne
9 years, 10 months
Re: [hibernate-dev] HSearch + Tika bridge using Wildfly modules
by Sanne Grinovero
Hi Brett,
we don't include all existing analysers and extensions within the
WildFly modules. In particular the Apache Tika libraries have a huge
amount of dependencies, you should choose the ones you need depending
on what kind of media you intend to parse.
Include any extension in your "application", we use the Hibernate ORM
classloader to lookup for extensions so these should be discoverable
if they are visible to the same classloader having your entities and
other extensions.
Sanne
On 29 May 2015 at 15:28, Brett Meyer <brmeyer(a)redhat.com> wrote:
> Hey Sanne! Artificer has '<module name="org.hibernate.search.orm" services="export" />' defined in its jboss-deployment-structure dependencies. But, when we try to use it, the following happens.
>
> Caused by: java.lang.ClassNotFoundException: org.apache.tika.parser.Parser from [Module "org.hibernate.search.engine:main" from local module loader @6cf76647 (finder: local module finder @665bf734 (roots: /home/brmeyer/IdeaProjects/artificer/installer/target/wildfly-8.2.0.Final/modules,/home/brmeyer/IdeaProjects/artificer/installer/target/wildfly-8.2.0.Final/modules/system/layers/base))]
>
> One of our entities uses the built-in TikaBridge. I figured the search.orm module would bring the necessary Tika jars in with it. Is there something else we need to add?
9 years, 10 months
Changelog file in Hibernate ORM
by Sanne Grinovero
The file changelog.txt in the root ot the Hibernate ORM project seems outdated.
Is it not maintained anymore? I found it handy.
Sanne
9 years, 11 months
hibernate-osgi JPA bootstrap & classloader
by Steve Ebersole
Brett,
As part of HHH-7527 (Enterprise OSGi support) you had changed
the org.hibernate.jpa.boot.spi.Bootstrap contract to basically overload
each method to additional accept a "providedClassLoader".
Every one of those methods however, also accepts
a org.hibernate.jpa.boot.spi.PersistenceUnitDescriptor which exposes 2
ClassLoader already.
Additionally, this ClassLoader is ultimately just used to build the
ClassLoaderService which hibernate-osgi overrides anyway.
Just curious if I missed something. Unless I did, it seems to me that we
really do not need these overloads on Bootstrap to support Enterprise
OSGi. This dove-tails with a discussion from the Karaf user list
ultimately discussing OsgiClassLoaderService and "holding bundles" that are
being re-installed or upgraded. Ultimately I am thinking through ways to
support being able to release OSGI bundle references from the
OsgiClassLoaderService...
9 years, 11 months
Release announcements
by Steve Ebersole
At the moment we write release announcements using in.relation.to and then
announce in other mediums by posting that link. We all agree (more or
less) that the wiki editor and rendering on there leaves much to be desired.
Brett had mentioned a long time ago about GitHub and its release
capabilities and tonight I went back and looked at them again. I created a
more descriptive release announcement in GitHub then its default of just
using the tag message for this 5.0.0.CR1 release just to see how it worked
out. Here are the 2 links for comparison:
* http://in.relation.to/Bloggers/HibernateORM500CR1Release
* https://github.com/hibernate/hibernate-orm/releases/tag/5.0.0.CR1
There are a few things to notice here. First, I think we can all agree
that the second looks considerably better. Also, there is a lot to be said
for these being closely available from the source repo.
Another thing to note is that the GitHub release has the ability to attach
random zips and tgz. I have not taken advantage of that as I felt bad
uploading our 62 and 97 Mb release bundles here just for a PoC. I do have
a task for myself post 5.0 to re-think how we build these release
bundles[1]. But ultimately whether it makes sens to attach them here comes
down to whether we think SF and its FRS has any advantage. I think
download statistics will be the only discussion point there.
Anyway, moving forward I plan to move to this approach for release
announcements.
The only sticky point is the sourcing for the banners on hibernate.org.
What drives that? RSS?
Thoughts?
[1] https://hibernate.atlassian.net/browse/HHH-9828
9 years, 11 months
HCANN, AnnotationFactory and TCCL
by Steve Ebersole
WildFly consuming ORM 5.0 is still hitting one last TCCL issue with HCANN.
It happens in
the org.hibernate.annotations.common.annotationfactory.AnnotationFactory#create
method trying to build the "annotation proxy class".
There are a few possible approaches to resolve this...
The simplest potentially effects other HCANN consumers, so we'd obviously
all need to agree. Anyway, the simplest approach is to use the ClassLoader
for annotation @interface Class rather than the TCCL. I do not see a
problem with that, but it would change some semantic.
The only other workable approach (without redesign of HCANN) I could think
of is to basically make a copy of AnnotationFactory and copy it into ORM.
Sure I could overload AnnotationFactory#create to optionally accept a
ClassLoader, but that introduces a hard dependency on new specific version
of HCANN.
Open to other suggestions. Thoughts?
9 years, 11 months
ORM and database testing
by Steve Ebersole
Now that CR1 is out one of my tasks is to start setting up the database
specific jobs on CI. Initially I will just work with MySQL and PostrgeSQL
(and maybe HSQLDB).
Part of this will be auditing how we do database testing (matrix plugin)
and what does/doesn't work there. I definitely like the idea of "database
profiles". Overall I am not sure that dynamically generating tasks
specific to each database profile was a great idea. It was predicated on
the idea that I might want to run tests against all database profiles. But
that simply has not been the case in practice to date. I have to look
again at the complexity the actual matrix stuff adds. If it is a lot of
complexity I might just remove that part and have this be driven by a
single build parameter (`gradle test -PdbProfile=mysql`, e.g.).
I'd also like to look at specific hooks for the profiles in terms of
pre-/post-events at the suite, class and test levels. For example, we
might have the H2 profile set up pre-post hooks for the suite to manage the
database rather than each test building one itself. This would have a lot
of benefits. Presumably it would help speed up the build some. It would
also work more closely to non-in-memory test builds and possibly help shake
out test problems in regards to db-object conflicts earlier. Also, in
general we might decide to hook in after each class to drop schemas
(ultimately HHH-8451 is a better solution to both of these I think).
Anything else to consider here?
9 years, 11 months
More Dialect and quoting fun
by Steve Ebersole
What follows is solely an issue with schema update and schema validation, 2
tools that to date we have not "really" supported, but I am trying to
change that with 5.0.
We discussed before the idea of auto-quoting and who should be the
authority in regards to keywords. We decided that it would be Dialect, for
Dialects that chose to do it, rather than the JDBC DatabaseMetaData.
However there is another piece to this that we currently miss. It has to
do with the following methods:
* java.sql.DatabaseMetaData#storesMixedCaseQuotedIdentifiers
* java.sql.DatabaseMetaData#storesLowerCaseQuotedIdentifiers
* java.sql.DatabaseMetaData#storesUpperCaseQuotedIdentifiers
* java.sql.DatabaseMetaData#storesMixedCaseIdentifiers
* java.sql.DatabaseMetaData#storesUpperCaseIdentifiers
* java.sql.DatabaseMetaData#storesLowerCaseIdentifiers
We already have bug reports that come directly from drivers not
implementing these "properly".
They come into play in the implementation of the following methods (on
org.hibernate.engine.jdbc.env.spi.IdentifierHelper):
* toMetaDataCatalogName
* toMetaDataSchemaName
* toMetaDataObjectName
* fromMetaDataCatalogName
* fromMetaDataSchemaName
* fromMetaDataObjectName
The to* methods are used when binding the Identifiers to the metadata
queries (DatabaseMetaData#getTables method e.g.). The from* methods are
used when extracting values from the results. We currently rely on the
answers to the referenced DatabaseMetaData methods to determine the
quoting->case conversion and case->quoting conversions.
My proposal is that we go a step further than what we did for Dialect and
auto-quoting. For that, we defined a
Dialect#determineKeywordsForAutoQuoting method and allowed Dialects to
override it if they wanted. The only time that method is used actually is
in building the IdentifierHelper instance. So my proposal is that we drop
Dialect#determineKeywordsForAutoQuoting and instead define a
Dialect#buildIdentifierHelper method. This could work one of 2 ways.
First, the Dialect#buildIdentifierHelper method accepts a DatabaseMetaData
and the base implementation would do what we do today. However,
DatabaseMetaData can very well be null. When we initialize
IdentifierHelper atm we assume some (H2 based) defaults. So going this
first route would mean each Dialect impl that wants to override this method
having to handle nulls there. Not ideal.
A second approach would be to have Dialect#buildIdentifierHelper accept
either no parameters or just one parameter which would be the same as what
is passed to Dialect#determineKeywordsForAutoQuoting. This would work such
that null returned from this method would mean to use a fallback approach
based on DatabaseMetaData.
What do y'all think?
9 years, 11 months