[
http://opensource.atlassian.com/projects/hibernate/browse/HHH-1501?page=c...
]
Steve Ebersole commented on HHH-1501:
-------------------------------------
Hibernate does not take into account the length/scale/precision nor the dialect when
determining the Hibernate Type to use for a property which omitted specifying one. What
it does is to basically choose a default based solely on the Java type. Using that it
performs a lookup via a call to TypeFactory.heuristicType.
What we'd need to do here is to inject a process that consults the dialect based on
the sql-type (from the resolved Hibernate Type) and length and asks whether it should use
an alternate Hibernate Type (aka, instead of StringType[String->VARCHAR] use
MaterializeClobType[String->CLOB] because the length exceeded some limit for that
dialect.
That I'd be OK with. Problem is that the dialect is not known when we are doing this
process :( This is something I plan on addressing, but not until 3.7 or later. See
HHH-2578 for the details of that
insert long string (more than 32700) fails on derby
---------------------------------------------------
Key: HHH-1501
URL:
http://opensource.atlassian.com/projects/hibernate/browse/HHH-1501
Project: Hibernate Core
Issue Type: Bug
Components: core
Affects Versions: 3.5.0-Beta-2
Environment: Derby 10
Reporter: Sergey Vladimirov
Priority: Trivial
http://issues.apache.org/jira/browse/DERBY-102
VARCHAR maximum length 32 672
LONG VARCHAR maximum length 32 700
CLOB maximum length 2 147 483 647
BLOB maximum length 2 147 483 647
this issue can be reproduced by org.hibernate.test.lob.TextTest (in 3.5 beta1,2) with
change org.hibernate.test.lob.LongStringTest.LONG_STRING_SIZE > 32700
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://opensource.atlassian.com/projects/hibernate/secure/Administrators....
-
For more information on JIRA, see:
http://www.atlassian.com/software/jira