[
http://opensource.atlassian.com/projects/hibernate/browse/HHH-2304?page=c...
]
Julien Kronegg commented on HHH-2304:
-------------------------------------
A cleaner workaround is to define a new Dialect:
public class DialectHhh2304 extends DB2390Dialect {
public DialectHhh2304() {
super();
// workaround for
http://opensource.atlassian.com/projects/hibernate/browse/HHH-2304
registerHibernateType( Types.CHAR,1, Hibernate.CHARACTER.getName() );
registerHibernateType( Types.CHAR,255, Hibernate.STRING.getName() );
}
}
Then to use instead of the original dialect (e.g. in the persistence.xml file).
I think defining a new dialect is safer than using "char(xxx as char)" (I did
not test it BTW).
Wrong type detection for sql type char(x) columns
-------------------------------------------------
Key: HHH-2304
URL:
http://opensource.atlassian.com/projects/hibernate/browse/HHH-2304
Project: Hibernate Core
Issue Type: Bug
Components: core
Affects Versions: 3.2.0.ga
Environment: Hibernate 3.2.0, Oracle 9.2, Oracle JDBC driver 10.2
Reporter: Markus Heiden
Attachments: hibernate.zip
When executing a sql query which returns columns of sql type char(x), a
java.lang.Character is returned. This leads to returning just the first character of the
value. In my eyes a String should be returned when the char(x) type has a width > 1. I
wasn't able to determine whether this is a jdbc driver issue or a hibernate issue.
When using sql type char(x) for columns of entities no such problems occur.
Test case is attached.
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://opensource.atlassian.com/projects/hibernate/secure/Administrators....
-
For more information on JIRA, see:
http://www.atlassian.com/software/jira