Reveng of Oracle db finds wrong lengths for unicode CHAR columns
----------------------------------------------------------------
Key: HBX-1027
URL:
http://opensource.atlassian.com/projects/hibernate/browse/HBX-1027
Project: Hibernate Tools
Issue Type: Bug
Components: reverse-engineer
Affects Versions: 3.2.cr1
Environment: Oracle 9i
Reporter: Matthew Lieder
This is a shortened version of the query currently used by OracleMetaDataDialect:
select decode(a.data_type, 'FLOAT',decode(a.data_precision,null, a.data_length,
a.data_precision), 'NUMBER', decode(a.data_precision,null, a.data_length,
a.data_precision), a.data_length) as COLUMN_SIZE
from all_tab_columns a
That works fine for CHAR(? Byte) columns, but returns ?*4 for CHAR(? Char) columns (where
? is any integer). I'm guessing that occurs because one ANSI character takes up 1 byte
while one UNICODE character takes up 4 bytes, and COLUMN_SIZE is meant to represent the
number of characters in a string column and not the number of bytes used to store the
string in the database.
The solution is to use "a.char_length" for any string-based data types (CHAR,
VARCHAR2, etc.):
decode(a.data_type, 'FLOAT',decode(a.data_precision,null, a.data_length,
a.data_precision), 'NUMBER', decode(a.data_precision,null, a.data_length,
a.data_precision), 'CHAR', a.char_length, 'VARCHAR2', a.char_length,
a.data_length) as COLUMN_SIZE
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://opensource.atlassian.com/projects/hibernate/secure/Administrators....
-
For more information on JIRA, see:
http://www.atlassian.com/software/jira