OIP-04126: Invalid argument value Error with unicode data - OO4O and Wizards(Archived)

We're using 0040 (v 90143) from VB to hit an Oracle 9 database (NLS setting is UTF-8). We use 0040 to build parameterized Insert and/or Update statements in some of our high transaction areas. I've noticed some strange behavior with multi-byte data (Japanese in this case) and a Varchar2 parameter. I have a column that is defined as 50 (char) so my understanding is that it should take up to 50 characters, irrespective of the length of those characters in bytes (I assume Oracle makes the behind-the-scenes width of the column 50 x 3 bytes to handle unicode).
Anyway, I have a string that comes into my VB app with a length greater than 50 so I truncate it and end up with a 50 character string. When I check the byte length of the string it returns a value of 100 so it would appear that each character in my string is 2 bytes in length. Seems innocuous enough, but when I try to assign the value to my input parameter I receive the OIP-04126: Invalid argument value error. I don't think it's strictly a string length issue either since trunctating several more characters still returns the same error.
Has anyone seen similar behavior with 0040 and multi-byte data? Thanks in advance for any information you have.
Scott Dafforn


ORA-01461 with ODP.NET

My application experiences ORA-01461 (can bind a LONG value only for insert into a LONG column) from time to time updating data of a row.
The environment: ODAC/ODP.NET version, used with VS2005. Database: Oracle Std, 10g2 patch 4
The (trimmed down) SQL is: update $RD$MngdItem set dgxrefid = :p_DgXrefID where ....
The C# codes goes like this:
OracleParameter p_dgXrefID = new OracleParameter();
p_dgXrefID.ParameterName = "p_dgXrefID";
p_dgXrefID.OracleDbType = OracleDbType.Varchar2;
p_dgXrefID.Value = MIprop.dgXREFID;
- for all added parameters I explicitly set the OracleDbType
- I created a log function for when an exception for ORA-01461 is raised which logs the text which is stored to all varchar fields - nothing, the length of the strings is always below the set length of the column.
My questions are:
- which characteristic exactly make a value a "LONG value" in this case, more than 4000 bytes?
- is there a way to identify which column caused ORA-01461?
- in which case exactly is ORA-01461 triggered - and when ORA-12899 "value too large for column" ? (I assume ORA-01461 is more about binding values to parameters and not about the size of a column?)
- which characteristic exactly make a value a "LONG value" in this case, more than 4000 bytes?
usually this happens when you try to insert/update into a varchar2 a value that has more than 4000 characters (obvious), or the database is using a characterset that uses more than a 1 byte per char (like AL32UTF8 which can use up to 4 bytes per char or even 8 in extended) (not so obvious).
to check the charset of your database try:
and look for NLS_CHARACTERSET value.
To correct this usually is enough to ensure that the server and the database are using the same characterset. 
Dany, thanks for the clarification.
The DB is using the AL32UTF8. This makes it tricky to calculate how many bytes are finally taken by a string as only characters with codes using more than 7bits are encoded to multiple bytes.
So I have included some logging code to catch all text values with a length exceeding 3500 bytes, the text content is in German using only these umlauts as non-ASCII characters, actually only about 2% of the characters. But: whenever an ORA-01461 is thrown my logging of "text values with more than 3500 characters" is empty.
But the totaly confusing issue is: The procedure encountering this error is collecting textual data from existing text columns and saving this collection into a special field of the same row. Thus the data processed for each row are stable. Now it comes: if I encounter this error for a row and I do exactly the same processing a second time no exception is thrown?!?!?
This leads to the question: under which conditions exactly is the ORA-01461 thrown by the ODP.NET? Could this depend on conditions from the NET Framework and not only from the Oracle Client or Datbase?
Currently I feel like a ghostbuster.

ORA-12899 error when using sql loader import multibyte characters

We have a table with a column 'DESCRIPTION' defined as nvarchar2(4000). We are trying to import data into it from data file. When the field in data file is with 4000 Chinese characters, it could not be imported. error message as below:ORA-12899: value too large for column DESCRIPTION (actual: 4000, maximum: 4000) Can anyone advice how to fix? thanks in advance!
You haven't said what your database version is. Prior to 12c, the maximum size for a database varchar2 column was 4000 bytes (NOT 4000 characters).  So, if you are trying to insert characters that take up multiple bytes (such as Chinese characters are likely to do) then you will not fit 4000 of those multi-byte characters in to 4000 bytes.  It's a common misconception that when you declare it as 4000 in size, then that means 4000 characters.  Even if you explicitly declare it as NVARCHAR2(4000 CHAR) it will still be a maximum of 4000 bytes, as that is the physical limit. From 12c onwards, the maximum size of VARCHAR2 columns on the database, and within SQL now allows for up to 32767 bytes, to bring it in line with PL/SQL VARCHAR2 datatype, though it may require a parameter to be set by the DBA to enable that feature if it's not enabled at the time the database is installed/upgraded. I don't know the characterset you are using, but I suspect it may be that each character requires up to 4 bytes each (if not, at least 2 bytes each).
Hi, Additionally, a character is technically a code point of the database character set. You can use the BYTE qualifier.For example VARCHAR2(10 BYTE), to explicitly give the maximum length in bytes. If no explicit qualifier is included in a column or attribute definition when a database object with this column or attribute is created, then the length semantics are determined by the value of the NLS_LENGTH_SEMANTICS . select value from nls_session_parameters where parameter = 'NLS_LENGTH_SEMANTICS'; What result do you get?what is your version of database and SQL*Loader client? Kind Regards
The OP's length semantics are unlikely to matter in this case.  The declaration was for 4000 in size, so it will be 4000 bytes regardless, unless the OP is using 12c with extended varchar2 enabled - though the error indicates not.
Thanks for your reply. My oracle version is 12c, and CHARACTERSET in SQL loader control file is set to UTF8. Just checked DB parameter max_string_size and its value is standard. So the max length for NVARCHAR(4000) is still 4000BYTES, right?
user8877753 wrote: Thanks for your reply. My oracle version is 12c, and CHARACTERSET in SQL loader control file is set to UTF8. Just checked DB parameter max_string_size and its value is standard. So the max length for NVARCHAR(4000) is still 4000BYTES, right?  Indeed, that is the case.A quick google would tell you, by linking to the documentation... https://docs.oracle.com/database/121/REFRN/GUID-D424D23B-0933-425F-BC69-9C0E6724693C.htm#REFRN10321

ORA-64203 - doing a select substr(clob_col,1,4000)

Environment: Oracle on Exadata Linux A database was converted from 11g single-byte character set to 12c multi-byte character set. A table with a CLOB column is used to cut/paste various document types (Word, Excel, etc.) into that CLOB column. As I understand it, in the single-byte character set (e.g. US7ASCII) a multi-byte character would be replaced with a blank in the CLOB. Several documents have been added in the 12c multi-byte character set database. The application code does a 'SELECT SUBSTR(<clob_column>,1,4000) STUFF'   and when it finds a multi-byte character in the first 4000 it returns more than 4000 bytes and raises the above ORA-64203 error. I confirmed this by lowering the 4000 characters requested to 3990 and printing out both the LENGTH and the LENGTHB and showed where in several cases the LENGTH was say 3990 and the LENGTHB was 3998. How should the application get around this error other than lowering the requested length below 4000? Would DBMS_LOB.SUBSTR work?    For some reason I don't have access to the DBMS_LOB package yet or I would have tried it. Thanks in advance for your comments!! -gary
If you have a string that has 4000 characters and any of the characters is multibyte (2 or more bytes), then obviously the string must have more than 4000 bytes. As 4000 bytes is the old absolute limit for VARCHAR2 values, you cannot exceed it in any way. Hence, you have two options, assuming you want to stay with VARCHAR2: Lower the number of selected characters to 1000. As the maximum Oracle-supported character width is 4 bytes, 1000 characters is never more than 4000 bytes.orMigrate to extended data types, aka VARCHAR2(32K), aka MAX_STRING_SIZE=EXTENDED. This will increase the maximum VARCHAR2 width to 32767 bytes and allow you to safely select up to floor(32767/4)=8191 characters. Note, you will need to make sure that all client applications can cope with the increased limit as well. Extended data types are available since Oracle Database 12.1 Thanks,Sergiusz
Hi Sergiusz Thanks for your quick response!  I understand what you're saying and will check with the developers to see which option they'd prefer. A couple of quick questions:This is a .NET application that displays the results of the SELECT to the user's screen.  The field they define on the screen must only be 4000 characters as well.  If they increase the size of that field would the query work successfully, assuming again they don't exceed the size of the increased field.Is SUBSTR by default a character-based function or a byte-based function?Would it make a difference to use DBMS_LOB.SUBSTR? Thanks very much! -gary
If a field has a character length limit of 4000 characters, then they can migrate to extended data types but also switch to character length semantics by declaring table columns as, for example, 4000 CHAR, instead of usual 4000 [BYTE]. SUBSTR is a character-based function, so SUBSTR(x,1,4000) returns at most 4000 characters, fitting into character semantics paradigm. FYI, SUBSTRB(x,1,4000) is the byte-based equivalent or SUBSTR that returns at most 4000 bytes (and possibly less characters). It is, however, not supported with CLOBs. DBMS_LOB.SUBSTR works like SUBSTR. There should be no significant difference in their behavior.. By the way, if you use SUBSTR or DBMS_LOB.SUBSTR in PL/SQL instead of SQL, then you assign the result to a PL/SQL variable. PL/SQL VARCHAR2 variables have the limit of 32767 bytes without migration to extended data types. Thanks,Sergiusz
Awesome! I'll have to work with the developers to see how they want to resolve the issue. Thanks very much for your  help!! -gary

Varchar storage with multi byte character sets

The user_tab_columns.data_length provides us with the means to determine the maximum number of bytes that a varchar column can hold. However, unless we are using a single-byte database character set, this does not equate to the maximum number of characters that a varchar column can hold.
This issue has been addressed in 9i with the user_tab_columns.char_length field.
We are searching for a means to dynamically find the maximum number of characters that a column can hold that is independent of the database version and the database character set.
How can we:
a) determine the maximum number of characters that a column can hold
or alternatively
b) determine the number of bytes required to store a character using the database character set
Again the solution would ideally work the same in both 8i and 9i. Thanks.

PLSQLAssociativeArray - ArraybindSize

Hello all,
Does anybody know the maximum size of an output parameter for ArrayBindSize when accessing a table of varchar2. And if so is their a way to set the limit?
Right now I can go to a maximium of 2000 characters.
depends on the column. 2000 char (assume ascii) is 2000 bytes.
Actually odp.net limits us:
"ArrayBindSize is used only for parameter types that have variable length such as Clob, Blob and Varchar2. The size is represented in bytes for binary datatypes and characters for the Unicode string types. The count for string types does not include the terminating character. The size is inferred from the actual size of the value, if it is not explicitly set. For an output parameter, the size of each value is set by ODP.NET. The ArrayBindSize property is ignored for fixed length datatypes. "
But the doc doesn't state what size and whether if we can change the size or not. I know Microsoft's driver supports 4000 bytes (so I don't know why oracle would lessen the # of bytes returned).