Feeds:
Posts
Comments

Posts Tagged ‘ora-01438’

Errors are not really a welcome stuff 😉 . Yesterday, I was importing a dump file (from 8.1.7.4) into 10.2.0.3 and hit

IMP-00020: long column too large for column buffer size (number).

After doing some googling and searching on metalink came to know (not sure though) that it was some bug in 8.1.7.4. And the solution was to try with different values of buffer parameter in import (which actually didn’t help) . Thats what oerr also has to say:

    *Cause: The column buffer is too small. This usually occurs when importing LONG data.
    *Action: Increase the insert buffer size 10,000 bytes at a time (for example). Use this step-by-step approach because a buffer size that is too large may cause a similar problem.

And also probably the issue was that the export was run with compress=N. So the guys again ran the export of the table with compress=Y (which is the default). Hopes got some oxygen and we again ran the import. That IMP-00020 was gone and a new baby struck:

IMP-00058:Oracle Error 1438 encountered.
ORA-01438:Value Larger Than Specified precision allow for this Column

Again started the googling and metalink’ing session and found that it was some bug 😦 . Import itself creating the table and then uttering ORA-01438. Complete non-sense. Isn’t it ? Just hitting and trying what we did was pre-created the table with all the NUMBER columns having their data types defined simply as NUMBER without any precision and wow it completed without any errors. Now its stupid seriously.

So then we did some research on the data in the table and found that there was one row which was causing the whole shit. There were two columns defined as NUMBER(12) but the values in them were of length 30 and 60. So what it was ? Probably a data corruption or what ? Otherwise how the table definition would have allowed such crap to enter the table. Couldn’t ascertain the exact reason but happy ending, it was 😉

Read Full Post »