DataPump limit on number of tables in the tables= parameter.

I ran across this the other day. If you try to list too many tables in the DataPump tables= parmameter you could exceed the limit. One way around this is to use the exclude=tables parameter. Another way is to just split your import into two or more smaller lists.

ORA-31693: Table data object “schema”.”table” failed to load/unload and is being skipped due to error:

I often get ORA-31693 when doing DataPump imports. The error can be caused by a lot of things and the solution is to fix the underlying problem and re-import the table. It’s time consuming to sift through hundreds of lines in an import log file to find the skipped tables so I use this script which generates a list of tables that I can include in the next import script’s ‘TABLES=’ parameter.

cat import.log | grep failed | awk '{ print $5 }' | sed s/\"//g |  awk '{ print $1 " ,"}'

ORA-31696: unable to export/import TABLE_DATA:”schema”.”table” using client specified AUTOMATIC method

Part of my job is to support a Business Intelligence development team. They often need to have certain test schemas refreshed from production. I’ve been using DataPump and on one schema I always get ORA-31696 for the same three tables. It’s due to the way I’m refreshing tables. I truncate the tables first and then I load them using the “content=data_only” parameter. It works for most tables but these three tables contain a LONG column definition. The fix for this is to run a second import using “table_exists_action=replace” and TABLES=(list of skipped tables).