I needed to create a large amount of test data (in table dean_table) from an existing table (prod_table). The tables were clones of each other and had unique index on TIME_STAMP and PHYSICAL_ID. The prod_table had 15k rows of data but I needed a few million for my test case. The first time I ran
insert into dean_table select * from owner.prod_table ;
I copied over the 15k rows. However subsequent executions would have failed with duplicate keys.
I created a SQL file with the below SQL and copied and pasted the SQL about a 100 times:
insert into dean_table select TIME_STAMP ,PHYSICAL_ID * (select dbms_random.value(1,1000) num from dual) ,TIME_STAMP_ESTIMATED ,PRIORITY ,ALARM_LEVEL ,SUBSYSTEM ,INSTANCE ,SUBSYSTEM_ID ,MESSAGE_NUMBER ,SLOT ,PORT ,STRING_PARAMETER0 ,PARAMETER_LIST0 ,PARAMETER_LIST1 ,PARAMETER_LIST2 ,PARAMETER_LIST3 ,PARAMETER_LIST4 ,PARAMETER_LIST5 ,PARAMETER_LIST6 ,PARAMETER_LIST7 from owner.prod_table ;
By executing this .SQL file many times I was able to rapidly create about 5 million rows of data in dean_table.