At my source - around 3 million records are inserted into a table in one transaction.
Is it possible to replicate 3 million records using Golden Gate.
Source and target database being Teradata.
Yes, it is possible.
The extract process is executing fine But the sort is failing... as i am executing in max protection mode ...
Tuning and sizing the replication system can be complex. You should contact your Teradata or Oracle GoldenGate support representative. This forum is not usually the best venue for getting timely critical support.
You could use the ETL software Talend
Grao, for larger insert/updates, you should consider Teradata's Data Mover.... misnamed product (should be called DataCopier) but will serve you well when it comes larger replication jobs.
Thanks!!! .. Yes they can handle huge volume of data.. no doubt ... But they cant handle deletes. The data at the target would only get loaded after the complete data load is done at the source ... unlike Golden Gate where the extract/Replicate process would be continuously executing.
Is there any way to use Data Mover in similar lines as Golden Gate?
How can we capture Deletes using Data Mover?
Good question, when it comes to data capture / deletes, I don't think there is a way out of the box to make datamover behave exactly like GG.
However, you could consider loading into a staging table and perform your own change data capture, not an easy task but there are many well known techniques out there.
Vague but hope this helps :)