Start a new topic

Import Table Data -- specify transaction batch size

[This topic is migrated from our old forums. The original author name has been removed]

I hope you agree, that Import Table Data is not the fastest way to import data. What definately will speed it up is if you allow a user to specify +optional+ transaction batch size. Ex: import table data from this file using transaction batch size of ( ### ) --- user selects most appropriate size for transaction Do you see where I am going with this? Thanks !!

[This reply is migrated from our old forums.]

Re: Import Table Data -- specify transaction batch size
Alex, The 100, 200, 300 and so on progress indicator is not related to how often commit is being made. It is just a simple progress indicator. Is this what causes the confusion? Regards Roger
[This reply is migrated from our old forums. The original author name has been removed]

Re: Import Table Data -- specify transaction batch size
Hi --- can we revisit this please? My Connection properties are as follows (Sybase ASE 12.5.4) Commit Batch Size: 5000 Auto Commit: On When Uncommitted Updates: ON Isolation level: transaction_none When I watch the import dialog, "*Successfull"* is only incremented in batches of 100... And there is also a slight delay before it displayes the next incremented value... I never see it smoothly increment numbers, like 1,3,4 or at least increment it at commit_batch_size value Product: DbVisualizer Personal 7.1.5 Build: #1590 (2011/02/24 17:23) Java VM: Java HotSpot(TM) Client VM Java Version: 1.6.0_13 Java Vendor: Sun Microsystems Inc. OS Name: Windows XP OS Arch: x86 OS Version: 5.1
[This reply is migrated from our old forums.]

Re: Import Table Data -- specify transaction batch size
Alex, Are you searching in the Debug Window? That may be the problem since it truncates the output. Try direct the output to file instead or set the batch size to a lower value just so that you can see the commits being made. (Sorry I missed to mention this). Regards Roger
[This reply is migrated from our old forums. The original author name has been removed]

Re: Import Table Data -- specify transaction batch size
Roger, Waited until the counter reached 8,000. Clicked on Stop . Searched the debug output for +commit+ which appeared as the last line right after the rollback was issued, but I do not think that this is the commit that we're looking for. +09:46:57 [DEBUG AWT-EventQueue-1 C.invoke] RootConnection: SybConnection.rollback()+ +09:46:58 [DEBUG AWT-EventQueue-1 C.invoke] RootConnection: SybConnection.setAutoCommit(true)+
[This reply is migrated from our old forums.]

Re: Import Table Data -- specify transaction batch size
Alex, You should see entries like this in the debug output: 15:40:52 [DEBUG Thread-23 H.?] Commit: MySQL 5.0.26: localhost 15:40:52 [DEBUG Thread-23 C.invoke] RootConnection: JDBC4Connection.commit() Note that this is for MySQL. Search for commit() in your log. Regards Roger
[This reply is migrated from our old forums. The original author name has been removed]

Re: Import Table Data -- specify transaction batch size
Rojer, I am not sure if there is another configuration that I need to set, but the Import is still inserts data in 100 row batches ignoring my setting of 5,000 . How can I tell by looking at the debug output that a commit was issued ? Product: DbVisualizer Personal 6.5.12 Build: #1378 (2009/12/22 19:31) Java VM: Java HotSpot(TM) Client VM Java Version: 1.6.0_13 Java Vendor: Sun Microsystems Inc. OS Name: Windows XP OS Arch: x86 OS Version: 5.1
[This reply is migrated from our old forums.]

Re: Import Table Data -- specify transaction batch size
Alex, This is fixed in the 6.5.12 version. Regards Roger
[This reply is migrated from our old forums.]

Re: Import Table Data -- specify transaction batch size
Alex, A workaround until the fix is available is that you enable the "Use a Single Physical Database Connection" in Connection Properties. If you do this then the import feature uses the specified "Commit Batch Size" setting. Doing this you will be able to test whether increasing the commit size gives any performance increase. Regards Roger
[This reply is migrated from our old forums.]

Re: Import Table Data -- specify transaction batch size
Alex, I found the problem and a fix will be in the 6.5.12 version and in the next EA. Thanks for reporting it! Best Regards Roger
[This reply is migrated from our old forums. The original author name has been removed]

Re: Import Table Data -- specify transaction batch size
Using 6.5.11 with Sybase ASE (jTDS) Driver Class: com.cybase.jdbc3.jdbc.SybDataSource v6 ( from Sybase's jconn3.jar) is there a debug option that i can enable that would help you? Product: DbVisualizer Personal 6.5.11 Build: #1376 (2009/12/05 10:46) Java VM: Java HotSpot(TM) Client VM Java Version: 1.6.0_13 Java Vendor: Sun Microsystems Inc. OS Name: Windows XP OS Arch: x86 OS Version: 5.1
[This reply is migrated from our old forums.]

Re: Import Table Data -- specify transaction batch size
Looking at it right now... :-) (Was unable to reproduce with latest EA. Is now giving 6.5.11 a try). Which version did you use? Roger
[This reply is migrated from our old forums. The original author name has been removed]

Re: Import Table Data -- specify transaction batch size
Roger, Were you able to replicate the issue?
[This reply is migrated from our old forums. The original author name has been removed]

Re: Import Table Data -- specify transaction batch size
Roger, Changed this value to 5000 ( from default 100) ---- Import function still imports at increment of 100 ... Thanks.
[This reply is migrated from our old forums.]

Re: Import Table Data -- specify transaction batch size
Hi Alex, In connection properties under Database->Physical Connection->Transaction you find the "Commit Batch Size" setting that control the size of each chunk that is committed during import (and in the table data editor). The default is 100. You can modify this to a suitable value. Regards Roger