[This topic is migrated from our old forums. The original author name has been removed]
Hi,
I see that when I view a table that contains blobs in the grid, it takes a long time to view even a single row. From what I see regarding memory usage, dbvis does not hold a copy of the blob in memory, which is good, especially for large blogs that can contain many GB.
Why does it take so long to view even 1 row of data?
When I Save Cell - to copy a blob to the client - I see that the blob is first copied into memory and finally written to disk. During the Save Cell action, dbvis if fully frozen, does not respond to anything. Can this be changed to read/write the blob in parts so the gui does not freeze and the blobs don't eat all memory? (larger blobs won't fit at all) this would also allow for some progress indication.
thanks,
Ronald.
Ronald,
Can you please clarify what steps you perform in DbVisualizer when you experience this performance issue?
When performance is bad, what is the size of the BLOB file?
Also, in Tool Properties->General->Grid->Binary/BLOB and CLOB Data, what settings are used for the BLOB/Binary properties?
Regards
Roger
a
anonymous
said
over 11 years ago
[This reply is migrated from our old forums. The original author name has been removed]
Re: big blobs in grid?
H Roger,
just create a table
create table test_blob (a number, b number, c number, d blob);
insert into test_blob (a,b,c,d) values (1,2,3,null);
commit;
select the table in the object tree,
view the data
double click the blob column
In the edit window that popped up hit the file browser to select a file.
If the file is small, about 300KB it is copied to the grid column subsecond.
commit/writing to the database is fast.
If the file is a bit larger, 50MB, the process seems to never end.
I understand that 50MB needs more time to process but for me it looks like there is an infinite loop.
(I tried this to work to a reproduction of the original issue where the route was the other way ..... from DB to client)
Ronald.
Roger Bjärevall
said
over 11 years ago
[This reply is migrated from our old forums.]
Re: big blobs in grid?
Hi Ronald,
The support for large blob/binary data is currently limited in the table editing feature due to memory issues. We have an open ticket to look into this.
There is an alternative for inserting large data and that is to use variables in the SQL Commander:
Table being used:
create table test_blob (a number, b number, c number, d blob);
Inserting large BLOBs is done with:
insert into test_blob (a, b, c, d) values (1, 2, 3, ${data||/Users/xxx/out/bigfile||BinaryData||noshow vl=file}$);
This will insert the bigfile file in the d column.
The following will extract all columns in the table and generate a SQL INSERT file (big.sql). Binary data are exported to individual files:
@export on;
@export set Filename="/Users/xxx/out/big.sql" Format="sql" BinaryFileDir="/Users/xxx/out" BinaryFormat="File";
select * from test_blob;
Regards
Roger
a
anonymous
said
over 11 years ago
[This reply is migrated from our old forums. The original author name has been removed]
Re: big blobs in grid?
Thanks Roger,
again: the tool keeps amazing me. Lot's of power still to be found, after quite a few years of using it with great pleasure.
Thanks,
Ronald.
Edited by: Ronald Rood on 18-apr-2013 21:11
anonymous