in reply to Re: Re: InterBase MEMO/BLOB fields, data with size > 1MB
in thread InterBase MEMO/BLOB fields, data with size > 1MB
You're right that I'd missed the important detail that you already were compressing data. In that case then compressing again won't help significantly, compression is not magic. If it could have done better the first time around, then it should have...
I don't understand your objection to splitting one large field across rows. The fact that it is binary data is OK, what you do is use substr to turn one string into several, and you insert them into a table with 3 fields, one an external ID so that you can join to other tables, one a sequence number so that you know what order to put the pieces back in, and the last one a data field. Then use join to join the pieces together.
This strategy will work with arbitrary binary data to arbitrary size as long as your database handles binary data and your memory handles the string manipulations.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: Re: Re: InterBase MEMO/BLOB fields, data with size > 1MB
by pet (Novice) on Jun 01, 2004 at 11:31 UTC |