Problems? Is your data what you think it is? | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
Recovery of a database could be as easy as running your latest backup and restart business, if you are well organized. If you are using binary logs, the system can recover fairly easily. BLOBs are not a problem here, they are just more data in your database. About organizing yourself, you might have noticed that I added a timestamp field to my table. This way, I can have a progressive backup of the fields that were modified in a given timeframe, to integrate with a full weekly backup. Personally, I would say that storing blobs in sparse files makes your task more difficult, but TMTOWTDI, after all, and I might be wrong. Let's say that I am just more confortable with my current architecture. _ _ _ _ (_|| | |(_|>< _| In reply to OT - database recovery - Re: Handling huge BLOBs
by gmax
|
|