I’m finding that when databases get larger than 100 megs or so, full-database dumps and restores are not as practical. Given that I normally work with Drupal, what can I do to avoid needing to do a full dump-and-restore? I don’t really want to figure out how all the tables relate to one another every time, but some sort of differential dump (only dump what’s changed since last dump) would be nice.
The idea is that if I only run one update script, which only touches a few tables, then I only want to restore those tables to how they were. I shouldn’t have to sit through 20 minutes of restoration because of a mistake that took a few seconds to happen.
Tips veeeery welcome!
(And I’m using MySQL.)