List:General Discussion« Previous MessageNext Message »
From:Jerry Schwartz Date:April 21 2010 2:32pm
Subject:RE: better way to backup 50 Gig db?
View as plain text  
>-----Original Message-----
>From: Claudio Nanni [mailto:claudio.nanni@stripped]
>Sent: Wednesday, April 21, 2010 2:12 AM
>Cc: mysql@stripped
>Subject: Re: better way to backup 50 Gig db?
>

[JS] <snip>

[JS] Unless I've forgotten something from earlier in my career (what day is 
it, anyways?), there are three aspects to this problem:

1. Ensuring that your databases, slave and master individually, are internally 
consistent;
2. Ensuring that your master has captured the latest externally-supplied data; 
and
3. Ensuring that your slave and you master are totally in synch.

#1 is the proper goal for the master. That's the whole point of ACID. For the 
master database, #2 is unattainable. You can buffer as many times and as many 
ways and as many places as you like, there is always going to be the 
**possibility** that some incoming data will be lost. Even if you push the 
problem all the way back to a human user, it will still be possible to lose 
data. If something is possible, it will happen: perhaps not for millennia, but 
more likely as soon as you leave on vacation.

Similarly, #1 is an attainable and necessary goal for a slave; and #2 is just 
as unattainable for a slave as for a master. The only way to guarantee #3 is 
to include the replication somewhere in the ACID transaction. The penalty for 
that is going to be a loss of throughput, possibly a horrendous loss of 
throughput. That is where somebody needs to do a cost/benefit analysis.

>Just my two cents
>
[JS] ... and mine ...

>Claudio
>
>
>Gavin Towey wrote:
>
>You can make binary backups from the master using filesystem snapshots.  You
>only need to hold a global read lock for a split second.
>
>Regards,
>Gavin Towey
>
>
>--
>MySQL General Mailing List
>For list archives: http://lists.mysql.com/mysql
>To unsubscribe:    http://lists.mysql.com/mysql?unsub=1




Thread
better way to backup 50 Gig db?Mitchell Maltenfort19 Apr
  • Re: better way to backup 50 Gig db?Shawn Green20 Apr
    • RE: better way to backup 50 Gig db?Gavin Towey20 Apr
      • Re: better way to backup 50 Gig db?Jay Ess20 Apr
        • RE: better way to backup 50 Gig db?andrew.2.moore20 Apr
          • RE: better way to backup 50 Gig db?Gavin Towey20 Apr
            • Re: better way to backup 50 Gig db?Rob Wultsch20 Apr
            • Re: better way to backup 50 Gig db?Claudio Nanni20 Apr
              • RE: better way to backup 50 Gig db?Gavin Towey21 Apr
Re: better way to backup 50 Gig db?Claudio Nanni21 Apr
  • Re: better way to backup 50 Gig db?Johan De Meersman21 Apr
    • Re: better way to backup 50 Gig db?Claudio Nanni21 Apr
      • Re: better way to backup 50 Gig db?Johan De Meersman21 Apr
        • Re: better way to backup 50 Gig db?Claudio Nanni21 Apr
      • RE: better way to backup 50 Gig db?Gavin Towey21 Apr
  • RE: better way to backup 50 Gig db?Jerry Schwartz21 Apr
Re: better way to backup 50 Gig db?Claudio Nanni21 Apr