List:General Discussion« Previous MessageNext Message »
From:Dan Buettner Date:November 13 2006 5:38pm
Subject:Re: Backing up large dbs with tar
View as plain text  
Van, I'll second what Gerald said about mysqlhotcopy.

When we first began using MySQL at my last job, we had terrible
problems with MySQL crashing.  Turned out to be due to a 3rd party
backup process attempting to lock and read the database files while
MySQL was attempting to use them.

Using mysqlhotcopy to copy the files elsewhere, and excluding the data
directory from the backup software, gave us a stable solution.

mysqldump might also work well for you, as it can lock
tables/databases and give you a consistent snapshot.  Potentially
takes longer to restore from a mysqldump file though.

HTH,
Dan


On 11/13/06, Van <van@stripped> wrote:
> Greetings:
>
> I have a 600M data file that never gets backed up.  The following error
> occurs in the cron job:
>
> tar: /data/mysql/"my_db_name"/"my_large_table_name".MYI: file changed as we read it
>
> Is there a way I can set this one table to read-only prior to the backup
> without affecting other db writes during this operation?
>
> Thanks,
> Van
>
> --
> MySQL General Mailing List
> For list archives: http://lists.mysql.com/mysql
> To unsubscribe:    http://lists.mysql.com/mysql?unsub=1
>
>
Thread
Backing up large dbs with tarVan13 Nov
  • Re: Backing up large dbs with tarGerald L. Clark13 Nov
  • Re: Backing up large dbs with tarDan Buettner13 Nov
    • RE: Backing up large dbs with tarTim Lucia14 Nov
      • Re: RE: Backing up large dbs with tarDan Buettner14 Nov
        • Re: Backing up large dbs with tarMathieu Bruneau19 Nov
          • Re: Backing up large dbs with tarDaniel da Veiga20 Nov
            • Query performance when executed by PHP MySql functionJacques Brignon20 Nov
              • Re: Query performance when executed by PHP MySql functionRolando Edwards20 Nov
                • EPLAINED: Query performance when executed by PHP MySql functionJacques Brignon20 Nov