Results 1 to 18 of 18

Thread: Downloading big databases?

  1. #1
    Registered
    Join Date
    Dec 2003
    Location
    San Francisco
    Posts
    20
    Rep Power
    0

    Downloading big databases?

    What are people's techniques for downloading big databases (around 1Gb)? Exporting from phpmyadmin just times out (I don't want to have to do every table separately). I tried to get a backup from the mysql backups tool on the control panel but it doesn't seem to actually do anything when I click on the backup (this is the new beta control panel).

    What I really want is to be able to run some equivalent of 'mysqldump | gzip > dump.sql.gz' and just be able to grab the file - hmm, maybe I can do something like that from php...

  2. #2
    Former Spam Filter (EU) IanS's Avatar
    Join Date
    Mar 2004
    Location
    Washington (THE original UK one!)
    Posts
    12,964
    Rep Power
    30
    You will probably find that you need to ask support to make a copy. One other technique mentioned elsewhere by others is to export various sections (eg tables)separately and re-assemble them afterwards.
    This is a Powweb customer
    helping Powweb customer forum.

    I am a customer just like you!!

    Some matters can only be answered by staff or support.
    Give it a go - ask here first!

  3. #3
    YvetteKuhns's Avatar
    Join Date
    Feb 2003
    Location
    Allentown, PA USA
    Posts
    15,244
    Rep Power
    35
    Did you try BigDump? It is a php script. I have tried downloading a few tables at a time from phpmyadmin before finding this script.
    Yvette Kuhns
    Power Pages Web Design
    Customized Internet Advertising Solutions

  4. #4
    Registered
    Join Date
    Dec 2003
    Location
    San Francisco
    Posts
    20
    Rep Power
    0
    I'm already using bigdump for uploading to powweb - are you also able to use it to download?

  5. #5
    Registered
    Join Date
    Dec 2003
    Location
    San Francisco
    Posts
    20
    Rep Power
    0
    Hmm - you can run mysqldump from php, but there is a 10 second cpu limit on processes so it can't dump the whole db.

    However a php script to dump the tables individually and cat the dumps together should be easy to knock up...

  6. #6
    satis's Avatar
    Join Date
    Oct 2002
    Location
    Dallas
    Posts
    2,914
    Rep Power
    21
    very cool. If you manage to get it working, do you mind posting the code? Are you just using a system call to run mysqldump?

  7. #7
    satis's Avatar
    Join Date
    Oct 2002
    Location
    Dallas
    Posts
    2,914
    Rep Power
    21
    ok, I admit this captured my attention so I've been playing with it. I'm currently writing a script for use here, but thought I'd share how to successfully force a mysqldump. This assumes mysqldump is in the system path.

    PHP Code:
        function ajaxDump($hostname$db$username$password$table){
            
    $command 'mysqldump --opt --host=' .$hostname .' --user=' .$username .' --password=' .$password .' ' .$db .' ' .$table .' > ' .$db .'_' .$table .'.sql';
            
    $result;
            
    system($command$result);
            if(
    $result == 0)
                return 
    true;
            return 
    false;
        } 

  8. #8
    satis's Avatar
    Join Date
    Oct 2002
    Location
    Dallas
    Posts
    2,914
    Rep Power
    21
    Have I ever mentioned that I can get obsessive about things that interest me? I've written a sql dump script that leverages mysqldump, dumping of individual tables, and ajax to make something that I think people may find useful. I consider this release 0.1a.

    I've attached a zip with the php file. Basically, drop it into your webspace, browse to it, and you can probably figure out the rest. I would post the code below, but it makes the post too long for the forums.

    I did not try catting together the sql files... I suppose that's a possibility for a future version.

    The only bug I found so far is that the rapid firing of ajax requests seems to cause some xmlhttprequest weirdness in FF. At least according to firebug, the xmlhttprequest just keeps going and going and going for each table. However, results are definitely returned, so that may be a problem with firebug. This is not tested in any browser other than FF, but it should work in any newer browser. *crosses fingers*

    Currently I'm handling table dumps individually. ie... one goes until it's done, then the next one goes. There's nothing to stop me from making several go in parallel. I actually designed it to do that, but that can probably wait until any bugs I've missed are ironed out.

    In order to keep things all in a single file, I stuck all the javascript and css into the php class. That makes things a bit dirty, but it means you only have to upload a single page to get this to work. I figured it was worth the trade off.

    This will work in php 5.0+ only.
    Last edited by IanS; 4-23-10 at 01:22 PM. Reason: Removed attachment - v0.1

  9. #9
    Registered
    Join Date
    Dec 2003
    Location
    San Francisco
    Posts
    20
    Rep Power
    0
    mysqldumping the individual tables worked a charm. Concatenating the gzipped files is fine too as gunzip is fine unzipping them. Here's the script I wrote to dump all the tables from a database:

    PHP Code:
    $host 'myhost.powwebmysql.com';
    $username 'myuser';
    $password 'mypassword';
    $database 'mydbname';

    $file 'export.sql.gz';

    // Connect

    $db mysqli_connect($host$username$password$database);

    $q mysqli_query($db'show tables');
    unlink($file);
    while (
    $t mysqli_fetch_row($q)) {
        
    $table $t[0];

        
    system("mysqldump -h $host -u $username -p$password --opt $database $table | gzip >> $file");

    No fancy ajax - it just dumps it all to a file in the same dir as the script. And it assumes no single table is so big that it'll take more than 10 seconds to dump it (otherwise the mysqldump will be terminated and the file will be screwed up).

    Now I'm having fun trying to import the sump into a different version of mysql on my home machine...

  10. #10
    Former Spam Filter (EU) IanS's Avatar
    Join Date
    Mar 2004
    Location
    Washington (THE original UK one!)
    Posts
    12,964
    Rep Power
    30
    It seems that this is developing into a really useful thread (at least for some!). I've made it sticky for the moment while it is still in development. I'll then prune it a little later.
    This is a Powweb customer
    helping Powweb customer forum.

    I am a customer just like you!!

    Some matters can only be answered by staff or support.
    Give it a go - ask here first!

  11. #11
    Registered
    Join Date
    Dec 2003
    Location
    San Francisco
    Posts
    20
    Rep Power
    0
    Aha, seems my import problem is one of my tables hitting the 10 second cpu limit. Going to think on how to get around that one...

  12. #12
    satis's Avatar
    Join Date
    Oct 2002
    Location
    Dallas
    Posts
    2,914
    Rep Power
    21
    oh, interesting, I didn't know you could just concat the gzipped files and actually have it work. That's pretty neat. I'm going to gank that functionality and include it in my script, if you have no objections.

    Regarding inserts timing out, that's a sticky problem. If you can think of an elegant solution that doesn't rely on a php script parsing the dump file and running the inserts until it gets close to timing out, please let me know.

  13. #13
    satis's Avatar
    Join Date
    Oct 2002
    Location
    Dallas
    Posts
    2,914
    Rep Power
    21
    I've made some changes to my script. Version 0.2 is now attached.

    Adding gzip is still in the future

    changelog:
    Added simultaneous dumping of 3 tables
    Changed name of script
    Altered writing to update table to be synchronous so simultaneous writes don't lock the browser up
    On successful dump, the table name in the "tables to dump" table now becomes the download link
    If a second dump request is made after a table was successfully dumped, the table parser strips off the previous link
    Any existing file is now unlinked before being redumped
    Progress Messages dialog improved
    Incremented version to v0.2
    Last edited by IanS; 4-23-10 at 01:23 PM. Reason: Removed attachment - v0.2

  14. #14
    satis's Avatar
    Join Date
    Oct 2002
    Location
    Dallas
    Posts
    2,914
    Rep Power
    21
    ok, I'm having gzip issues.. I'm executing the following command in an exec shell

    mysqldump --opt --host=cust.powwebmysql.com --user=db_user --password=db_password database table| gzip > filename.gz

    filename.gz is created as you might expect, but it's not gzipped. It's just plain text. From everything I can find online (and this thread), it should work. Any ideas what might be the matter? I could probably break it into 2 synchronous commands by doing a mysqldump into a text file, then gzip the text file, but I'd rather make a single system call.

  15. #15
    Rick
    Join Date
    May 2002
    Location
    Minneapolis, MN
    Posts
    1,753
    Rep Power
    19
    Timeouts come to mind. GZIP is pretty fast, but the combined tasks might cross the threshold. You should definitely try splitting the command just to check things are working step-by-step.
    Rick Trethewey

  16. #16
    satis's Avatar
    Join Date
    Oct 2002
    Location
    Dallas
    Posts
    2,914
    Rep Power
    21
    haha... I just made a really stupid mistake. Turns out everything was working exactly as it's supposed to. For some reason I was assuming that clicking on a link to a .gz file should prompt a download, like a .zip would, but that's not the case. I can't believe how much time I spent troubleshooting something that was working. *sigh*

  17. #17
    satis's Avatar
    Join Date
    Oct 2002
    Location
    Dallas
    Posts
    2,914
    Rep Power
    21
    Updated script to 0.3

    Added scrolling to progress messages
    Clear progress messages every time dump is clicked
    On file dump failure, unlink any existing file
    Added gzip support and detection
    Attached Files Attached Files

  18. #18
    Registered
    Join Date
    Dec 2003
    Location
    San Francisco
    Posts
    20
    Rep Power
    0
    Yeah, that happens to me too, something is automatically unzipping the file. Is that a powweb thing or is it my browser (firefox) doing that? Safari just downloads the file like I'd expect…

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •