欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页  >  IT编程

Python实现备份EC2的重要文件和MySQL数据库到S3

程序员文章站 2022-06-07 14:18:38
今天尝试了使用boto这个工具来用python备份文件到s3,废话不说,上代码: 1. 备份重要文件到s3: [python]  import os ...

今天尝试了使用boto这个工具来用python备份文件到s3,废话不说,上代码:

1. 备份重要文件到s3:


[python] 
import os 
connected = 0 
def connect(): 
    access_key = 'yourkey 
    secret_key = 'yourkey' 
    from boto.s3.connection import s3connection 
    global conn 
    conn = s3connection(access_key, secret_key) 
    global connected 
    connected = 1 
 
def put(filename, bucketname): 
    if connected == 0: 
        print 'not connected!' 
    elif connected == 1: 
        local_file = filename.strip() 
        bucket = bucketname.strip() 
        from boto.s3.key import key 
        b = conn.get_bucket(bucket) 
        k = key(b) 
        k.key = local_file 
        k.set_contents_from_filename(local_file) 
         
if __name__ == '__main__': 
    connect() 
    sourcefolder = '/var/www/www.ttgrow.com/ttgrow/photos/storyphotos' 
    print 'story photo sync in progress' 
    for root, dirs, files in os.walk(sourcefolder): 
        for file in files:   
            print '  '+str(os.path.join(root,file)) 
            put(os.path.join(root,file),'ttgrow-photo') 
    sourcefolder = '/var/www/www.ttgrow.com/ttgrow/photos/thumbnails' 
    print 'thumbnail sync in progress' 
    for root, dirs, files in os.walk(sourcefolder): 
        for file in files: 
            print '  '+str(os.path.join(root,file)) 
            put(os.path.join(root,file),'ttgrow-photo') 
    print 'finished' 

2. 备份mysql到s3:

[python] 
import os 
connected = 0 
def connect(): 
    access_key = 'yourkey' 
    secret_key = 'yourkey' 
    from boto.s3.connection import s3connection 
    global conn 
    conn = s3connection(access_key, secret_key) 
    global connected 
    connected = 1 
 
def put(filename, bucketname): 
    if connected == 0: 
        print 'not connected!' 
    elif connected == 1: 
        local_file = filename.strip() 
        bucket = bucketname.strip() 
        from boto.s3.key import key 
        b = conn.get_bucket(bucket) 
        k = key(b) 
        k.key = local_file 
        k.set_contents_from_filename(local_file) 
         
if __name__ == '__main__': 
    from datetime import datetime 
    import os 
    temp = datetime.today() 
    filename = '/tmp/dbbak-'+str(temp.year)+'-'+str(temp.month)+'-'+str(temp.day)+'-'+str(temp.hour)+'-'+str(temp.minute)+'.sql' 
    os.system("mysqldump -h your_rds_location -u usrname -ppassword dbname > "+filename) 
    print 'backup db finished' 
    connect() 
    put(filename,'ttgrow-db') 
 
    print 'upload to s3 finished' 
再把执行脚本加到定时器就每天可以定时执行了 :)