Home Backend Development Python Tutorial Introduction to the example of joining the master-slave synchronization cluster with python mysql library

Introduction to the example of joining the master-slave synchronization cluster with python mysql library

Mar 22, 2017 am 10:06 AM

The script can be executed on any machine (mysql needs to be installed, at least the mysql client, mysql can only be version 5.6 and above), first enter the source ip, check whether mysql on the source ip is running normally, and then run it on this machine Dump the mysql database, then transfer the dump file to the destination server, import the database on the destination server, and finally add the slave database to the active cluster.

[root@master test]# cat finaly_mysql.py

#!/usr/bin/env python

#-*- coding: utf-8 -*-

import MySQLdb, socket, paramiko,sys, os,datetime, time

sour_db=raw_input('please input the source mysql database ip:')

dest_db=raw_input ('please input the destination mysql database ip:')

password=raw_input('please input the sour_db root users password :')

def check_port(ip,port):

print "test whether source mysql db is running!"

res=socket.socket(socket.AF_INET, socket.SOCK_STREAM)

res.settimeout(3)

try:

res.connect((ip,port))

print 'Server port 3306 OK!'

print("\033[41;36m Server port 3306 OK! \033[0m")

except Exception,e:

print Exception,":",e

print "break this program"

sys.exit()

res.close()

#Check whether the 3306 port of the source library is normal

def begin_dump():

print "begin dump remote mysql,please waiting...!"

print("\033[41;36m begin dump remote mysql,please waiting...! \033[0m")

Hostname=sour_db

Username='root'

dump='mysqldump -uroot -pXp29at5F37 -h 192.168.3.10 -A -B > /tmp/dump.sql && echo $ ? '

if os.popen(dump).read().strip() == '0':

print "dump result is 0,means dump success"

print("\033[41;36m dump result is 0,means dump success \033[0m")

else:

print "\033[1;31;40m% s\033[0m" % "dump error,exit python file"

sys.exit()

#Dump database file from local

def trans_dump():

print " "

local_dir='/tmp'

remote_dir='/tmp'

dest_dir='/tmp'

print " "

print "begin transfer mysql dump file from local server to destination mysql database server ,please waiting...!"

try:

t=paramiko. Transport((dest_db,22))

      t.connect(username='root',password=password)

        sftp=paramiko.SFTPClient.from_transport(t)

files='dump.sql'

                                                                                                                                                                      print ' Beginning to transfer file to %s %s ' % (dest_db,datetime.datetime.now())

print ' Transfering file:',dest_db + ':' + os.path.join(local_dir, files)

        sftp.put(os.path.join(local_dir,files),os.path.join(dest_dir,files))

        t.close()

Print 'Transfer all dump file success %s' %datetime.datetime.now ()

Except Exception, e:

##, "\ 033 [1; 31; 40m %s \033[0m" % ":", "\033[1;31;40m%s\033[0m" % e

sys.exit()

#Copy the database file from the local Transfer to the destination server, that is, dest_db

def import_dump():

conn=MySQLdb.connect(host=dest_db,user='root',passwd='Xp29at5F37',db='test ')

            cur1=conn.cursor()

                cur1.execute("stop slave;")

##              . conn.cursor()

Cur2.execute("reset master;")

Cur2.close()

cur3=conn.cursor()

Cur3.execute("reset slave all;")

cur3.close()

conn.close()

print " "

print "begin to import mysql dump file ,please waiting...!"

local_dir='/tmp'

remote_dir='/tmp'

dest_dir='/tmp'

import_command = "mysql -uroot -pleyou < /tmp/dump.sql"

print ' begin import dump file ,it may take a long time, please be patient !!!'

try:

ssh =paramiko.SSHClient ()

ssh.load_system_host_keys ()

ssh.connect (hostname =dest_db,username ='root',password =password)

stdin, stdout, stderr = ssh.exec_command (import_command)

print stderr.read ()

ssh.close ()

except Exception,e:

print Exception,":",e

print "import over"

#Before importing First, stop all slaves of dest_db, because we don’t know the status of dest_db. It may have been a master database before, or it may be a slave database of other machines.

def final_check_mysql ():

print " "

print "finally check mysql service "

status = True

try:

Conn = MySQLDB.CONNECT (host = Dest_db, User = 'ROOT', PASSWD = 'XP29AT5F37', DB = 'Test')

              cur1.execute("CHANGE MASTER TO MASTER_HOST='192.168.3.10', MASTER_USER='root', MASTER_PASSWORD='Xp29at5F37', MASTER_AUTO_POSITION=1;")

##                    

                                                                                                                                                                                                                                ; "

time.sleep(10)

                                                                                                                                                                                                                                                                                    = Cur2.Fetchalls ()

IO_THREAD = Result [0] [10]

Sql_thread = Result [0] [11]

## PRINT IO_THREAD, SQL_THREAD

if io_thread == "Yes" and sql_thread == "Yes":

print 'MySQL master/slave replication status is successfully'

## else:

print ' MySQL MASTER/SLAVE Replication Fail, Please Check it '

Cur2.Close ()

Conn.Close ()

## Exception Exception, E:

# print Exception,"\033[1;31;40m%s\033[0m" % ":", "\033[1;31;40m%s\033[0m" % e

                status = True

Return status

#Finally check the running status of the new slave library and see if the synchronization is normal

if __name__ == "__main__":

a=check_port (sour_db,3306)

b=begin_dump()

c=trans_dump()

d=import_dump()

e=final_check_mysql()

print "dump file ok!!!!!!!!!!!"

The above is the detailed content of Introduction to the example of joining the master-slave synchronization cluster with python mysql library. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

How to avoid being detected by the browser when using Fiddler Everywhere for man-in-the-middle reading? How to avoid being detected by the browser when using Fiddler Everywhere for man-in-the-middle reading? Apr 02, 2025 am 07:15 AM

How to avoid being detected when using FiddlerEverywhere for man-in-the-middle readings When you use FiddlerEverywhere...

How to solve permission issues when using python --version command in Linux terminal? How to solve permission issues when using python --version command in Linux terminal? Apr 02, 2025 am 06:36 AM

Using python in Linux terminal...

How to teach computer novice programming basics in project and problem-driven methods within 10 hours? How to teach computer novice programming basics in project and problem-driven methods within 10 hours? Apr 02, 2025 am 07:18 AM

How to teach computer novice programming basics within 10 hours? If you only have 10 hours to teach computer novice some programming knowledge, what would you choose to teach...

How to get news data bypassing Investing.com's anti-crawler mechanism? How to get news data bypassing Investing.com's anti-crawler mechanism? Apr 02, 2025 am 07:03 AM

Understanding the anti-crawling strategy of Investing.com Many people often try to crawl news data from Investing.com (https://cn.investing.com/news/latest-news)...

Python 3.6 loading pickle file error ModuleNotFoundError: What should I do if I load pickle file '__builtin__'? Python 3.6 loading pickle file error ModuleNotFoundError: What should I do if I load pickle file '__builtin__'? Apr 02, 2025 am 06:27 AM

Loading pickle file in Python 3.6 environment error: ModuleNotFoundError:Nomodulenamed...

What is the reason why pipeline files cannot be written when using Scapy crawler? What is the reason why pipeline files cannot be written when using Scapy crawler? Apr 02, 2025 am 06:45 AM

Discussion on the reasons why pipeline files cannot be written when using Scapy crawlers When learning and using Scapy crawlers for persistent data storage, you may encounter pipeline files...

See all articles