基于mysqldump快速搭建从库
mysql主从搭建总的来说大致分为3个步骤,一是为主从实例添加复制所需参数以及创建复制用的账户,二在是需要在主库建立快照,三是
mysql主从搭建总的来说大致分为3个步骤,一是为主从实例添加复制所需参数以及创建复制用的账户,二在是需要在主库建立快照,三是在从库上添加指向主库IP,端口,用户名,密码,binlog位置等。而对于主从搭建的快照方式有很多种,如使用InnoDB hotbak,xtrabackup,mysqldump以及直接使用tar方式来建立快照。本文主要介绍使用mysqldump方式来建立快照,适用于不超过20GB左右的数据库。
与本文有关的相关参考:
使用mysqldump导出数据库
MySQL 复制简要描述及示例
MySQL多实例配置
1、实例级别的主从搭建
-- 演示环境,另,本文演示基于同一主机的多实例,,主端口为3406,从端口为3506
master@localhost[(none)]> show variables like 'version';
+---------------+------------+
| Variable_name | Value |
+---------------+------------+
| version | 5.6.12-log |
+---------------+------------+
master@localhost[(none)]> system cat /etc/issue
CentOS release 5.4 (Final)
Kernel \r on an \m
--有关参主从数配置,请参考MySQL 复制简要描述及示例
--创建用于复制的账户
master@localhost[(none)]> grant replication slave on *.* to 'repl'@'192.168.1.177' identified by 'xxx';
Query OK, 0 rows affected (0.01 sec)
--全局读锁
master@localhost[(none)]> flush tables with read lock;
Query OK, 0 rows affected (0.02 sec)
master@localhost[(none)]> system pwd
/data/inst3406
--获取master binlog位置
master@localhost[(none)]> show master status;
+--------------------+----------+--------------+------------------+-------------------+
| File | Position | Binlog_Do_DB | Binlog_Ignore_DB | Executed_Gtid_Set |
+--------------------+----------+--------------+------------------+-------------------+
| inst3406bin.000001 | 2169 | | | |
+--------------------+----------+--------------+------------------+-------------------+
1 row in set (0.00 sec)
--使用mysqldump导出实例
master@localhost[(none)]> system mysqldump -uroot -pxxx -S /tmp/mysql3406.sock --routines --all-databases --opt >alldb.sql
master@localhost[(none)]> system ls
alldb.sql data3406
--解锁
master@localhost[(none)]> unlock tables;
master@localhost[(none)]> exit
--从库上面导入dump --从库上设置主库的相关信息(host,port等等) --启动slave 2、部分库从库搭建 --以下演示为仅搭建部分从库,为只同步sakila tempdb 2个库 slave@localhost[(none)]> reset slave all; --为从库添加只同步sakila tempdb 2个库,以下为修改后的结果 [mysql@app ~]$ grep skip-slave /data/inst3506/data3506/my3506.cnf --修改后重启3506以使从配置生效 [mysql@app ~]$ mysqld_safe --defaults-file=/data/inst3506/data3506/my3506.cnf & --从主库仅导出sakila tempdb --在从库端登陆执行dump文件 --查看dump期间的master binlog位置 --从库上设置主库的相关信息(host,port等等) --启动从库 -- Author : Leshami
[mysql@app inst3406]$ mysql -uroot -pxxx -S /tmp/mysql3506.sock
[mysql@app inst3506]$ mysqls
slave@localhost[(none)]> change master to
-> MASTER_HOST='192.168.1.177',
-> MASTER_USER='repl',
-> MASTER_PASSWORD='xxx',
-> MASTER_PORT=3406,
-> MASTER_LOG_FILE='inst3406bin.000001',
-> MASTER_LOG_POS=2169;
Query OK, 0 rows affected, 2 warnings (0.01 sec)
slave@localhost[(none)]> start slave;
--重置从库
slave@localhost[(none)]> stop slave;
Query OK, 0 rows affected (0.01 sec)
Query OK, 0 rows affected (0.01 sec)
[mysql@app ~]$ grep replicate /data/inst3506/data3506/my3506.cnf
replicate-do-db=test
replicate-do-db=sakila
skip-slave-start
[mysql@app ~]$ mysqladmin -uroot -pxxx -S /tmp/mysql3506.sock shutdown
[mysql@app ~]$ mysqldump -uroot -pxxx -S /tmp/mysql3406.sock --single-transaction --master-data=2 -R --database sakila tempdb>multidb.sql
[mysql@app ~]$ mysqls
slave@localhost[(none)]> source multidb.sql
slave@localhost[tempdb]> system grep -i "change master" multidb.sql
-- CHANGE MASTER TO MASTER_LOG_FILE='inst3406bin.000001', MASTER_LOG_POS=3293117;
slave@localhost[tempdb]> change master to
-> MASTER_HOST='192.168.1.177',
-> MASTER_USER='repl',
-> MASTER_PASSWORD='xxx',
-> MASTER_PORT=3406,
-> MASTER_LOG_FILE='inst3406bin.000001',
-> MASTER_LOG_POS=3293117;
Query OK, 0 rows affected, 2 warnings (0.01 sec)
slave@localhost[tempdb]> start slave;
Query OK, 0 rows affected (0.01 sec)
-- Blog :
--校验结果
slave@localhost[tempdb]> show slave status \G
*************************** 1. row ***************************
Slave_IO_State: Waiting for master to send event
Master_Host: 192.168.1.177
Master_User: repl
Master_Port: 3406
Connect_Retry: 60
Master_Log_File: inst3406bin.000001
Read_Master_Log_Pos: 3293117
Relay_Log_File: relay-bin.000002
Relay_Log_Pos: 285
Relay_Master_Log_File: inst3406bin.000001
Slave_IO_Running: Yes
Slave_SQL_Running: Yes
Replicate_Do_DB: test,sakila

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

DDREASE is a tool for recovering data from file or block devices such as hard drives, SSDs, RAM disks, CDs, DVDs and USB storage devices. It copies data from one block device to another, leaving corrupted data blocks behind and moving only good data blocks. ddreasue is a powerful recovery tool that is fully automated as it does not require any interference during recovery operations. Additionally, thanks to the ddasue map file, it can be stopped and resumed at any time. Other key features of DDREASE are as follows: It does not overwrite recovered data but fills the gaps in case of iterative recovery. However, it can be truncated if the tool is instructed to do so explicitly. Recover data from multiple files or blocks to a single

0.What does this article do? We propose DepthFM: a versatile and fast state-of-the-art generative monocular depth estimation model. In addition to traditional depth estimation tasks, DepthFM also demonstrates state-of-the-art capabilities in downstream tasks such as depth inpainting. DepthFM is efficient and can synthesize depth maps within a few inference steps. Let’s read about this work together ~ 1. Paper information title: DepthFM: FastMonocularDepthEstimationwithFlowMatching Author: MingGui, JohannesS.Fischer, UlrichPrestel, PingchuanMa, Dmytr

If you need to know how to use filtering with multiple criteria in Excel, the following tutorial will guide you through the steps to ensure you can filter and sort your data effectively. Excel's filtering function is very powerful and can help you extract the information you need from large amounts of data. This function can filter data according to the conditions you set and display only the parts that meet the conditions, making data management more efficient. By using the filter function, you can quickly find target data, saving time in finding and organizing data. This function can not only be applied to simple data lists, but can also be filtered based on multiple conditions to help you locate the information you need more accurately. Overall, Excel’s filtering function is a very practical

The performance of JAX, promoted by Google, has surpassed that of Pytorch and TensorFlow in recent benchmark tests, ranking first in 7 indicators. And the test was not done on the TPU with the best JAX performance. Although among developers, Pytorch is still more popular than Tensorflow. But in the future, perhaps more large models will be trained and run based on the JAX platform. Models Recently, the Keras team benchmarked three backends (TensorFlow, JAX, PyTorch) with the native PyTorch implementation and Keras2 with TensorFlow. First, they select a set of mainstream

Facing lag, slow mobile data connection on iPhone? Typically, the strength of cellular internet on your phone depends on several factors such as region, cellular network type, roaming type, etc. There are some things you can do to get a faster, more reliable cellular Internet connection. Fix 1 – Force Restart iPhone Sometimes, force restarting your device just resets a lot of things, including the cellular connection. Step 1 – Just press the volume up key once and release. Next, press the Volume Down key and release it again. Step 2 – The next part of the process is to hold the button on the right side. Let the iPhone finish restarting. Enable cellular data and check network speed. Check again Fix 2 – Change data mode While 5G offers better network speeds, it works better when the signal is weaker

The latest video of Tesla's robot Optimus is released, and it can already work in the factory. At normal speed, it sorts batteries (Tesla's 4680 batteries) like this: The official also released what it looks like at 20x speed - on a small "workstation", picking and picking and picking: This time it is released One of the highlights of the video is that Optimus completes this work in the factory, completely autonomously, without human intervention throughout the process. And from the perspective of Optimus, it can also pick up and place the crooked battery, focusing on automatic error correction: Regarding Optimus's hand, NVIDIA scientist Jim Fan gave a high evaluation: Optimus's hand is the world's five-fingered robot. One of the most dexterous. Its hands are not only tactile

I cry to death. The world is madly building big models. The data on the Internet is not enough. It is not enough at all. The training model looks like "The Hunger Games", and AI researchers around the world are worrying about how to feed these data voracious eaters. This problem is particularly prominent in multi-modal tasks. At a time when nothing could be done, a start-up team from the Department of Renmin University of China used its own new model to become the first in China to make "model-generated data feed itself" a reality. Moreover, it is a two-pronged approach on the understanding side and the generation side. Both sides can generate high-quality, multi-modal new data and provide data feedback to the model itself. What is a model? Awaker 1.0, a large multi-modal model that just appeared on the Zhongguancun Forum. Who is the team? Sophon engine. Founded by Gao Yizhao, a doctoral student at Renmin University’s Hillhouse School of Artificial Intelligence.

New SOTA for multimodal document understanding capabilities! Alibaba's mPLUG team released the latest open source work mPLUG-DocOwl1.5, which proposed a series of solutions to address the four major challenges of high-resolution image text recognition, general document structure understanding, instruction following, and introduction of external knowledge. Without further ado, let’s look at the effects first. One-click recognition and conversion of charts with complex structures into Markdown format: Charts of different styles are available: More detailed text recognition and positioning can also be easily handled: Detailed explanations of document understanding can also be given: You know, "Document Understanding" is currently An important scenario for the implementation of large language models. There are many products on the market to assist document reading. Some of them mainly use OCR systems for text recognition and cooperate with LLM for text processing.
