写点什么

TiDB 使用国内公有云和私有部署的 S3 存储备份指南

  • 2023-05-26
    北京
  • 本文字数:10031 字

    阅读完需:约 33 分钟

作者: pepezzzz 原文来源:https://tidb.net/blog/2ba94301

背景

S3 存储服务凭借易用的 Restful API、扩展性和按需计费模型,在云时代迅速流行开来。国内公有云上部署的 TiDB 较多地使用 S3 存储用于备份,随着 minio 等开源的 S3 存储在私有化本地部署场景的流行,本地部署的 TiDB 也逐渐走向开源的 S3 存储。

S3 存储的使用要点和推荐方式

Path style 和 Virtual hosted style

Path style 和 Virtual hosted style 是 S3 存储服务 URL 的两种构建方式。


  • Path Style


URL 结构为 <Schema>://<S3 Endpoint>/<Bucket>/<Object> ,Schema 包含 HTTP 或者 HTTPS,Bucket 表示存储空间名称,S3 Endpoint 为 Bucket 所在数据存储供访问的 Endpoint,Object 是文件的访问路径,如:https://minio.example.com:9000/examplebucket/destfolder/example.txt


  • Virtual hosted style


Virtual hosted style 是指将 Bucket 置于 Host Header 的访问方式,URL 结构为 <Schema>://<Bucket>.< 外网 Endpoint>/<Object>,路径上的各部分含义同上,如:https://examplebucket.oss-cn-hangzhou.aliyuncs.com/destfolder/example.txt


在使用 URL 引用对象时,DNS 解析用于将子域名映射到 IP 地址。Path Style 下,子域始终是公有云域名或其中一个区域终端节点。Virtual hosted style 下,子域特定于存储桶。Path Style 是 S3 存储早期的访问路径方式,在公有云上配置使用 Path Style 时,会导致所有 S3 用户访问同一域名,如:s3.Region.amazonaws.com,使得流量管理和访问控制等越来越难于管理。


目前阿里云 OSS 等公有云 S3 对象存储仅支持 virtual hosted 访问方式,大部分的私有化部署 S3 存储仅支持 Path style 访问方式,部分服务两种方式都支持。备份客户端在使用 S3 存储时,需要根据服务方的要求说明进行相应设置,否则使用 OSS 等公有云的 S3 存储服务时,会遇到 Please use virtual hosted style to access 的错误提示。


参考文档:


https://help.aliyun.com/document_detail/31834.html


https://cloud.tencent.com/document/product/436/41284


https://docs.aws.amazon.com/zh_cn/AmazonS3/latest/userguide/VirtualHosting.html

Internal 域名

使用公有云的 S3 存储服务时,需要区分 S3 Endpoint 域名所代表的内网或者外网。


  • 内网指的是阿里云同地域(region)产品之间的内部通信网络,比如:ECS 云服务器访问同地域的 OSS 服务。

  • 外网指的是互联网,比如:私有的本地数据中心通过 http 下载 OSS 服务上的备份文件。如果需要外网访问,同时还涉及公共读或公共读写的桶的权限配置。


根据云服务商的收费策略,内网和外网会产生不同的网络流量费用。比如:云服务内网产生的流入和流出流量均免费,但是请求次数仍会计费。通过外网访问产生的流入流量(写)是免费的,流出流量(读)是收费的。


云服务厂商通常会在产品文档中说明不同的域名,需要根据备份服务器与 OSS 服务的地域关系进行配置。阿里云 OSS 需要显式指定外网和内网不同的域名区分访问方式,腾讯云 COS 通过 DNS 解析自动返回内网或者外网的 IP 区分访问方式。


参考文档:


https://help.aliyun.com/document_detail/31837.htm


https://cloud.tencent.com/document/product/436/6224

存储桶的数据归档时效

使用 S3 存储服务时,可以使用存储桶的生产周期管理能力,实现备份数据的分层存储和自动删除等功能。如:备份数据 7 天后从温数据层到冷数据层,30 天后删除。


参考文档:


https://help.aliyun.com/document_detail/169911.html


https://cloud.tencent.com/document/product/436/14605


https://min.io/docs/minio/kubernetes/upstream/administration/object-management/object-lifecycle-management.html

存储桶的自动复制

使用 S3 存储服务时,尤其是单机房私有化部署的 minio 集群未实现容灾,可以使用存储桶跨区域或者跨集群的复制能力,实现备份数据的站点级别冗余保护。如:备份数据从本地机房 A 的 minio 集群复制到本地机房 B 的 minio 集群。


参考文档:


https://www.minio.org.cn/docs/minio/kubernetes/upstream/administration/bucket-replication.html


https://help.aliyun.com/document_detail/254865.html


https://cloud.tencent.com/document/product/436/19235

运维编排备份任务

公有云通常提供运维编排平台的能力,如:阿里云 OOS、腾讯云蓝鲸平台等。通过编排平台和节点脚本可以定时运行运维备份任务。

s3cmd 验证

在使用 S3 桶进行备份时,强烈推荐先使用 s3cmd 进行桶的访问权限验证,再配置备份命令。

基于 AK/SK 进行备份操作

以使用阿里云 OSS 存储为例,使用 AK/SK 的访问方式进行 TiDB 的备份操作。请注意 rclone 组件默认 s3-force-path-style 是使用 path style,通过 S3 URL 的 参数或者命令行配置 s3.provider=“alibaba” 可以控制。


具体的控制逻辑参考:https://github.com/pingcap/tidb/blob/master/br/pkg/storage/s3.go#L167


        // In some cases, we need to set ForcePathStyle to false.        // Refer to: https://rclone.org/s3/#s3-force-path-style        if options.Provider == "alibaba" || options.Provider == "netease" ||                options.UseAccelerateEndpoint {                options.ForcePathStyle = false        }
复制代码

br 全库备份脚本

# vi brbackupfull.sh AccessKey=SecretKey=Bucket=Endpoint=oss-cn-REGION-internal.aliyuncs.comPDIP=export AWS_ACCESS_KEY_ID=$AccessKeyexport AWS_SECRET_ACCESS_KEY=$SecretKeyCURDATE=$(date +%Y%m%d%H%M%S)/root/.tiup/components/br/{version}/br backup full --pd "${PDIP}" --storage "s3://${Bucket}/br/${CURDATE}" --s3.endpoint="http://${Endpoint}" --s3.provider="alibaba" --send-credentials-to-tikv=true --ratelimit 128 --log-file brbackupfull.logecho s3://${Bucket}/br/${CURDATE} 
复制代码

br 全库恢复脚本

# vi brrestorefull.sh AccessKey=SecretKey=Bucket=Endpoint=oss-cn-REGION-internal.aliyuncs.comPDIP=dir=export AWS_ACCESS_KEY_ID=$AccessKeyexport AWS_SECRET_ACCESS_KEY=$SecretKey/root/.tiup/components/br/{version}/br restore full --pd "${PDIP}" --storage "s3://${Bucket}/br/${dir}" --s3.endpoint="http://${Endpoint}" --s3.provider="alibaba" --send-credentials-to-tikv=true --ratelimit 128 --log-file brrestorefull.log 
复制代码

dumpling 全库备份脚本

# vi dumplingfull.sh AccessKey=SecretKey=Bucket=Endpoint=oss-cn-REGION-internal.aliyuncs.comTIDBIP=BAKUSER=BAKPW=export AWS_ACCESS_KEY_ID=$AccessKeyexport AWS_SECRET_ACCESS_KEY=$SecretKeyCURDATE=$(date +%Y%m%d%H%M%S)/root/.tiup/components/dumpling/{version}/dumpling -u "${BAKUSER}" -P4000 -h ${TIDBIP} -p "${BAKPW}"  --filetype sql -t 8 -o "s3://${Bucket}/dumpling/${CURDATE}" --s3.endpoint="http://${Endpoint}" --s3.provider="alibaba" -r 200000 -F 256MiB
# 简要日志如下:
[2022/03/24 13:57:42.941 +08:00] [INFO] [dump.go:103] ["begin to run Dump"] [conf="{\"s3\":{\"endpoint\":\"http://oss-cn-hangzhou-internal.aliyuncs.com\",\"region\":\"us-east-1\",\"storage-class\":\"\",\"sse\":\"\",\"sse-kms-key-id\":\"\",\"acl\":\"\",\"access-key\":\"\",\"secret-access-key\":\"\",\"provider\":\"alibaba\",\"force-path-style\":false,\"use-accelerate-endpoint\":false},\"gcs\":{\"endpoint\":\"\",\"storage-class\":\"\",\"predefined-acl\":\"\",\"credentials-file\":\"\"},\"azblob\":{\"endpoint\":\"\",\"account-name\":\"\",\"account-key\":\"\",\"access-tier\":\"\"},\"AllowCleartextPasswords\":false,\"SortByPk\":true,\"NoViews\":true,\"NoHeader\":false,\"NoSchemas\":false,\"NoData\":false,\"CompleteInsert\":false,\"TransactionalConsistency\":true,\"EscapeBackslash\":true,\"DumpEmptyDatabase\":true,\"PosAfterConnect\":false,\"CompressType\":0,\"Host\":\"10.0.1.32\",\"Port\":4000,\"Threads\":8,\"User\":\"root\",\"Security\":{\"CAPath\":\"\",\"CertPath\":\"\",\"KeyPath\":\"\"},\"LogLevel\":\"info\",\"LogFile\":\"\",\"LogFormat\":\"text\",\"OutputDirPath\":\"s3://tidb-test-a/dumpling/20220324135742\",\"StatusAddr\":\":8281\",\"Snapshot\":\"432039909896486919\",\"Consistency\":\"snapshot\",\"CsvNullValue\":\"\\\\N\",\"SQL\":\"\",\"CsvSeparator\":\",\",\"CsvDelimiter\":\"\\\"\",\"Databases\":[],\"Where\":\"\",\"FileType\":\"sql\",\"ServerInfo\":{\"ServerType\":3,\"ServerVersion\":\"5.4.0\",\"HasTiKV\":true},\"Rows\":200000,\"ReadTimeout\":900000000000,\"TiDBMemQuotaQuery\":0,\"FileSize\":268435456,\"StatementSize\":1000000,\"SessionParams\":{\"tidb_snapshot\":\"432039909896486919\"},\"Tables\":null,\"CollationCompatible\":\"loose\"}"]
复制代码

lightning 还原脚本

命令中的参数可以根据需要转成 toml 文件。


建议提升执行 lightning 节点的配置到 16 C 的规格,并且保证临时目录 sortedkvdir 的容量能满足单份数据存放的需求。


# vi lightning.shAccessKey=SecretKey=Bucket=Endpoint=oss-cn-REGION-internal.aliyuncs.comTIDBIP=TIDBPORT=4000BAKUSER=BAKPW=PDIP=dir=sortedkvdir=/data/sorted-kv-dir/${dir}export AWS_ACCESS_KEY_ID=$AccessKeyexport AWS_SECRET_ACCESS_KEY=$SecretKey# tidb backup#/root/.tiup/components/tidb-lightning/{version}/tidb-lightning --backend tidb --log-file tidb-lightning-full.log --status-addr ":8239"  -d "s3://${Bucket}/dumpling/${dir}/?endpoint=http://${Endpoint}&provider=alibaba"  --tidb-host ${TIDBIP} --tidb-port ${TIDBPORT} --tidb-user ${BAKUSER} --tidb-password ${BAKPW} --pd-urls "${PDIP}:2379" --analyze required --checksum required # local backend/root/.tiup/components/tidb-lightning/{version}/tidb-lightning --backend local -sorted-kv-dir "${sortedkvdir}" --log-file tidb-lightning-full.log --status-addr ":8239"  -d "s3://${Bucket}/dumpling/${dir}/?endpoint=http://${Endpoint}&provider=alibaba"  --tidb-host ${TIDBIP} --tidb-user ${BAKUSER} --tidb-password ${BAKPW} --pd-urls "${PDIP}:2379" --analyze required --checksum required
复制代码

基于阿里云 RAM 进行备份操作

将 AK/SK 保存在应用程序的配置文件中,调用阿里云服务 API,存在两个问题:(1) 保密性问题。可能随着快照、镜像及镜像创建出来的实例被泄露。(2) 难运维性问题。要更换 AK,需要对每个对象进行更新并重新部署。在阿里云场景中,可以通过给 ECS 实例配置 RAM 角色来避免 AK 泄露及运维难的问题。


TiDB 6.1 版本开始对 Dumpling 和 Lightning 工具开始支持阿里云 RAM,BR 工具涉及 send-credentials-to-tikv 的逻辑,暂不支持。


简要步骤如下:


  1. 在 ECS 的主机上附加 S3 存储的 RAM 的权限。

  2. 使用免配置 AK/SK 的备份脚本。

总结

TiDB 支持 NFS 和 S3 两种备份存储,NFS 存在操作用户权限要求高、内核态维护难等弊端。推荐 TiDB 备份逐渐切换到 S3 备份存储,结合存储桶的桶复制和数据归档特性,实现更好的备份管理。

附: s3cmd 的使用

s3 cmd 的下载

Centos 7 推荐从以下网址进行下载,注意 s3cmd 是 noarch 类型的包,理论上兼容 arm64。


https://rhel.pkgs.org/7/epel-aarch64/s3cmd-2.0.2-1.el7.noarch.rpm.html


s3cmd 的官方下载网址


https://s3tools.org/download

常用的 s3cmd 命令

命令列举如下:


# 列举所有的桶s3cmd ls# 创建 bucket(ucket 名称必须是全局唯一)s3cmd mb s3://{BUCKETNAME}# 查看桶内的文件s3cmd ls s3://{BUCKETNAME}/dumpling/# 递归删除目录s3cmd rm -r s3://{BUCKETNAME}/dumpling/# 批量下载备份文件到本地目录s3cmd get s3://{BUCKETNAME}/dumpling/* ./# 批量上传本地目录文件s3cmd put ./* s3://{BUCKETNAME}/dumpling/# 查看空间占用s3cmd du -H s3://{BUCKETNAME}/dumpling/
复制代码

s3cfg 典型配置

以阿里云 OSS 为例,使用 s3cmd -configure 的典型配置如下。


#s3cmd -configureAccess Key: xxxxSecret Key: xxxxDefault Region: USS3 Endpoint: oss-cn-hangzhou-internal.aliyuncs.comDNS-style bucket+hostname:port template for accessing a bucket: %(bucket)s.oss-cn-hangzhou-internal.aliyuncs.comEncryption password:Path to GPG program: /usr/bin/gpgUse HTTPS protocol: TrueHTTP Proxy server name:HTTP Proxy server port: 0Test access with supplied credentials? [Y/n] yPlease wait, attempting to list all buckets...Success. Your access key and secret key worked fine :-)Now verifying that encryption works...Not configured. Never mind.Save settings? [y/N] yConfiguration saved to '/root/.s3cfg'
复制代码

使用 python setup 脚本的安装方式

下载 s3cmd 的安装包和依赖的 whl 等离线包https://sourceforge.net/projects/s3tools/files/s3cmd/https://pypi.org/project/python-magic/#fileshttps://pypi.org/project/six/#fileshttps://pypi.org/project/python-dateutil/#files


[root@host S3CMD]# yum install python-setuptoolsLast metadata expiration check: 2:01:08 ago on Thu 25 May 2023 07:44:15 AM CST.Package python-setuptools-44.1.1-1.oe1.noarch is already installed.Dependencies resolved.Nothing to do.Complete!
[root@host S3CMD]# pip install python_magic-0.4.27-py2.py3-none-any.whlDEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-supportWARNING: Running pip install with root privileges is generally not a good idea. Try pip install --user instead.Processing ./python_magic-0.4.27-py2.py3-none-any.whlInstalling collected packages: python-magicERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts.We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default.s3cmd 2.1.0 requires python-dateutil, which is not installed.Successfully installed python-magic-0.4.27
[root@host S3CMD]# pip install six-1.16.0-py2.py3-none-any.whlDEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-supportWARNING: Running pip install with root privileges is generally not a good idea. Try pip install --user instead.Processing ./six-1.16.0-py2.py3-none-any.whlInstalling collected packages: sixSuccessfully installed six-1.16.0
[root@host S3CMD]# pip install python_dateutil-2.8.2-py2.py3-none-any.whlDEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-supportWARNING: Running pip install with root privileges is generally not a good idea. Try pip install --user instead.Processing ./python_dateutil-2.8.2-py2.py3-none-any.whlRequirement already satisfied: six>=1.5 in /usr/lib/python2.7/site-packages (from python-dateutil==2.8.2) (1.16.0)Installing collected packages: python-dateutilSuccessfully installed python-dateutil-2.8.2
[root@host S3CMD]# cd s3cmd-2.1.0/[root@host s3cmd-2.1.0]# python setup.py installUsing xml.etree.ElementTree for XML processingrunning installrunning bdist_eggrunning egg_infowriting requirements to s3cmd.egg-info/requires.txtwriting s3cmd.egg-info/PKG-INFOwriting top-level names to s3cmd.egg-info/top_level.txtwriting dependency_links to s3cmd.egg-info/dependency_links.txtreading manifest file 's3cmd.egg-info/SOURCES.txt'reading manifest template 'MANIFEST.in'writing manifest file 's3cmd.egg-info/SOURCES.txt'installing library code to build/bdist.linux-x86_64/eggrunning install_librunning build_pycreating build/bdist.linux-x86_64/eggcreating build/bdist.linux-x86_64/egg/S3copying build/lib/S3/Exceptions.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/CloudFront.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/Custom_httplib3x.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/Config.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/ACL.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/HashCache.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/ExitCodes.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/Progress.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/S3.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/init.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/Utils.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/SortedDict.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/S3Uri.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/FileDict.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/Crypto.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/PkgInfo.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/ConnMan.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/FileLists.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/AccessLog.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/Custom_httplib27.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/MultiPart.py -> build/bdist.linux-x86_64/egg/S3copying build/lib/S3/BidirMap.py -> build/bdist.linux-x86_64/egg/S3byte-compiling build/bdist.linux-x86_64/egg/S3/Exceptions.py to Exceptions.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/CloudFront.py to CloudFront.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/Custom_httplib3x.py to Custom_httplib3x.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/Config.py to Config.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/ACL.py to ACL.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/HashCache.py to HashCache.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/ExitCodes.py to ExitCodes.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/Progress.py to Progress.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/S3.py to S3.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/init.py to init.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/Utils.py to Utils.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/SortedDict.py to SortedDict.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/S3Uri.py to S3Uri.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/FileDict.py to FileDict.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/Crypto.py to Crypto.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/PkgInfo.py to PkgInfo.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/ConnMan.py to ConnMan.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/FileLists.py to FileLists.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/AccessLog.py to AccessLog.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/Custom_httplib27.py to Custom_httplib27.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/MultiPart.py to MultiPart.pycbyte-compiling build/bdist.linux-x86_64/egg/S3/BidirMap.py to BidirMap.pycinstalling package data to build/bdist.linux-x86_64/eggrunning install_datacreating build/bdist.linux-x86_64/egg/sharecreating build/bdist.linux-x86_64/egg/share/doccreating build/bdist.linux-x86_64/egg/share/doc/packagescreating build/bdist.linux-x86_64/egg/share/doc/packages/s3cmdcopying README.md -> build/bdist.linux-x86_64/egg/share/doc/packages/s3cmdcopying INSTALL.md -> build/bdist.linux-x86_64/egg/share/doc/packages/s3cmdcopying LICENSE -> build/bdist.linux-x86_64/egg/share/doc/packages/s3cmdcopying NEWS -> build/bdist.linux-x86_64/egg/share/doc/packages/s3cmdcreating build/bdist.linux-x86_64/egg/share/mancreating build/bdist.linux-x86_64/egg/share/man/man1copying s3cmd.1 -> build/bdist.linux-x86_64/egg/share/man/man1creating build/bdist.linux-x86_64/egg/EGG-INFOinstalling scripts to build/bdist.linux-x86_64/egg/EGG-INFO/scriptsrunning install_scriptsrunning build_scriptscreating build/bdist.linux-x86_64/egg/EGG-INFO/scriptscopying build/scripts-2.7/s3cmd -> build/bdist.linux-x86_64/egg/EGG-INFO/scriptschanging mode of build/bdist.linux-x86_64/egg/EGG-INFO/scripts/s3cmd to 755copying s3cmd.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFOcopying s3cmd.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFOcopying s3cmd.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFOcopying s3cmd.egg-info/requires.txt -> build/bdist.linux-x86_64/egg/EGG-INFOcopying s3cmd.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFOzip_safe flag not set; analyzing archive contents...creating 'dist/s3cmd-2.1.0-py2.7.egg' and adding 'build/bdist.linux-x86_64/egg' to itremoving 'build/bdist.linux-x86_64/egg' (and everything under it)Processing s3cmd-2.1.0-py2.7.eggRemoving /usr/lib/python2.7/site-packages/s3cmd-2.1.0-py2.7.eggCopying s3cmd-2.1.0-py2.7.egg to /usr/lib/python2.7/site-packagess3cmd 2.1.0 is already the active version in easy-install.pthInstalling s3cmd script to /usr/binInstalled /usr/lib/python2.7/site-packages/s3cmd-2.1.0-py2.7.eggProcessing dependencies for s3cmd==2.1.0Searching for python-magic==0.4.27Best match: python-magic 0.4.27Adding python-magic 0.4.27 to easy-install.pth fileUsing /usr/lib/python2.7/site-packagesSearching for python-dateutil==2.8.2Best match: python-dateutil 2.8.2Adding python-dateutil 2.8.2 to easy-install.pth fileUsing /usr/lib/python2.7/site-packagesSearching for six==1.16.0Best match: six 1.16.0Adding six 1.16.0 to easy-install.pth fileUsing /usr/lib/python2.7/site-packagesFinished processing dependencies for s3cmd==2.1.0[root@host s3cmd-2.1.0]#
复制代码


发布于: 刚刚阅读数: 5
用户头像

TiDB 社区官网:https://tidb.net/ 2021-12-15 加入

TiDB 社区干货传送门是由 TiDB 社区中布道师组委会自发组织的 TiDB 社区优质内容对外宣布的栏目,旨在加深 TiDBer 之间的交流和学习。一起构建有爱、互助、共创共建的 TiDB 社区 https://tidb.net/

评论

发布
暂无评论
TiDB 使用国内公有云和私有部署的 S3 存储备份指南_数据库架构设计_TiDB 社区干货传送门_InfoQ写作社区