AWS S3 存储桶复制及权限同步

这篇具有很好参考价值的文章主要介绍了AWS S3 存储桶复制及权限同步。希望对大家有所帮助。如果存在错误或未考虑完全的地方,请大家不吝赐教,您也可以点击"举报违法"按钮提交疑问。

1、存储桶复制

分为2种: SCR , CCR 

SCR和CCR的操作文档可以参考AWS 官方文档,这里就不重复了:

复制对象 - Amazon Simple Storage Service

使用 S3 分批复制以复制现有对象 - Amazon Simple Storage Service

授予 Amazon S3 分批操作的权限 - Amazon Simple Storage Service

SCR可以同步对象的权限,不需要额外的权限同步操作。

CCR无法同步除所有者之外的权限,需要进行其他权限的同步,需要通过写批量同步权限的脚本完成同步操作

下面是同步公开READ的权限脚本示例,供参考:

#!/usr/bin/python3
# -*- coding: utf-8 -*-

# Copyright WUZL. or its affiliates. All Rights Reserved.
# SPDX-License-Identifier: Apache-2.0

"""
Purpose
Show how to use AWS SDK for Python (Boto3) with Amazon Simple Storage Service
(Amazon S3) to perform basic object acl operations, Synchronize the public read permissions of the source and target buckets. 
"""

import json
import logging

# 在操作系统里需要先安全AWS boto3 SDK包 # pip3 install boto3
import boto3
from botocore.exceptions import ClientError

logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
# 建立一个filehandler来把日志记录在文件里,级别为debug以上
fh = logging.FileHandler("boto3_s3_object_acl_modi.log")
fh.setLevel(logging.DEBUG)
# 建立一个streamhandler来把日志打在CMD窗口上,级别为error以上
ch = logging.StreamHandler()
ch.setLevel(logging.ERROR)
# 设置日志格式
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(lineno)s %(message)s",datefmt="%Y-%m-%d %H:%M:%S")
ch.setFormatter(formatter)
fh.setFormatter(formatter)
#将相应的handler添加在logger对象中
logger.addHandler(ch)
logger.addHandler(fh)
# 开始打日志
# logger.debug("debug message")
# logger.info("info message")
# logger.warn("warn message")
# logger.error("error message")
# logger.critical("critical message")

# snippet-start:[python.example_code.s3.helper.ObjectWrapper]

# args 变量值根据实际情况自己定义
s_region_name='eu-west-2'
s_aws_access_key_id='xxx'
s_aws_secret_access_key='xxx'
s_bucket='xxx'

t_region_name='us-west-2'
t_awt_accest_key_id=''
t_awt_secret_accest_key='xxx'
target_bucket='xxx'


class ObjectWrapper:
    """Encapsulates S3 object actions."""
    def __init__(self, s3_object):
        """
        :param s3_object: A Boto3 Object resource. This is a high-level resource in Boto3
                          that wraps object actions in a class-like structure.
        """
        self.object = s3_object
        self.key = self.object.key
# snippet-end:[python.example_code.s3.helper.ObjectWrapper]

# snippet-start:[python.example_code.s3.GetObject]
    def get(self):
        """
        Gets the object.

        :return: The object data in bytes.
        """
        try:
            body = self.object.get()['Body'].read()
            logger.info(
                "Got object '%s' from bucket '%s'.",
                self.object.key, self.object.bucket_name)
        except ClientError:
            logger.exception(
                "Couldn't get object '%s' from bucket '%s'.",
                self.object.key, self.object.bucket_name)
            raise
        else:
            return body
# snippet-end:[python.example_code.s3.GetObject]

# snippet-start:[python.example_code.s3.ListObjects]
    @staticmethod
    def list(bucket, prefix=None):
        """
        Lists the objects in a bucket, optionally filtered by a prefix.

        :param bucket: The bucket to query. This is a Boto3 Bucket resource.
        :param prefix: When specified, only objects that start with this prefix are listed.
        :return: The list of objects.
        """
        try:
            if not prefix:
                objects = list(bucket.objects.all())
            else:
                objects = list(bucket.objects.filter(Prefix=prefix))
            # logger.info("Got objects %s from bucket '%s'", [o.key for o in objects], bucket.name)
            logger.info("Got objects from bucket '%s'", bucket.name)
        except ClientError:
            logger.exception("Couldn't get objects for bucket '%s'.", bucket.name)
            raise
        else:
            return objects
# snippet-end:[python.example_code.s3.ListObjects]

# snippet-start:[python.example_code.s3.ListObjectsKeys]
    @staticmethod
    def list_all_keys(bucket, prefix=None):
        """
        Lists the ListObjectsKeys in a bucket, optionally filtered by a prefix.

        :param bucket: The bucket to query. This is a Boto3 Bucket resource.
        :param prefix: When specified, only objects that start with this prefix are listed.
        :return: The list of objects.
        """
        try:
            if not prefix:
                objects = list(bucket.objects.all())
            else:
                objects = list(bucket.objects.filter(Prefix=prefix))
            all_keys = [o.key for o in objects]
            # logger.info("Got objects %s from bucket '%s'", [o.key for o in objects], bucket.name)
            logger.info("Got objects list from bucket '%s'", bucket.name)
        except ClientError:
            logger.exception("Couldn't get objects for bucket '%s'.", bucket.name)
            raise
        else:
            return all_keys
# snippet-end:[python.example_code.s3.ListObjectsKeys]

# snippet-start:[python.example_code.s3.GetObjectAcl]
    def get_acl(self):
        """
        Gets the ACL of the object.

        :return: The ACL of the object.
        """
        try:
            acl = self.object.Acl()
            # logger.info("Got ACL for object %s owned by %s.", self.object.key, acl.owner['DisplayName'])
        except ClientError:
            logger.exception("Couldn't get ACL for object %s.", self.object.key)
            raise
        else:
            return acl
# snippet-end:[python.example_code.s3.GetObjectAcl]

# snippet-start:[python.example_code.s3.PutObjectAcl]
    def put_acl(self, uri):
        """
        Applies an ACL to the object that grants read access to an AWS user identified
        by email address.

        :param email: The email address of the user to grant access.
        """
        try:
            acl = self.object.Acl()
            # Putting an ACL overwrites the existing ACL, so append new grants
            # if you want to preserve existing grants.
            grants = acl.grants if acl.grants else []
            grants.append({'Grantee': {'Type': 'Group', 'URI': uri}, 'Permission': 'READ'})
            acl.put(
                AccessControlPolicy={
                    'Grants': grants,
                    'Owner': acl.owner
                }
            )
            # logger.info("Granted read access to %s.", uri)
        except ClientError:
            logger.exception("Couldn't add ACL to object '%s'.", self.object.key)
            raise
# snippet-end:[python.example_code.s3.PutObjectAcl]


# snippet-start:[python.example_code.s3.Scenario_ObjectManagement]
def usage_demo():
    # print('-'*88)
    # print("Welcome to the Amazon S3 object acl modi demo!")
    # print('-'*88)

    # logging.basicConfig(level=logging.INFO, format='%(levelname)s: %(message)s')
    # LOG_FORMAT = "%(asctime)s - %(levelname)s - %(message)s"
    # logging.basicConfig(filename='boto3_s3_object_acl_modi.log', level=logging.DEBUG, format=LOG_FORMAT)
    
    # s3_client = boto3.client('s3', region_name=s_region_name, aws_access_key_id=s_aws_access_key_id, aws_secret_access_key=s_aws_secret_access_key)
    # response = s3_client.list_buckets()
    # print('Existing buckets:')
    # for bucket in response['Buckets']:
    #     print(f'  {bucket["Name"]}')

    s3_resource = boto3.resource('s3', region_name=s_region_name, aws_access_key_id=s_aws_access_key_id, aws_secret_access_key=s_aws_secret_access_key)
    bucket = s3_resource.Bucket(s_bucket)
    # print(dir(bucket))

    t_s3_resource = boto3.resource('s3', region_name=t_region_name, aws_access_key_id=t_awt_accest_key_id, aws_secret_access_key=t_awt_secret_accest_key)
    t_bucket = t_s3_resource.Bucket(target_bucket)
    # print(dir(t_bucket))
    # t_objects = ObjectWrapper.list(t_bucket)
    # print(t_objects)

    # prepare objects keys for modi
    objects = ObjectWrapper.list(bucket)
    all_keys = ObjectWrapper.list_all_keys(bucket)
    # print(objects)
    try:
        keys=[]
        len_all_keys = len(all_keys)
        logger.info("len_all_keys: %s", len_all_keys)
        for object_summary in objects:
            len_all_keys = len_all_keys - 1
            logger.info("left_keys: %s", len_all_keys)
            key=str(object_summary.key)
            # logger.info("object_key: '%s'", key)
            # print(key+':')
            object_acl = object_summary.Acl()
            # print(object_acl)
            # print(object_acl.grants)
            # logger.info("object_grants: '%s'", str(object_acl.grants))
            for grant in object_acl.grants:
                if 'READ' == grant['Permission']:
                    # print('very good!') 
                    keys.append(key)
                    break
    except ClientError as error:
        print(error)
        
    # print(keys)
    logger.info("keys list len: %s", len(keys))
    logger.info("source keys: %s", keys)
    
    logger.info("Modi target bucket object grants:")
    
    # prepare target objects keys for modi
    t_objects = ObjectWrapper.list(t_bucket)
    # print(t_objects)
    # exit()
    t_all_keys = ObjectWrapper.list_all_keys(t_bucket)
    logger.info("t_all_keys list len: %s", len(t_all_keys))
    try:
        modi_keys=[]
        t_keys=[]
        tmp_keys = []
        for tmp_key in keys:
            tmp_keys.append(tmp_key)
        len_left_t_keys = len(keys)
        logger.info("len_left_t_keys: %s", len_left_t_keys)
        for key in keys:
            # logger.info("len of keys: %s, keys: %s", len(keys), keys)
            len_left_t_keys = len_left_t_keys - 1
            logger.info("len_left_t_keys: %s", len_left_t_keys)
            if key in t_all_keys:
                t_key=key
                object_summary = t_s3_resource.ObjectSummary(target_bucket,t_key)                
                # logger.info("t_object_key: '%s'", t_key)
                # print(key+':')
                object_acl = object_summary.Acl()
                # print(object_acl)
                # print(object_acl.grants)
                # logger.info("object_grants: '%s'", str(object_acl.grants))
                # t_keys.append(t_key)
                # logger.info("len of t_keys: %s, t_keys: %s", len(t_keys), t_keys)
                for grant in object_acl.grants:
                    # logger.info("grant: %s", grant)
                    # if 'READ' == grant['Permission']:
                    if grant['Permission'] == 'READ':
                        # logger.info("object %s have permission READ", t_key)
                        tmp_keys.remove(t_key)
                        break
                    
            # logger.info("len of tmp_keys: %s, keys: %s", len(tmp_keys), tmp_keys)
            modi_keys=tmp_keys
        logger.info("len of modi_keys: %s ,modi_keys: '%s'", len(modi_keys), str(modi_keys))
    except ClientError as error:
        print(error)
    
    len_left_modi_keys = len(modi_keys)
    for key in modi_keys:
        len_left_modi_keys = len_left_modi_keys - 1
        logger.info("len_left_modi_keys: %s", len_left_modi_keys)
        object_key = key
        # print(object_key)
        obj_wrapper = ObjectWrapper(t_bucket.Object(object_key))
        # print(t_bucket.Object(object_key))
        object_acl = t_bucket.Object(object_key).Acl()
        # print(object_acl)
        # print(object_acl.grants)
        try:
            obj_wrapper.put_acl(uri='http://acs.amazonaws.com/groups/global/AllUsers')
            acl = obj_wrapper.get_acl()
            # logger.info("Put ACL grants on object '%s': '%s'", str(obj_wrapper.key), str(json.dumps(acl.grants)))
            logger.info("Put ACL grants on object '%s'", str(obj_wrapper.key))
        except ClientError as error:
            if error.response['Error']['Code'] == 'UnresolvableGrantByEmailAddress':
                print('*'*88)
                print("This demo couldn't apply the ACL to the object because the email\n"
                    "address specified as the grantee is for a test user who does not\n"
                    "exist. For this request to succeed, you must replace the grantee\n"
                    "email with one for an existing AWS user.")
                print('*' * 88)
            else:
                raise


# snippet-end:[python.example_code.s3.Scenario_ObjectManagement]


if __name__ == '__main__':
    usage_demo()

代码参考:

S3 — Boto3 Docs 1.26.26 documentation

aws-doc-sdk-examples/object_wrapper.py at main · awsdocs/aws-doc-sdk-examples · GitHub文章来源地址https://www.toymoban.com/news/detail-538994.html

到了这里,关于AWS S3 存储桶复制及权限同步的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处: 如若内容造成侵权/违法违规/事实不符,请点击违法举报进行投诉反馈,一经查实,立即删除!

领支付宝红包 赞助服务器费用

相关文章

  • AWS S3桶 配置访问权限(AKSK)的流程

    在新上线的机器上,需要给hadoop配置AKSK,否则在该机器上执行的任务将无法访问S3中的文件。 AK:Access Key Id,用于标示用户 SK:Secret Access Key,是用户用于加密认证字符串和用来验证认证字符串的密钥 容器如果其宿主机配置了角色(Role)权限,则对应的容器不需要配置aksk也

    2024年02月06日
    浏览(49)
  • AWS s3存储桶限制IP访问

    官方参考连接: https://docs.amazonaws.cn/AmazonS3/latest/userguide/example-bucket-policies.html 1、从控制台进入对应的存储桶,进入权限,编辑策略 2、效果 命令可以查到存储桶,但是不能查看桶里面的内容 3、在命令行修改权限 如果修改权限是时候,我上面第二条IP没有加,控制台进入对应

    2024年02月12日
    浏览(51)
  • AWS——03篇(AWS之Amazon S3(云中可扩展存储)-01入门)

    关于AWS的其他入门,如下: AWS——01篇(AWS入门 以及 AWS之EC2实例及简单使用). AWS——02篇(AWS之服务存储EFS在Amazon EC2上的挂载——针对EC2进行托管文件存储). 2.1.1 简述 Amazon S3:从任意位置存储和检索任意数量的数据 Amazon S3 是一项对象存储服务,可提供业界领先的可扩展

    2024年02月14日
    浏览(46)
  • aws对象存储s3基于lambda实现图片裁剪

    存储桶名称:wmmzz 1.存储桶策略设置 2. 创建lambda函数 点击跳转到IAM控制台,创建自定义角色,选择服务类型lambda,创建策略 输入策略下一步取名resize-policy,回到创建角色页面,搜索刚才创建的策略选中,再搜索AmazonS3FullAccess选中 点击下一步,输入角色名称resize-role,点击“创建角色

    2024年02月11日
    浏览(48)
  • [ 云计算 | AWS 实践 ] 使用 Java 列出存储桶中的所有 AWS S3 对象

    本文收录于【#云计算入门与实践 - AWS】专栏中,收录 AWS 入门与实践相关博文。 本文同步于个人公众号:【 云计算洞察 】 更多关于云计算技术内容敬请关注:CSDN【#云计算入门与实践 - AWS】专栏。 本系列已更新博文: [ 云计算 | AWS 实践 ] Java 应用中使用 Amazon S3 进行存储桶

    2024年02月06日
    浏览(53)
  • 使用AWS MVP方案[Data Transfer Hub]从Global S3同步文件到中国区S3

    本文主要描述在AWS Global区部署Data Transfer Hub方案,并创建从global S3同步文件到中国区S3的任务   1.1 AWS Global账号 需要一个AWS Global的账号,并且有相应的权限,本例是Full Administrator权限 1.2 在AWS Global账号下准备一个S3存储桶 登陆AWS Global账号,选择 服务 - 存储 - S3   点击创建

    2024年02月08日
    浏览(50)
  • 使用rclone工具实现华为云OBS至AWS S3数据迁移同步

            项目需要将华为云的OBS对象存储服务的存储桶bucket的内容迁移复制到AWS云的S3存储桶中,AWS中暂无实现改需求的云服务,所以采用开源的第三方软件rclone来实现。         rclone可以使用在linux操作系统中,是一种命令行形式的工具。 华为云OBS: 1. 已从统一身份认

    2024年02月02日
    浏览(91)
  • [ 云计算 | AWS 实践 ] 基于 Amazon S3 协议搭建个人云存储服务

    本文收录于【#云计算入门与实践 - AWS】专栏中,收录 AWS 入门与实践相关博文。 本文同步于个人公众号:【 云计算洞察 】 更多关于云计算技术内容敬请关注:CSDN【#云计算入门与实践 - AWS】专栏。 本系列已更新博文: [ 云计算 | AWS 实践 ] Java 应用中使用 Amazon S3 进行存储桶

    2024年02月05日
    浏览(44)
  • python botos s3 aws

    https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html AWS是亚马逊的云服务,其提供了非常丰富的套件,以及支持多种语言的SDK/API。 本文针对其S3云储存服务的Python SDK(boto3)的使用进行介绍。 :AWS,S3,Python,boto3,endpoint,client AWS是一整套亚马逊云服务套

    2024年04月15日
    浏览(35)
  • Python文件上传 S3(AWS) 简单实现

    建立aws账户,进入到S3界面  点击 \\\"Create bucket\\\" 一系列操作之后——这里给bucket命名为csfyp python需要先: 这两个包含一些连接python和s3 连接的api 然后直接上代码

    2024年02月03日
    浏览(53)

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

博客赞助

微信扫一扫打赏

请作者喝杯咖啡吧~博客赞助

支付宝扫一扫领取红包,优惠每天领

二维码1

领取红包

二维码2

领红包