By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Migrating S3 Buckets Between AWS Accounts Like a Pro (Without Losing Your Sanity) | HackerNoon
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Computing > Migrating S3 Buckets Between AWS Accounts Like a Pro (Without Losing Your Sanity) | HackerNoon
Computing

Migrating S3 Buckets Between AWS Accounts Like a Pro (Without Losing Your Sanity) | HackerNoon

News Room
Last updated: 2025/08/11 at 6:57 AM
News Room Published 11 August 2025
Share
SHARE

Moving an Amazon S3 bucket from one AWS account to another sounds simple… until you try it with 4–5 terabytes of data.

At that scale, the usual methods — aws s3 cp or cross-account replication — feel like mailing every single file via carrier pigeon. They’re fine for gigabytes, but you’ll be waiting days (and paying more than you’d like) for terabytes.

That’s where AWS DataSync comes in. In my case, it was 10× faster than the “normal” way, fully automated, and secure. Here’s how I did it.


Why DataSync for Large Migrations?

AWS DataSync is built for bulk movement of data — think terabytes to petabytes. It:

  • Moves data directly between AWS services (or from on-prem to AWS) without middleman storage.
  • Parallelizes transfers, making them much faster.
  • Handles metadata, object tags, and ACLs automatically.
  • Can be run incrementally so you can do a cutover with minimal downtime.

When you’re moving 4–5 TB between AWS accounts, those benefits matter.


Step 1: Prepare the Buckets

We need two buckets:

  • Source bucket in the original AWS account (the one holding your 5 TB of data).
  • Destination bucket in the target AWS account.

Both must:

  • Exist before starting.
  • Be in the same AWS region if you want max speed & lower cost.
  • Have versioning optional (DataSync doesn’t require it like replication does).

Step 2: IAM Roles and Permissions

DataSync needs permission to read from the source and write to the destination — across accounts. This is where IAM roles and bucket policies come in.


Source Bucket Policy (Allow destination account’s DataSync role and logged-in user to read & list objects)

{
 "Version": "2012-10-17",
 "Statement": [
  {
   "Effect": "Allow",
   "Principal": {
    "AWS": [
     "arn:aws:iam::DIST_ACCOUNT_ID:role/datasync-role",
     "arn:aws:iam::DIST_ACCOUNT_ID:user/distention_account_logged_in_user"
    ]
   },
   "Action": [
    "s3:GetBucketLocation",
    "s3:ListBucket",
    "s3:ListBucketMultipartUploads"
   ],
   "Resource": "arn:aws:s3:::source_bucket"
  },
  {
   "Effect": "Allow",
   "Principal": {
    "AWS": [
     "arn:aws:iam::DIST_ACCOUNT_ID:role/datasync-role",
     "arn:aws:iam::DIST_ACCOUNT_ID:user/distention_account_logged_in_user"
    ]
   },
   "Action": [
    "s3:AbortMultipartUpload",
    "s3:DeleteObject",
    "s3:GetObject",
    "s3:ListMultipartUploadParts",
    "s3:PutObjectTagging",
    "s3:GetObjectTagging",
    "s3:PutObject"
   ],
   "Resource": "arn:aws:s3:::source_bucket/*"
  }
 ]
}

Destination Account Role Policy (This is the role DataSync uses to access the source bucket)

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Action": [
                "s3:GetBucketLocation",
                "s3:ListBucket",
                "s3:ListBucketMultipartUploads"
            ],
            "Effect": "Allow",
            "Resource": "arn:aws:s3:::source_bucket"
        },
        {
            "Action": [
                "s3:AbortMultipartUpload",
                "s3:DeleteObject",
                "s3:GetObject",
                "s3:ListMultipartUploadParts",
                "s3:PutObject",
                "s3:GetObjectTagging",
                "s3:ListBucket",
                "s3:PutObjectTagging"
            ],
            "Effect": "Allow",
            "Resource": "arn:aws:s3:::source_bucket/*"
        }
    ]
}

Destination Bucket Policy (Allow DataSync role and destination account user to write data)

{
 "Version": "2008-10-17",
 "Statement": [
  {
   "Sid": "DataSyncCreateS3LocationAndTaskAccess",
   "Effect": "Allow",
   "Principal": {
    "AWS": [
     "arn:aws:iam::DIST_ACCOUNT_ID:role/datasync-role",
     "arn:aws:iam::DIST_ACCOUNT_ID:user/distention_account_logged_in_user"
    ]
   },
   "Action": [
    "s3:GetBucketLocation",
    "s3:ListBucket",
    "s3:ListBucketMultipartUploads",
    "s3:AbortMultipartUpload",
    "s3:DeleteObject",
    "s3:GetObject",
    "s3:ListMultipartUploadParts",
    "s3:PutObject",
    "s3:GetObjectTagging",
    "s3:PutObjectTagging"
   ],
   "Resource": [
    "arn:aws:s3:::dist_bucket",
    "arn:aws:s3:::dist_bucket/*"
   ]
  }
 ]
}

Step 3: Create the DataSync Locations

We need two “locations” — one for the source bucket, one for the destination.

Example AWS CLI for source bucket:

aws datasync create-location-s3 
  --s3-bucket-arn arn:aws:s3:::s3-source-01 
  --s3-storage-class STANDARD 
  --s3-config BucketAccessRoleArn="arn:aws:iam::1234567890:role/datasync-role" 
  --region us-east-1

Repeat for the destination bucket with its own bucket ARN.


Step 4: Create the DataSync Task

Once both locations are created, set up a DataSync task to copy objects from the source location to the destination location.

You can:

  • Enable metadata copy (preserves timestamps, tags).
  • Run incremental syncs until final cutover.
  • Use the console to monitor transfer speed and completion.

Step 5: Run and Monitor the Transfer

For 4–5 TB, you’re not finishing in minutes, but you’ll still see a huge improvement over traditional methods. In my migration, the speed difference was dramatic — hours instead of days.


Pro Tips for Large Transfers

  1. Same region is cheaper & faster — cross-region transfers can double the bill.
  2. Incremental runs are your friend — run DataSync multiple times before final cutover so you only copy changes on the last run.
  3. Keep IAM tight — you’re opening cross-account access; remove it after the migration.
  4. Tag the migration, so you can track transfer costs in AWS Cost Explorer.

Final Thoughts

Migrating 5 TB of S3 data between AWS accounts used to mean choosing between slow and free-ish or fast and expensive. DataSync changes that equation — giving you fast, secure, repeatable migrations without manual scripting or downtime.

Think of it as upgrading from carrying buckets of water to installing a high-pressure pipeline. Once you’ve used it, you won’t go back.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Nvidia, AMD to Pay 15% of Revenue From China AI Chip Sales to US Government
Next Article Oxford Nanopore co-founder to step down as chief executive – UKTN
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Best Immunity Supplements for 2025
News
6 Lessons Learned: Focusing Security Where Business Value Lives
Computing
You can snag an Oura Ring for almost half price for a limited time
Gadget
Wikipedia’s case against Online Safety Act dismissed by High Court – UKTN
News

You Might also Like

Computing

6 Lessons Learned: Focusing Security Where Business Value Lives

9 Min Read
Computing

Linux Cache-Aware Scheduling / Load Balancing Updated With New Tuning Knob

3 Min Read
Computing

Honor reveals design of Honor 400 series smartphones ahead of global launch · TechNode

1 Min Read
Computing

Safaricom cut fibre rates 25% as Starlink loses market share

3 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?