Working With Files and Folders in S3, Using AWS SDK for .NET

This tutorial explains some basic file/folder operations in an AWS S3 bucket using AWS SDK for .NET (C#). First, we create a directory in S3, then upload a file to it, then we will list the content of the directory and finally delete the file and folder. We show these operations in both low-level and high-level APIs.

Introduction 

 
In this tutorial, I am going to show you how to use AWS SDK for .NET to do some basic file operations on an S3 bucket. AWS provides both low-level and high-level APIs. First, we are going to see how to use low-level APIs and then we will perform the same operations using high-level APIs.
 
In order to run the following code, you need to install the AWSSDK.S3 Nuget package.
 
I have created an S3 bucket named: my-bucket-name-123 and I have created a folder named my-folder inside the bucket,
 
Working With Files And Folders In S3, Using AWS SDK For .NET 
 

Low-Level APIs

 
The low-level APIs are mapped closely to the underlying REST API, here we use a Request object to provide request information and AWS responds with the Response object.
 

Understanding S3 Path (when using low-level API)

 
First, we need to understand that there is no concept of a folder in S3, everything is an object. If I want to create a folder called sub-folder, I need to append the folder name with a / to let AWS know that what I want is a folder, not a file... so I need to create,
 
my-s3-bucket-name-123/my-folder/sub-folder/
 
If I don't include the trailing slash AWS will create an object called sub-folder instead of a folder.
 

Initializing AmazonS3Client 

 
This is how we initialize S3 clients, which we are going to use for the remaining examples,
  1. string bucketName = "my-bucket-name-123";  
  2. string awsAccessKey = "AKI............";  
  3. string awsSecretKey = "+8Bo..................................";  
  4.   
  5. IAmazonS3 client = new AmazonS3Client(_awsAccessKey, _awsSecretKey, RegionEndpoint.APSoutheast2);  

Note: AmazonS3Client is thread safe, you can make it static or use a singletone instance.

Creating a folder

 
Here we are going to create a folder called sub-folder inside my-folder
  1. string folderPath = "my-folder/sub-folder/";  
  2.   
  3. PutObjectRequest request = new PutObjectRequest()  
  4. {  
  5.     BucketName = _bucketName,  
  6.     Key = folderPath // <-- in S3 key represents a path  
  7. };  
  8.   
  9. PutObjectResponse response = client.PutObject(request);  
Note1
If you forget the trailing slash in the path (i.e. "my-folder/sub-folder") it would create an object called sub-folder.
 
Note2
If you include a slash at the beginning of the path (i.e. "/my-folder/sub-folder/") it will create a folder with name as an empty string and put the remaining folders inside it.
 

Copying a file into a folder

 
The following code would copy test.txt inside sub-folder,
  1. FileInfo file = new FileInfo(@"c:\test.txt");  
  2. string path = "my-folder/sub-folder/test.txt";  
  3.   
  4. PutObjectRequest request = new PutObjectRequest()  
  5. {  
  6.     InputStream = file.OpenRead(),  
  7.     BucketName = _bucketName,  
  8.     Key = path // <-- in S3 key represents a path  
  9. };  
  10.   
  11. PutObjectResponse response = client.PutObject(request); 

Listing content of a folder

 
The following code would list the contents of sub-folder,
  1. ListObjectsRequest request = new ListObjectsRequest  
  2. {  
  3.     BucketName = _bucketName,  
  4.     Prefix = "my-folder/sub-folder/"  
  5. };  
  6.   
  7. ListObjectsResponse response = client.ListObjects(request);    
  8. foreach (S3Object obj in response.S3Objects)    
  9. {    
  10.     Console.WriteLine(obj.Key);    
  11. }    
  12.   
  13. // result:  
  14. // my-folder/sub-folder/  
  15. // my-folder/sub-folder/test.txt  

Deleting file/folder

 
In the following code first we delete test.txt and then sub-folder,
  1. // delete test.txt file  
  2. string filePath = "my-folder/sub-folder/test.txt";  
  3. var deleteFileRequest = new DeleteObjectRequest  
  4. {  
  5.     BucketName = _bucketName,  
  6.     Key = filePath  
  7. };  
  8. DeleteObjectResponse fileDeleteResponse = client.DeleteObject(deleteFileRequest);  
  9.   
  10. // delete sub-folder  
  11. string folderPath = "my-folder/sub-folder/";  
  12. var deleteFolderRequest = new DeleteObjectRequest  
  13. {  
  14.     BucketName = _bucketName,  
  15.     Key = folderPath  
  16. };  
  17. DeleteObjectResponse folderDeleteResponse = client.DeleteObject(deleteFolderRequest);  

High-level APIs

 
High-level APIs are designed to mimic the semantic of File I/O operations. They are very similar to working with FileInfo and Directory. 
 

Understanding the S3 path (when using high-level APIs)

 
When using high-level APIs, we need to use windows' styles paths, so use a backslash (NOT slash) in your path,
 
   "my-folder\sub-folder\test.txt" 
 
Also note that, similar to low-level APIs we need a trailing backslash to indicate a folder, for example, "my-folder\sub-folder\" indicates that sub-folder is a folder whereas "my-folder\sub-folder" indicates that sub-folder is an object inside my-folder. 
 

Initializing AmazonS3Client 

 
Use the same code as low-level APIs (above) to initialize AmazonS3Client.  
 

Creating a folder

 
Here we are going to create a folder called high-level-folder and create another folder called my-folder inside it.
  1. string path = @"high-level-folder";  
  2.   
  3. S3DirectoryInfo di = new S3DirectoryInfo(client, _bucketName, path);  
  4. if (!di.Exists)  
  5. {  
  6.     di.Create();  
  7.     di.CreateSubdirectory("sub-folder");  
  8. }  

    Copying file into folder

     
    The following code would copy test.txt inside sub-folder,  
    1. FileInfo localFile = new FileInfo(@"c:\test.txt");  
    2. string path = @"high-level-folder\sub-folder\test.txt";  
    3.   
    4. S3FileInfo s3File = new S3FileInfo(client, _bucketName, path);  
    5. if (!s3File.Exists)  
    6. {  
    7.     using (var s3Stream = s3File.Create()) // <-- create file in S3  
    8.     {  
    9.         localFile.OpenRead().CopyTo(s3Stream); // <-- copy the content to S3  
    10.     }  
    11. }  

    Listing content of a folder

     
    The following code would list the content of a sub-folder, 
    1. string path = @"high-level-folder\sub-folder\";  
    2.   
    3. S3DirectoryInfo di = new S3DirectoryInfo(client, _bucketName, path);  
    4. IS3FileSystemInfo[] files = di.GetFileSystemInfos();  
    5. foreach (S3FileInfo file in files)  
    6. {  
    7.     Console.WriteLine($"{file.Name}");  
    8. }  
    9.   
    10. // result:  
    11. // test.txt  
    Note
    Unlike low-level API, here the folder name (sub-folder) is not listed. 
     

    Deleting file/folder

     
    In the following code first we delete test.txt and then sub-folder, 
    1. // delete test.txt file    
    2. string filePath = @"high-level-folder\sub-folder\test.txt";  
    3. S3FileInfo s3File = new S3FileInfo(client, _bucketName, filePath);  
    4. if (s3File.Exists)  
    5. {  
    6.     s3File.Delete();  
    7. }  
    8.   
    9. // delete sub-folder    
    10. string folderPath = @"high-level-folder\sub-folder\";  
    11. S3DirectoryInfo directory = new S3DirectoryInfo(client, _bucketName, folderPath);  
    12. if (directory.Exists)  
    13. {  
    14.     directory.Delete();  
    15. } 
     

    Wrapping up

     
    When designing information systems, we follow a practice called single source of truth (SSOT), it ensures that every data element is edited in only one place. SSOT simplifies the information system and makes working with it a lot easier. Personally I like to extend this practice to every aspect of design... when designing a UI, there is no point in giving users 2 different ways to buy a product, it complicates the UI and confuses the user... in my opinion, the same is true for designing a software library. I think by providing 2 different APIs (low-level and high-level), AWS has unnecessarily complicated the process of communicating with the S3 bucket, especially because these APIs use different paths systems... personally I spent hours on an error because I was using a high-level style path, with low-level APIs.
     
    Now, to make the matter worse, AWS is providing yet another way of interacting with S3 buckets, and that is using AWS TransferUtility which runs on top of low-level API and is the recommended way for reading/writing large objects (larger than a few GB). Have a look at this AWS documentation and see which one is the best option for your needs.