Writing a Simple C# Files Backup Solution for File Server

Github: https://github.com/adriancs2/auto-folder-backup

This article provides an insight and an introduction of building a simple C# files backup program.

The codes shown in this article are extracted from the main project. They are the core ideas of how the whole thing works. For a complete working solution, please download the source code or visit the github page. 

If you have experience in similar files and folders backup solutions, you’re more than welcome to share your idea at the Github page or CodeProject comment section.

The first thing that came into mind for backing up files, it’s just simply copying files to another folder.

The Basics

using System.IO;

string sourceFolder = @"D:\fileserver";
string destinationFolder = @"E:\backup";

// Get all sub-folders
string[] folders = Directory.GetDirectories(sourceFolder);

// Get all files within the folder
string[] files = Directory.GetFiles(sourceFolder);

// To create folders
foreach (string folder in folders)
{
    // Get the folder name
    string foldername = Path.GetFileName(folder);

    // Get the destination folder path
    string dest = Path.Combine(destFolder, foldername);

    // Create the folder
    Directory.CreateDirectory(dest);
}

// To copies files
foreach (string file in files)
{
    // Get the filename
    string filename = Path.GetFileName(file);

    // Get the destination file path
    string dest = Path.Combine(destFolder, filename);

    // Copy the file
    File.Copy(file, dest, true);
}

Next, in order to cover all sub-folders, the code above can be refactored into a method:

static void Main(string[] args)
{
    string sourceFolder = @"D:\fileserver";
    string destinationFolder = @"E:\backup";

    BackupDirectory(sourceFolder, destinationFolder);
}

static void BackupDirectory(string sourceFolder, string destinationFolder)
{
    if (!Directory.Exists(destinationFolder))
    {
        Directory.CreateDirectory(destinationFolder);
    }

    string[] files = Directory.GetFiles(sourceFolder);

    foreach (string file in files)
    {
        string filename = Path.GetFileName(file);
        string dest = Path.Combine(destFolder, filename);
        File.Copy(filename, dest, true);
    }

    string[] folders = Directory.GetDirectories(sourceFolder);

    foreach (string folder in folders)
    {
        string name = Path.GetFileName(folder);
        string dest = Path.Combine(destinationFolder, name);

        // recursive call
        BackupDirectory(folder, dest);
    }
}

Here’s a different version that does not require a recursive call.

Version 2

static void Main(string[] args)
{
    string sourceFolder = @"D:\fileserver";
    string destinationFolder = @"E:\backup";

    BackupFolder(sourceFolder, destinationFolder);
}

static void BackupFolder(string source, string destination)
{
    if (!Directory.Exists(destination))
    {
        Directory.CreateDirectory(destination);
    }

    // Create folders, including sub-folders
    foreach (string dirPath in Directory.GetDirectories(source, "*", SearchOption.AllDirectories))
    {
        Directory.CreateDirectory(dirPath.Replace(source, destination));
    }

    // Copy all files, including files in sub-folders
    foreach (string newPath in Directory.GetFiles(source, "*.*", SearchOption.AllDirectories))
    {
        File.Copy(newPath, newPath.Replace(source, destination), true);
    }
}

Another even simpler version.

Version 3

using System;
using Microsoft.VisualBasic.FileIO;

static void Main(string[] args)
{
    string sourceFolder = @"D:\fileserver";
    string destinationFolder = @"E:\destination";

    FileSystem.CopyDirectory(sourceFolder, destinationFolder, UIOption.DoNotDisplayUI);
}

Capturing Read And Write Errors

The next concern is about read and write error. Maybe because of user account access rights. Therefore, it is highly recommended to run the program as Administrator or “Local System” which will have full access rights to everything. Nonetheless, in order to address some other unknown circumstances that might be resulting file read or write errors, all read and write actions should be wrapped within a try-catch block, so that the backup process will not be terminated half way when any error happens. Logging can be implemented so that if any error occurs, it can be traced.

The typical try-catch block:

try
{
    // success
    File.Copy(sourceFile, destinationFile, true);
    // write log (success)
}
catch (Exception ex)
{
    // failed
    // write log (fail)
}

Now, combining the logging and try-catch, the block will look something like this:

static void Main(string[] args)
{
    try
    {
        string sourceFolder = @"D:\fileserver";
        string destinationFolder = @"E:\2023-11-01";

        // the file path of the logging files
        string successLogPath = @"E:\2023-11-01\log-success.txt";
        string failLogPath = @"E:\2023-11-01\log-fail.txt";

        // Supplying stream writer for logging
        // Stream writer will write text into a text file
        using (StreamWriter successWriter = new StreamWriter(successLogPath, true))
        {
            using (StreamWriter failWriter = new StreamWriter(failLogPath, true))
            {
                BackupDirectory(sourceFolder, destinationFolder, successWriter, failWriter);
            }
        }
    }
    catch (Exception ex)
    {
        // logging...
    }
}

static void BackupDirectory(string sourceFolder, string destinationFolder, 
                StreamWriter successWriter, StreamWriter failWriter)
{
    // Stage 1: Create the destination folder

    if (!Directory.Exists(destinationFolder))
    {
        try
        {
            Directory.CreateDirectory(destinationFolder);
            successWriter.WriteLine($"Create folder: {destinationFolder}");
        }
        catch (Exception ex)
        {
            // Cannot create folder
            failWriter.WriteLine($"Failed to create folder: {sourceFolder}\r\nAccess Denied\r\n");
            return;
        }
    }

    // Stage 2: Get all files from the source folder

    string[] files = null;

    try
    {
        files = Directory.GetFiles(sourceFolder);
    }
    catch (UnauthorizedAccessException)
    {
        // Access denied, cannot read folder or files
        failWriter.WriteLine($"{sourceFolder}\r\nAccess Denied\r\n");
    }
    catch (Exception e)
    {
        // Other unknown read errors
        failWriter.WriteLine($"{sourceFolder}\r\n{e.Message}\r\n");
    }

    // Stage 3: Copy all files from source to destination folder
    if (files != null && files.Length > 0)
    {
        foreach (string file in files)
        {
            try
            {
                string name = Path.GetFileName(file);
                string dest = Path.Combine(destinationFolder, name);
    
                File.Copy(file, dest, true);
            }
            catch (UnauthorizedAccessException)
            {
                // Access denied, cannot write file
                failWriter.WriteLine($"{file}\r\nAccess Denied\r\n");
                TotalFailed++;
            }
            catch (Exception e)
            {
                // Other unknown error
                TotalFailed++;
                failWriter.WriteLine($"{file}\r\n{e.Message}\r\n");
            }
        }
    }

    // Stage 4: Get all sub-folders

    string[] folders = null;

    try
    {
        folders = Directory.GetDirectories(sourceFolder);
    }
    catch (UnauthorizedAccessException)
    {
        // Access denied, cannot read folders
        failWriter.WriteLine($"{sourceFolder}\r\nAccess denied\r\n");
    }
    catch (Exception e)
    {
        // Other unknown read errors
        failWriter.WriteLine($"{sourceFolder}\r\nAccess {e.Message}\r\n");
    }

    // Stage 5: Backup files and "sub-sub-folders" in the sub-folder
    if (folders != null && folders.Length > 0)
    {
        foreach (string folder in folders)
        {
            try
            {
                string name = Path.GetFileName(folder);
                string dest = Path.Combine(destinationFolder, name);
    
                // recursive call
                BackupDirectory(folder, dest, successWriter, failWriter);
            }
            catch (UnauthorizedAccessException)
            {
                // Access denied, cannot read folders
                failWriter.WriteLine($"{folder}\r\nAccess denied\r\n");
            }
            catch (Exception e)
            {
                // Other unknown read errors
                failWriter.WriteLine($"{sourceFolder}\r\nAccess {e.Message}\r\n");
            }
        }
    }
}

The backup process will be executed on daily or weekly basis, depands on your preference. For each backup, the destination folder can be named by timestamps, for example:

// for daily basis
E:\2023_11_01\ << day 1
E:\2023_11_02\ << day 2
E:\2023_11_03 030000\ << day 3

// for weekly basis
E:\2023-11-01 030000\ << week 1
E:\2023-11-08 030000\ << week 2
E:\2023-11-15 030000\ << week 3

Full Backup

The backup code explained above executes a “Full Backup,” copying everything.

There is a very high likelihood that more than 90% of the content being backed up is the same as in the previous backup. Imagine if the backup is performed on a daily basis; there would be a lot of identical redundancies.

How about this: instead of repeatedly copying the same large amount of files every time, the program only copies those files that are new or have been modified, the “Incremental Backup.”

So, the backup strategy will now look something like this: perform an incremental backup on a daily basis and a full backup on a weekly basis (perhaps every Sunday, etc.) or every 15th days or perhaps, once a month.

Incremental backup is resource-efficient, saves space (obviously), saves CPU resources, and consumes far less time.

Incremental Backup

To identify the new files is easy, new files are not existed in the destination folder, just copy it.

foreach (string file in files)
{
    string name = Path.GetFileName(file);
    string dest = Path.Combine(destinationFolder, name);

    if (File.Exists(dest))
    {

    }
    else
    {
        // Not existed
        File.Copy(file, dest, true);
    }
}

If the file exists, then some kind of file comparison will need to be carried out to determine whether both the source file and the destination file are exactly the same.

The most accurate way to identify whether they are identical copies is by calculating the HASH signature of both files.

public static string ComputeFileHash(string filePath)
{
    using (FileStream fs = new FileStream(filePath, FileMode.Open, FileAccess.Read))
    {
        using (SHA256 sha256 = SHA256.Create())
        {
            byte[] hashBytes = sha256.ComputeHash(fs);
            return BitConverter.ToString(hashBytes).Replace("-", "");
        }
    }
}

Example of output:

string hash = ComputeFileHash(@"C:\path\to\your\file.txt");
// output: 3D4F2BF07DC1BE38B20CD6E46949A1071F9D0E3D247F1F3DEDF73E3D4F2BF07D

During the first backup, compute the SHA-256 hash signature of all files and cache (save) it into a text file (or a portable database such as SQLite), which will look something like this:

A3A67E8FBEC365CDBB63C09A5790810A247D692E96360183C67E3E72FFDF6FE9|\somefile.pdf
7F83B1657FF1FC53B92DC18148A1D65DFC2D4B1FA3D677284ADDD200126D9069|\somefolder\letter.docx
C7BE1ED902FB8DD4D48997C6452F5D7E509FB598F541B12661FEDF5A0A522670|\account.xlsx
A54D88E06612D820BC3BE72877C74F257B561B19D15BB12D0061F16D454082F4|\note.txt

The HASH and the filename are separated by a vertical line “|” symbol.

Then at the second backup, re-compute SHA-256 hash for all files (again) and match it to the cached (saved) old SHA-256 from the destination folder.

By comparing the HASH signatures of both files (new and old), the program is able to identify even the slightest tiny differences of both files.

BUT, to compute SHA-256 hash, the file bytes will be loaded, and this process is even more hardware resource-intensive. Yes, it will still save space, but the time required and the CPU computation power are more than quadruple of what is needed to just doing a full backup.

Another faster (super fast) alternative to identify the newer version of a file is by comparing the “last modified time” of both files. Although comparing the “last modified time” is not as accurate as comparing the HASH, it is good enough in this context.

DateTime srcWriteTime = File.GetLastWriteTime(sourceFile);
DateTime destWriteTime = File.GetLastWriteTime(destinationFile);

// If the source file has a later write time, copy it over
if (srcWriteTime > destWriteTime)
{
    // Overwrite if file already exists
    File.Copy(file, dest, true);
}
else
{
    // Else, skip copying the file
}

and this concludes the basic idea of how the incremental backup can be done.

Next Problem: The Backup Drive Will Eventually Fill Up

If the backup drive becomes full, simply delete the oldest backup folder. If deleting one backup folder is not enough, delete two folders.

The problem is… deleting a large number of files and sub-folders can be very slow.

What is a fast way to clear off files in a short amount of time?

One solution that comes to mind is to “Format Drive.”

Here’s a code snippet of formatting a drive using “Format.exe“:

using System.Diagnostics;

var process = new Process()
{
    StartInfo = new ProcessStartInfo
    {
        FileName = "format.com",
        Arguments = $"E: /FS:NTFS /V:backup /Q /Y /A:4096",
        Verb = "runas",  // Run the process with administrative permissions
        RedirectStandardOutput = true,
        RedirectStandardInput = true,
        UseShellExecute = false,
        CreateNoWindow = true,
    }
};

process.Start();

// Automatically confirm the warning prompt by writing "Y" to the standard input
process.StandardInput.WriteLine("Y");

string output = process.StandardOutput.ReadToEnd();
process.WaitForExit();

Above method has a high chance of being reported as malware by antivirus software.

Another better way is by using ManagementObjectSearcher from System.Management;

using System.Management;

ManagementObjectSearcher searcher = new ManagementObjectSearcher("\\\\.\\ROOT\\CIMV2", 
    "SELECT * FROM Win32_Volume WHERE DriveLetter = 'E:'");
    
foreach (ManagementObject volume in searcher.Get())
{
    // The parameters are: file system, quick format, cluster size, label, enable compression
    volume.InvokeMethod("Format", new object[] { "NTFS", true, 4096, "Backup", false });
}

However, if we format the backup drive that is currently in use, all backups will be lost, which makes this not an option.

How about using two backup drives ? …

This means that now, when the first drive is full, we can continue the backup on the second drive. And when the second drive is full, since we have the latest backups on the second drive, we can safely format the first drive. Sounds good? Yep, that will do.

And one more thing, the main storage of file server has to be easily moved to another computer. This means the main storage of the file server has to be using a dedicated physical hard disk. So, the main Windows operating system will be using it’s own dedicated hard disk too. This same goes to the backup folder, they will be using another dedicated hard disk, or they can share the hard disk with the Windows OS.

So, 3 hard disks. 1 for Windows, 1 for file server, 1 (or 2) for backup.

The following images illustrates the scenario:

  • First hard disk used for running Windows (Drive C).
  • Second hard disk will be used as the main file server storage (Drive D).
  • Third and forth hard disk (or a thrid hard disk partitioned into 2 drives) will be dedicated for backup (Drive E, F).

Below image shows the software started to run the backup. In this example, the program is set to perform a full backup on every interval of 7 days. Other days will be performing incremental backup.

Next, the image shows the first backup drive is full. The program will switch to the next backup drive.

and lastly, when the next drive is full, the program will go back to the first backup drive. Since the first backup drive has already full, the program will format the drive and start using it for the backup.

Doing the Final Code Logic

Now the last part going to code the logic for deciding which drive to use and backup type (incremental or full backup).

Before starting the logic, we need to obtain the total size of the source folder.

static long GetDirectorySize(DirectoryInfo dirInfo)
{
    long size = 0;

    // Calculate size of all files in the directory.
    FileInfo[] files = dirInfo.GetFiles();

    foreach (FileInfo file in files)
    {
        size += file.Length;
    }

    // Calculate size of all sub-folders.
    DirectoryInfo[] subDirs = dirInfo.GetDirectories();

    foreach (DirectoryInfo subDir in subDirs)
    {
        // recursive call
        size += GetDirectorySize(subDir);
    }

    return size;
}

DirectoryInfo dirInfo = new DirectoryInfo(sourceFolder);
long totalSize = GetDirectorySize(dirInfo);

The main logic to get the destination folder:

static string GetDestinationFolder(long totalSize)
{
    // These values are usually loaded from a config/settings file
    string[] lstMyDrive = new string[] { "E", "F" };
    double TotalDaysForFullBackup = 7;

    // TotalDaysForFullBackup = The interval of days to do full backup

    // this will be used as the name for the new destination folder
    string timeNowStr = DateTime.Now.ToString("yyyy-MM-dd HHmmss");

    // example: "2023-11-02 010000"
    
    
    // ===================================================
    // Stage 1: Collecting all found drives that match with the settings
    // ===================================================
    
    DriveInfo[] allDrives = DriveInfo.GetDrives();
    List<DriveInfo> matchingDrives = allDrives.Where(d => lstMyDrive.Contains(d.Name[0].ToString())).ToList();

    // output drive collections: E:\ and F:\


    // ===================================================
    // Stage 2: Get the latest backup date from folder's name
    // ===================================================

    // if there is ever a backup being executed before, 
    // there will have at least one folder existed in one of the backup drives

    // declare a dictionary to store the folders and their correspondence dates
    Dictionary<string, DateTime> dicFolderDate = new Dictionary<string, DateTime>();

    // the matchingDrives here refers to E:\ and F:\
    // which obtained from the previous lines

    foreach (var drive in matchingDrives)
    {
        // get all previous created backup folders
        string[] backupFolders = Directory.GetDirectories(drive.Name);

        // for example:

        // E:\2023-11-01 030000
        // E:\2023-11-07 030000
        // E:\2023-11-14 030000
        // F:\2023-11-21 030000
        // F:\2023-11-28 030000

        foreach (var dir in backupFolders)
        {
            string folderName = new DirectoryInfo(dir).Name;
            
            // example:
            // folderName = "2023-11-01 030000"

            // get the backup dates from the folder name
            if (DateTime.TryParseExact(folderName, "yyyy-MM-dd HHmmss", 
                                    CultureInfo.InvariantCulture, DateTimeStyles.None, 
                                    out DateTime folderDate))
            {
                // collecting the folers and their dates
                dicFolderDate[dir] = folderDate;
                
                // "E:\2023-11-01 030000" = "2023-11-01 03:00:00"
            }
        }
    }


    // ===================================================
    // Stage 3: Check if there is any folder existed
    // ===================================================

    // No backup folder exists, means no backup is ever executed.
    // Start from the first drive
    if (dicFolderDate.Count == 0)
    {
        return $"{lstMyDrive[0]}:\\{timeNowStr}";
        // example: E:\2023-11-02 010000
        // this is the first folder
    }

    // if there are folders found, continue the following
    // assuming some backups have already been executed

    // ===================================================
    // Stage 4: Get the latest backup date and folder
    // ===================================================

    string latestBackupFolder = "";
    DateTime latestBackupDate = DateTime.MinValue;

    foreach (var keyValuePair in dicFolderDate)
    {
        // key = backup folder
        // value = date

        // example: [key: "E:\2023-11-01 030000"], [value: date]

        // finding the latest date and the folder
        if (keyValuePair.Value > latestBackupDate)
        {
            latestBackupDate = keyValuePair.Value;
            latestBackupFolder = keyValuePair.Key;
        }
    }

    // ===================================================
    // Stage 5: There is no recorded backup
    // ===================================================

    // None of the found folders were recognizable as backup folders
    if (latestBackupDate == DateTime.MinValue)
    {
        // which means, no backup is ever executed
        // Begin full backup, start at the first drive

        return $"{lstMyDrive[0]}:\\{timeNowStr}";

        // example: E:\2023-11-02 010000
        // this is the first folder
    }

    // ===================================================
    // Stage 6: Check if the total number of days since 
    //          the last full backup exceeds the threshold
    // ===================================================

    // calculating the age of the last backup
    var timespanTotalDaysOld = DateTime.Now - latestBackupDate;

    // TotalDaysForFullBackup:
    // - an number defined by user
    // - preloaded during program start
    // - usually loaded from a config file

    // the age of the last backup is still within the defined days
    // Perform incremental backup, continue to use the latest used backup folder

    if (timespanTotalDaysOld.TotalDays < TotalDaysForFullBackup)
    {
        // use the old folder, performing incremental backup

        return latestBackupFolder;

        // example: E:\2023-11-02 010000
    }

    // the age of the last backup is older than defined days
    if (timespanTotalDaysOld.TotalDays >= TotalDaysForFullBackup)
    {
        // Perform a full backup on new folder

        bool requireFormat = false;

        // GetSuitableDrive - Find the correct drive for saving new backup
        // this method will be explained the next section
        string newTargetedDriveLetter = 
               GetSuitableDrive(totalSize, lstMyDrive, out requireFormat);

        // Format Drive here...............
        if (requireFormat)
        {
            FormatDrive(newTargetedDriveLetter);
        }

        return $"{newTargetedDriveLetter}:\\{timeNowStr}";

        // example: E:\2023-11-02 010000
    }
}

the GetSuitableDrive method:

static string GetSuitableDrive(long totalSize, 
                     string[] lstMyDrive, out bool requireFormat)
{
    // lstMyDrive = { "E", "F" };
    // this value is usually loaded from a config/text file

    requireFormat = false;

    // the following 2 lines might be a redundant which can be passed down 
    // from the parent method
    // but anyway, let's continue this at the moment

    DriveInfo[] allDrives = DriveInfo.GetDrives();
    List<DriveInfo> matchingDrives = allDrives.Where(d => 
                   lstMyDrive.Contains(d.Name[0].ToString())).ToList();

    // if you remember, above 2 lines will get the drive info for E:\ and F:\

    // Condition 1: Check available free space

    // loop through all drives and find whether one of them 
    // has enough space for backup

    foreach (DriveInfo drive in matchingDrives)
    {
        if (drive.IsReady)
        {
            bool enoughSpace = drive.AvailableFreeSpace > totalSize;

            // this drive has enough space, use this
            if (enoughSpace)
            {
                return drive.Name[0].ToString();
            }
        }
    }

    // none of the drives has enough space, continue the following

    // now the program will look for the latest backup drive
    // and identify the it's drive letter

    // at this point, a format will be required
    // the question is which drive will be formatted
    requireFormat = true;

    DateTime latestDate = DateTime.MinValue;
    string latestDrive = "";

    foreach (DriveInfo drive in matchingDrives)
    {
        if (drive.IsReady)
        {
            // this steps seems to be a redundant, 
            // as this info has already been collected at the parent method
            // anyway, let's continue this at the moment

            // get all the backup drives    
            string[] directories = Directory.GetDirectories(drive.Name);

            foreach (string dir in directories)
            {
                string folderName = new DirectoryInfo(dir).Name;
                if (DateTime.TryParseExact(folderName, "yyyy-MM-dd HHmmss", 
                    CultureInfo.InvariantCulture, DateTimeStyles.None, 
                    out DateTime folderDate))
                {
                    if (folderDate > latestDate)
                    {
                        // obtain the latest date and it's correspondence drive letter
                        latestDate = folderDate;
                        latestDrive = drive.Name[0].ToString();
                    }
                }
            }
        }
    }

    // Find the next drive after the one with the latest folder

    // Get the current position drive letter in the array (list)
    int latestDriveIndex = Array.IndexOf(lstMyDrive, latestDrive);

    // Move towards the next drive
    latestDriveIndex++;

    // The index position falls out the array (list)
    if (latestDriveIndex >= lstMyDrive.Length)
    {
        // Going back to the first drive
        latestDriveIndex = 0;
    }

    return lstMyDrive[latestDriveIndex];
}

Up to this point, the drive letter has been chosen and a new backup folder will be created

By piecing all of this together, a simple file and folder backup solution can be built.

Let The Program Runs Automatically

The program can be set to run automatically by using Windows Task Scheduler.

  • Open Windows Task Scheduler, create a Task.
  • Set the task scheduler’s action to run this program.
  • Run the task scheduler with administrative user or System
  • Run it whether user is logged on or not
  • Run with highest privileges
  • Set a trigger with your preferred execution time (i.e. 3am)

Here is the C# code snippet that can install the task scheduler programmatically:

First, install the Nuget Package of TaskScheduler (provided by David Hall).

Install Task

using Microsoft.Win32.TaskScheduler;
using System.Text.RegularExpressions;

// Get the service on the local machine
using (TaskService ts = new TaskService())
{
    // Check if the task already exists
    var existingTask = ts.GetTask("Auto Folder Backup");
    if (existingTask != null)
    {
        // Task already existed
        return;
    }

    // Create a new task definition and assign properties
    TaskDefinition td = ts.NewTask();
    td.RegistrationInfo.Description = "Automated folder backup task";
    td.Principal.RunLevel = TaskRunLevel.Highest;  // Run with the highest privileges

    // Create a trigger that will fire every day at 3AM
    DateTime triggertime = DateTime.Today.AddHours((int)nmTaskHour.Value)
                        .AddMinutes((int)nmTaskMinute.Value);
    td.Triggers.Add(new DailyTrigger { StartBoundary = triggertime });

    // Create an action that will launch the program whenever the trigger fires
    string p = @"D:\auto_folder_backup\auto_folder_backup.exe";
    td.Actions.Add(new ExecAction(p, null, null));

    // Register the task in the root folder
    ts.RootFolder.RegisterTaskDefinition(@"Auto Folder Backup", td,
        TaskCreation.CreateOrUpdate,
        "SYSTEM",   // Specify the "SYSTEM" user (or an administrator username)
        null,       // No password is needed when using the SYSTEM user
        TaskLogonType.ServiceAccount,
        null);      // No SDDL-defined security descriptor needed
}

Remove Task

// Get the service on the local machine
using (TaskService ts = new TaskService())
{
    // Get the tasks that match the regex
    var tasks = ts.RootFolder.GetTasks(new Regex("Auto Folder Backup"));

    // Delete the tasks
    foreach (var task in tasks)
    {
        ts.RootFolder.DeleteTask(task.Name);
    }
}

That is all for now. Cheers  😀


Similar Articles