š¹ Introduction
In programming, arrays often contain repeated elements. Removing duplicates ensures that only unique elements remain. This is essential in real-world applications like data cleaning, search optimization, and improving performance.
For example
Input: [1, 2, 2, 3, 4, 4, 5]
Output: [1, 2, 3, 4, 5]
There are several ways to remove duplicates from an array, each with different time and space complexities.
š ļø Method 1. Using a Set (Recommended)
Most modern programming languages provide a Set
data structure that automatically stores unique elements.
š¹ Java Example
import java.util.*;
public class RemoveDuplicates {
public static void main(String[] args) {
int[] arr = {1, 2, 2, 3, 4, 4, 5};
Set<Integer> set = new LinkedHashSet<>(); // preserves order
for (int num : arr) {
set.add(num);
}
System.out.println("Unique elements: " + set);
}
}
ā
Output
Unique elements: [1, 2, 3, 4, 5]
š¹ Python Example
arr = [1, 2, 2, 3, 4, 4, 5]
unique_arr = list(set(arr))
print("Unique elements:", unique_arr)
Note: In Python, converting to a set may not preserve the original order, unless you use dict.fromkeys(arr)
or OrderedDict
.
š ļø Method 2. Using Sorting
Another approach is to sort the array first and then remove consecutive duplicates.
š¹ Java Example
import java.util.Arrays;
public class RemoveDuplicatesSorted {
public static void main(String[] args) {
int[] arr = {1, 2, 2, 3, 4, 4, 5};
Arrays.sort(arr);
int j = 0; // index for unique elements
for (int i = 1; i < arr.length; i++) {
if (arr[i] != arr[j]) {
j++;
arr[j] = arr[i];
}
}
int[] result = Arrays.copyOf(arr, j + 1);
System.out.println("Unique elements: " + Arrays.toString(result));
}
}
ā
Output
Unique elements: [1, 2, 3, 4, 5]
Time Complexity: O(n log n) (due to sorting)
Space Complexity: O(1)
š ļø Method 3. Using a Temporary Array (Brute Force)
This method works by checking each element before adding it to a new array.
š¹ Java Example
public class RemoveDuplicatesBruteForce {
public static void main(String[] args) {
int[] arr = {1, 2, 2, 3, 4, 4, 5};
int n = arr.length;
int[] temp = new int[n];
int j = 0;
for (int i = 0; i < n; i++) {
boolean isDuplicate = false;
for (int k = 0; k < j; k++) {
if (arr[i] == temp[k]) {
isDuplicate = true;
break;
}
}
if (!isDuplicate) {
temp[j++] = arr[i];
}
}
int[] result = new int[j];
System.arraycopy(temp, 0, result, 0, j);
System.out.println("Unique elements: " + java.util.Arrays.toString(result));
}
}
Time Complexity: O(n²)
Space Complexity: O(n)
This method is less efficient but useful to understand the basic logic behind removing duplicates.
š” Tips for Removing Duplicates
Use Set if you want simplicity and efficiency.
Use sorting if maintaining order is not strictly necessary or for in-place operations.
For large datasets, prefer hashing-based approaches like Set
or HashMap
.
š Conclusion
Removing duplicates is a fundamental DSA problem that helps improve your coding and problem-solving skills. Understanding multiple approaches ā Set, Sorting, Brute Force ā prepares you for interviews and real-world projects.
By practicing this problem, you also learn about time-space tradeoffs, which is a key concept in efficient programming.