Algorithms in C#  

Explain Time Complexity and Space Complexity with Examples

When we write a program, it’s not enough for it to just work. It also needs to be efficient. That’s where time complexity and space complexity come in. These two concepts help us measure how much time and memory an algorithm uses as the input grows.

Let’s break them down in simple terms.

⚡ 1. What is Time Complexity?

Time complexity tells us how much time an algorithm will take to run as the input size increases.

👉 Instead of using a stopwatch, we use mathematical functions like O(1), O(n), O(log n) to describe the growth rate.

🔑 Examples

  • O(1): Constant Time

          
            int getFirstElement(int arr[]) {
        return arr[0];  // Always takes the same time
    }
          
        

    ✅ No matter the array size, accessing the first element is instant.

  • O(n): Linear Time

          
            void printArray(int arr[], int n) {
        for(int i=0; i<n; i++) {
            cout << arr[i] << " ";
        }
    }
          
        

    ✅ If the array has 10 elements, 10 steps; if 1000 elements, 1000 steps.

  • O(log n): Logarithmic Time (like Binary Search)
    ✅ Each step cuts the problem in half → much faster for large inputs.

💾 2. What is Space Complexity?

Space complexity tells us how much memory an algorithm uses. It includes:

  • Variables

  • Data structures (arrays, linked lists, etc.)

  • Function call stack (especially in recursion)

🔑 Example

  • O(1): Constant Space

          
            int sum(int a, int b) {
        return a + b;   // Uses only two integer variables
    }
          
        

    ✅ Always uses the same amount of memory.

  • O(n): Linear Space

          
            void storeNumbers(int n) {
        int arr[n];  // Memory grows as n grows
    }
          
        

    ✅ If n=100, memory for 100 integers is used.

🔍 3. Why Do We Care About Complexity?

  • Time complexity ensures programs don’t take forever when the input grows.

  • Space complexity ensures programs don’t run out of memory.

  • Many times, there’s a trade-off :

    • Faster algorithms may use more memory.

    • Memory-efficient algorithms may take longer.

👉 Example

  • Hash tables (fast lookup but higher memory).

  • Binary search (slightly slower but memory efficient).

📊 4. Big-O Notation Summary

Here’s a quick table to understand common complexities:

ComplexityNameExample Use Case ⚡
O(1)ConstantAccessing array element
O(log n)LogarithmicBinary Search
O(n)LinearTraversing an array
O(n log n)LinearithmicMerge Sort, Quick Sort
O(n²)QuadraticNested loops, Bubble Sort
O(2^n)ExponentialRecursive Fibonacci
O(n!)FactorialTravelling Salesman brute force

🚀 5. Real-Life Analogy

Imagine you are searching for a book in a library:

  • O(1): You already know the exact shelf → instant.

  • O(n): You check every book one by one.

  • O(log n): You follow a directory, cut the search space in half each time.

👉 That’s how complexity feels in real life!

🎯 Conclusion

Time complexity = how fast.
Space complexity = how much memory.

Together, they tell us whether an algorithm is efficient and practical for large-scale problems. Understanding them is the foundation of mastering DSA and cracking coding interviews.

So, next time you solve a problem, don’t just ask “Does it work?” — also ask:
👉 “How fast and memory-friendly is it?”