Python  

How to Use Python’s asyncio for Concurrent HTTP Requests?

Introduction

When a Python program makes many HTTP requests sequentially, it can become slow because each request waits for the previous one to finish. This is especially noticeable when calling APIs or downloading data from multiple URLs.

In simple terms, your program waits even when it does not need to. Python’s asyncio module helps solve this problem by allowing multiple HTTP requests to run concurrently. In this article, you will learn how to use asyncio for concurrent HTTP requests using clear explanations and easy examples.

What Is asyncio in Python?

asyncio is a Python library used for writing asynchronous code. It allows your program to continue processing while waiting for slow operations, such as network calls.

Key ideas:

  • async defines an asynchronous function

  • await pauses execution until a task finishes

  • An event loop manages all async tasks

asyncio is ideal for I/O-bound tasks such as HTTP requests.

Why Use asyncio for HTTP Requests?

HTTP requests spend most of their time waiting for server responses. With asyncio, you can send many requests together instead of waiting for each one.

Benefits include:

  • Faster execution

  • Better resource usage

  • Scalable API calls

Example scenario:

  • Fetching data from 10 APIs

  • Downloading data from multiple services

Instead of taking 10 seconds, concurrent requests may finish in 1–2 seconds.

Required Library for Async HTTP Requests

The standard requests library is blocking and does not work with asyncio. For async HTTP calls, you should use aiohttp.

Install it using:

pip install aiohttp

aiohttp works smoothly with asyncio and supports async/await syntax.

Making a Single Async HTTP Request

Let’s start with a simple example of making one async HTTP request.

import asyncio
import aiohttp

async def fetch_data(url):
    async with aiohttp.ClientSession() as session:
        async with session.get(url) as response:
            return await response.text()

async def main():
    data = await fetch_data('https://example.com')
    print(data)

asyncio.run(main())

Explanation:

  • async def defines async functions

  • await waits for the HTTP response

  • asyncio.run() starts the event loop

Making Multiple HTTP Requests Concurrently

To run multiple requests at the same time, create tasks and run them together.

async def fetch_data(session, url):
    async with session.get(url) as response:
        return await response.text()

async def main():
    urls = [
        'https://example.com',
        'https://httpbin.org/get',
        'https://jsonplaceholder.typicode.com/posts/1'
    ]

    async with aiohttp.ClientSession() as session:
        tasks = [fetch_data(session, url) for url in urls]
        results = await asyncio.gather(*tasks)

    for result in results:
        print(result[:100])

asyncio.run(main())

Explanation:

  • All requests start together

  • asyncio.gather() waits for all tasks

  • Results are returned in order

Handling Errors in Async HTTP Requests

Errors must be handled properly to avoid crashes.

async def fetch_data(session, url):
    try:
        async with session.get(url) as response:
            return await response.text()
    except Exception as e:
        return f"Error fetching {url}: {e}"

This ensures one failed request does not stop others.

Limiting Concurrent Requests

Sending too many requests at once can overload servers. Use a semaphore to limit concurrency.

semaphore = asyncio.Semaphore(5)

async def fetch_with_limit(session, url):
    async with semaphore:
        async with session.get(url) as response:
            return await response.text()

This limits the number of active requests at any time.

When asyncio Is the Right Choice

asyncio works best for:

  • API calls

  • Web scraping

  • Microservices communication

  • I/O-bound workloads

It is not ideal for CPU-heavy tasks like image processing or machine learning.

Comparison Table: requests vs asyncio vs threading

The table below compares common approaches for making HTTP requests in Python.

Aspectrequests (sync)asyncio (async)threading
Concurrency modelBlockingNon-blocking event loopOS threads
Performance for I/OLowHighMedium
Resource usageLowVery lowHigher (threads)
ComplexityVery simpleModerateModerate
Best use caseFew requestsMany concurrent requestsMixed workloads

Real-World API Batching Examples

In real applications, APIs often limit how many requests you can send at once. Batching helps control load and improves reliability.

Example: Fetch user details in batches of 5 URLs at a time.

async def fetch_batch(session, urls):
    tasks = [fetch_data(session, url) for url in urls]
    return await asyncio.gather(*tasks)

async def main():
    all_urls = [f'https://api.example.com/users/{i}' for i in range(1, 21)]

    async with aiohttp.ClientSession() as session:
        for i in range(0, len(all_urls), 5):
            batch = all_urls[i:i+5]
            results = await fetch_batch(session, batch)
            print(f'Batch {i//5 + 1} completed')

This approach avoids overwhelming the API while still running requests concurrently.

Timeouts, Retries, and Backoff Strategies

Network calls can fail due to slow servers or temporary issues. Adding timeouts and retries makes your code more reliable.

Adding Timeouts

timeout = aiohttp.ClientTimeout(total=10)
async with aiohttp.ClientSession(timeout=timeout) as session:
    async with session.get(url) as response:
        data = await response.text()

Simple Retry Logic

async def fetch_with_retry(session, url, retries=3):
    for attempt in range(retries):
        try:
            async with session.get(url) as response:
                return await response.text()
        except Exception:
            if attempt == retries - 1:
                raise

Exponential Backoff

import asyncio

async def fetch_with_backoff(session, url, retries=3):
    delay = 1
    for attempt in range(retries):
        try:
            async with session.get(url) as response:
                return await response.text()
        except Exception:
            await asyncio.sleep(delay)
            delay *= 2

Timeouts and backoff strategies prevent your application from failing under temporary network issues.

Common Mistakes to Avoid

Beginners often make these mistakes:

  • Using requests with asyncio

  • Forgetting await

  • Creating a new session for every request

  • Not handling exceptions

Avoiding these mistakes improves reliability and performance.

Summary

Using Python’s asyncio for concurrent HTTP requests allows your programs to run faster and more efficiently. By combining asyncio, async/await, and aiohttp, you can send multiple HTTP requests at the same time, handle errors safely, and control concurrency. This approach is ideal for modern Python applications that depend on network calls and external APIs.