Demystifying Asynchronous Programming in Python: Practical Examples with Async and Await
Asynchronous programming has become increasingly popular in recent years, particularly for web development and other I/O-bound tasks π. With the introduction of the async/await syntax in Python 3.5, writing asynchronous code has become more convenient. In this article, we'll demystify asynchronous programming in Python by focusing on the fundamentals of async
/await
and examine practical examples using only the built-in language features and FastAPI π.
What is Asynchronous Programming? π€
Asynchronous programming is a paradigm that enables the concurrent execution of tasks without blocking the flow of execution. It contrasts with synchronous programming, where tasks are executed sequentially and often results in longer response times and wasted resources when waiting for external data, like network requests or file I/O.
Asynchronous programming is particularly beneficial for web servers that handle multiple simultaneous requests. These servers can process requests concurrently, improving overall performance and responsiveness π‘. When a request requires access to an external resource, the server can continue processing other requests instead of waiting.
In summary, asynchronous programming allows for more efficient and responsive applications by executing multiple tasks concurrently without blocking. Web servers can greatly benefit from this approach, as it enables them to handle multiple requests simultaneously, improving performance and responsiveness.
Async and Await in Python
The async/await syntax in Python provides a simple and powerful way to write asynchronous code. The async
keyword is used to define a coroutine function and the await
keyword is used to call a coroutine within another coroutine.
Here's a simple example to illustrate the concept:
import asyncio
async def say_hello():
print("Hello,")
await asyncio.sleep(1)
print("world!")
async def main():
await say_hello()
asyncio.run(main())
In this example, we first import the asyncio
module, which is part of the Python standard library and provides tools for writing asynchronous code π§©.
We then define an asynchronous function, or coroutine, called say_hello
using the async
keyword. Inside this coroutine, we print "Hello," and then use the await
keyword to call the asynchronous asyncio.sleep
function. This function puts the current coroutine on hold for the specified duration (1 second in this case) while allowing other coroutines to continue executing. After the sleep duration is over, the execution resumes, and the "world!" text is printed.
Next, we define another coroutine called main
, which serves as the entry point for our program. Inside main
, we use the await
keyword to call the say_hello
coroutine. This means that the main
coroutine will wait for say_hello
to complete before continuing.
Finally, we use the asyncio.run
function to execute the main
coroutine. This function takes a coroutine as its argument and runs it as the main entry point of the asynchronous program.
Asynchronous Task Scheduling in FastAPI
You can also schedule asynchronous tasks to run concurrently in FastAPI. In this example, we'll define two tasks and schedule them to run concurrently using FastAPI's built-in BackgroundTasks
feature.
import asyncio
from fastapi import FastAPI, BackgroundTasks
app = FastAPI()
async def task_a():
await asyncio.sleep(2)
print("Task A completed.")
async def task_b():
await asyncio.sleep(3)
print("Task B completed.")
@app.get("/run-tasks")
async def run_tasks(background_tasks: BackgroundTasks):
background_tasks.add_task(task_a)
background_tasks.add_task(task_b)
return {"message": "Tasks scheduled"}
Here I demonstrate asynchronous task scheduling in FastAPI using the built-in BackgroundTasks
feature π οΈ. We define two coroutines, task_a
and task_b
, which simulate tasks with different durations using asyncio.sleep
. In the FastAPI endpoint run_tasks
, we accept a background_tasks
parameter and use it to schedule both tasks to run concurrently. When the /run-tasks
endpoint is called, the tasks are scheduled in the background, allowing the endpoint to return a response immediately, without waiting for the tasks to complete.
This asynchronous task scheduling can be used in various real-world scenarios where tasks need to be performed in the background without blocking the main application flow. These tasks might include:
- Sending emails or notifications: When a user submits a form or triggers an event, you may need to send emails or notifications to other users. Scheduling these tasks in the background allows the web application to respond quickly to the user's request, while the emails or notifications are sent asynchronously π§.
- Data processing or analysis: For applications that require processing large datasets or performing complex calculations, background tasks can be used to offload this work from the main application thread. This ensures that the application remains responsive to user requests while the data processing is being carried out π’.
- Periodic tasks or cron jobs: In some cases, applications may need to perform periodic tasks like cleaning up old data, generating reports, or updating external systems. By scheduling these tasks as background tasks, you can ensure they run concurrently without affecting the application's performance β²οΈ.
- Web scraping or data fetching: If your application relies on data fetched from external sources, such as APIs or websites, you can schedule these data-fetching tasks in the background to avoid blocking the main application flow π.
- File processing: When users upload large files or your application needs to generate files (e.g., PDFs, images), you can process these files in the background while allowing the application to continue handling user requests π.
By leveraging FastAPI's BackgroundTasks
feature for these use cases, you can create more efficient and responsive web applications that can handle concurrent tasks without impacting the user experience.
Asynchronous File Upload in FastAPI
Asynchronous programming can also be useful for handling file uploads efficiently. In this example, we'll create an asynchronous file upload endpoint using FastAPI.
import asyncio
from fastapi import FastAPI, File, UploadFile
app = FastAPI()
async def save_file(file: UploadFile):
async with aiofiles.open(f"bucket/{file.filename}", "wb") as f:
await f.write(await file.read())
@app.post("/upload-file")
async def upload_file(file: UploadFile):
await save_file(file)
return {"filename": file.filename}
In this example, we define a coroutine function save_file
that saves an uploaded file to disk. We create an asynchronous endpoint upload_file
in FastAPI that awaits the save_file
coroutine and returns the uploaded file's name.
Rate Limiting with Asynchronous Semaphores π
Asynchronous programming can also help manage resource usage by implementing rate limiting. In this example, we'll use an asynchronous semaphore to limit the number of concurrent requests to an external API.
import asyncio
from fastapi import FastAPI, HTTPException
from httpx import AsyncClient
app = FastAPI()
semaphore = asyncio.Semaphore(5)
async def fetch_data():
async with AsyncClient() as client:
async with semaphore:
response = await client.get("https://jsonplaceholder.typicode.com/todos/1")
if response.status_code != 200:
raise HTTPException(status_code=503, detail="External API request failed")
return response.json()
@app.get("/data")
async def get_data():
data = await fetch_data()
return data
This example demonstrates how to implement rate limiting for concurrent requests to an external API using asynchronous semaphores. Semaphores are synchronization primitives that can be used to limit access to a shared resource, in this case, the number of concurrent API requests.
We create an asynchronous semaphore with a limit of 5 concurrent requests. The fetch_data
coroutine acquires the semaphore before making an external API request using the httpx
library. This ensures that no more than 5 requests are made concurrently. If the semaphore limit is reached, additional requests will have to wait for an available slot. If the request fails, an HTTP exception is raised to inform the client.
By implementing rate limiting with async semaphores, you can manage the number of concurrent requests to external APIs or other shared resources, helping to prevent overloading these systems and ensuring your application remains stable and efficient.
Conclusions
In conclusion, asynchronous programming in Python, utilizing the async/await syntax, enables the development of more efficient and responsive applications, particularly when working with I/O-bound tasks. The practical examples provided in this article demonstrate the versatility of async/await in various use cases, such as FastAPI endpoints, task scheduling, file uploads, and rate limiting.
Implementing asynchronous programming in your projects can lead to significant performance improvements, especially when dealing with web servers or applications that require access to external resources. By understanding and employing asynchronous programming techniques, you can optimize your applications and deliver a better user experience π.
It is essential to continue exploring the world of asynchronous programming in Python, as there are more tools and libraries available to help you harness the full potential of async/await. Mastering these concepts will enable you to tackle complex projects with ease and create highly efficient, scalable, and responsive applications.
tl;dr;
- Asynchronous programming enables concurrent execution of tasks, improving efficiency and responsiveness π‘.
- Async/await syntax in Python simplifies writing asynchronous code π.
- FastAPI is a modern web framework that supports asynchronous programming β‘.
- Practical examples include asynchronous FastAPI endpoints, task scheduling, file uploads, and rate limiting π οΈ.
- Asynchronous programming is particularly beneficial for web servers and applications dealing with external resources π.
- Mastering async/await concepts allows for the creation of highly efficient, scalable, and responsive applications π―.
Member discussion