{"id":3397,"date":"2025-10-28T09:58:14","date_gmt":"2025-10-28T09:58:14","guid":{"rendered":"https:\/\/www.certkiller.com\/blog\/?p=3397"},"modified":"2025-10-28T09:58:14","modified_gmt":"2025-10-28T09:58:14","slug":"mastering-async-io-in-python-how-event-loops-coroutines-and-tasks-enable-efficient-non-blocking-code","status":"publish","type":"post","link":"https:\/\/www.certkiller.com\/blog\/mastering-async-io-in-python-how-event-loops-coroutines-and-tasks-enable-efficient-non-blocking-code\/","title":{"rendered":"Mastering Async IO in Python: How Event Loops, Coroutines, and Tasks Enable Efficient Non-Blocking Code"},"content":{"rendered":"<p><span style=\"font-weight: 400;\">Concurrency is a fundamental concept in computing. It is the process where a system can manage multiple tasks in an overlapping manner. This does not mean the tasks are all running at the exact same instant. Instead, it means a task can be started, it can run for a while, and then it can be paused. While it is paused, another task can be started and allowed to run. The system effectively juggles these tasks, switching between them to make progress on all of them. Their execution periods overlap, giving the illusion of simultaneous execution.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This is different from parallelism. Parallelism is when multiple tasks are running at the exact same physical time. This requires hardware with multiple processing units, such as a multi-core CPU. If you have four CPU cores, you can run four tasks in true parallel. Concurrency, on the other hand, can be achieved even on a single-core CPU. It is a structural property of a program, while parallelism is a runtime behavior dependent on hardware. The source article correctly notes that concurrency is a broader term than parallelism.<\/span><\/p>\n<h2><b>The Problem: I\/O Bound vs. CPU Bound<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">To understand why we need async IO, we must first understand the two main types of tasks a program performs. The first type is &#8220;CPU-bound.&#8221; These are tasks that involve heavy computation. Think of resizing a large image, encrypting a file, or calculating a complex mathematical model. The task is limited by the speed of the CPU. To make these tasks faster, you generally need a faster CPU or multiple CPUs (parallelism).<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The second type is &#8220;I\/O-bound.&#8221; I\/O stands for &#8220;Input\/Output.&#8221; These are tasks that spend most of their time waiting for something else. This &#8220;something else&#8221; could be a response from a web server, a query to a database, or reading a large file from a slow hard drive. While the program is waiting for the network or the disk, the CPU is sitting idle, doing nothing. This is the specific problem that asynchronous programming is designed to solve. It aims to put the CPU to work while it would otherwise be waiting.<\/span><\/p>\n<h2><b>Traditional Solutions: Threads and Multiprocessing<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Before async IO became popular, Python offered two primary tools for concurrency. The first is threading. A thread is a separate flow of execution. Your operating system can manage multiple threads, switching between them. If one thread is blocked waiting for a network request, the operating system can switch to another thread to keep the CPU busy. However, due to Python&#8217;s Global Interpreter Lock (GIL), only one thread can execute Python bytecode at a time, so it does not provide true parallelism for CPU-bound tasks.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The second tool is multiprocessing. This module bypasses the GIL by creating entirely new processes. Each process gets its own Python interpreter and memory space. This allows for true parallelism and is excellent for CPU-bound tasks. However, creating and managing processes is &#8220;heavy.&#8221; It uses more memory, and communicating between processes is more complex than communicating between threads. Both of these solutions are powerful, but they also introduce complexity and resource overhead.<\/span><\/p>\n<h2><b>What is Async IO in Python?<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Async IO, provided by the asyncio library, is Python&#8217;s third approach to concurrency. It is a framework for writing single-threaded, concurrent code using a concept called &#8220;cooperative multitasking.&#8221; Instead of relying on the operating system to switch between threads, or creating new processes, asyncio manages everything within a single thread. It does this by using an &#8220;event loop&#8221; and special functions called &#8220;coroutines.&#8221; This approach is particularly well-suited for high-performance I\/O-bound applications.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">As the source article states, asyncio is a library package used to write concurrent code using the async and await syntax. It can be imported simply with import asyncio. This model allows a program to manage tens of thousands of concurrent I\/O operations (like network connections) far more efficiently than using a separate thread for each one. It achieves concurrency without the overhead of threads or the complexity of multiprocessing.<\/span><\/p>\n<h2><b>Setting up Your Environment<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The first step to using asyncio is ensuring you have the correct Python version. As the source material notes, you will need Python 3.7 or a newer version. These versions have greatly simplified the asyncio API, particularly with the introduction of asyncio.run(), which makes it much easier to get started.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">It is a universal best practice in Python development to use a virtual environment for every project. A virtual environment is an isolated directory that contains a specific version of Python and its own set of installed libraries. This prevents conflicts between the dependencies of different projects. As shown in the source, you can create one using the venv module that comes built-in with Python.<\/span><\/p>\n<h2><b>Creating and Activating a Virtual Environment<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">To create a virtual environment, open your terminal or command prompt, navigate to your project&#8217;s folder, and run the command: python -m venv myenv. This command creates a new directory named &#8220;myenv&#8221; which will hold your environment&#8217;s files. This only needs to be done once per project.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">To &#8220;activate&#8221; or start using the environment, you must run a special script. On Windows, as the source article mentions, the command is myenv\\Scripts\\activate. On Unix-based systems like macOS or Linux, the command is source myenv\/bin\/activate. Once activated, your terminal prompt will usually change to show the environment&#8217;s name. Now, any pip commands you run will install packages into this isolated environment, leaving your global Python installation clean.<\/span><\/p>\n<h2><b>Installing Asynchronous Libraries<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">With your virtual environment activated, you can now install the libraries you need. The asyncio library itself is part of Python&#8217;s standard library, so it does not need to be installed. However, to do anything useful with it, you need third-party libraries that <\/span><i><span style=\"font-weight: 400;\">support<\/span><\/i><span style=\"font-weight: 400;\"> asyncio. A normal library, like the popular requests library for HTTP, will not work. Calling a blocking function from a synchronous library will freeze your entire asyncio application.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">You must use &#8220;async-native&#8221; libraries. The source article suggests two popular ones. aiohttp is an &#8220;asynchronous HTTP client\/server&#8221; framework. It is the asyncio equivalent of requests. aioredis is an asynchronous library for &#8220;Redis,&#8221; a popular in-memory database. To install them, you use pip, as shown in the source: pip install aiohttp aioredis. These libraries are built from the ground up to be non-blocking and to integrate perfectly with asyncio.<\/span><\/p>\n<h2><b>Async IO &#8220;Hello, World&#8221;<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Let us look at the simple example from the source article. This is the &#8220;Hello, World&#8221; of asyncio.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def hello():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Hello&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(1)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;World&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(hello())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">This short program demonstrates all the core concepts. The import asyncio line brings in the library. The async def syntax defines a &#8220;coroutine&#8221; function named hello. Inside this function, print(&#8220;Hello&#8221;) is a normal, synchronous function call.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The most important line is await asyncio.sleep(1). This is the asynchronous, non-blocking part. asyncio.sleep() is an async function that tells the program to &#8220;pause here for one second.&#8221; The await keyword gives control back to the asyncio event loop. The loop can then go and do other work during that one second instead of just freezing. After the second is up, the loop resumes the hello function, which then runs print(&#8220;World&#8221;). Finally, asyncio.run(hello()) is the main entry point that starts the event loop and runs the hello coroutine.<\/span><\/p>\n<h2><b>Understanding Coroutines<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">As the source article highlights, coroutines are a critical part of asyncio. But what are they? A coroutine is, in essence, a function that can be paused and resumed. When you define a function using async def, you are not creating a normal function. When you call it, like hello(), it does not immediately run the code inside. Instead, it returns a &#8220;coroutine object.&#8221;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This object is like a blueprint for the work that needs to be done. It contains the code, the current state, and its context. It does not do anything until you hand it over to the asyncio event loop. The loop is responsible for actually executing, pausing, and resuming the coroutine. This ability to pause is what enables cooperative multitasking. A coroutine can run until it hits an await on a slow I\/O operation, and it &#8220;cooperatively&#8221; pauses itself, yielding control back to the loop so another coroutine can run.<\/span><\/p>\n<h2><b>Introducing the async and await Keywords<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The async and await keywords are the special syntax that makes all of this work. The async keyword is used in two places. First, as we have seen, async def is used to declare a coroutine function. This flags it to Python as a special function that must be managed by an event loop.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Second, async for and async with are used to create asynchronous loops and context managers, which are more advanced features. The await keyword, as its name suggests, is used to &#8220;await&#8221; the result of another asynchronous operation. You can only use the await keyword <\/span><i><span style=\"font-weight: 400;\">inside<\/span><\/i><span style=\"font-weight: 400;\"> an async def function. When you await something, you are telling the asyncio loop, &#8220;I am pausing my execution here until this other thing is complete. You can go run other tasks in the meantime.&#8221; This is the explicit &#8220;pause&#8221; point.<\/span><\/p>\n<h2><b>A Deeper Dive into Coroutines<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">In the first part, we introduced coroutines as special functions defined with async def. Now, let us explore them more deeply. The concept of a coroutine actually predates asyncio. It comes from the idea of &#8220;generators&#8221; in Python. Generators, which use the yield keyword, are functions that can be paused. When a generator &#8220;yields&#8221; a value, its state is frozen, and it returns the value. It can then be resumed from that exact point later.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Coroutines are a supercharged version of this idea. A coroutine defined with async def is a function that can pause its execution when it encounters an await expression. It pauses, allowing other code to run, and then resumes when the awaited operation is complete. This &#8220;pause&#8221; is the cooperative part of &#8220;cooperative multitasking.&#8221; The coroutine explicitly and voluntarily gives up control, rather than being forcibly interrupted by an operating system scheduler like a thread is.<\/span><\/p>\n<h2><b>What is an &#8220;Awaitable&#8221;?<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The await keyword is used to pause a coroutine, but you cannot await just anything. You can only await an object that is &#8220;awaitable.&#8221; In the world of asyncio, there are three main types of awaitable objects: coroutines, Tasks, and Futures.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">A coroutine object is what you get when you call an async def function. As we saw in await asyncio.sleep(1), you can await another coroutine. This is the most common case. You await an I\/O operation from a library (like aiohttp) that is itself an async def function. When you await another coroutine, asyncio transparently schedules it to run, waits for it to finish, and then gives you the result.<\/span><\/p>\n<h2><b>Tasks: The Schedulers of Work<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">A coroutine object on its own does not run. You need to tell asyncio to run it. When you pass a coroutine to asyncio.run(), it gets wrapped in a Task and scheduled. A Task is the primary way asyncio manages the execution of a coroutine. A Task is an object that &#8220;wraps&#8221; a coroutine and handles its lifecycle. It schedules the coroutine on the event loop, runs it to its first await, and then resumes it when its awaited operation is complete.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">You can create tasks manually using asyncio.create_task(). This is how you tell asyncio, &#8220;I want this coroutine to start running in the background, concurrently with my current coroutin.&#8221; The Task object is also an awaitable, so you can await it later to get its result or to just make sure it has finished. Tasks are what make concurrent execution possible.<\/span><\/p>\n<h2><b>Futures: The Low-Level Placeholders<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">A Future is a low-level awaitable object that represents the <\/span><i><span style=\"font-weight: 400;\">eventual result<\/span><\/i><span style=\"font-weight: 400;\"> of an asynchronous operation. When you create a Future, it is like an empty box or a placeholder. You can await this empty box, and your coroutine will pause. At some point, some other part of the system (usually a low-level I\/O component) will put a value into the box (or an error) and mark it as &#8220;done.&#8221; When that happens, any coroutine that was awaiting the Future will be woken up and resumed.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">You will rarely create Future objects manually in modern asyncio code. Task is actually a <\/span><i><span style=\"font-weight: 400;\">subclass<\/span><\/i><span style=\"font-weight: 400;\"> of Future. When you create a Task, you are creating a special kind of Future that knows how to run a coroutine to get its own result. Understanding Futures is helpful for understanding how asyncio works under the hood and for interfacing with older, callback-based code.<\/span><\/p>\n<h2><b>The Central Role of the Event Loop<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The &#8220;event loop&#8221; is the heart of every asyncio application. It is the central scheduler that manages all the tasks, I\/O operations, and events. You can think of it as the juggler from our earlier analogy. The event loop has a queue of tasks that are ready to run. It picks a task from the queue and runs it. That task&#8217;s coroutine executes its code until it hits an await keyword.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">When a coroutine awaits an operation (like asyncio.sleep(1) or a network request), it is essentially telling the event loop, &#8220;I am going to be busy waiting for 1 second. Please pause me and go do something else.&#8221; The event loop pauses that task, places it in a &#8220;waiting&#8221; state, and then moves on to the next task in the &#8220;ready&#8221; queue. This is a single, continuous loop: run a task, pause it at an await, run another task, check if any waiting tasks are done, move them to the ready queue, and repeat.<\/span><\/p>\n<h2><b>What asyncio.run() Really Does<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">In modern Python, you will most often interact with the event loop through a single, high-level function: asyncio.run(). As the source example asyncio.run(hello()) shows, this is the main entry point for an asyncio program. This simple function performs several steps for you.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">First, it creates a brand new event loop. Second, it takes the coroutine object you passed it (e.g., hello()) and wraps it in a Task to schedule it. Third, it starts the event loop, which begins running that task. The loop will continue to run until that initial task, and any other tasks it creates, are all complete. Finally, once the main task is finished, asyncio.run() will shut down the event loop, clean up any resources, and then return the result from the main coroutine.<\/span><\/p>\n<h2><b>Running Tasks Concurrently: asyncio.gather()<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The hello() example from Part 1 is not concurrent. It does one thing, sleeps, and then does another. The real power of asyncio is shown when you run multiple tasks at once. The source article provides a perfect example of this using asyncio.gather(). Let us analyze that example in detail.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def task1():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Event 1 started&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(2)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Event 1 completed&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def task2():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Event 2 started&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(1)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Event 2 completed&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.gather(task1(), task2())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">Here, we have a main coroutine that we use as our entry point. Inside main, we call asyncio.gather(). This function takes one or more awaitables (in this case, the two coroutine objects task1() and task2()) and schedules them to run concurrently.<\/span><\/p>\n<h2><b>Walking Through the gather() Example<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">When asyncio.run(main()) starts, the main coroutine begins. It immediately hits await asyncio.gather(task1(), task2()). The gather function tells the event loop to start running both task1 and task2. The loop starts task1 first. task1 prints &#8220;Event 1 started&#8221; and then immediately hits await asyncio.sleep(2). It pauses and tells the loop, &#8220;Wake me up in 2 seconds.&#8221;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The event loop, now free, starts task2. task2 prints &#8220;Event 2 started&#8221; and then hits await asyncio.sleep(1). It also pauses, telling the loop, &#8220;Wake me up in 1 second.&#8221; Now the event loop is idle, with two tasks &#8220;sleeping.&#8221; After one second passes, the loop wakes up task2, which prints &#8220;Event 2 completed&#8221; and finishes. After another second (two seconds total), the loop wakes up task1, which prints &#8220;Event 1 completed&#8221; and finishes. The entire operation takes only 2 seconds, not 3 (2 + 1), because the tasks ran concurrently.<\/span><\/p>\n<h2><b>The asyncio.gather() Output<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The output from the source&#8217;s example is key:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Event 1 started<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Event 2 started<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Event 2 completed<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Event 1 completed<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">This output perfectly demonstrates the concurrent, overlapping execution. The program starts task1, then immediately starts task2 <\/span><i><span style=\"font-weight: 400;\">before<\/span><\/i><span style=\"font-weight: 400;\"> task1 is finished. task2 finishes first because its sleep duration was shorter. Then, task1 finishes. This is the non-blocking, concurrent behavior in action. The await asyncio.gather(&#8230;) line will not complete until <\/span><i><span style=\"font-weight: 400;\">all<\/span><\/i><span style=\"font-weight: 400;\"> the tasks passed to it have finished.<\/span><\/p>\n<h2><b>asyncio.create_task() for True Concurrency<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">While asyncio.gather() is great for &#8220;run these things and wait for them all,&#8221; sometimes you want to start a task in the background and continue doing other work. This is done with asyncio.create_task(). This function takes a coroutine and schedules it to run on the loop immediately. It returns a Task object, which you can await later.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Here is an alternative way to write the main function:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0task_1 = asyncio.create_task(task1())<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0task_2 = asyncio.create_task(task2())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# We can do other work here while task_1 and task_2 run<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Tasks have been created and are running&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# Now, wait for them to finish<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await task_1<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await task_2<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">This approach gives you more flexibility. You can create tasks, let them run, do other processing, and then await them at the very end to ensure they are complete before the main function exits. This is a more common and powerful pattern for building complex applications.<\/span><\/p>\n<h2><b>Why We Need Async Libraries<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">We have established that asyncio is a framework for managing concurrent I\/O-bound tasks. However, asyncio itself does not know <\/span><i><span style=\"font-weight: 400;\">how<\/span><\/i><span style=\"font-weight: 400;\"> to perform I\/O. It does not come with a built-in HTTP client or a database driver. It provides the event loop and the async\/await syntax, but it relies on a rich ecosystem of third-party libraries to perform the actual I\/O operations.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">A critical, non-negotiable rule of asyncio is that you must <\/span><i><span style=\"font-weight: 400;\">never<\/span><\/i><span style=\"font-weight: 400;\"> call a blocking, synchronous function. If you use the standard requests.get() library in an async function, you will freeze the entire event loop. The requests library is not designed for asyncio; it blocks the thread while waiting for the network. To perform I\/O, you <\/span><i><span style=\"font-weight: 400;\">must<\/span><\/i><span style=\"font-weight: 400;\"> use libraries specifically designed with async def functions, like aiohttp, which was mentioned in the source article.<\/span><\/p>\n<h2><b>The Synchronous Problem: A requests Example<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Let&#8217;s first look at a synchronous program that fetches data from three different URLs. We will use the popular requests library.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import requests<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import time<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">def fetch_url(url):<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(f&#8221;Fetching {url}&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0requests.get(url)\u00a0 # This is a blocking call<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(f&#8221;Done fetching {url}&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">start = time.time()<\/span><\/p>\n<p><span style=\"font-weight: 400;\">fetch_url(&#8220;https.com&#8221;) # A placeholder, real URLs would be used<\/span><\/p>\n<p><span style=\"font-weight: 400;\">fetch_url(&#8220;https.com&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">fetch_url(&#8220;https.com&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">end = time.time()<\/span><\/p>\n<p><span style=\"font-weight: 400;\">print(f&#8221;Finished in {end &#8211; start:.2f} seconds&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">If each request takes 1 second, this program will run in 3 seconds. First, it fetches URL 1 (1 second). Then, it fetches URL 2 (1 second). Then, it fetches URL 3 (1 second). The execution is sequential. The CPU is idle for most of those 3 seconds, just waiting for the network.<\/span><\/p>\n<h2><b>The Asynchronous Solution: An aiohttp Example<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Now, let&#8217;s convert that synchronous code into a high-performance asynchronous version using aiohttp. First, you must install it: pip install aiohttp.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import aiohttp<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import time<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def fetch_url(session, url):<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(f&#8221;Fetching {url}&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0async with session.get(url) as response:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await response.text() # We must await the I\/O<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(f&#8221;Done fetching {url}&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0async with aiohttp.ClientSession() as session:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0tasks = [<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0fetch_url(session, &#8220;https.com&#8221;),<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0fetch_url(session, &#8220;https.com&#8221;),<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0fetch_url(session, &#8220;https.com&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0]<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await asyncio.gather(*tasks)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">start = time.time()<\/span><\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p><span style=\"font-weight: 400;\">end = time.time()<\/span><\/p>\n<p><span style=\"font-weight: 400;\">print(f&#8221;Finished in {end &#8211; start:.2f} seconds&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">If each request takes 1 second, this entire program will run in just over 1 second. It starts all three requests concurrently. It awaits all of them at once. All three &#8220;waiting&#8221; periods overlap, and the program finishes when the <\/span><i><span style=\"font-weight: 400;\">slowest<\/span><\/i><span style=\"font-weight: 400;\"> of the three requests is complete. This is a massive performance gain for I\/O-bound workloads.<\/span><\/p>\n<h2><b>Breaking Down the aiohttp Example<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Let us analyze the new asynchronous code. We define an async def fetch_url. It now takes a session object. aiohttp recommends using a single ClientSession for making multiple requests, as it efficiently manages the underlying connection pool.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Inside fetch_url, the line async with session.get(url) as response: is the asynchronous, non-blocking way to make a GET request. We use async with because setting up and tearing down the request are themselves async operations. We then await response.text() to read the content. This is also an I\/O operation, so it must be awaited. In the main function, we create the ClientSession and then build a list of coroutine objects. We use asyncio.gather(*tasks) to run them all concurrently.<\/span><\/p>\n<h2><b>Working with Asynchronous Files<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Network requests are not the only form of I\/O. What about reading and writing files from the disk? By default, Python&#8217;s standard open(), read(), and write() functions are synchronous and blocking. If you try to read a very large 1GB file, your entire asyncio event loop will freeze until the read is complete. This is another trap that can kill your application&#8217;s performance.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The solution is to use a library designed for asynchronous file operations. The most popular one is aiofiles. You can install it with pip install aiofiles. This library provides an async version of the standard file API, allowing you to await file operations and keep the event loop unblocked.<\/span><\/p>\n<h2><b>A Synchronous File Operation<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">First, let us look at the blocking problem.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import time<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">def write_file_sync():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Writing file&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0with open(&#8220;test.txt&#8221;, &#8220;w&#8221;) as f:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0f.write(&#8220;This is a large file.&#8221;) # Blocks here<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0time.sleep(2) # Simulating a slow disk write<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;File write complete&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def do_something_else():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Doing something else&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(1)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Done with something else&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">If you tried to run write_file_sync() and do_something_else() concurrently, the blocking open() and write() would freeze the loop, and do_something_else would be unable to run until the 2-second simulated write was finished.<\/span><\/p>\n<h2><b>The aiofiles Solution<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Now, let us fix this using aiofiles.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import aiofiles<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import time<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def write_file_async():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Writing file async&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0async with aiofiles.open(&#8220;test.txt&#8221;, &#8220;w&#8221;) as f:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await f.write(&#8220;This is a large file.&#8221;) # Non-blocking<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(2) # Simulating slow I\/O<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;File write complete async&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def do_something_else():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Doing something else&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(1)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Done with something else&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.gather(<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0write_file_async(),<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0do_something_else()<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">In this version, the write_file_async coroutine uses async with aiofiles.open(). When it hits await f.write(), it yields control. The event loop is now free and can immediately start the do_something_else() task, which prints its message and starts its own 1-second sleep. The output will show the tasks running concurrently, and the total time will be around 2 seconds (the duration of the longest task), not 3.<\/span><\/p>\n<h2><b>Working with Asynchronous Databases<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Databases are the most common source of I\/O blocking in web applications. Just like requests, standard database drivers like psycopg2 (for PostgreSQL) or mysql-connector are synchronous. Calling cursor.execute() will block the event loop. To build a high-performance async application, you must use an async database driver.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The source article mentioned aioredis, which is an excellent driver for the Redis in-memory database. For traditional SQL databases, you have options like asyncpg for PostgreSQL (known for its incredible speed) or aiomysql for MySQL. These libraries provide async def functions for connecting, querying, and fetching results.<\/span><\/p>\n<h2><b>An aioredis Example<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Let&#8217;s expand on the source&#8217;s suggestion and see a simple aioredis example. First, install it: pip install aioredis. You also need a Redis server running.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import aioredis<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# Connect to the Redis server<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0redis = await aioredis.from_url(&#8220;redis:\/\/localhost&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# Set a value, this is a non-blocking I\/O operation<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await redis.set(&#8220;my_key&#8221;, &#8220;hello_async&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Set &#8216;my_key&#8217; in Redis&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# Get the value back, also non-blocking<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0value = await redis.get(&#8220;my_key&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(f&#8221;Got value: {value.decode(&#8216;utf-8&#8217;)}&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# Close the connection<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await redis.close()<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">Every interaction with the Redis server, such as await redis.set(&#8230;) and await redis.get(&#8230;), is an await point. This means that while your program is waiting for the Redis server to respond over the network, the event loop is free to run other tasks. This allows a single server to handle thousands of concurrent database operations efficiently.<\/span><\/p>\n<h2><b>Why This Ecosystem Matters<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">These examples\u2014aiohttp, aiofiles, and aioredis\u2014illustrate the core pattern of asyncio development. The asyncio library itself is the engine, and the aio* libraries are the car&#8217;s components (the wheels, the transmission, the steering). You cannot build a useful application with <\/span><i><span style=\"font-weight: 400;\">just<\/span><\/i><span style=\"font-weight: 400;\"> asyncio. You must build it with the rich ecosystem of libraries that have been designed to be non-blocking and to speak the async\/await language.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">When choosing a new database, message queue, or API client for your asyncio project, the very first question you must ask is, &#8220;Does it have an asyncio-native library?&#8221; If the answer is no, you will either need to find an alternative or use advanced techniques (like asyncio.to_thread(), which we will cover later) to avoid blocking the event loop.<\/span><\/p>\n<h2><b>Beyond asyncio.gather()<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">In Part 2, we introduced asyncio.gather() as the primary way to run a list of tasks and wait for them all to complete. This is a fantastic tool for many common &#8220;fire and forget&#8221; scenarios, where you just want to run a batch of jobs. However, more complex applications require more fine-grained control over their concurrent tasks.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">What if you want to set a timeout on a task? What if you want to stop a task that is running for too long? What if you do not want to wait for <\/span><i><span style=\"font-weight: 400;\">all<\/span><\/i><span style=\"font-weight: 400;\"> tasks, but just the <\/span><i><span style=\"font-weight: 400;\">first one<\/span><\/i><span style=\"font-weight: 400;\"> to finish? asyncio provides a powerful set of functions for these advanced management scenarios, including asyncio.wait_for(), asyncio.wait(), and the Task object&#8217;s cancel() method.<\/span><\/p>\n<h2><b>Handling Timeouts with asyncio.wait_for()<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">A very common problem in I\/O operations is that a remote server might be slow or completely unresponsive. Your program could await a network request and get stuck waiting forever, freezing that part of your application. To prevent this, you should almost always wrap your I\/O calls in a timeout. The asyncio.wait_for() function is the tool for this.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">asyncio.wait_for() takes an awaitable (like a coroutine) and a timeout value in seconds. It runs the coroutine, but if the timeout is exceeded before the coroutine finishes, it raises an asyncio.TimeoutError. You can catch this error to handle the timeout gracefully.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def slow_operation():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Starting slow operation&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(5) # Simulating a 5-second task<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Slow operation complete&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0return &#8220;Done&#8221;<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0try:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0result = await asyncio.wait_for(slow_operation(), timeout=3.0)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(f&#8221;Operation succeeded: {result}&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0except asyncio.TimeoutError:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(&#8220;The operation timed out!&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">In this example, wait_for is given a 3-second timeout for a 5-second operation. After 3 seconds, wait_for will raise TimeoutError, which we catch. The slow_operation task is also automatically cancelled.<\/span><\/p>\n<h2><b>Understanding Task Cancellation<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">When asyncio.wait_for() times out, it <\/span><i><span style=\"font-weight: 400;\">cancels<\/span><\/i><span style=\"font-weight: 400;\"> the underlying task. Cancellation is an explicit action in asyncio. You can also do it manually. When you have a Task object (from asyncio.create_task()), you can call its cancel() method. This will cause an asyncio.CancelledError to be injected into the coroutine at its <\/span><i><span style=\"font-weight: 400;\">next await point<\/span><\/i><span style=\"font-weight: 400;\">.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This is a critical concept. A task does not just stop immediately. It continues running until it next awaits something. At that point, instead of pausing, it will raise CancelledError. Your coroutine can, and should, use a try&#8230;finally block to catch this error and clean up any resources (like closing a file or a network connection).<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">async def cancellable_task():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0try:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(&#8220;Task started&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(10) # Pauses here<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(&#8220;Task finished&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0except asyncio.CancelledError:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(&#8220;Task was cancelled!&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0# Perform cleanup here<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0raise<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0task = asyncio.create_task(cancellable_task())<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(1) # Let the task start<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0task.cancel()<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0try:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await task # We must await the task to see the error<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0except asyncio.CancelledError:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(&#8220;Main caught the cancellation.&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">The output will show &#8220;Task started&#8230;&#8221; and then &#8220;Task was cancelled!&#8221; This shows the try&#8230;except block inside the coroutine successfully caught the error.<\/span><\/p>\n<h2><b>Fine-Grained Control with asyncio.wait()<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">asyncio.gather() is an all-or-nothing function. asyncio.wait() is its more complex, low-level, and powerful alternative. asyncio.wait() takes a set of tasks and allows you to specify <\/span><i><span style=\"font-weight: 400;\">when<\/span><\/i><span style=\"font-weight: 400;\"> it should return. You can control this with the return_when parameter.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The options for return_when are asyncio.ALL_COMPLETED (the default, which makes it similar to gather), asyncio.FIRST_COMPLETED, and asyncio.FIRST_EXCEPTION. The FIRST_COMPLETED option is particularly useful. It tells asyncio.wait() to return as soon as <\/span><i><span style=\"font-weight: 400;\">any<\/span><\/i><span style=\"font-weight: 400;\"> one of the tasks in the set has finished.<\/span><\/p>\n<h2><b>Using asyncio.wait() for the First Result<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Imagine you are querying three redundant API servers for the same piece of data. You do not care which one answers; you just want the <\/span><i><span style=\"font-weight: 400;\">fastest<\/span><\/i><span style=\"font-weight: 400;\"> response. This is a perfect use case for asyncio.wait().<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def query_server(name, delay):<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(delay)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0return f&#8221;Response from {name}&#8221;<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0tasks = {<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0asyncio.create_task(query_server(&#8220;Server A&#8221;, 2)),<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0asyncio.create_task(query_server(&#8220;Server B&#8221;, 1)),<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0asyncio.create_task(query_server(&#8220;Server C&#8221;, 3)),<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0}<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0done, pending = await asyncio.wait(tasks, return_when=asyncio.FIRST_COMPLETED)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# &#8216;done&#8217; is a set of the tasks that finished<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0first_result = done.pop().result()<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(f&#8221;Got first result: {first_result}&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# &#8216;pending&#8217; is a set of tasks still running<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0for task in pending:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0task.cancel() # Clean up the other tasks<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.gather(*pending, return_n=True) # Wait for cleanup<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">This program will print &#8220;Got first result: Response from Server B&#8221; after just one second. asyncio.wait() returns two sets: done and pending. We get the result from the done set and then immediately cancel all the pending tasks, saving network resources.<\/span><\/p>\n<h2><b>asyncio.TaskGroup (Python 3.11+)<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The patterns with create_task and gather are so common that Python 3.11 introduced a new, cleaner, and safer way to manage concurrent tasks: asyncio.TaskGroup. A TaskGroup is a context manager that waits for all tasks spawned within it to complete. It is like asyncio.gather(), but the syntax is often clearer.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">A TaskGroup also provides stronger guarantees. If any task within the group fails with an exception, all other tasks in the group are automatically cancelled. This concept is called &#8220;structured concurrency&#8221; and it makes programs much more robust and easier to reason about.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\"># This code requires Python 3.11 or newer<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def task1():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(1)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Task 1 done&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def task2():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(2)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Task 2 done&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0raise ValueError(&#8220;Task 2 failed!&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0try:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0async with asyncio.TaskGroup() as tg:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0tg.create_task(task1())<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0tg.create_task(task2())<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0# The &#8216;async with&#8217; block will not exit until both tasks are done<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0except* ValueError as e:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(f&#8221;TaskGroup caught exception(s): {e.exceptions}&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\"># asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">This example shows how the TaskGroup automatically manages the tasks. When task2 fails, task1 will be cancelled, and the ExceptionGroup (note the except*) is raised after the block exits.<\/span><\/p>\n<h2><b>Chaining Coroutines<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">In Part 1, the source article showed a simple demo function that awaited another.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">async def demo1():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(1)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0return &#8220;Result from demo1&#8221;<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def demo():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# Execution will be paused here and return to demo() when demo1() is ready<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0asy = await demo1()<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0return asy<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">This is called &#8220;chaining&#8221; coroutines, and it is the fundamental building block of an async program. The demo coroutine is <\/span><i><span style=\"font-weight: 400;\">dependent<\/span><\/i><span style=\"font-weight: 400;\"> on the result of demo1. When demo awaits demo1, demo itself is paused. The event loop then runs demo1. demo1 awaits asyncio.sleep(1), so <\/span><i><span style=\"font-weight: 400;\">it<\/span><\/i><span style=\"font-weight: 400;\"> is also paused.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The event loop is now free to run other tasks. After 1 second, the loop wakes up demo1, which returns its string. This unblocks the demo coroutine, which receives the return value into the asy variable and then returns it. This shows how await propagates the &#8220;pausing&#8221; state up the call stack, allowing the event loop to take control.<\/span><\/p>\n<h2><b>Running Blocking Code in an Async World<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">What happens if you <\/span><i><span style=\"font-weight: 400;\">must<\/span><\/i><span style=\"font-weight: 400;\"> use a synchronous, blocking library (like requests)? You cannot call it directly, or it will block the loop. asyncio provides a critical escape hatch for this: asyncio.to_thread(). This function takes a blocking synchronous function (and its arguments) and runs it in a <\/span><i><span style=\"font-weight: 400;\">separate thread pool<\/span><\/i><span style=\"font-weight: 400;\">.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This frees the main event loop to continue running other coroutines. From the perspective of your coroutine, you just await asyncio.to_thread() as if it were a true async function.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import requests # The blocking library<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">def blocking_http_call():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Starting blocking call in thread&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0response = requests.get(&#8220;https.com&#8221;) # Blocks this thread<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Blocking call finished&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0return response.status_code<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Starting main coroutine&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0status = await asyncio.to_thread(blocking_http_call)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(f&#8221;Got status: {status}&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Main coroutine finished&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">This pattern is the correct and safe way to mix synchronous, blocking I\/O code with an asyncio application. It bridges the gap between the old synchronous world and the new asynchronous one.<\/span><\/p>\n<h2><b>Shielding a Task from Cancellation<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Sometimes, you may have a critical operation that you do not want to be cancelled, even if the parent task is cancelled. For example, you might be writing a final &#8220;I&#8217;m done&#8221; status to a database. If the parent task is cancelled, you still want this write operation to complete.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">For this, asyncio provides asyncio.shield(). This function &#8220;shields&#8221; an awaitable from cancellation. If the main coroutine containing the shield is cancelled, the shielded task <\/span><i><span style=\"font-weight: 400;\">will not<\/span><\/i><span style=\"font-weight: 400;\"> be cancelled. It will continue running in the background. The main coroutine will still raise CancelledError, but the protected operation will run to completion.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">async def critical_operation():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Critical write started&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(3) # Cannot be cancelled<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Critical write complete!&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0try:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await asyncio.wait_for(<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0asyncio.shield(critical_operation()),<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0timeout=1<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0except asyncio.TimeoutError:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(&#8220;Main task timed out (and was cancelled).&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0# The critical operation is still running!<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(3) # Wait for it to finish<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(&#8220;Main exiting&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\"># asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">The output of this program will show the timeout error, but 3 seconds later, it will also print &#8220;Critical write complete!&#8221; This demonstrates the shield was successful.<\/span><\/p>\n<h2><b>The Problem of Shared State<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">When you run multiple tasks concurrently, you will eventually face a new problem: what if two tasks need to access or modify the same piece of data at the same time? This is known as a &#8220;race condition.&#8221; For example, if you have a variable counter = 0 and two tasks try to increment it ( counter += 1 ) at the same time, the final result might be 1 instead of 2. This is because both tasks might read the value 0 before either one has a chance to write the new value 1.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In asyncio, this is slightly less of a problem than in threading because a coroutine can only be &#8220;interrupted&#8221; at an await point. However, if you have an await <\/span><i><span style=\"font-weight: 400;\">between<\/span><\/i><span style=\"font-weight: 400;\"> reading a value and writing it back, you have a race condition. To solve this, asyncio provides synchronization primitives that are very similar to those in the threading module.<\/span><\/p>\n<h2><b>asyncio.Lock: Protecting Critical Sections<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The most fundamental synchronization tool is the asyncio.Lock. A Lock is an object that can be &#8220;acquired&#8221; by only one task at a time. It is like a bathroom key in an office. Only one person can have the key (acquire the lock) at a time. Anyone else who wants to use the bathroom (access the shared resource) must wait until the first person is done and returns the key (releases the lock).<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In asyncio, a coroutine awaits the lock.acquire() method. If the lock is available, the coroutine acquires it and continues. If it is held by another task, the coroutine <\/span><i><span style=\"font-weight: 400;\">pauses<\/span><\/i><span style=\"font-weight: 400;\"> at the await until the lock is released. The best way to use a lock is with the async with statement.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">counter = 0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">lock = asyncio.Lock()<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def increment_counter():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0global counter<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0async with lock:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0# This is the &#8220;critical section&#8221;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0# Only one task can be in here at a time<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0temp = counter<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(0.01) # Simulate I\/O<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0counter = temp + 1<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0tasks = [increment_counter() for _ in range(100)]<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.gather(*tasks)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(f&#8221;Final counter: {counter}&#8221;) # Will be 100<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">Without the lock, the await between reading temp and writing counter would create a race condition, and the final counter would be a random low number. With the lock, the final counter is correctly 100.<\/span><\/p>\n<h2><b>asyncio.Queue: Producer-Consumer Problems<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">A Queue is a data structure designed for safely passing data between concurrent tasks. It is the core of the &#8220;producer-consumer&#8221; pattern. You have one or more &#8220;producer&#8221; tasks that generate data and put it into the queue. You also have one or more &#8220;consumer&#8221; tasks that get data from the queue and process it.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">An asyncio.Queue is &#8220;async-aware.&#8221; If a producer tries to put an item into a full queue, it will pause until a slot is free. If a consumer tries to get an item from an empty queue, it will pause until an item becomes available. This waiting is non-blocking and allows the event loop to run other tasks.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import random<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def producer(queue):<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0for i in range(5):<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0data = f&#8221;data_item_{i}&#8221;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(random.random()) # Simulate work<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await queue.put(data)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(f&#8221;Produced: {data}&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def consumer(queue):<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0while True:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0data = await queue.get() # Pauses if queue is empty<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(f&#8221;Consumed: {data}&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0# Process the data&#8230;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0queue.task_done() # Signal that this item is done<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0queue = asyncio.Queue(maxsize=2)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0pro = asyncio.create_task(producer(queue))<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0con = asyncio.create_task(consumer(queue))<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await pro # Wait for producer to finish<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await queue.join() # Wait for all items to be processed<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0con.cancel() # Stop the consumer<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">The queue.join() method is important. It blocks until task_done() has been called for every item that was put into the queue, ensuring all work is finished.<\/span><\/p>\n<h2><b>asyncio.Semaphore: Limiting Concurrency<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Sometimes you do not want to <\/span><i><span style=\"font-weight: 400;\">stop<\/span><\/i><span style=\"font-weight: 400;\"> concurrent access, but you want to <\/span><i><span style=\"font-weight: 400;\">limit<\/span><\/i><span style=\"font-weight: 400;\"> it. A common example is interacting with a third-party API that has a rate limit. The API might only allow you to make 10 concurrent connections. If your program spawns 1000 tasks to hit this API, you will be blocked or banned.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">An asyncio.Semaphore is the tool for this. A semaphore is initialized with a number, for example, 10. It is like a club with 10 bouncers. It will allow 10 tasks to &#8220;acquire&#8221; it and enter the club (run the code). The 11th task that tries to acquire it will have to wait outside (pause) until one of the first 10 tasks &#8220;releases&#8221; the semaphore and leaves.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def fetch_api(session, url, semaphore):<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0async with semaphore:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0# Only 5 tasks can be in this block at a time<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(f&#8221;Fetching {url}&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0# response = await session.get(url)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(1) # Simulate request<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0print(f&#8221;Done with {url}&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0return url<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0semaphore = asyncio.Semaphore(5) # Limit to 5 concurrent tasks<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0urls = [f&#8221;url_{i}&#8221; for i in range(20)]<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# We create 20 tasks, but the semaphore<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# will ensure only 5 run at any given time.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0tasks = [fetch_api(None, url, semaphore) for url in urls]<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.gather(*tasks)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">This program will run in batches of 5, taking about 4 seconds total (20 tasks \/ 5 concurrent = 4 batches of 1 second each).<\/span><\/p>\n<h2><b>asyncio.Event: Coordinating Tasks<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">An asyncio.Event is a simple but powerful tool for communication <\/span><i><span style=\"font-weight: 400;\">between<\/span><\/i><span style=\"font-weight: 400;\"> coroutines. It is a boolean flag that tasks can wait() for. An Event starts in the &#8220;cleared&#8221; state (False). Any task that awaits event.wait() will pause. Another task can then call event.set(), which sets the flag to &#8220;set&#8221; (True). When this happens, all tasks that were waiting for the event are immediately woken up and allowed to proceed.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This is perfect for &#8220;startup&#8221; or &#8220;shutdown&#8221; logic. You might have a main task that needs to set up a database connection, and several worker tasks that must not start until that connection is ready.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">db_ready = asyncio.Event()<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def setup_database():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Connecting to database&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(2) # Simulate connection setup<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Database is ready!&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0db_ready.set() # Set the event, waking up workers<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def worker(name):<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(f&#8221;Worker {name} is waiting for database&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await db_ready.wait() # Pauses here until event is set<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(f&#8221;Worker {name} is starting work.&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# &#8230; do work &#8230;<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.gather(setup_database(), worker(&#8220;A&#8221;), worker(&#8220;B&#8221;))<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">The output shows both workers waiting, and as soon as the database is ready, both start their work simultaneously.<\/span><\/p>\n<h2><b>asyncio.Barrier: Waiting for Each Other<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">A Barrier is a more complex primitive. It is like a &#8220;checkpoint.&#8221; You can create a barrier for a specific number of tasks (e.g., 3). All 3 tasks will run until they await barrier.wait(). The first two tasks to arrive will pause. When the third and final task hits the barrier, all three tasks are unblocked at the same time and allowed to proceed.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This is useful in multi-stage computations where you need to ensure all tasks have completed &#8220;Stage 1&#8221; before <\/span><i><span style=\"font-weight: 400;\">any<\/span><\/i><span style=\"font-weight: 400;\"> of them are allowed to begin &#8220;Stage 2.&#8221;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def worker(barrier, delay):<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(f&#8221;Task started, working for {delay}s (Stage 1)&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(delay)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Task at the barrier, waiting&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await barrier.wait() # Pauses here<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Barrier passed! (Stage 2)&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# Create a barrier for 3 tasks<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0barrier = asyncio.Barrier(3)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0tasks = [<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0asyncio.create_task(worker(barrier, 1)),<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0asyncio.create_task(worker(barrier, 2)),<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0asyncio.create_task(worker(barrier, 3)),<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0]<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.gather(*tasks)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">asyncio.run(main())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">The output will show all three tasks arriving at the barrier at different times. After 3 seconds, when the last task arrives, all three will print &#8220;Barrier passed!&#8221; at the same time.<\/span><\/p>\n<h2><b>asyncio.Condition: Advanced Lock and Event<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">A Condition combines a Lock with an Event. It is a more advanced primitive for complex state management. Tasks can acquire the condition (which is a lock), and then await condition.wait(). This releases the lock and pauses the task until another task acquires the lock and calls condition.notify().<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This is useful for complex scenarios where a consumer needs to wait for a <\/span><i><span style=\"font-weight: 400;\">specific condition<\/span><\/i><span style=\"font-weight: 400;\"> to be true. A producer task can acquire the lock, modify the shared state, and then call condition.notify() to wake up one or more waiting consumers, who will then re-check the state. This is more flexible than a Queue as it allows tasks to wait on arbitrary logical conditions, not just &#8220;is the queue non-empty?&#8221;<\/span><\/p>\n<h2><b>The Challenges of Asynchronous Code<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">While asyncio provides incredible performance benefits, it is not free. Asynchronous code introduces new and unique challenges in debugging and testing. Because the execution of a function can be paused and interleaved with other functions, the &#8220;stack trace&#8221; of an error is often not a simple, linear path. Bugs related to timing (race conditions) or resource management (a task that is never awaited) can be subtle and difficult to find.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Traditional debugging techniques, like setting a simple breakpoint, may not work as expected because the program&#8217;s state is constantly shifting between different tasks. This final part of the series will cover the tools and techniques for debugging, testing, and safely integrating asyncio code into your projects.<\/span><\/p>\n<h2><b>Asyncio Debug Mode<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">asyncio comes with a built-in &#8220;debug mode&#8221; that can be a huge help. It provides more verbose logging and can detect common problems. The two most important things it does are:<\/span><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Log &#8220;slow&#8221; coroutines: It will log a warning if a coroutine takes too long to execute <\/span><i><span style=\"font-weight: 400;\">between<\/span><\/i><span style=\"font-weight: 400;\"> await points, which is a sign that you have a blocking, synchronous call in your code.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Detect &#8220;never-awaited&#8221; coroutines: If you call an async def function but forget to await it or create_task(), the coroutine will never run. Debug mode will detect this and log a warning.<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400;\">You can enable debug mode in two ways. The simplest is to set an environment variable before running your script: PYTHONASYNCIODEBUG=1. Alternatively, you can pass debug=True to the asyncio.run() function: asyncio.run(main(), debug=True). Running in debug mode during development is a highly recommended best practice.<\/span><\/p>\n<h2><b>Common Pitfall: Calling a Coroutine Without Awaiting<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The single most common mistake for beginners is calling an async def function and forgetting to await it.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">async def my_coro():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;This will never print!&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# WRONG<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0my_coro()<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# RIGHT<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await my_coro()<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# ALSO RIGHT<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0asyncio.create_task(my_coro())<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">In the &#8220;WRONG&#8221; example, my_coro() is called. This does not run the code inside. It just creates a coroutine object, which is then immediately discarded. The program will not print anything and will not raise an error (unless debug mode is on). This is a silent failure that can be very confusing. You must <\/span><i><span style=\"font-weight: 400;\">always<\/span><\/i><span style=\"font-weight: 400;\"> await a coroutine or schedule it as a Task.<\/span><\/p>\n<h2><b>Common Pitfall: Calling Synchronous Blocking Code<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The second most common mistake is calling a blocking I\/O function in a coroutine.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import time<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import requests<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def my_bad_coro():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;Blocking the event loop&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# WRONG: This freezes the entire program<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0time.sleep(5)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;&#8230;unblocked&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def other_coro():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;I want to run!&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def main():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.gather(<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0my_bad_coro(),<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0other_coro()<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">In this example, my_bad_coro will call time.sleep(5). This is a synchronous, blocking call. It freezes the one and only thread. The event loop is stuck and cannot switch to other_coro. The output will be &#8220;Blocking&#8230;&#8221;, then a 5-second freeze, then &#8220;&#8230;unblocked&#8221;, and only <\/span><i><span style=\"font-weight: 400;\">then<\/span><\/i><span style=\"font-weight: 400;\"> &#8220;I want to run!&#8221;. You must use await asyncio.sleep(5) or await asyncio.to_thread(time.sleep, 5).<\/span><\/p>\n<h2><b>Testing Asynchronous Code with pytest-asyncio<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Testing async def functions is not as simple as testing normal functions, because you need a running event loop to await them. The standard Python unittest module has some support, but the community standard is a plugin called pytest-asyncio.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">After installing pytest and pytest-asyncio, you can write your tests as async def functions and mark them with @pytest.mark.asyncio. The plugin will automatically manage the event loop for you.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\"># In a file named test_my_app.py<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import pytest<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\"># The function we want to test<\/span><\/p>\n<p><span style=\"font-weight: 400;\">async def add_async(a, b):<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(0) # Simulate async work<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0return a + b<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">@pytest.mark.asyncio<\/span><\/p>\n<p><span style=\"font-weight: 400;\">async def test_add_async():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0result = await add_async(2, 3)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0assert result == 5<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">You can then run this test from your terminal with the simple pytest command. This makes testing async code just as easy as testing synchronous code.<\/span><\/p>\n<h2><b>Interfacing with Synchronous Code<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">We have already seen asyncio.to_thread(), which allows an async function to safely call a sync function. But what about the other way around? What if you have a large, existing synchronous application (like a Flask web app) and you want to call an async function from it?<\/span><\/p>\n<p><span style=\"font-weight: 400;\">You cannot just await from synchronous code. The solution is to get access to an event loop. If you are in the main thread and no loop is running, you can simply use asyncio.run(). This will create a new loop, run your function, and shut the loop down.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Python<\/span><\/p>\n<p><span style=\"font-weight: 400;\">import asyncio<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">async def my_async_utility():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0await asyncio.sleep(1)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0return &#8220;Async result&#8221;<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">def my_sync_function():<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(&#8220;In sync code, calling async&#8230;&#8221;)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0# Create a new loop just for this one call<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0result = asyncio.run(my_async_utility())<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0print(f&#8221;Got result: {result}&#8221;)<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">my_sync_function()<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">If a loop is <\/span><i><span style=\"font-weight: 400;\">already running<\/span><\/i><span style=\"font-weight: 400;\"> in another thread, asyncio.run() cannot be used. In that advanced case, you must use asyncio.run_coroutine_threadsafe() to safely submit the coroutine to the other thread&#8217;s loop.<\/span><\/p>\n<h2><b>The Broader Async Ecosystem: Web Frameworks<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The biggest beneficiary of asyncio has been the world of Python web frameworks. asyncio is perfectly suited for web servers, which must handle thousands of concurrent network connections (HTTP requests) that are all I\/O-bound.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This has led to a new generation of high-performance web frameworks. The most popular is FastAPI, which is built from the ground up on asyncio. It allows you to define your API endpoints as async def functions, giving you massive performance with very little code. Other popular frameworks in this space include Starlette (which FastAPI is based on), Sanic, and Quart (an async version of Flask).<\/span><\/p>\n<h2><b>Conclusion:<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">asyncio is more than just a library; it is a new paradigm for writing concurrent programs in Python. It provides a powerful and efficient solution to the problem of I\/O-bound workloads, which are dominant in modern web-connected applications. By using cooperative multitasking with coroutines and an event loop, it avoids the overhead and complexity of traditional threading.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The journey begins with async and await, but it expands into a rich ecosystem of asynchronous libraries and advanced primitives for managing tasks, timeouts, and shared state. While the learning curve can be steep, mastering asyncio unlocks the ability to write truly high-performance, scalable, and modern Python applications.<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Concurrency is a fundamental concept in computing. It is the process where a system can manage multiple tasks in an overlapping manner. This does not mean the tasks are all running at the exact same instant. Instead, it means a task can be started, it can run for a while, and then it can be [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[],"class_list":["post-3397","post","type-post","status-publish","format-standard","hentry","category-posts"],"_links":{"self":[{"href":"https:\/\/www.certkiller.com\/blog\/wp-json\/wp\/v2\/posts\/3397"}],"collection":[{"href":"https:\/\/www.certkiller.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.certkiller.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.certkiller.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.certkiller.com\/blog\/wp-json\/wp\/v2\/comments?post=3397"}],"version-history":[{"count":1,"href":"https:\/\/www.certkiller.com\/blog\/wp-json\/wp\/v2\/posts\/3397\/revisions"}],"predecessor-version":[{"id":3398,"href":"https:\/\/www.certkiller.com\/blog\/wp-json\/wp\/v2\/posts\/3397\/revisions\/3398"}],"wp:attachment":[{"href":"https:\/\/www.certkiller.com\/blog\/wp-json\/wp\/v2\/media?parent=3397"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.certkiller.com\/blog\/wp-json\/wp\/v2\/categories?post=3397"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.certkiller.com\/blog\/wp-json\/wp\/v2\/tags?post=3397"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}