If you spend a lot of time waiting around for your code to run,
and you've tried
all the easier things, you can sometimes get
a big speedup by parallelizing it — breaking it into
chunks and running each chunk separately. Laptops typically
typically have at least two cores, and many have four.
Desktop boxes can have as many as 32. But Python is single-threaded.
By default it will only run on one core at a time.
multiprocessing package helps us to use as
many cores as we want.
Here's a minimal example that you can copy and paste to get started with.
( Here's the snippet on GitLab.)
from multiprocessing import Pool import os import numpy as np def f(n): return np.var(np.random.sample((n, n))) result_objs =  n = 1000 with Pool(processes=os.cpu_count() - 1) as pool: for _ in range(n): result = pool.apply_async(f, (n,)) result_objs.append(result) results = [result.get() for result in result_objs] print(len(results), np.mean(results), np.var(results))
f(n)is a function with a burdensome calculation.
Poolis a collection of worker processes.
When initializing a
processeskeyword argument chooses how many workers to create. It doesn't make sense to create more workers than you have processors.
os.cpu_count()tells you exactly how many that is. If you leave one processor free you can still run Firefox and listen to Spotify while your code runs.
Pool.apply_async()assigns a job to your worker pool, but doesn't wait around for the result. Instead it returns a placeholder. We can use the placeholder to get the actual result by calling
Pool.apply_async()takes a function as a first argument and a tuple of arguments for that function as a second argument. Because we want each worker to run
f(n), we pass
If you were to immediately call
get()on the result placeholder from
apply_async(), it would hold up the for loop while it waited for the result. In parallelization terminology, it blocks all the other workers from getting new jobs. In this case, you would still have several workers, but they would each take turns doing one job while the others stood around and waited. Better to collect all the result placeholders and gather up the results when the works have done their jobs.
multiprocessing package is incredibly powerful.
This is only the tiniest part of its capabilities. But hopefully
it's a part that you'll find particularly useful.