Multiprocessing is a package which supports spawning processes using an API. This package is used for both local and remote concurrencies. Using this module, programmer can use multiple processors on a given machine. It runs on Windows and UNIX os.
All equivalent synchronization primitives are present in this package.
from multiprocessing import Process, Lock def my_function(x, y): x.acquire() print ('hello world', y) x.release() if __name__ == '__main__': lock = Lock() for num in range(10): Process(target= my_function, args=(lock, num)).start()
Here one instance can lock to ensure that only one process can display standard output at a time.
For pooling, we use Pool class. When one can create a pool of processes that will carry the whole tasks submitted to it.
class multiprocessing.Pool([processes[, initializer[, initargs[, maxtasksperchild]]]])
A pool object controls a pool of worker to select which job can be submitted and it supports asynchronous result which has timeouts, callbacks and a parallel map implementation.
cpu_count() is used if the process is none and initializer(*initargs) this function calls when initializer is not none.
apply(func[, args[, kwds]])
This is same as apply() built-in function. This is blocked until the result is ready, if it wants to perform in parallel then apply_async() method is better.
apply_async(func[, args[, kwds[, callback]]])
Returns a result object.
map(func, iterable [, chunksize])
map() is a built-in function and it supports only one iterable argument. It blocks until the result is ready.
In this method, the iterable breaks into a number of small chunks and these small parts are submitted to the process pool as a separate task.
map_async(func, iterable[, chunksize[, callback]])
Returns a result object.
imap(func, iterable[, chunksize])
It is same as itertools.imap().
The size of the argument is same as the one used in map().
imap_unordered(func, iterable[, chunksize])
This is same as imap() except that the retuning iterator should be ordered.
When a worker has been completed all the tasks then worker exits the process.
If we want to stop worker process immediately without completing the task then this method is used.
Before using join() method, we must use close() and terminate() functions.
Returned by Pool.apply_async() and Pool.map_async().
This function returns the result when it arrives.
Using this wait function, we wait the result to be available or until timeout seconds pass.
This function returns whether the call has completed or not.
This function returns when the call completed without any error.
# -*- coding: utf-8 -*- """ Created on Sun Sep 30 12:17:58 2018 @author: Tutorials Point """ from multiprocessing import Pool import time def myfunction(m): return m*m if __name__ == '__main__': my_pool = Pool(processes=4) # start 4 worker processes result = my_pool.apply_async(myfunction, (10,)) # evaluate "f(10)" asynchronously in a single process print (result.get(timeout=1)) print (my_pool.map(myfunction, range(10))) # prints "[0, 1, 4,..., 81]" my_it = my_pool.imap(myfunction, range(10)) print (my_it.next() ) # prints "0" print (my_it.next() ) # prints "1" print (my_it.next(timeout=1) ) # prints "4" unless your computer is *very* slow result = my_pool.apply_async(time.sleep, (10,)) print (result.get(timeout=1) ) # raises multiprocessing.TimeoutError