file io - Asynchronous running of Python functions -


i have function (based on parameters) move/rename image files mapped card reader drive in windows server drive after file renamed using template , appending incrementing number loop.

there 3 cards of photos sent destination folder. each card being processed 1 after other because of file sizes , traveling on network can take quite time.

is there way have function receive list of mapped card drives (not more 3) , run rename function simultaneously each card.

my poor attempt illustrate trying follows:

def collectcards(cards):     card in cards:          #my goal run each instance of following function asynchronously          drv =renameimages(card)  def renameimages(carddrive):     #perform renaming functions     return count_of_renamed_images 

you try using multiprocessing processes (pool) or threads (pool.threadpool). in both cases, difference in import - api stays same:

from multiprocessing import pool  my_pool = pool(3) # `cards` list of cards. send rename requests workers. my_pool.map(renameimages, cards)  # close pool , wait processes close. my_pool.close() my_pool.join() 

the number in pool(3) indicates number of worker processes - more means greater number of concurrent renameimages functions running. bear in mind multiprocessing requires card objects able pickled. if renameimages not heavy on memory, try using threadpool - card objects shared between threads.


Comments

Popular posts from this blog

css - Which browser returns the correct result for getBoundingClientRect of an SVG element? -

gcc - Calling fftR4() in c from assembly -

Function that returns a formatted array in VBA -