python 3.x - execute multiple functions in parallel -


I have a large number of functions and each of them should load the URL from the list and perform some operations. I need all the tasks to do this in parallel if it is possible and each function should load the URL from the list. I have some code here, but I'm not sure what it does or what I do, can it be simplified? Thanks

  Class Multi (Object): # ------------------------------- - -------------------------------------- def __init __ (self, urls_func1, urls_func2): " "Start the class with the list of" Urls "" self.urls_func1 = urls_func1self.urls_func2 = urls_func2 # -------------------------- - ------------------------------------------ def func1 (self, url , Que): "Do something" que.put (result_func1) # ----------------------------------- - --------------------------------- Deaf Fan 2 (Own, URL, Qi): "Do Something" Quetap (Result_func2) # --------------------------------------------- - ------------------------ DEF (self): For "func1" jobs = [] Queue 1 = [] urls_func1 for url: Queue1.append (qi) process = multiprocessing.process (target = self.func1, args = (url, qi1 [lane (jobs)],) jobs.append (process) for jobs in jobs.start (): Job Url_func1: queue2.append (Q) Process = Multiprocessing. For the URL in the process (target = self.func2, args = (url, queue2 [len] "func2" jobs = [] queue2 = [] Join (jobs)],) jobs.append (process) process.start () jobs for jobs: job.join () return result_func1, result_func2 # ------------ ----- --------------------------------------------- ----- --- if __ Name__ == "__- Mean__": urls_for_func1 = ['www.google.com', 'www.google.com', 'www.google.com', 'www.google.com'] urls_for_func2 = ['www. Google.com ',' www.google.com ',' www.google.com '] a, b = (multi (urls_for_func1, urls_for_func2) .run ())   

: Correction of variable names

You can download multiple URLs simultaneously, while the total of parallel jobs Limit number one thread pool:

  multiprocessing.dummy import from pool # Usage thread DRF func1 (URL): URL return on # URL, "result", any DR F Func2 (url): # Failure URL with URL return, none, "Description of error" pool = pool (20) # 20 over 20 concurrent connections result_for_function1 = pool.map (func1, urls_func1) # when Until this block results_for_func2 = pool.map (func2, urls_func2)    

Comments

Popular posts from this blog

c - Mpirun hangs when mpi send and recieve is put in a loop -

python - Apply coupon to a customer's subscription based on non-stripe related actions on the site -

java - Unable to get JDBC connection in Spring application to MySQL -