multithreading - Why Python multiprocessing slowed down my function execution? -
i writing code mincut graph problem in python on ubuntu, after proving correctness single thread implementation trying speed things using multithreading/multiprocessing code slows down. noticed strange behavior , when use 2 threads time taken algorithm function execution doubles compared single thread implementation canceling out gain of having 2 threads , resulting in slow down due multiprocessing overhead , can't know why , possible explanation had have single core processor not case laptop has intel® core™ i5-6200u cpu @ 2.30ghz × 4
numbers
single thread : iteration time (mostly run time of randomcontraction function) on average 200 ms
time allocation, rand sel time 0.00 collapse time 0.12 self loops time 0.08 iteration # 6882 of 211931 took time 0.20 graph creation time 0.00 alg run time 0.20 elasped time 1439.17 total time 42797.71 progress 3.36 cut = 9
using 2 threads from multithreading import thread
: 2 parallel iteration times on average 470 ms function speed slowed down 200 ms 470 ms , waiting join take worst case
time allocation, rand sel time 0.00 collapse time 0.22 self loops time 0.17 time allocation, rand sel time 0.00 collapse time 0.25 self loops time 0.16 iteration # 26 of 211931 took time 0.47 graph creation time 0.05 alg run time 0.42 elasped time 6.62 total time 50250.92 progress 0.01 cut = none
using 2 processes from multiprocessing import process
: 2 parallel iterations times on average 460 ms function speed slowed down 200 ms 460 ms , waiting join take worst case .
time allocation, rand sel time 0.00 collapse time 0.24 self loops time 0.16 time allocation, rand sel time 0.00 collapse time 0.24 self loops time 0.16 iteration # 48 of 211931 took time 0.46 graph creation time 0.05 alg run time 0.41 elasped time 10.96 total time 48907.02 progress 0.02 cut = none
my expectations 2 iterations take 200 ms or more doubling speed of code seems randomcontraction function run time has doubled negating effect of parallelism
code
i removed lines calculating , logging time reduce clutter
def randomcontraction(g, cut, i): while g.numnodes() > 2 : edgeindex = g.randedgeindex() g.collapse(edgeindex) g.removeselfloops() cut[i] = g.cut() return def kragermincut(list): num_threads = 2 elapsed = 0 n = len(list) orig = graph(list) iterations = int(n*n*math.log(n)) threads = [none] * num_threads g = [none] * num_threads cut = [none] * iterations = 0 while < iterations: j in range(len(g)): g[j] = copy.deepcopy(orig) j in range(len(threads)): threads[j] = process(target=randomcontraction, args=(g[j], cut, j+i)) threads[j].start() j in range(len(threads)): threads[j].join() += num_threads print(cut) return min(cut)
Comments
Post a Comment