Questions tagged [multiprocessing]

1

votes
1

answer
1k

Views

Detect when multiprocessing queue is empty and closed

Let's say I have two processes: a reader and a writer. How does the writer detect when the reader has finished writing values? The multiprocessing module has a queue with a close method that seems custom-built for this purpose. But how do you detect when the queue has been closed? This doesn't seem...
BlackSheep
1

votes
2

answer
166

Views

multiprocessing.Value doesn't store float correctly

I try to asign a float to the multiprocessing.Value shared ctype as follows: import multiprocessing import random test_float = multiprocessing.Value('f', 0) i = random.randint(1,10000)/(random.randint(1,10000)) test_float.value = i print('i: type = {}, value = {}'.format(type(i), i)) print('test_flo...
elpunkt
1

votes
1

answer
92

Views

cProfile causes pickling error when running multiprocessing Python code

I have a Python script that runs well when I run it normally: $ python script.py I am attempting to profile the code using the cProfile module: $ python -m cProfile -o script.prof script.py When I launch the above command I get an error regarding being unable to pickle a function: Traceback (most...
James Adams
0

votes
0

answer
6

Views

Name getting corrupted in multiprocessor bound buffer

Currently I'm trying to make a program that will use processors to take in several files with website URLs and then output that website URL along with the IP address to another file. This uses a bounded buffer that the requester processes use to store website URLs before passing these along to the r...
Tarface
1

votes
1

answer
30

Views

Multiprocessing on a list being passed to a function

I have a function that processes one url at a time: def sanity(url): try: if 'media' in url[:10]: url = 'http://dummy.s3.amazonaws.com' + url req = urllib.request.Request(url, headers={'User-Agent' : 'Magic Browser'}) ret = urllib.request.urlopen(req) allurls.append(url) return 1 except (urllib.requ...
Eswar
1

votes
2

answer
59

Views

How come using multiprocessing isn't any faster? am i doing it wrong?

I made a python script to solve a complex math problem and write the result to a text file. but it takes a long time so I wanted to make it utilize more of my i7-7700K because it only uses 18%. so I tried using multiprocessing. but its not faster. Am I doing it wrong? (side note) when I run it with...
Kodi4444
1

votes
2

answer
49

Views

Multiprocessing on chunks of an image

I have a function that has to loop through individual pixels of an image and calculate some geometry. This function takes a very long time to run (~5 hours on a 24 Megapixel image) but seems like it should be easy to run in parallel on multiple cores. However, I can't for the life of me find a well...
Will.Evo
0

votes
1

answer
17

Views

Multiprocessing “Pool” hangs on running program

I have written a program to use the 'Pool' from multiprocessing python package,but it keeps hanging the kernel.(I am using Jupyter with Anaconda,by the way) I tried using a function from an imported package like 'math',which works perfectly,but, the moment i use a function i created in the same pyth...
Chandra Sangala
1

votes
1

answer
37

Views

Why multiprocessing.Pool cannot change global variable?

I want to use multiprocessing.Pool to load a large dataset, here is the code I'm using: import os from os import listdir import pickle from os.path import join import multiprocessing as mp db_path = db_path the_files = listdir(db_path) fp_dict = {} def loader(the_hash): global fp_dict the_file = joi...
meTchaikovsky
1

votes
1

answer
93

Views

Share RabbitMQ channel between multiple python processes

I want to share the BlockingChannel across multiple python process. In order to send basic_ack from other python process. How to share the BlockingChannel across multiple python processes. Following is the code: self.__connection__ = pika.BlockingConnection(pika.ConnectionParameters(host='localhost...
Shan Khan
1

votes
0

answer
349

Views

Python 3.5.x multiprocessing throwing “OSError: [Errno 12] Cannot allocate memory” on a 74% free RAM system

I wrote a Python 3.5 application that spawns several processes using multiprocessing library. It runs on Ubuntu 16.04.3 LTS on a dedicated server machine that ships with 2 x Intel Xeon E2690 v1 (8 cores each) and 96GB of RAM. This system runs a PostgreSQL instance which is configured to use a maximu...
pietrop
0

votes
0

answer
2

Views

Maximum number of CUDA blocks?

I want to implement an algorithm in CUDA that takes an input of size N and uses N^2 threads to execute it (this is the way the particular algorithm words). I've been asked to make a program that can handle up to N = 2^10. I think for my system a given thread block can have up to 512 threads, but for...
gkeenley
-1

votes
1

answer
24

Views

Getting started with multiprocessing in Python

I'm trying to set up a multiprocess in my script: I want to loop through an array and run a function for every item in the array but I want this function to be called simultaneously. This is the original set up: def my_function(my_variable): #DO stuff return my_variable_updated def main(): #initiali...
1

votes
0

answer
172

Views

Python multiprocessing fails to terminate on KeyboardInterrupt, freezing Windows

I have a simple function, getHashes(), that calculates a checksum when supplied with a list/dict of data. I am using multiprocessing to calculate these hashes in parallel, using multiple cores. However, if I CTRL + C to end the process, my entire machine locks up. I am using Python 3.6 on Windows 10...
ensnare
1

votes
0

answer
58

Views

Multiprocessing code bug

I want to read/write data from Serial port and record video simultaneously. For this task I am using multiprocessing module in Python. I wrote the following code as a test which runs fine when I run it in a separate script i.e. it records a video and writes to a file. video_length = 10 time_stamp =...
Nischal
1

votes
0

answer
13

Views

Creating a serial object casing problems with multiprocessing module

I am trying to use multiprocessing module along with reading data from serial port (Please also refer to my earlier post Multiprocessing code bug). I verified it by systematically commenting out the code. The code runs fine but as soon as a serial object is initialized, the multiprocessing doesn't p...
Nischal
1

votes
0

answer
50

Views

Killing a Sub Process while using Multiprocessing

I need to shorten multiple urls using the Google Shortener API. Since each shortening process is not interdependent I decided to use multiprocessing library in Python. raw_data is a data frame which contains my long url. Api_Key is a list which contains api key from multiple google accounts(as i...
anarchy
1

votes
0

answer
24

Views

Fastest way to call two functions multiple times without using a list of input parameters

I have written a small programm that uses a pipe. The parent takes care of camera connection while the child is processing the images. The child process calls two functions FunctionA and FunctionB. Both times the image from the parent is processed. I want to run the two functions as fast as possib...
Max Krappmann
1

votes
0

answer
78

Views

what is proper way to update vao/vbo between thread?

Here is the problem I came across: I read 3d model data in a thread and render it in main thread. What is the proper way to create or update vao/vbo data? Currently I set a bool value to indicate that whether main thread need to update vbo data, like this: void RenderModel(){ if(isNeedUpdateVBO){ up...
Damons
1

votes
0

answer
179

Views

Multithreading/Multiprocessing in Python 3.4 web app for Azure

I'm working on writing a Python web app with Flask using Azure to host. I need to do some background work. When I test the app locally, everything works great. However, as soon as I push the update to Azure, it stops functioning. Right now, I have a multithreading.Process set up, and based on the lo...
Sean Dempsey
1

votes
2

answer
459

Views

How to combine multiprocessing and eventlet

I have a task need to start 2 processes and within each process need to start 2 threads to do really work. Below is the source code I used to simulate my use case. import multiprocessing import eventlet def subworker(num1, num2): print 'Start subworker %d,%d\n' % (num1, num2) eventlet.sleep(10) prin...
Leehom
1

votes
0

answer
79

Views

How to use multiprocessing.Process when output is large for multiprocessing.Queue?

What I am trying to do? A Python script that would read a large file (1-8G), match on a field called record (which can be 'r' or 'f') and can have same ID. If the ID is same, combine the 'f' and 'r' lines and save it to dictionary with ID as the key. In order to speed up the process, I split the fil...
Praniti Gupta
1

votes
2

answer
63

Views

Multiprocess with shared variable in bash

I'm trying to achieve a dynamic progress bar in bash script, the kind we see when installing new packages. In order to do this, a randomtask would call a progressbar script as a background task and feed it with some integer values. The first script uses a pipe to feed the second. #!/bin/bash # rando...
Gabriel Forien
1

votes
0

answer
82

Views

How does multi-processing in python increase the execution time?

I have tried simple piece of python code containing two for-loops using multi-processing pool. First I set the pool=1, total execution time was= 22211 ms. while for pool=40, total execution time for each of 40 processes was= Time taken: 24045, Time taken: 24047, Time taken: 24072, Time taken: 2409...
MUKUND KUMAR
1

votes
0

answer
35

Views

Terminating all sub-processes after an event

I have a Python multiprocessing scenario which I have simplified for my question here. There are x number of jobs to be processed in 2 parts. In my code, the 2 job parts are actually HTTP requests where part 2 is dependent on the results of part 1. Finally, there is a 3rd part that simply reports on...
Jason
1

votes
0

answer
111

Views

Concurrent Futures and the correct number of threads to use.

I'm developing a webscraping tool and I am using concurrent processes. I want to know if there is a general rule for the number of threads you need to use. Currently, I have it set for 10, but I've noticed that I get a lot more missing data values when I push the number past the amount of threads....
CENTURION
1

votes
1

answer
44

Views

Run function in same module and file in different process

If I have the following Julia code snippet, is there any way I can run the for loop with multiple processes without putting complicated into an extra file and doing something like @everywhere include('complicated.jl')? Otherwise, the processes don't seem to be able to find the function. function com...
elya5
1

votes
0

answer
48

Views

Return a value and skip a line after x seconds

Say I have a code like this: def bar(a): def foo(): b = 1+a c = b + 10 #line responsible for en z = b+c return z if __name__ == '__main__': # Start bar as a process p = multiprocessing.Process(target=foo) p.start() # Wait for 10 seconds or until process finishes p.join(10) # If thread is still activ...
Andy G
1

votes
1

answer
1.2k

Views

Running same function in parallel Python

I have created a function to gather cryptocurrency prices. What I want to do is to run the function to gather order book prices for different cryptos in parallel. The function is basically the same, the only thing changing is the crypto. example: def gather_prices(pair): get_prices = order_book(pair...
Mariano
1

votes
0

answer
68

Views

errors in csv output file; data collected from arduino

I'm creating a program needed for my master thesis. The program reads data from arduino and writes it into csv file. This is only part of the program, so I use multiprocessing module. But the program makes a few bugs in the csv file, especially at the beginning. It should write a 3-digit number in...
Jarshah
1

votes
0

answer
96

Views

How to make Queue work with Pool function in multiprocessing

I have seen this example here on the website and I am wondering how to make the code work in accordance with the pool function. I have been through a lot of documentation but the closest to this code is the question asked by Olson: Can I use a multiprocessing Queue in a function called by Pool.imap...
Joemoreneau
1

votes
1

answer
335

Views

Overhead in Python multiprocessing module

I am using the multiprocessing module in Python and expect some overhead launching the process, creating a queue and putting and getting value to/from the queue. However, if the sub-process has enough work to do, I would expect that the overhead would eventually be washed out. Running a simple exa...
Donna
1

votes
2

answer
582

Views

Python running multiple processes

I have a script (a.py) that needs to call another script (b.py) multiple times, each execution of b.py takes 1 minute. If I run in a for loop, this takes a lot of time. I want to know how I can optimize this, to reduce the time. Any help or suggestions would be very helpful. Source code: # a.py i...
R George
1

votes
0

answer
117

Views

Updating batch image array in-place when using joblib

This is a follow-up question for my solution to the question below: How to apply a function in parallel to multiple images in a numpy array? My suggested solution works fine if the function process_image() has to return the result and then we can cache that to some list for later processing. Since I...
kmario23
1

votes
0

answer
46

Views

Passing arguments vs. using pipes to communicate with child processes

In the following code, I time how long it takes to pass a large array (8 MB) to a child process using the args key word when forking the process verses passing using a pipe. Does anyone have any insight into why it is so much faster to pass data using an argument than using a pipe? Below, each co...
Donna
1

votes
0

answer
70

Views

Does numpy linalg.eig() block multiprocessing?

I am trying to run a function in parallel with the multiprocessing module. However there seems to be a deadlock. It seems to be due to the numpys: np.linalg.eig If I uncomment this function my code runs as expected. Does this occur due to some low level blas/laplack implementation? And is there a w...
ritchie46
1

votes
0

answer
103

Views

Embedding python in C++ for multiprocessing

I have a Python program, however it seems I cannot really scale it because if the lack of multiprocessing. We have added threading but since it still runs on one core we cannot scale enough. I saw here that it is possible to embed python in C++ programs. So I thought to do multiprocessing in C++ an...
user3605780
1

votes
0

answer
33

Views

python - How to interrupt a multiprocessing program and store specific data from every core to load it next time it will run?

I have the following python3 code: from multiprocessing import Pool, cpu_count import hashlib import signal def checkHash(argsList): key = argsList[0] pattern = argsList[1] cpusNum = argsList[2] coreId = argsList[3] while True: coreId += cpusNum newHash = hashlib.sha256() newFeed = key + '_' + str(c...
saavedra29
1

votes
0

answer
338

Views

Python: What is the biggest difference between `Celery` lib and `Multiprocessing` lib in respect of parallel programming?

I think that all tasks that could be done using celery can also be done via multiprocessing library. Despite of this, wonder why the one use celery instead of multiprocessing in Python program or web framework such as django, flask, etc.
user3595632
1

votes
0

answer
28

Views

Python multiple operate on multiple concurrent.futures

Hi I'm coming from Java development world where multiple futureTasks could have dependency relationships with nice interfaces. I start to use ThreadPoolExecutor in python, I'm looking for something like this def getSomething(): return 'Hello' def combine(r1, r2): return r1 + r2 executor = ThreadPool...
i3wangyi

View additional questions