Questions tagged [multiprocessing]

1

votes
0

answer
349

Views

Python 3.5.x multiprocessing throwing “OSError: [Errno 12] Cannot allocate memory” on a 74% free RAM system

I wrote a Python 3.5 application that spawns several processes using multiprocessing library. It runs on Ubuntu 16.04.3 LTS on a dedicated server machine that ships with 2 x Intel Xeon E2690 v1 (8 cores each) and 96GB of RAM. This system runs a PostgreSQL instance which is configured to use a maximu...
pietrop
0

votes
0

answer
2

Views

Maximum number of CUDA blocks?

I want to implement an algorithm in CUDA that takes an input of size N and uses N^2 threads to execute it (this is the way the particular algorithm words). I've been asked to make a program that can handle up to N = 2^10. I think for my system a given thread block can have up to 512 threads, but for...
gkeenley
-1

votes
1

answer
24

Views

Getting started with multiprocessing in Python

I'm trying to set up a multiprocess in my script: I want to loop through an array and run a function for every item in the array but I want this function to be called simultaneously. This is the original set up: def my_function(my_variable): #DO stuff return my_variable_updated def main(): #initiali...
1

votes
0

answer
172

Views

Python multiprocessing fails to terminate on KeyboardInterrupt, freezing Windows

I have a simple function, getHashes(), that calculates a checksum when supplied with a list/dict of data. I am using multiprocessing to calculate these hashes in parallel, using multiple cores. However, if I CTRL + C to end the process, my entire machine locks up. I am using Python 3.6 on Windows 10...
ensnare
1

votes
0

answer
58

Views

Multiprocessing code bug

I want to read/write data from Serial port and record video simultaneously. For this task I am using multiprocessing module in Python. I wrote the following code as a test which runs fine when I run it in a separate script i.e. it records a video and writes to a file. video_length = 10 time_stamp =...
Nischal
1

votes
0

answer
13

Views

Creating a serial object casing problems with multiprocessing module

I am trying to use multiprocessing module along with reading data from serial port (Please also refer to my earlier post Multiprocessing code bug). I verified it by systematically commenting out the code. The code runs fine but as soon as a serial object is initialized, the multiprocessing doesn't p...
Nischal
1

votes
0

answer
50

Views

Killing a Sub Process while using Multiprocessing

I need to shorten multiple urls using the Google Shortener API. Since each shortening process is not interdependent I decided to use multiprocessing library in Python. raw_data is a data frame which contains my long url. Api_Key is a list which contains api key from multiple google accounts(as i...
anarchy
1

votes
0

answer
24

Views

Fastest way to call two functions multiple times without using a list of input parameters

I have written a small programm that uses a pipe. The parent takes care of camera connection while the child is processing the images. The child process calls two functions FunctionA and FunctionB. Both times the image from the parent is processed. I want to run the two functions as fast as possib...
Max Krappmann
1

votes
0

answer
78

Views

what is proper way to update vao/vbo between thread?

Here is the problem I came across: I read 3d model data in a thread and render it in main thread. What is the proper way to create or update vao/vbo data? Currently I set a bool value to indicate that whether main thread need to update vbo data, like this: void RenderModel(){ if(isNeedUpdateVBO){ up...
Damons
1

votes
0

answer
179

Views

Multithreading/Multiprocessing in Python 3.4 web app for Azure

I'm working on writing a Python web app with Flask using Azure to host. I need to do some background work. When I test the app locally, everything works great. However, as soon as I push the update to Azure, it stops functioning. Right now, I have a multithreading.Process set up, and based on the lo...
Sean Dempsey
1

votes
2

answer
459

Views

How to combine multiprocessing and eventlet

I have a task need to start 2 processes and within each process need to start 2 threads to do really work. Below is the source code I used to simulate my use case. import multiprocessing import eventlet def subworker(num1, num2): print 'Start subworker %d,%d\n' % (num1, num2) eventlet.sleep(10) prin...
Leehom
1

votes
0

answer
79

Views

How to use multiprocessing.Process when output is large for multiprocessing.Queue?

What I am trying to do? A Python script that would read a large file (1-8G), match on a field called record (which can be 'r' or 'f') and can have same ID. If the ID is same, combine the 'f' and 'r' lines and save it to dictionary with ID as the key. In order to speed up the process, I split the fil...
Praniti Gupta
1

votes
2

answer
63

Views

Multiprocess with shared variable in bash

I'm trying to achieve a dynamic progress bar in bash script, the kind we see when installing new packages. In order to do this, a randomtask would call a progressbar script as a background task and feed it with some integer values. The first script uses a pipe to feed the second. #!/bin/bash # rando...
Gabriel Forien
1

votes
0

answer
82

Views

How does multi-processing in python increase the execution time?

I have tried simple piece of python code containing two for-loops using multi-processing pool. First I set the pool=1, total execution time was= 22211 ms. while for pool=40, total execution time for each of 40 processes was= Time taken: 24045, Time taken: 24047, Time taken: 24072, Time taken: 2409...
MUKUND KUMAR
1

votes
0

answer
35

Views

Terminating all sub-processes after an event

I have a Python multiprocessing scenario which I have simplified for my question here. There are x number of jobs to be processed in 2 parts. In my code, the 2 job parts are actually HTTP requests where part 2 is dependent on the results of part 1. Finally, there is a 3rd part that simply reports on...
Jason
1

votes
0

answer
111

Views

Concurrent Futures and the correct number of threads to use.

I'm developing a webscraping tool and I am using concurrent processes. I want to know if there is a general rule for the number of threads you need to use. Currently, I have it set for 10, but I've noticed that I get a lot more missing data values when I push the number past the amount of threads....
CENTURION
1

votes
1

answer
44

Views

Run function in same module and file in different process

If I have the following Julia code snippet, is there any way I can run the for loop with multiple processes without putting complicated into an extra file and doing something like @everywhere include('complicated.jl')? Otherwise, the processes don't seem to be able to find the function. function com...
elya5
1

votes
0

answer
48

Views

Return a value and skip a line after x seconds

Say I have a code like this: def bar(a): def foo(): b = 1+a c = b + 10 #line responsible for en z = b+c return z if __name__ == '__main__': # Start bar as a process p = multiprocessing.Process(target=foo) p.start() # Wait for 10 seconds or until process finishes p.join(10) # If thread is still activ...
Andy G
1

votes
1

answer
1.2k

Views

Running same function in parallel Python

I have created a function to gather cryptocurrency prices. What I want to do is to run the function to gather order book prices for different cryptos in parallel. The function is basically the same, the only thing changing is the crypto. example: def gather_prices(pair): get_prices = order_book(pair...
Mariano
1

votes
0

answer
68

Views

errors in csv output file; data collected from arduino

I'm creating a program needed for my master thesis. The program reads data from arduino and writes it into csv file. This is only part of the program, so I use multiprocessing module. But the program makes a few bugs in the csv file, especially at the beginning. It should write a 3-digit number in...
Jarshah
1

votes
0

answer
96

Views

How to make Queue work with Pool function in multiprocessing

I have seen this example here on the website and I am wondering how to make the code work in accordance with the pool function. I have been through a lot of documentation but the closest to this code is the question asked by Olson: Can I use a multiprocessing Queue in a function called by Pool.imap...
Joemoreneau
1

votes
1

answer
335

Views

Overhead in Python multiprocessing module

I am using the multiprocessing module in Python and expect some overhead launching the process, creating a queue and putting and getting value to/from the queue. However, if the sub-process has enough work to do, I would expect that the overhead would eventually be washed out. Running a simple exa...
Donna
1

votes
2

answer
582

Views

Python running multiple processes

I have a script (a.py) that needs to call another script (b.py) multiple times, each execution of b.py takes 1 minute. If I run in a for loop, this takes a lot of time. I want to know how I can optimize this, to reduce the time. Any help or suggestions would be very helpful. Source code: # a.py i...
R George
1

votes
0

answer
117

Views

Updating batch image array in-place when using joblib

This is a follow-up question for my solution to the question below: How to apply a function in parallel to multiple images in a numpy array? My suggested solution works fine if the function process_image() has to return the result and then we can cache that to some list for later processing. Since I...
kmario23
1

votes
0

answer
46

Views

Passing arguments vs. using pipes to communicate with child processes

In the following code, I time how long it takes to pass a large array (8 MB) to a child process using the args key word when forking the process verses passing using a pipe. Does anyone have any insight into why it is so much faster to pass data using an argument than using a pipe? Below, each co...
Donna
1

votes
0

answer
70

Views

Does numpy linalg.eig() block multiprocessing?

I am trying to run a function in parallel with the multiprocessing module. However there seems to be a deadlock. It seems to be due to the numpys: np.linalg.eig If I uncomment this function my code runs as expected. Does this occur due to some low level blas/laplack implementation? And is there a w...
ritchie46
1

votes
0

answer
103

Views

Embedding python in C++ for multiprocessing

I have a Python program, however it seems I cannot really scale it because if the lack of multiprocessing. We have added threading but since it still runs on one core we cannot scale enough. I saw here that it is possible to embed python in C++ programs. So I thought to do multiprocessing in C++ an...
user3605780
1

votes
0

answer
33

Views

python - How to interrupt a multiprocessing program and store specific data from every core to load it next time it will run?

I have the following python3 code: from multiprocessing import Pool, cpu_count import hashlib import signal def checkHash(argsList): key = argsList[0] pattern = argsList[1] cpusNum = argsList[2] coreId = argsList[3] while True: coreId += cpusNum newHash = hashlib.sha256() newFeed = key + '_' + str(c...
saavedra29
1

votes
0

answer
338

Views

Python: What is the biggest difference between `Celery` lib and `Multiprocessing` lib in respect of parallel programming?

I think that all tasks that could be done using celery can also be done via multiprocessing library. Despite of this, wonder why the one use celery instead of multiprocessing in Python program or web framework such as django, flask, etc.
user3595632
1

votes
0

answer
28

Views

Python multiple operate on multiple concurrent.futures

Hi I'm coming from Java development world where multiple futureTasks could have dependency relationships with nice interfaces. I start to use ThreadPoolExecutor in python, I'm looking for something like this def getSomething(): return 'Hello' def combine(r1, r2): return r1 + r2 executor = ThreadPool...
i3wangyi
1

votes
0

answer
37

Views

How can i transform my code to use multiprocessing(pool)?

for key in copy_reference_dataset: model_line_number = 0 model_dictionary_elements_number = len(copy_reference_dataset[key]) while model_line_number < model_dictionary_elements_number: compt = compt + 1 if key in plus_min_dict: plus_min_dict[key].append(copy_reference_dataset[key][model_line_number]...
Talar
1

votes
0

answer
72

Views

Python Multiprocessing writing to shared file using lock

I have built a very single program to understand how parallelization may help (save time). This program simply write a list of integer in a file, and I split the treatment by range of integer. Reading relevant question, I have put Lock, but no improvement. The program run, but instead of having in m...
sph
1

votes
1

answer
154

Views

Python multiprocess dict of list

I need to do some stuffs in multiprocess with Python 3.6. Namely, I have to update a dict adding lists of objects. Since these objects are unpickable I need to use dill instead of pickle and multiprocess from pathos instead of multiprocessing, but this should not be the problem. Adding a list to the...
fortea
1

votes
0

answer
328

Views

How to make CherryPy server spawn more processes (use multiple cpu cores)?

I can't find any documentation on spawning multiple processes for cherrypy web server. Currently I only use multiple threads: ... server = cherrypy._cpserver.Server() ... server.thread_pool = 30 But as threading performance gain in python is limited by GIL I want to spawn more processes so the serve...
Bob
1

votes
0

answer
305

Views

How to pass a shared object to a multiprocessing.pool (without pickling or using globals)?

The multiprocessing docs say it is better to inherit (from the ancestor process) than to pickle/unpickle a shared resource (and hence to avoid using pipes/queues). However, it also says it is better to explicitly pass resources, rather than to share as global resources. So, how do you pass a shared...
benjimin
1

votes
1

answer
44

Views

Print 3 JAVA program's output in the separate consecutive lines, sharing same console, using \r in a multiprocessing scenario

I have 3 java programs having individual main functions which are called by a parent program on 3 separate threads. All three programs prints value of a counter in a single console using below code For program # 1 System.out.print(ANSI_PURPLE + ' \r EEG Sensor count =' + Sensor1_tuple_count); Syst...
Amarjit Singh
1

votes
0

answer
377

Views

Python alternative to multiprocessing.Queue() to share data across processes

I am writing a program that needs to communicate and share audio stream data between multiple processes. This can be done by using multiprocessing.Queue(). However, the machine that I'm running this program on returned this error whenever I tried to use multiprocessing.Queue(): ImportError: This pla...
nyamuk91
1

votes
0

answer
53

Views

python-multiprocessing Pool gets stuck

I have the following code (pseudo-code since original is too long) from multiprocessing import Pool def _my_fun(params): out = # do someting return out class myClass(): def __init__(self,init_params): sef.init_params = init_params def process(self,some_list): data = [] try: poll = Pool(processess=4)...
Lorenzo Quirós
1

votes
0

answer
97

Views

cx_freeze tkinter application not working with multiprocessing in Windows

I created a basic Tkinter GUI that has one button that when you click on it, it is supposed to use multiprocessing in order to open up 5 Chrome instances of 'https://www.google.com.' When I compile the script using cx_freeze, I click on the new exe and the button does nothing. Here is the code: main...
Julian Rachman
1

votes
1

answer
926

Views

Using Apache Spark to parallelize processing of a Pandas dataframe

I have a general question regarding the appropriateness of using Spark for a type of problem I frequently encounter in Python: performing the same task on the same set of data using different parameter settings using the multiprocessing package. Consider the following toy example (note this is just...
xbot

View additional questions