Community
cancel
Showing results for 
Search instead for 
Did you mean: 
asome2
Student Ambassador
2,293 Views

Jupyter hub crashes when running multiprocessing (6cores) script

Hi, I'm running a very simple script,

however the entire system crashes after 2-3 minutes of running.

import os

import json

import pandas

import sys

import multiprocessing as mp

def jsons2df(path):

json_list=[]

file_list = os.scandir(path)

counter=0

error_counter=0

for j_file in file_list:

counter+=1

if counter%10000==0:

print(counter)

# with open(os.path.join(path,j_file),"r") as j:

with open(j_file.path,"r") as j:

try:

data = json.load(j)

except:

print("ERROR: ",j_file)

error_counter+=1

if error_counter>1000:

break

else:

continue

json_list.append(data)

try:

json_df= pandas.DataFrame(json_list)

b=json_df["url"].apply(lambda x: x.split("/")[2])

json_df['domain']=b

return json_df

except:

return json_list

def get_and_extract(number):

print("starting: ",number)

if not os.path.isfile("%s.tar"%(number)):

if not os.path.isdir("%s"%(number)):

if not os.path.isfile("./pickles/wiki_%s.pickle"%(number)):

print("downloading")

url=" http://data.dws.informatik.uni-mannheim.de/webtables/2015-07/englishCorpus/compressed/%s.tar.gz"%(number)

os.system("wget %s"%(url))

print("extracting gz # 1")

os.system("tar -xzf %s.tar.gz >/dev/null"%(number))

os.system("rm -rf %s.tar.gz"%(number))

if os.path.isfile("%s.tar"%(number)):

print("extracting tar # 2")

os.system("tar --skip-old-files -xf %s.tar >/dev/null"%(number))

os.system("rm -rf %s.tar"%(number))

if not os.path.isfile("%s.tar"%(number)):

if number.startswith("0"):

number=number[1:]

path="./%s"%(number)

if not os.path.isfile("./pickles/wiki_%s.pickle"%(number)):

df =jsons2df(path)

print("pickling:")

df.to_pickle("./pickles/df_%s.pickle"%(number))

b=df[df.domain.str.contains("en.wikipedia.org")]

b=b[~b.url.str.contains('Special:Book')]

b.to_pickle("./pickles/wiki_%s.pickle"%(number))

if os.path.isdir("%s"%(number)):

print("Deleting:")

os.system("rsync -r --delete emptydir/ %s/"%(number))

os.system("rmdir %s"%(number))

print("Done: ",number)

# get_and_extract(number)

def main(number1,number2):

# pool=mp.Pool(CORES)

pool = mp.Pool(6)

print(number1,number2)

print("Looping:")

for i in range(int(number1),int(number2)+1):

print(i)

x =str(i)

y=0

x =str(i)

y=0

a=pool.apply_async(get_and_extract,(x,))

# a=pool.apply_async(get_and_extract,(x,))

# print(a)

pool.close()

"""

if __name__ == '__main__':

number=sys.argv[1]

number2=sys.argv[2]

main(number,number2)

"""

Tags (1)
0 Kudos
7 Replies
idata
Community Manager
154 Views

Hi Amit,

 

Thanks for reaching out to us.

 

We tried out the script from our end and the results are attached.

 

I would suggest you try out the following:

 

1. Are you using a virtual environment to run the file. If not, please give it a try.

 

2. Is the notebook crashing because of large numbers being passed to your main function. i.e. main(number1,number2)? Kindly verify that.

 

Please confirm on the above options and if the issue still persists, kindly send a screenshot of the error message(if any).

 

Thanks & Regards,

 

Sandhiya
asome2
Student Ambassador
154 Views

Thanks,

1. Yes I am using a virtual environment and it still crashes.

2. Small numbers are passed and the crash is immediate , i.e. main(32,35)

Sadly,

Issue is not resolved.

asome2
Student Ambassador
154 Views

No error, just that my jupyter instance restrats.

idata
Community Manager
154 Views

Hi Amit,

 

 

Unavailability of free space can cause Jupyter notebook to crash without showing an error. Kindly check your usage and let us know if your issue is resolved.

 

 

Thanks & Regards,

 

Sandhiya
asome2
Student Ambassador
154 Views

I checked and the issued is not resolved

idata
Community Manager
154 Views

Hi Amit,

 

 

Hope the cleaned account is working fine. Do get back incase you are still facing the issue.

 

 

 

Thanks & Regards,

 

Sandhiya
idata
Community Manager
154 Views

Hi Amit,

 

 

We did not hear back from you and hence we are closing this case. Do open a new thread incase you face any issues.

 

 

Thanks & Regards,

 

Sandhiya
Reply