Intel® DevCloud
Help for those needing help starting or connecting to the Intel® DevCloud
1642 Discussions

Getting disconnected through ssh or jupyter

Mohammed_Elkomy
Beginner
2,113 Views

Hello everyone!
I'm having an issue, every time I run long linux commands like 

  1. dd if=/dev/urandom of=bigfile.txt bs=1048576 count=500;
  2. compressing or decompressing large files using tar
  3. split command (I want to split a large compressed file and upload as chunks to drive, my script works but I get disconnected)

I get disconnected from ssh or jupyter and it hangs for so long leading to forcibly closing the job using qdel (504 server error)

Thanks,

Mohammed_Elkomy_0-1617712406337.png

 

Labels (1)
0 Kudos
1 Solution
JananiC_Intel
Moderator
1,926 Views

Hi,


Sorry for the delay.


Regarding your 1st question of using dd if=/dev/urandom of=bigfile.txt bs=1048576 count=500; command in ssh or jupyter, you will not be able to use this command in jupyter so as a workaround you can use ssh .


Try this and let us know.


Regards,

Janani Chandran


View solution in original post

0 Kudos
9 Replies
JananiC_Intel
Moderator
2,090 Views

Hi,


Thanks for posting in Intel forums.


Could you give us the reproducer like steps and commands followed?


Regards,

Janani Chandran


0 Kudos
Mohammed_Elkomy
Beginner
2,085 Views

Thanks for replying,

A very simple example to encounter this

running this command to generate random garbage of 5GB on ssh or jupyter cell: (or anly long running Linux command such as file compression or splitting)
dd if=/dev/urandom of=bigfile.txt bs=1048576 count=5000

the jupyter disconnects in few seconds if I run this command, and also the ssh hangs.

refreshing the jupyter tab doesn't work as well it hangs for so long, so I have to do a qdel from ssh for the jupyter job

It's worth mentioning those commands were working the last month, maybe something changed?
regards,

0 Kudos
JananiC_Intel
Moderator
2,063 Views

Hi,


We will look into this internally and let you know the updates.


Regards,

Janani Chandran


0 Kudos
JananiC_Intel
Moderator
2,042 Views

Hi,


Sorry for the delay.


We have yet to receive an update from the admin team. We will let you know once we get an update.


Regards,

Janani Chandran


0 Kudos
JananiC_Intel
Moderator
1,961 Views

Hi,


Sorry for the delay.


To answer your question

2)compressing or decompressing large files using tar - This may be due to some limitations of the DevCloud, such as disk quota. Also this might happen if you run the command on login node. The login node is not meant for any kind of work and we can't guarantee that a particular scenario won't fail.

3)This is possibly a combination of disk space and or ulimits on the download/upload speeds and times. In order to prevent a degradation of performance we limit the upload/download bandwidths and durations.

We suggest you to use rsync like this:

$ rsync -aP source_folder/* destination_folder/

rsync will let you resume a transfer from where it failed.


Regarding your 1st question we will check and let you know.


Regards,

Janani Chandran




0 Kudos
JananiC_Intel
Moderator
1,927 Views

Hi,


Sorry for the delay.


Regarding your 1st question of using dd if=/dev/urandom of=bigfile.txt bs=1048576 count=500; command in ssh or jupyter, you will not be able to use this command in jupyter so as a workaround you can use ssh .


Try this and let us know.


Regards,

Janani Chandran


0 Kudos
JananiC_Intel
Moderator
1,910 Views

Hi,


Is your issue resolved? Do you have any update?


Regards,

Janani Chandran


0 Kudos
Mohammed_Elkomy
Beginner
1,902 Views

Thanks a lot for the great effort.

0 Kudos
JananiC_Intel
Moderator
1,894 Views

Hi,


Thanks for accepting our solution. If you need any additional information, please submit a new question as this thread will no longer be monitored.


Regards,

Janani Chandran


0 Kudos
Reply