Intel® DevCloud
Help for those needing help starting or connecting to the Intel® DevCloud
Announcements
Welcome to the Intel Community. If you get an answer you like, please mark it as an Accepted Solution to help others. Thank you!
For the latest information on Intel’s response to the Log4j/Log4Shell vulnerability, please see Intel-SA-00646
717 Discussions

Getting disconnected through ssh or jupyter

Mohammed_Elkomy
Beginner
695 Views

Hello everyone!
I'm having an issue, every time I run long linux commands like 

  1. dd if=/dev/urandom of=bigfile.txt bs=1048576 count=500;
  2. compressing or decompressing large files using tar
  3. split command (I want to split a large compressed file and upload as chunks to drive, my script works but I get disconnected)

I get disconnected from ssh or jupyter and it hangs for so long leading to forcibly closing the job using qdel (504 server error)

Thanks,

Mohammed_Elkomy_0-1617712406337.png

 

Labels (1)
0 Kudos
1 Solution
JananiC_Intel
Moderator
508 Views

Hi,


Sorry for the delay.


Regarding your 1st question of using dd if=/dev/urandom of=bigfile.txt bs=1048576 count=500; command in ssh or jupyter, you will not be able to use this command in jupyter so as a workaround you can use ssh .


Try this and let us know.


Regards,

Janani Chandran


View solution in original post

9 Replies
JananiC_Intel
Moderator
672 Views

Hi,


Thanks for posting in Intel forums.


Could you give us the reproducer like steps and commands followed?


Regards,

Janani Chandran


Mohammed_Elkomy
Beginner
667 Views

Thanks for replying,

A very simple example to encounter this

running this command to generate random garbage of 5GB on ssh or jupyter cell: (or anly long running Linux command such as file compression or splitting)
dd if=/dev/urandom of=bigfile.txt bs=1048576 count=5000

the jupyter disconnects in few seconds if I run this command, and also the ssh hangs.

refreshing the jupyter tab doesn't work as well it hangs for so long, so I have to do a qdel from ssh for the jupyter job

It's worth mentioning those commands were working the last month, maybe something changed?
regards,

JananiC_Intel
Moderator
645 Views

Hi,


We will look into this internally and let you know the updates.


Regards,

Janani Chandran


JananiC_Intel
Moderator
624 Views

Hi,


Sorry for the delay.


We have yet to receive an update from the admin team. We will let you know once we get an update.


Regards,

Janani Chandran


JananiC_Intel
Moderator
544 Views

Hi,


Sorry for the delay.


To answer your question

2)compressing or decompressing large files using tar - This may be due to some limitations of the DevCloud, such as disk quota. Also this might happen if you run the command on login node. The login node is not meant for any kind of work and we can't guarantee that a particular scenario won't fail.

3)This is possibly a combination of disk space and or ulimits on the download/upload speeds and times. In order to prevent a degradation of performance we limit the upload/download bandwidths and durations.

We suggest you to use rsync like this:

$ rsync -aP source_folder/* destination_folder/

rsync will let you resume a transfer from where it failed.


Regarding your 1st question we will check and let you know.


Regards,

Janani Chandran




JananiC_Intel
Moderator
509 Views

Hi,


Sorry for the delay.


Regarding your 1st question of using dd if=/dev/urandom of=bigfile.txt bs=1048576 count=500; command in ssh or jupyter, you will not be able to use this command in jupyter so as a workaround you can use ssh .


Try this and let us know.


Regards,

Janani Chandran


View solution in original post

JananiC_Intel
Moderator
492 Views

Hi,


Is your issue resolved? Do you have any update?


Regards,

Janani Chandran


Mohammed_Elkomy
Beginner
484 Views

Thanks a lot for the great effort.

JananiC_Intel
Moderator
476 Views

Hi,


Thanks for accepting our solution. If you need any additional information, please submit a new question as this thread will no longer be monitored.


Regards,

Janani Chandran


Reply