Community
cancel
Showing results for 
Search instead for 
Did you mean: 
cherian__Aaron
Beginner
143 Views

Out of memory when installing packages on Devcloud

Hello,

I was using the Intel devcloud to test out an application on Intel hardware, but I'm facing issues during its installation on devcloud. The failure happens when I do some spack installations, some package installations exit with the error

cc1plus: out of memory allocating 28464168 bytes after a total of 325640192 bytes

When I ran "ulimit" to check for memory constraints, I got this result

u38270@login-2:~$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) 1000000
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 62461
max locked memory       (kbytes, -l) 16384
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) 120
max user processes              (-u) 200
virtual memory          (kbytes, -v) 1000000
file locks                      (-x) unlimited

I feel the issue of memory arises because the VM provides only 1GB of virtual memory. Could someone please let us know the reason for this happening and also a way to get around this?

PFA the spack logs for one of the failing packages

Tags (1)
0 Kudos
3 Replies
Adweidh_Intel
Moderator
143 Views

Hi,

In most cases, memory error is caused by trying to run compute-intensive tasks on the login node. In such cases, log in to the compute node using qsub –I and execute your commands there.

Please let us know in case of any issues/queries.

cherian__Aaron
Beginner
143 Views

Thank you, that was the issue. I was running the installations from login node. When I ran the same commands from the compute node, the installation ran smoothly

Adweidh_Intel
Moderator
143 Views

Hi,

Thanks for the confirmation, we are closing this thread. In case of any issues please feel free to open a new thread.

Reply