Why a 352GB NumPy ndarray can be used on a 8GB memory macOS computer?MemoryError on Windows but not MacOS...

Word or phrase for showing great skill at something without formal training in it

What is this metal M-shaped device for?

How would one buy a used TIE Fighter or X-Wing?

Can pricing be copyrighted?

Finding radius of circle

Is there any differences between “gucken” and “schauen”?

How to deal with an incendiary email that was recalled

How do you funnel food off a cutting board?

How experienced do I need to be to go on a photography workshop?

page split between longtable caption and table

Everyone is beautiful

Are there any outlying considerations if I treat donning a shield as an object interaction during the first round of combat?

Dilemma of explaining to interviewer that he is the reason for declining second interview

Recrystallisation of dibenzylideneacetone

What to do when being responsible for data protection in your lab, yet advice is ignored?

Using only 1s, make 29 with the minimum number of digits

Why do neural networks need so many training examples to perform?

Can a hotel cancel a confirmed reservation?

A starship is travelling at 0.9c and collides with a small rock. Will it leave a clean hole through, or will more happen?

I am on the US no-fly list. What can I do in order to be allowed on flights which go through US airspace?

What do you call a fact that doesn't match the settings?

"On one hand" vs "on the one hand."

What is the purpose of easy combat scenarios that don't need resource expenditure?

Cryptic with missing capitals



Why a 352GB NumPy ndarray can be used on a 8GB memory macOS computer?


MemoryError on Windows but not MacOS when using np.zerosHow can the Euclidean distance be calculated with NumPy?What is the difference between ndarray and array in numpy?numpy.dot -> MemoryError, my_dot -> very slow, but works. Why?Pandas MemoryError on server with more MemoryPython Numpy Memory ErrorNumpy and memory allocation on Mac OS X vs. LinuxNumpy array error MemoryErrorgot memoryerror creating ODM=np.zeros((n,n)) with iteratingWhy does importing numpy add 1 GB of virtual memory on Linux?Python Memory Error using Numpy only with Ubuntu?













6















import numpy as np

array = np.zeros((210000, 210000)) # default numpy.float64
array.nbytes


When I run the above code on my 8GB memory MacBook with macOS, it has no error occurred. But run the same code on a 16GB memory PC with Windows 10, or a 12GB memory ubuntu laptop, or even on a 128GB memory Linux supercomputer, the Python interpreter will raise a MemoryError. All the test environments have 64bit Python 3.6 or 3.7 installed.









share

























  • MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..

    – Martijn Pieters
    5 hours ago











  • @MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I use sysctl vm.swapusage to see the real VM usage and got 1200 M

    – Blaise Wang
    5 hours ago











  • But they don't compress.

    – Martijn Pieters
    5 hours ago











  • @MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.

    – Blaise Wang
    2 hours ago
















6















import numpy as np

array = np.zeros((210000, 210000)) # default numpy.float64
array.nbytes


When I run the above code on my 8GB memory MacBook with macOS, it has no error occurred. But run the same code on a 16GB memory PC with Windows 10, or a 12GB memory ubuntu laptop, or even on a 128GB memory Linux supercomputer, the Python interpreter will raise a MemoryError. All the test environments have 64bit Python 3.6 or 3.7 installed.









share

























  • MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..

    – Martijn Pieters
    5 hours ago











  • @MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I use sysctl vm.swapusage to see the real VM usage and got 1200 M

    – Blaise Wang
    5 hours ago











  • But they don't compress.

    – Martijn Pieters
    5 hours ago











  • @MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.

    – Blaise Wang
    2 hours ago














6












6








6


1






import numpy as np

array = np.zeros((210000, 210000)) # default numpy.float64
array.nbytes


When I run the above code on my 8GB memory MacBook with macOS, it has no error occurred. But run the same code on a 16GB memory PC with Windows 10, or a 12GB memory ubuntu laptop, or even on a 128GB memory Linux supercomputer, the Python interpreter will raise a MemoryError. All the test environments have 64bit Python 3.6 or 3.7 installed.









share
















import numpy as np

array = np.zeros((210000, 210000)) # default numpy.float64
array.nbytes


When I run the above code on my 8GB memory MacBook with macOS, it has no error occurred. But run the same code on a 16GB memory PC with Windows 10, or a 12GB memory ubuntu laptop, or even on a 128GB memory Linux supercomputer, the Python interpreter will raise a MemoryError. All the test environments have 64bit Python 3.6 or 3.7 installed.







python python-3.x macos numpy memory





share














share












share



share








edited 3 hours ago







Blaise Wang

















asked 5 hours ago









Blaise WangBlaise Wang

728




728













  • MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..

    – Martijn Pieters
    5 hours ago











  • @MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I use sysctl vm.swapusage to see the real VM usage and got 1200 M

    – Blaise Wang
    5 hours ago











  • But they don't compress.

    – Martijn Pieters
    5 hours ago











  • @MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.

    – Blaise Wang
    2 hours ago



















  • MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..

    – Martijn Pieters
    5 hours ago











  • @MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I use sysctl vm.swapusage to see the real VM usage and got 1200 M

    – Blaise Wang
    5 hours ago











  • But they don't compress.

    – Martijn Pieters
    5 hours ago











  • @MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.

    – Blaise Wang
    2 hours ago

















MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..

– Martijn Pieters
5 hours ago





MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..

– Martijn Pieters
5 hours ago













@MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I use sysctl vm.swapusage to see the real VM usage and got 1200 M

– Blaise Wang
5 hours ago





@MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I use sysctl vm.swapusage to see the real VM usage and got 1200 M

– Blaise Wang
5 hours ago













But they don't compress.

– Martijn Pieters
5 hours ago





But they don't compress.

– Martijn Pieters
5 hours ago













@MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.

– Blaise Wang
2 hours ago





@MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.

– Blaise Wang
2 hours ago












1 Answer
1






active

oldest

votes


















8














You are most likely using Mac OS X Mavericks or newer, so 10.9 or up. From that version onwards, MacOS uses virtual memory compression, where memory requirements that exceed your physical memory are not only redirected to memory pages on disk, but those pages are compressed to save space.



For your ndarray, you may have requested ~332GB of memory, but it's all a contiguous sequence of NUL bytes at the moment, and that compresses really, really well:



Memory stats from the Activity Monitor, showing a virtual memory size of 332.71 GB but Real Memory Size stat of 9.3 MB



That's a screenshot from the Activity Monitor tool, with the process details of my Python process where I replicated your test (use the (I) icon on the toolbar to open it); this is from the Memory tab, where you can see that the Real Memory Size column is only 9.3 MB used, against a Virtual Memory Size of 332.71GB.



Once you start setting other values for those indices, you'll quickly see the memory stats increase to gigabytes instead of megabytes:



while True:
index = tuple(np.random.randint(array.shape[0], size=2))
array[index] = np.random.uniform(-10 ** -307, 10 ** 307)


or you can push the limit further by assigning to every index (in batches, so you can watch the memory grow):



array = array.reshape((-1,))
for i in range(0, array.shape[0], 10**5):
array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)


The process is eventually terminated; my Macbook Pro doesn't have enough swap space to store hard-to-compress gigabytes of random data:



>>> array = array.reshape((-1,))
>>> for i in range(0, array.shape[0], 10**5):
... array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
...
Killed: 9


You could argue that MacOS is being too trusting, letting programs request that much memory without bounds, but with memory compression, memory limits are much more fluid. Your np.zeros() array does fit your system, after all. Even though you probably don't actually have the swap space to store the uncompressed data, compressed it all fits fine so MacOS allows it and terminates processes that then take advantage of the generosity.



If you don't want this to happen, use resource.setrlimit() to set limits on RLIMIT_STACK to, say 2 ** 14, at which point the OS will segfault Python when it exceeds the limits.






share|improve this answer


























  • Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.

    – inf
    4 hours ago











  • @inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.

    – Martijn Pieters
    4 hours ago











  • Define "run out of memory", do you get a MemoryError or just start filling RAM, swapping and get OOMed?

    – inf
    4 hours ago











  • @inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc confirms Python has been given the memory allocation), there won't be a MemoryError, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.

    – Martijn Pieters
    4 hours ago











  • I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence the MemoryError.

    – inf
    3 hours ago











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54961554%2fwhy-a-352gb-numpy-ndarray-can-be-used-on-a-8gb-memory-macos-computer%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









8














You are most likely using Mac OS X Mavericks or newer, so 10.9 or up. From that version onwards, MacOS uses virtual memory compression, where memory requirements that exceed your physical memory are not only redirected to memory pages on disk, but those pages are compressed to save space.



For your ndarray, you may have requested ~332GB of memory, but it's all a contiguous sequence of NUL bytes at the moment, and that compresses really, really well:



Memory stats from the Activity Monitor, showing a virtual memory size of 332.71 GB but Real Memory Size stat of 9.3 MB



That's a screenshot from the Activity Monitor tool, with the process details of my Python process where I replicated your test (use the (I) icon on the toolbar to open it); this is from the Memory tab, where you can see that the Real Memory Size column is only 9.3 MB used, against a Virtual Memory Size of 332.71GB.



Once you start setting other values for those indices, you'll quickly see the memory stats increase to gigabytes instead of megabytes:



while True:
index = tuple(np.random.randint(array.shape[0], size=2))
array[index] = np.random.uniform(-10 ** -307, 10 ** 307)


or you can push the limit further by assigning to every index (in batches, so you can watch the memory grow):



array = array.reshape((-1,))
for i in range(0, array.shape[0], 10**5):
array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)


The process is eventually terminated; my Macbook Pro doesn't have enough swap space to store hard-to-compress gigabytes of random data:



>>> array = array.reshape((-1,))
>>> for i in range(0, array.shape[0], 10**5):
... array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
...
Killed: 9


You could argue that MacOS is being too trusting, letting programs request that much memory without bounds, but with memory compression, memory limits are much more fluid. Your np.zeros() array does fit your system, after all. Even though you probably don't actually have the swap space to store the uncompressed data, compressed it all fits fine so MacOS allows it and terminates processes that then take advantage of the generosity.



If you don't want this to happen, use resource.setrlimit() to set limits on RLIMIT_STACK to, say 2 ** 14, at which point the OS will segfault Python when it exceeds the limits.






share|improve this answer


























  • Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.

    – inf
    4 hours ago











  • @inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.

    – Martijn Pieters
    4 hours ago











  • Define "run out of memory", do you get a MemoryError or just start filling RAM, swapping and get OOMed?

    – inf
    4 hours ago











  • @inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc confirms Python has been given the memory allocation), there won't be a MemoryError, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.

    – Martijn Pieters
    4 hours ago











  • I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence the MemoryError.

    – inf
    3 hours ago
















8














You are most likely using Mac OS X Mavericks or newer, so 10.9 or up. From that version onwards, MacOS uses virtual memory compression, where memory requirements that exceed your physical memory are not only redirected to memory pages on disk, but those pages are compressed to save space.



For your ndarray, you may have requested ~332GB of memory, but it's all a contiguous sequence of NUL bytes at the moment, and that compresses really, really well:



Memory stats from the Activity Monitor, showing a virtual memory size of 332.71 GB but Real Memory Size stat of 9.3 MB



That's a screenshot from the Activity Monitor tool, with the process details of my Python process where I replicated your test (use the (I) icon on the toolbar to open it); this is from the Memory tab, where you can see that the Real Memory Size column is only 9.3 MB used, against a Virtual Memory Size of 332.71GB.



Once you start setting other values for those indices, you'll quickly see the memory stats increase to gigabytes instead of megabytes:



while True:
index = tuple(np.random.randint(array.shape[0], size=2))
array[index] = np.random.uniform(-10 ** -307, 10 ** 307)


or you can push the limit further by assigning to every index (in batches, so you can watch the memory grow):



array = array.reshape((-1,))
for i in range(0, array.shape[0], 10**5):
array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)


The process is eventually terminated; my Macbook Pro doesn't have enough swap space to store hard-to-compress gigabytes of random data:



>>> array = array.reshape((-1,))
>>> for i in range(0, array.shape[0], 10**5):
... array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
...
Killed: 9


You could argue that MacOS is being too trusting, letting programs request that much memory without bounds, but with memory compression, memory limits are much more fluid. Your np.zeros() array does fit your system, after all. Even though you probably don't actually have the swap space to store the uncompressed data, compressed it all fits fine so MacOS allows it and terminates processes that then take advantage of the generosity.



If you don't want this to happen, use resource.setrlimit() to set limits on RLIMIT_STACK to, say 2 ** 14, at which point the OS will segfault Python when it exceeds the limits.






share|improve this answer


























  • Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.

    – inf
    4 hours ago











  • @inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.

    – Martijn Pieters
    4 hours ago











  • Define "run out of memory", do you get a MemoryError or just start filling RAM, swapping and get OOMed?

    – inf
    4 hours ago











  • @inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc confirms Python has been given the memory allocation), there won't be a MemoryError, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.

    – Martijn Pieters
    4 hours ago











  • I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence the MemoryError.

    – inf
    3 hours ago














8












8








8







You are most likely using Mac OS X Mavericks or newer, so 10.9 or up. From that version onwards, MacOS uses virtual memory compression, where memory requirements that exceed your physical memory are not only redirected to memory pages on disk, but those pages are compressed to save space.



For your ndarray, you may have requested ~332GB of memory, but it's all a contiguous sequence of NUL bytes at the moment, and that compresses really, really well:



Memory stats from the Activity Monitor, showing a virtual memory size of 332.71 GB but Real Memory Size stat of 9.3 MB



That's a screenshot from the Activity Monitor tool, with the process details of my Python process where I replicated your test (use the (I) icon on the toolbar to open it); this is from the Memory tab, where you can see that the Real Memory Size column is only 9.3 MB used, against a Virtual Memory Size of 332.71GB.



Once you start setting other values for those indices, you'll quickly see the memory stats increase to gigabytes instead of megabytes:



while True:
index = tuple(np.random.randint(array.shape[0], size=2))
array[index] = np.random.uniform(-10 ** -307, 10 ** 307)


or you can push the limit further by assigning to every index (in batches, so you can watch the memory grow):



array = array.reshape((-1,))
for i in range(0, array.shape[0], 10**5):
array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)


The process is eventually terminated; my Macbook Pro doesn't have enough swap space to store hard-to-compress gigabytes of random data:



>>> array = array.reshape((-1,))
>>> for i in range(0, array.shape[0], 10**5):
... array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
...
Killed: 9


You could argue that MacOS is being too trusting, letting programs request that much memory without bounds, but with memory compression, memory limits are much more fluid. Your np.zeros() array does fit your system, after all. Even though you probably don't actually have the swap space to store the uncompressed data, compressed it all fits fine so MacOS allows it and terminates processes that then take advantage of the generosity.



If you don't want this to happen, use resource.setrlimit() to set limits on RLIMIT_STACK to, say 2 ** 14, at which point the OS will segfault Python when it exceeds the limits.






share|improve this answer















You are most likely using Mac OS X Mavericks or newer, so 10.9 or up. From that version onwards, MacOS uses virtual memory compression, where memory requirements that exceed your physical memory are not only redirected to memory pages on disk, but those pages are compressed to save space.



For your ndarray, you may have requested ~332GB of memory, but it's all a contiguous sequence of NUL bytes at the moment, and that compresses really, really well:



Memory stats from the Activity Monitor, showing a virtual memory size of 332.71 GB but Real Memory Size stat of 9.3 MB



That's a screenshot from the Activity Monitor tool, with the process details of my Python process where I replicated your test (use the (I) icon on the toolbar to open it); this is from the Memory tab, where you can see that the Real Memory Size column is only 9.3 MB used, against a Virtual Memory Size of 332.71GB.



Once you start setting other values for those indices, you'll quickly see the memory stats increase to gigabytes instead of megabytes:



while True:
index = tuple(np.random.randint(array.shape[0], size=2))
array[index] = np.random.uniform(-10 ** -307, 10 ** 307)


or you can push the limit further by assigning to every index (in batches, so you can watch the memory grow):



array = array.reshape((-1,))
for i in range(0, array.shape[0], 10**5):
array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)


The process is eventually terminated; my Macbook Pro doesn't have enough swap space to store hard-to-compress gigabytes of random data:



>>> array = array.reshape((-1,))
>>> for i in range(0, array.shape[0], 10**5):
... array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
...
Killed: 9


You could argue that MacOS is being too trusting, letting programs request that much memory without bounds, but with memory compression, memory limits are much more fluid. Your np.zeros() array does fit your system, after all. Even though you probably don't actually have the swap space to store the uncompressed data, compressed it all fits fine so MacOS allows it and terminates processes that then take advantage of the generosity.



If you don't want this to happen, use resource.setrlimit() to set limits on RLIMIT_STACK to, say 2 ** 14, at which point the OS will segfault Python when it exceeds the limits.







share|improve this answer














share|improve this answer



share|improve this answer








edited 1 hour ago

























answered 5 hours ago









Martijn PietersMartijn Pieters

715k13825002312




715k13825002312













  • Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.

    – inf
    4 hours ago











  • @inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.

    – Martijn Pieters
    4 hours ago











  • Define "run out of memory", do you get a MemoryError or just start filling RAM, swapping and get OOMed?

    – inf
    4 hours ago











  • @inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc confirms Python has been given the memory allocation), there won't be a MemoryError, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.

    – Martijn Pieters
    4 hours ago











  • I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence the MemoryError.

    – inf
    3 hours ago



















  • Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.

    – inf
    4 hours ago











  • @inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.

    – Martijn Pieters
    4 hours ago











  • Define "run out of memory", do you get a MemoryError or just start filling RAM, swapping and get OOMed?

    – inf
    4 hours ago











  • @inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc confirms Python has been given the memory allocation), there won't be a MemoryError, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.

    – Martijn Pieters
    4 hours ago











  • I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence the MemoryError.

    – inf
    3 hours ago

















Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.

– inf
4 hours ago





Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.

– inf
4 hours ago













@inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.

– Martijn Pieters
4 hours ago





@inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.

– Martijn Pieters
4 hours ago













Define "run out of memory", do you get a MemoryError or just start filling RAM, swapping and get OOMed?

– inf
4 hours ago





Define "run out of memory", do you get a MemoryError or just start filling RAM, swapping and get OOMed?

– inf
4 hours ago













@inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc confirms Python has been given the memory allocation), there won't be a MemoryError, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.

– Martijn Pieters
4 hours ago





@inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc confirms Python has been given the memory allocation), there won't be a MemoryError, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.

– Martijn Pieters
4 hours ago













I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence the MemoryError.

– inf
3 hours ago





I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence the MemoryError.

– inf
3 hours ago




















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54961554%2fwhy-a-352gb-numpy-ndarray-can-be-used-on-a-8gb-memory-macos-computer%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

“%fieldName is a required field.”, in Magento2 REST API Call for GET Method Type The Next...

How to change City field to a dropdown in Checkout step Magento 2Magento 2 : How to change UI field(s)...

變成蝙蝠會怎樣? 參考資料 外部連結 导航菜单Thomas Nagel, "What is it like to be a...