Python教程

《Python》np.fromfile内存限制

本文主要是介绍《Python》np.fromfile内存限制,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!

https://stackoverflow.com/questions/54545228/is-there-a-memory-limit-on-np-fromfile-method

由于安装32bit python导致的问题

解决方案:安装64bit python

Is there a memory limit on np.fromfile() method?

1

I am trying to read a big file into array with the help of np.fromfile(), however, after certain number of bytes it gives MemoryError.

with open(filename,'r') as file:
    data = np.fromfile(file, dtype=np.uint16, count=2048*2048*63)
    data = data.reshape(63, 2048, 2048)

It works fine with 2048*2048*63 however not working with 2048*2048*64. How to debug this? I am wondering what is the bottleneck here?

Edit: I am running on Windows 10, RAM 256GB, it is a standalone script, 64bit Python.

Edit2: I followed the advices on comments, now getting the error at 128*2048*2048, works fine with 127*2048*2048.

pythonnumpymemory se-share-sheet#willShow s-popover:shown->se-share-sheet#didShow" data-controller="se-share-sheet s-popover" data-gps-track="post.click({ item: 2, priv: 0, post_type: 1 })" data-s-popover-placement="bottom-start" data-se-share-sheet-license-name="CC BY-SA 4.0" data-se-share-sheet-license-url="https%3a%2f%2fcreativecommons.org%2flicenses%2fby-sa%2f4.0%2f" data-se-share-sheet-location="1" data-se-share-sheet-post-type="question" data-se-share-sheet-social="facebook twitter devto" data-se-share-sheet-subtitle="" data-se-share-sheet-title="Share a link to this question" title="Short permalink to this question" href="https://stackoverflow.com/q/54545228" rel="external nofollow" target="_blank">Share Improve this question   edited Feb 6 '19 at 1:07     asked Feb 6 '19 at 0:50 CanCode 9522 silver badges99 bronze badges
  •   How much RAM do you have and what is the bittage of your system? – Mad Physicist Feb 6 '19 at 0:52
  • 1 How much RAM do you have? How large is your swap/pagefile? Is this standalone code, or part of a larger program with other large allocations? Are you running a 32 or 64 bit version of Python? We need a lot more details to provide useful answers. – ShadowRanger Feb 6 '19 at 0:54
  •   256GB RAM, but python cannot handle that? – CanCode Feb 6 '19 at 0:54
  • 2 2048 * 2048 * 2 * 64 = 0.5 GiB. Shouldn't be an issue. Unless you do it a few thousand times – Mad Physicist Feb 6 '19 at 0:55 
  • 1 For the record, on 64 bit CPython 3.7.1 running on Ubuntu bash on Windows, I can't reproduce (and I have far less RAM than you, only 12 GB). It loads and reshapes just fine (if I change it to data.reshape from np.reshape). While it's unlikely to matter, it would be useful to know what OS and specific Python version you're running. – ShadowRanger Feb 6 '19 at 0:59 
Show 15 more comments

1 Answer

ActiveOldestVotes 1  

Despite what you believe, you've installed a 32 bit version of Python on your 64 bit operating system, which means virtual address space is limited to only have 2 GB in user mode, and attempts to allocate contiguous blocks of a GB or more can easily fail due to address space fragmentation.

The giveaway is your sys.maxsize, which is just the largest value representable by a C ssize_t in your build of Python. 2147483647 corresponds to 2**31 - 1, which is the expected value on 32 bit Python. A 64 bit build would report 9223372036854775807 (2**63 - 1).

Uninstall the 32 bit version of Python and download/install a 64 bit version (link is to 3.7.2 download page) (look for the installer to be labelled as x86-64, not x86; the file name would include amd64). Annoyingly, the main page for downloading Python defaults to offering the 32 bit version for Windows, so you have to scroll down to the links to specific version download pages, click on the latest, then scroll down to the complete list by OS and bittedness and choose appropriately.

这篇关于《Python》np.fromfile内存限制的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!