Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Operating system refuses to provide memory error

    Hi Everyone,

    A coauthor and I are trying to append several very large datasets together. When I run the code to append these I get the following error: " op. sys. refuses to provide memory
    Stata's data-storage memory manager has already allocated
    319238963224 bytes and it just attempted to allocate another 32m
    bytes. The operating system said no. Perhaps you are running
    another memory-consuming task and the command will work later
    when the task completes. Perhaps you are on a multiuser system
    that is especially busy and the command will work later when
    activity quiets down. Perhaps a system administrator has put a
    limit on what you can allocate; see help memory. Or perhaps
    that's all the memory your computer can allocate to Stata.
    "

    I have re-ran the code and gotten the same error three times, all at different times during the day, and with nothing else running on the computer.

    Does anyone have suggestions on what to do from here?

    Thank you for the help!

    Kait




  • #2
    You need one or more of

    smaller datasets (perhaps compress will help)

    more memory on the same or different computer.



    Comment


    • #3
      Nick,

      Thank you for the help. We will the compress command next!

      Comment


      • #4
        Just idle curiosity here: Stata's data-storage memory manager has already allocated 319 GB? Am I interpreting that correctly?

        Comment


        • #5
          Nils,

          That is correct, 319 GB

          Comment


          • #6
            Nils asked the question I wondered about. 319GB is for me a substantial amount of disk space and an unheard-of amount of physical memory.

            So can you tell us, what sort of system are you running on? and how much physical memory and free disk storage does it have? and just how "very large" are these datasets you are trying to append? Is it possible that your problem is that you've not only exceeded the physical memory of your system but also have exceeded the free disk available for use as swap space in support of virtual memory?

            I'll admit that I doubt there's anything I can contribute to resolving your problems, other than the usual "can you drop some variables or observations from these datasets individually before the append" advice. But I am interested in understanding where the limits are for "big data" on systems more substantial than mine.

            Thanks for any information you can share with those of us not favored with such a substantial computing environment.
            Last edited by William Lisowski; 16 Jul 2015, 14:59.

            Comment


            • #7
              Another strategy is to determine if any variables are strings that can be encoded; that is, strings that take only a limited number of values, such as country names or survey questions. This can produce dramatic savings; just be sure to apply the same encoding for each dataset.


              I, too, was surprised by the 319gb number.

              Comment


              • #8
                Jeph makes a good point. I will add that -recast-ing strs as strLs sometimes yields dramatic savings.

                Comment

                Working...
                X