Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • What is the best configuration for a computer to run stata

    Hello forum

    I Need to know what settings have on their computers. I will work with an actuarial model 1500000 individuals and 250 variables, which I recommend buying computer.

    excuse my bad English.

    thanks a lot.




  • #2
    Sergio,

    The most important thing is to have enough memory. Do you know how much memory this data set will take? It's hard to know from what you have described. At a minimum it will take 3.75 GB (if all 250 variables are small numbers that can be stored as byte) and could be quite a bit larger than that. If you don't already have the data set in Stata format, see help datatypes to see how many bytes are consumed by each data type. Strings, of course, consume as many bytes as the longest string.

    In any case, you should probably get at least twice as much memory as you need, to allow for the memory needs of the operating system and other programs, as well as for future expansion.

    Given enough memory, everything else should be fine if you get a computer with a modern dual-core processor (such as Intel i5 or i7 or similar).

    Regards,
    Joe

    Comment


    • #3
      For Stata, it is better to buy a CPU with 64bit processor and around 16 GB of RAM, that should solve your problem
      Regards
      --------------------------------------------------
      Attaullah Shah, PhD.
      Professor of Finance, Institute of Management Sciences Peshawar, Pakistan
      FinTechProfessor.com
      https://asdocx.com
      Check out my asdoc program, which sends outputs to MS Word.
      For more flexibility, consider using asdocx which can send Stata outputs to MS Word, Excel, LaTeX, or HTML.

      Comment


      • #4
        I would also caution against buying a computer based on the needs of one specific problem. Unless you will be doing nothing else for a period of several years, you should plan to buy something that can meet a variety of foreseeable demands. Also, even if your current data set is "only" 3.75GB, to do analyses with it may require creating large matrices, or merging with or appending other data sets. You need plenty of room to spare. If budgetary constraints bite, as a general heuristic, more memory and a lesser CPU is, for most people, a better trade-off unless you expect to do very computation intensive things (multi-level non-linear models, large-scale microsimulations, etc.)

        Comment


        • #5
          as a general heuristic, more memory and a lesser CPU is, for most people, a better trade-off unless you expect to do very computation intensive things
          I second that notion. Most decent (but reasonably priced) motherboards max out at 32 GB of RAM, though you might find something that can hold more. But unless you need to spend all your money immediately, 16 GB should probably be enough. If at all possible get it configured such that you have slots empty (say it has four slots, get 2*8 GB chips) which leaves two slots open for later upgrading if needed. It's generally cheaper to add RAM after the fact than to max it out from the manufacturer, and you don't know yet if you need to max it out.

          As something that nobody has mentioned yet, if you can afford it, get a decent sized (say 128 or 256) GB SD drive, plus a larger 7,200 RPM SATA storage drive (1 or 2 TB). When reading and writing large data, SD drives are dramatically faster, but they're also too small for main storage in the long run with big data. So you keep stuff you're not immediately working on on the storage drive. It *is* possible to do weird and expensive things like 15,000 RPM SAS drives, but if you're halfway reasonable about file management, the SD/SATA combo is by far the cheapest and fastest solution. Memory actually is more important to Stata than hard drive access time, but it's not too expensive nowadays to do the SD/SATA combo, and when loading really big data (or just booting up or opening a word processor), you'll be glad you have SD.

          Finally, depending on your environment, plan on a way of backing stuff up. Throwing $100 or so out for an external hard drive is a solid investment if you don't have central IT back stuff up for you (and, depending on the size of your files, they might not be willing to back everything up for you anyway).

          Comment


          • #6
            Also, something I have no idea about, but by throwing it out there, maybe somebody will bite: do -spmap- or other graphics rely on the GPU or the CPU? I have an ancient computer but an upgraded video card, and -spmap- can take a while. Most Stata processes mostly depend on CPU and RAM (especially RAM), but for some stuff, upgrading the video card might help. Or might not. If they all rely on the CPU anyway, then go with integrated video instead of spending money on a dedicated video card. But if some things you might do rely on the video card, then it can be a worthwhile upgrade.

            Comment


            • #7
              @clyde indeed, I will make dynamic microsimulation model

              @Joe , @Attaullah @Ben

              Thank you very much for the contributions

              Regards

              Comment

              Working...
              X