• Kausta@lemm.ee
    link
    fedilink
    English
    arrow-up
    68
    ·
    3 months ago

    You havent seen anything until you need to put a 4.2gb gzipped csv into a pandas dataframe, which works without any issues I should note.

    • thisfro@slrpnk.net
      link
      fedilink
      English
      arrow-up
      23
      ·
      3 months ago

      I raise you thousands of gzipped files (total > 20GB) combined into one dataframe. Frankly, my work laptop did not like it all that much. But most basic operations still worked fine tho

      • Kausta@lemm.ee
        link
        fedilink
        English
        arrow-up
        9
        ·
        3 months ago

        Yeah, it was just a simple example. Although using just pandas (without something like dask) for loading terabytes of data at once into a single dataframe may not be the best idea, even with enough memory.