Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My "main" org file is 21k lines, it's no problem at all. My laptop is from 2017 or something.

I do sometimes work on files that are 300k lines (don't ask), and while it's mostly fine, once in a while I'll try to use some less common operation that's not very optimized and have to abort it (e.g. don't try to magit-blame that file). But stuff like searching, scrolling, editing, syntax highlighting are all fast.

If I have to open files >100M I sometimes open them in `fundamental-mode`, which turns off syntax highlighting.

For truly large files (gigabytes), there is the `vlf` (Very Large File) package, which doesn't load it all into memory at once, but still lets you search, scroll and even M-x occur (sort of like grep, but more integrated and editable).

Note that this is on Emacs 31 (there have been some major perf improvements in the last three or so releases, and more is coming with the new garbage collector)

In earlier days there were issues with very long lines; these have partly been mitigated in later releases; on the one hand by faster internals, but also a new builtin package `so-long` that can notice very long lines, default 10k bytes, where Emacs probably should give up syntax highlighting etc. to remain snappy.



I finally made the switch to vim when I was working on a really large frontend template that consisted of the same massive repeated block where a small portion of each was different based on a condition.

There was a lot of search and replace, and emacs started dogging it really hard on like the 10th condition block.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: