On Debian... I have between 3 and 10 open calc files, each of a few hundreds kiB, except one of 1.5 MiB that I don't modify (only read). soffice.bin just slowly but steadily inflates. After a couple of days, it takes like 700 MiB on my swap partition and everything is laggy. I set a Java parameter to use up to 2048 MiB of RAM. The memory link is still the same but it gets laggy later.
Open files are a couple of xlsx files and CSV files
Thank you for reporting the bug. Please attach a sample document, as this makes it easier for us to verify the bug. I have set the bug's status to 'NEEDINFO'. Please change it back to 'UNCONFIRMED' once the requested document is provided. (Please note that the attachment will be public, remove any sensitive information before attaching it. See https://wiki.documentfoundation.org/QA/FAQ#How_can_I_eliminate_confidential_data_from_a_sample_document.3F for help on how to do so.)
Hi thanks for your answer. Unfortunately, the files I am working on consist only of sensitive data. I can however bring some more information: * Opening an xlsx file of 45 kiB (6 sheets) with no formulas, no external link, basically only hand-filled data, no named range, no filtering, nothing other than plain text EXCEPT a conditional formatting involving 16 conditions/rules makes the used RAM increase by 25 MiB. Closing this file will free 1 MiB. Re-opening it will increase the used RAM by 6 MiB. Re-closing it will free 2 MiB. Re-re-opening it will use 5 MiB... * Opening a 2.2 MiB xlsx file (7 sheets) with only hand-filled data, no formula, 5000 fields linked to an external file (broken links since the other file does not exist any more, so I answered "No" for "refreshing linked data"), auto-filtering, no formatting except manually set background colour for headers, will increase the used RAM by 100 MiB. Closing this file without saving) will free 6 MiB. Re-opening it will increase used RAM by 20 MiB. Re-closing it will free 20 MiB. Re-re-opening it will use 30 MiB. Re-re-closing it frees 4 MiB... * Opening a 14 kiB ods file (7 sheets of pure text, no formatting except a few manually set background cell colour, no formula, no links, no named ranges) that is already opened uses 6.5 more MiB. Closing it frees 5 MiB... * Opening a CSV file of 3.1 kiB increases the RAM usage of 6 MiB. Closing it frees 5 MiB. Re-opening it costs 5 MiB, Re-closing it frees 5 MiB. I cannot see a predictive pattern, but it is clear that running through iterations of opening-closing-opening the very same xsls or ods file without modifying it makes the RAM usage inflate. While I started with a RAM usage of around 86 MiB, working a few hours on my 8-10 files (4-5 opened simultaneously, entering text, a few add/remove lines) plus running the little tests described above lead me to a RAM usage of 260 MiB. I am aware that this is weak information, but I don't think I can do much better for now. Sorry, I hope it helps a bit though. Cheers
In my case, my OS in general gets really laggy after hibernating the OS. have you hibernated your OS as well ?
I do hibernate my computer a lot. However, it never makes my OS laggy in any way. (I use Debian and uswsusp with the hibernation image compression enabled, on a swap partition of a good Samsung SSD drive). Well, I realise I wrote "everything is laggy" in my OP, but I did not mean it. I should have written "every action in LibreOffice Calc". My main concern is actually that while LibreOffice's RAM usage is growing mad, it swaps out a lot (even with vm.swappiness = 1) and there is no longer sufficient space on my swap partition to perform a hibernation. At that point, probably because too much LibreOffice data is swapped out, only LibreOffice is very laggy, especially for scrolling, even with small files of 300 lines and 25 columns of pure text (no formula, no links, a little bit of formatting...). I've tried to increase the vm.vfs_cache_pressure to 400 (against the default value of 100). After only one day of work, it seems that soffice.bin's RAM usage is kind of under control, remaining around 180 MiB. But I need more days of work with various files opening, closing, editing to confirm the good news. Anyway, thanks for your support.
remaining *at* around 180 MiB, sorry
I realise I should clear a point: * I have 8 GiB of RAM, and the total RAM usage (including every processes) never exceeds 3 GiB * For some reason however, even with "vm.swappiness = 1", my swap usage is normally between 1 GiB and 1.5 GiB, and can get much higher when soffice.bin becomes greedy.
Thanks to a system crash, here is a little test I could run: Turn on computer after a system crash. Open LibreOffice and agree the file recuperation, consisting of: 2 .xlsx (48 kiB and 54 kiB), each with 16 conditional formatting rules and 7 sheets of pure text 1 .ods with 7 sheets, each with only 1 line of pure text (between 20 and 120 columns) (14,5 kiB) 1 CSV (2 MiB) of pure text (approx 5,500 lines of 100 columns) At that point, soffice.bin's RAM usage is 160 MiB. Then I applied to 12 CSV files, each consisting of only one line of 20 to 120 columns and weighing between 281 B and 1.2 kiB), the exactly identical sequence as following: [Open CSV, remove one column, save (still as CSV) and close] At that point, soffice.bin's RAM usage is 205 MiB.
(In reply to fredgib from comment #8) > Thanks to a system crash, here is a little test I could run: > > Turn on computer after a system crash. > > Open LibreOffice and agree the file recuperation, consisting of: > 2 .xlsx (48 kiB and 54 kiB), each with 16 conditional formatting rules and 7 > sheets of pure text > 1 .ods with 7 sheets, each with only 1 line of pure text (between 20 and 120 > columns) (14,5 kiB) > 1 CSV (2 MiB) of pure text (approx 5,500 lines of 100 columns) > > At that point, soffice.bin's RAM usage is 160 MiB. > > Then I applied to 12 CSV files, each consisting of only one line of 20 to > 120 columns and weighing between 281 B and 1.2 kiB), the exactly identical > sequence as following: > [Open CSV, remove one column, save (still as CSV) and close] > > At that point, soffice.bin's RAM usage is 205 MiB. Hello, Considering you're working with 12 files, 205 Mib it's quite acceptable. As a comparison, Firefox is consuming 1.3 Gbs with 17 tabs opened for me right now. Closing as RESOLVED WONTFIX
Even regarding the fact that there is never more than 1 open file at a time? My point was that when I close a file, Libreoffice does not release as much memory as it took for opening the file. So even if I only have a few files opened at a time, the fact that I opened and closed a lot of files will lead to a big amount of memory.
There's known memory problem with 32-bit LO but here it should be 64-bit, right? Then, could this be considered a duplicate of Bug 92482?