Created attachment 162478 [details] file with 1048577 rows When 'Very large spreadsheets' experimental feature is turned on, importing more than 1048576 rows will crash. 1048576 rows is the previous limit. 1. Turn experimetal features on 2. Turn Very large spreadsheets on 3. Import the attached .csv file with 1048577 rows this results in a crash with the following message multi_type_vector::set#483: block position not found! (logical pos=1048576, block size=1, logical size=1048576) The same file with 1048576 rows imports perfectly (tail -n +2 lines1048577.csv)
Created attachment 162484 [details] bt with debug symbols On pc Debian x86-64 with master sources updated today, I could reproduce this.
Probably a duplicate
no crash in Version: 7.1.0.0.alpha0+ Build ID: 8c18cd6823ddf4ef5ba67801a84cee26c9b5a9a6 CPU threads: 4; OS: Mac OS X 10.15.6; UI render: default; VCL: osx Locale: ru-RU (ru_RU.UTF-8); UI: en-US Calc: threaded WFM