Bug 113611 - The data could not be loaded completely because the maximum number of rows per sheet was exceeded
Summary: The data could not be loaded completely because the maximum number of rows pe...
Status: NEW
Alias: None
Product: LibreOffice
Classification: Unclassified
Component: Calc (show other bugs)
Version:
(earliest affected)
5.4.1.2 release
Hardware: All All
: low enhancement
Assignee: Not Assigned
URL:
Whiteboard:
Keywords: needsDevEval
Depends on:
Blocks: Calc-1048576plus-rows
  Show dependency treegraph
 
Reported: 2017-11-02 19:47 UTC by Boris Bahes
Modified: 2023-05-07 15:46 UTC (History)
4 users (show)

See Also:
Crash report or crash signature:


Attachments
Excel Query Editor #1 (119.08 KB, image/png)
2017-11-05 10:12 UTC, Boris Bahes
Details
Excel Query Editor #2 (63.61 KB, image/png)
2017-11-05 10:13 UTC, Boris Bahes
Details

Note You need to log in before you can comment on or make changes to this bug.
Description Boris Bahes 2017-11-02 19:47:06 UTC
Description:
Loading large .csv file at the end of loading displays message "The data could not be loaded completely because the maximum number of rows per sheet was exceeded." Is it possible to have interface like Excel does in this situation, to give user ability to filter this large .csv and to load filtered version?


Actual Results:  

Summary:	
"The data could not be loaded completely because the maximum number of rows per sheet was exceeded." - message dialog in Calc.


Expected Results:
New UI that would allow loading filtered version.


Reproducible: Always


User Profile Reset: Yes


OpenGL enabled: Yes

Additional Info:


User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36
Comment 1 V Stuart Foote 2017-11-02 20:03:59 UTC
MAX row count is  1,048,576, or 2^20 rows

Do you really have a CSV that large? Otherwise the CSV is misformatted.

=-ref-=
https://wiki.documentfoundation.org/Faq/Calc/022
Comment 2 Boris Bahes 2017-11-02 20:46:54 UTC
250M 
just trying to load utility that logged for week traffic on server with single client connected.
Comment 3 V Stuart Foote 2017-11-02 21:32:36 UTC
Sure, some LO testing prior to import filter parsing (e.g. test with wc -l) might be feasible. But that really is rather an abusive data set to be pushing into a spread sheet. I would not want to try to work with a sheet that large ;-)

Would think loading to Base on detection would be a smarter route. Filtering records would be trivial at that point.

@Eike, Kohei?
Comment 4 Boris Bahes 2017-11-03 05:33:03 UTC
I've seen this suggestion on different forums. But using Cals seems more intuitive for end users.
Comment 5 Eike Rathke 2017-11-03 22:48:22 UTC
Filter how? What does Excel do there?
Comment 6 Boris Bahes 2017-11-05 10:12:58 UTC
Created attachment 137534 [details]
Excel Query Editor  #1
Comment 7 Boris Bahes 2017-11-05 10:13:13 UTC
Created attachment 137535 [details]
Excel Query Editor  #2
Comment 8 Boris Bahes 2017-11-05 10:20:35 UTC
Excel uses Query Editor.
Comment 9 Anton 2019-01-08 12:04:34 UTC
I would say, rather than a filter, the maximum number of row restriction should be dropped. I just tried to open a file, that wasn't even 70 MB and couldn't open it. We are in 2019. My computer has 20GB of ram. And I cannot open a 70 MB file because of some arbitrary row restriction?! Really!? (I have read the bug on the maximum column number restriction and understand that there might be some underlying technical debt issues that make this difficult to fix, but from a user perspective this really is ridiculous behaviour. I hope it will get fixed.)