When connecting to a data source via "Link to External Data", the Spreadsheet doesn't set an Accept header, hence there is no way for a web server to know which response file format is best for the user agent. We send plain text formatted for human convention in this case... which obviously is the worst possible case for this feature. If you have a good HTML parser that scans for a TABLE object, then please add "Accept: text/html" to your request header? It also seems the import doesn't look at the mimetype when receiving data, we send back text/csv or text/tsv and your tool plus a marker saying it is UTF-8 encoded the link tool ignores this information, guesses incorrectly about the structure of the text file, and prompts the user. If someone sets text/csv or text/tsv they are probably following the RFC. Furthermore, when we return an HTML file with a _single_ table, it prompts the user to pick one of 3 tables. This user interface dialog doesn't make sense, since the file itself has exactly one table. I don't know how it's listing them. Finally, this tool sends several PROPFIND and GET requests for a single open. I'm not sure why PROPFIND is needed, but one GET should be sufficient. Instead, each request amounts to ~8 PROPFIND, ~3 HEAD, and ~3 GET requests. Being able to pull in web resources with this tool would be absolutely great. If you would like to test, feel free to use: http://demo.htsql.org/school http://demo.htsql.org/school/:html http://demo.htsql.org/school/:tsv http://demo.htsql.org/school/:csv Thank you kindly!
Never confirmed by a QA member (or even a 2nd person). Moving back to UNCONFIRMED. In the future please do not set your own bugs to NEW. Thanks
Sorry but I am closing this as INVALID. I know it's been 2 years and I apologize for that but this is a list of issues in one bug report and we need 1 bug/enhancement per bug report else a developer who takes the bug has to decide which of the issues to fix (or they just skip it because they don't want to tackle 5 things). Please open new bug reports for each of the enhancement requests/bugs that you point out and give clear instructions on how to reproduce, what you expect to see, and what you actually see. Again - I recognize that it's been 2+ years. The QA team is growing and we're trying to at least confirm requests faster. Thanks so much for your understanding.
Hopefully, this is a better way to understand and use the websites. Other methods can be some difficult in comparison to use https://www.allessayvikings.com/essaysoft-review/ in order to manage the task. All the process owing to external data and other web addresses is phenomenal.