I’m looking at data acquisition hardware that leads to either .CSV files or their own proprietary binary format.
I understand .CSV is a format Excel popularized and have heard that Excel can only handle 64,000 observations (I think they would call them “rows”). But I’m not an Excel user and don’t know.
.CSV is simply a comma delimited text file. Excel will only(!) go to 64,000 rows, but a .CSV file can be imported into a database program (like Access) for a virtually unlimited number of rows.
Thanks, folks. Yoyodyne, it turns out typical hardware that uses these files doesn’t stop writing just because they hit the limit - it’s just that Excel would stop reading. Like you suggest, other things use these files with longer datasets.
I write my own code to read files, and in other work often read millions of lines, and sometimes as many as 32,000 columns. Looks like that’ll work here too.
Zoid, I don’t really need to know, you just got me curious - do other versions handle more rows? or less? I can’t look because I don’t actually use Excel, don’t even know how.
We’re working on report design and dumping the results of queries into Excel. The conventional wisdom we’re going with is that Excel stops at 64K, so it’s off to Access we go.
Specs for Excel 2003: “Worksheet size 65,536 rows by 256 columns” [as mentioned above] “Arrays: Limited by available memory. Also, arrays cannot refer to entire columns. For example, an array cannot refer to the entire column C:C or to the range C1:C65536. However, an array can refer to the range C1:D65535 because the range is one row short of the maximum worksheet size and does not include the entire C or D column.”
Excel only handles the ~64K rows but if you’re adept at using pivot tables you can use Excel to manipulate Access (or even Oracle) tables that have many, many more rows than that.