I was preparing a table of production data to play with during Chuck's SQL training session and stumbled on something.

Must be all the Dataflex-enhanced CO2 in the conference room, but I spotted a wrinkle in DBImport that has actually bitten me previously and was in fact why I hadn't used it since 17.1.

If the csv file being imported is BIG, DBimport will fail silently.

472,389 lines of CSV. Loaded to multiple copies of a DAT table. Apparently completes every time, but copy 1 had 188,857 records, copy 2 had 195,486 records, and so on. Both DATs were created from a newly loaded DEF. Couldn't see anything in DBImport that was omitting records. Couldn't see any problem with the input.

The issue is that a newly-created DAT expects 10,000 records. DBImport just keeps on inserting records without complaint all the way until all 472,389 have been processed. No errors no crashes.

As DBImport estimates the line count of the csv file to prepare a progress bar, couldn't it guess that the 10000 limit is breached and change the target file structure to accommodate?