Support Topics Documentation Slack YouTube Blog

CSV import - some rows working, most not

Hi everyone - I’m migrating some tables from into Backendless via CSV upload.

I’ve altered them to contain the Backendless field header syntax but getting errors when I try to upload.
It seems to accept some rows, but not the majority.

Is there anything you can see or think of that could cause this? Or what I can try to debug?
It’s a file with a LOT of columns (which is a hangover from previous bubble implementation) which I’ll be optimising once inside Backendless.

In my case I’m adding completely new tables - here’s the error message I’m getting:

13:39:37  IMPORT_DATA  Started import subtask (1) for CompaniesTest table.
13:39:37  IMPORT_DATA  Cannot parse '0.4' as a long number.
13:39:37  IMPORT_DATA  Ignores rows with values [, , , , , , , 3, 1, , , , , , , , 3, 1, 531397, -103718, , 7200000... etc.etc.etc.

And the test file I’m using - with two rows, one which works, the other (second) which does not.

Any help would be massively appreciated!


CompaniesTest.csv (29.0 KB)

Hi Rob,

The CSV defines 321 columns. From the database design perspective, it is (for the lack of a better word) a disaster…

Do you really need all these columns in a single table?

As for the error reported in the log… Some of the column types are not defined correctly. For example, here’s a column definition:


here’s the data it contains:


That is not an integer. Should be declared as DOUBLE


Great thanks Mark - I’d not come across that before. Will try now.

As for the hideous table - absolutely. It’s 10 tables mashed into one, which was the fastest way to load the data in Bubble, although still woefully slow. Hence the move to Backendless. We’re still in testing phase right now - apples to apples, how much load time do we save? Then if successful, we’ll need to rebuild the data gathering and upload process to store things more efficiently.

Hello @Rob_Simpson

Could you please explain what you mean when say?

how much load time do we save

Regards, Dima.