Row size too large on json import (Importing from Parse)

When importing a parse json export, I get this error message:

IMPORT_DATA Importing failed. com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs

Attached is the schema. By my math, I have 86 fields. If they were all string, I would still be at 43k so not sure why it is failing. Because I have a lot of ints/doubles/booleans I would think I am closer to 35k.

Thanks

BackendlessFields.pdf (101.87kB)

Hi Tim,

Every string is encoded with UTF16, which takes at least 3 bytes for every character. So for 86 properties in a single table, a single record would require 124500 bytes.

Regards,
Mark

Thanks for the quick response. Looks like I need to work on my math. I just watched your import video which says you are working to resolve all import issues from parse, is this issue on your list?

We have a lot of junk in our database, we could probably fit in the current limit that Backendless has, however we do not have an easy way to delete columns because Parse has shut down their import capability. Normally I would just export our production DB, import it into a new app on Parse, delete a bunch of columns and try again to import into backendless, but I cant’ do that. Any thoughts? Might be a nice feature to support ignoring certain fields/columns when importing a DB.

Thanks
Tim

Hi Tim,

We have always had the limit of 65535 bytes per record. This is more of an issue of a denormalized schema rather than the import algorithm problem. The algorithm can handle the import so long the incoming data fits into the constraints we have.

I would suggest creating a program that reads in and writes out the JSON data and while writing it out, it would remove any unnecessary elements.

Regards,
Mark

Thanks, Most of that went over my head but I guess the key question is that it sounds like this is not on your list of things to do as far as supporting full parse DB import capability? And as such, I’ll need to line up some developer resources to write some code to convert it to something that backendless can handle.

Thanks
Tim

Tim, I apologize for being too technical in my response. Yes, the data needs to be “shrunk” by eliminating anything that is not needed before it is imported.

When you look for technical resources, please consider this as an option:
https://backendless.com/professional-services/

Regards,
Mark