InvalidEntitySizeException

6 06:43:58 IMPORT_DATA Preparing DailyMurliURLs table: initializing columns.
7 06:43:58 IMPORT_DATA Import of tables failed: Property value for “spanishURL” exceeded the maximum allowed size (500 symbols)
8 06:43:58 IMPORT_DATA Importing failed. Exception:com.backendless.exceptions.persistence.InvalidEntitySizeException: Property value for “spanishURL” exceeded the maximum allowed size (500 symbols)

“spanishURL” data type is text , so the limit is more than 500. But why backendless is throwing the error.

App name : BrahmakumarisBestFriend
account : photographerapps@gmail.com

Hello!

I tried to reproduce your problem in my test app but all works fine.
Can you send data sample on which your problem is occurred?

Regards, Andriy

Dear Andriy,

Please find the sample data attached herewith.

(Attachment DailyMurliURLs.json is missing)

Dear Andriy,

Please find the sample data attached herewith. Converted json file to txt. Please do the needful.

DailyMurliURLs.txt (304 KB)

Hello!

Json format does not preserve information about column types. You encounter problem when trying to import data in the app without corresponding table. In this case Backendless will try to guess column types. For all text columns “String” type will be suggested.

For avoiding such problem you can import your data via CSV files. Each CSV file in header contains all required metadata for columns including their types. Please, let me know if you bypassed problem in this way.

Also I created internal ticket BKNDLSS-20514 for improving type suggestion during “Parse app” import. It will fix your problem. I notify you when it will be released.

Regards, Andriy

Hi Andriy,

Uploaded CSV with proper data types(TEXT). Still the issue persists. Since last month it was working as expected. Is there any change happened during the last 2 months. Attached the .CSV for reference.

11:29:42 IMPORT_DATA Importing Started.
2	11:29:42 IMPORT_DATA Preparing files to import
3	11:29:42 IMPORT_DATA Copying of DailyMurliURLs.csv, size: 0.215 MiB.
4	11:29:42 IMPORT_DATA Creating temporary tables
5	11:29:42 IMPORT_DATA Creating user's tables
6	11:29:42 IMPORT_DATA Preparing DailyMurliURLs table: initializing columns.
7	11:29:43 IMPORT_DATA Import of tables failed: Property value for "hindiMp3URL" exceeded the maximum allowed size (500 symbols)
8	11:29:43 IMPORT_DATA Importing failed. Exception:com.backendless.exceptions.persistence.InvalidEntitySizeException: Property value for "hindiMp3URL" exceeded the maximum allowed size (500 symbols)
9 10 com.backendless.management.impex.manager.imports.data.CommonDatabaseUtils.ensureDataSize(CommonDatabaseUtils.java:48)
11 com.backendless.management.impex.manager.imports.data.ImportDataManager.initColumns(ImportDataManager.java:679)
12 com.backendless.management.impex.manager.imports.data.ImportDataManager.initTables(ImportDataManager.java:627)
13 com.backendless.tasks.impex.ImportDataTask.doExecute(ImportDataTask.java:170)
14 com.backendless.tasks.impex.ImportDataTask.executeImpl(ImportDataTask.java:125)
15 com.backendless.tasks.impex.AbstractImportTask.execute(AbstractImportTask.java:89)
16 com.backendless.taskman.Task.run(Task.java:136)
17 com.backendless.taskman.TimeoutFixedThreadPoolExecutor.executeTaskWithTracker(TimeoutFixedThreadPoolExecutor.java:91)
18 com.backendless.taskman.TimeoutFixedThreadPoolExecutor.lambda$invoke$0(TimeoutFixedThreadPoolExecutor.java:39)
19 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
20 java.util.concurrent.FutureTask.run(FutureTask.java:266)
21 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
22 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
23 java.lang.Thread.run(Thread.java:748)
24	11:29:43 IMPORT_DATA Importing Finished.

DailyMurliURLs.csv (220 KB)

Hello!

I checked your app. I saw that table with same name is already exists. Can you rename your file and try to import it again? Backendless will create separate table for it. Let me know about this import results.

Regards, Andriy

Thank you Andriy,

I did it as you said and data got imported as expected. But since last 2 years, I am just importing data to the same table by uploading json file only. Why it throws error this time. For next time if I want to insert the data to existing table I am not sure, it will give expected results. But anyway Thank you very much for your immediate response all the time.

Regards,

Suma

Hi, Suma!

Errors that was encountered by you is caused by previously failed JSON import. Probably system internally cached guessed column types for your table during failed import and reused it for all other imports.
You can recreate table by deleting and then importing data in file with original name or wait until tomorrow for cache invalidation.

I recommend you to use CSV import with files that contain headers with columns metadata. In this way you can specify exact type for each column explicitly.

As I wrote before, we will look at this issue in import of JSON files and will fix it in future releases.

Regards, Andriy