File up load 13mb zip file with one file compressed

Application ID: 45F4AD7D-0C9A-4A6A-FF23-2666FDC93000
Error UID: E5BA4DF7-DEE7-69CB-FF12-19D89A61A800

Backendless encountered an error while handling the request. An internal trouble ticket with ID E5BA4DF7-DEE7-69CB-FF12-19D89A61A800 has been created and we will be investigating the issue.
String index out of range: 0
java.lang.StringIndexOutOfBoundsException: String index out of range: 0
	at java.base/java.lang.StringLatin1.charAt(StringLatin1.java:47)
	at java.base/java.lang.String.charAt(String.java:693)
	at com.backendless.management.impex.AutoReplacingService.replace(AutoReplacingService.java:20)
	at com.backendless.management.impex.csv.data.ColumnSimpleProcessor.fromHeader(ColumnSimpleProcessor.java:66)
	at com.backendless.management.impex.csv.data.ColumnsInfoParser.parseEntry(ColumnsInfoParser.java:51)
	at com.backendless.management.impex.csv.data.ColumnsInfoParser.parseEntry(ColumnsInfoParser.java:16)
	at com.googlecode.jcsv.reader.internal.CSVReaderImpl.readNext(CSVReaderImpl.java:68)
	at com.backendless.management.impex.ImpexSystem$1.loadColumns(ImpexSystem.java:38)
	at com.backendless.management.impex.manager.ImpexManager.parseFile(ImpexManager.java:292)
	at com.backendless.management.impex.service.ImportService.importZip(ImportService.java:340)
	at controllers.console.migration.Import.lambda$importZip$4(Import.java:175)
	at com.backendless.async.ExecutorService.lambda$submit$0(ExecutorService.java:64)
	at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:834)

Problem description

Attempting to upload 500,000-row file.
I have uploaded 20-row file with same header descriptions successfully.
System crashed with ‘invalid file’ when attempting to upload as a single 50mb file. Converted to zip file of 13mb which then produced this internal error ticket.

Hi @Paul_HIllen,

thanks for reporting this issue!
I have created an internal ticket to investigate it.
You can reference it by it’s internal ID BKNDLSS-24312.

Regards,
Stanislaw

Hi Stanislaw,
Just spotted the issue, my cloud9 only supports 100000 data table items and cloud99 400000 data table items.
Bad time for subscription band price increases!!
Regards,
Paul