Extract table data into csv file

Hello
I have a table with 7k records in the database. I want to do a run every 10 min where take this data and convert it to csv file placed on the file system so the mobile app can retrieve it

My thought process is to do create a sever code script that does the fetching and creation of such file . 2 questions

  1. Is my approach correct or do you think there is a better or easier way?( waiting for upcoming business suit automation might not be feasible as it is needed quick)

  2. is paging also needed when code on server is executed (ie the 100 lines limit)?

Open for better and faster approach that I didnt think of

Thank you

Hello @snakeeyes,

Thank you for your questions. I’ll address them in order to provide clear responses:

  1. Regarding the CSV file generation: Your current method of creating a CSV file every 10 minutes seems to be functional, but it might indeed be costly and result in excessive file storage usage. To offer a more efficient alternative, I would need a bit more detail about your use case:

    • What data are you exporting to CSV files?
    • What triggers the need for these frequent exports?
    • How are these CSV files being used after creation?

    Understanding these aspects will help in recommending a more tailored and possibly more cost-effective approach.

  2. About paging in server-side code execution: Yes, the limit for cloud code is the same, 100 records per request.

Feel free to provide additional details regarding your CSV export process, and we can explore alternative solutions that might be faster and more budget-friendly.

Thank you @sergey.kuk for your comment .
Currently the mobile app fetches a csv file that has 7k products and parse them locally for app usage . These products are in a database table and I usually generate the file manually from it once a week.

Now the use case we have is that the list of these products in the database will be updated by a portal . Therefore, the updated list of products need to be loaded in the app as “live” as possible. This is why the proposal of generating a file every 10 min (or 20 min).

Given this context, please feel free to suggest another way. Also if the way i am proposing is correct, is writing the server code manualy and uploading it is the only way ? Or is there some tool that can help with that ? Just trying to action it quick and fast

Thank you

Just an update . Please correct me of my conclusion is wrong. It seems server execution time of 5 second will bite me.

If I read the table of 7k records while doing paging , it will easily take way more than 5 seconds. Even if I split the reading into multiple timers (where I use hive to save intermediate steps) i still need all the results to generate one complete csv file.

It seems I just have to do a cron job on my server to read from backendless and save the file back there again.

Hello @snakeeyes,

Apologies for the delay in responding. Your scenario does present some complex elements.

Regarding your use case where the product list in the database will be updated through a portal:

  1. If you are only adding new data: You might consider using an after create event handler. This handler can be set up to automatically append new entries to your CSV file using Backendless’s append operation. More information can be found in the Backendless documentation on file append operations. It’s important to note that these event handlers won’t trigger if modifications are made directly from the Backendless console.
  2. For other types of updates: If your updates include modifications to existing data or if you’re unsure about the types of updates being performed, implementing a cron job on your server, as you mentioned, would be a good approach. This setup would allow for regular synchronization of your CSV file with the database changes made through the portal.

Please let me know if this addresses your concerns, or if there’s anything specific you’d like further clarification on!

Best regards,
Sergii