I have a table with a non-unique reference ID. There can be 1 to N rows with the same ID.
I need to -
A) If there is 1 row, send an email with the details in that row.
B) If there are N rows, send an email with the details in the N rows.
I can get all the rows where the ID is only in one row (problem A above) by grouping by the ID and using a HAVING COUNT(ID) = 1.
I am stumped on how to get the rows efficiently for problem B.
Idea 1) Group by the ID and use HAVING COUNT(ID) > 1, and then loop the results, calling the DB for each loop/Id to get the details for those rows and sending the email.
Idea 2) Group by the ID, COUNT the IDs and use HAVING COUNT(ID) > 1. This gets all the IDs and a count of how many rows have each ID. Loop the results, putting the IDs into a second query IN clause, but ensure the count of the IDs from the first query is under 100 to keep it within the page size limit. Finally, loop the second query results, group the row data with the same IDs into an email and send it.
I wouldn’t say I like either option. Idea 1 is clean but has lots of DB trips. Idea 2 is messy but has less DB trips.
Other ideas are appreciated.
I need to do this for 1000-1500 rows each morning.
Is there a restriction on the max amount of rows with the same reference ID?
[Tim] - No, but it isn’t likely to be over 10.
So you need to iterate over unique reference IDs and send emails to the related customers about each of entries with that reference ID, correct?
[Tim] - Correct. The data is shipping information, and each row is the tracking number, carrier (UPS/FedEx/USPS), etc. Each order can have 1 to N (realistically 10 max) shipments.
Which type of Backendless SDK do you use (REST, JS, Java e.t.c)?
[Tim] - I am using codeless.
If you have table “order” and relation “shipments” then I recommend you to load orders directly with related shipments using “Load Table Objects” block. With such approach you will be able to load 100 orders and up to 100 objects in each of relations at the same time. 100 - is the max page size in Data service
With this approach to process 1000 orders you will need 10 requests to fetch order IDs and 10 to fetch orders with all related shipments (20 API calls in total). If you can directly fetch required orders then you will need only 10 API calls in total.
Otherwise approach can be slightly complex:
Fetch list of unique references. If you have more than 100 references for iteration you will need to fetch them in bathes (pages) since system allow to fetch only 100 entries per page.
To minimize API calls you can fetch tracking numbers for several reference IDs at the same time. Considering that page size limit is 100 in Data service you can fetch tracking numbers for up to 10 references at the same time (per one request).
So basically for each batch of references you will need to split it in smaller batches (up to 10 items) and fetch tracking numbers for each of this smaller batches. After fetching group records by reference ID and then execute your email sending logic for each of buckets.
On set of 1000 references it should take you 10 requests to fetch unique references and 100 requests to fetch all your data (110 API calls in total).
We don’t track the orders in Backendless; those live in our order management system.
I could split the current table into two tables:
Table 1
ID
Table 2
Relationship to ID
Tracking Number
Carrier
Etc.
Using Deep save or transaction to insert a row in Table 1 and Table 2 when the shipment is created woudln’t increase the API calls on insert. I think.
What happens on Deep save if a record already exists in table 1, but not in table 2?
What happens on Deep save if a record already exists in table 1, but not in table 2?
It will update record in table 1 and create record in table 2.
There is other problem with “Deep save” in your case - when you specify “objectId” for record, “Deep save” logic will try to update object instead of creating one. Due to this I would suggest you to use transactions API instead.
In transactions API you have more control over each action. For table 1, for example, you will be able to use “upsert” action to reach your goal.