Hi Team,
I have a process that updates values in a batch of records via the REST API, and I am looking to reduce the API calls required.
My understanding of the bulk update, is that it places the SAME value in multiple records via one call. Example:
If my question is: Is the item red?
And the answer varies per record, using bulk update I can only say all true or all false.
Correct me if I am wrong here…
And while I could do two calls here, one for true and one for false, the logic permutations become impossible if say I am updating 3-10+ fields per record with unique values per record.
If I could submit a JSON array of record objects (for example), each object referencing the objectId, then this could all be done in one API call.
Is that (or an alternative) possible and I have missed it in the docs somehow?
Right now, one batch update of 250 records is 250 calls, and to keep under rate limits, this means that iterating through 30k records takes a couple of hours.
I don’t need to update 30k records in one call, but even just 1 call for 250 records would reduce the overhead considerably and i could do that more frequently.
Thanks!
Hello @Will_Sargent,
I can sugest you to check our Transactions API, it also allows you to update multiple objects at once.It’s the only way to update several objects using one request.
Regards,
Olha
Hi @olhadanylova,
Thank you for the suggestion of this API.
It looks like to achieve what I am after, I need to chain multiple units of work together.
So I am thinking that If I have 250 unique records to update, each with a unique set of new values, then I need to chain 250 single record update operations together into a single transaction.
Does this sound correct?
The one downside is that if one update fails (possible in some scenarios but unlikely in this one ( I think), since I just retrieved the records being updated in this case) the entire batch rolls-back, and I have to handle that exception.
Thanks
Hi @Will_Sargent ,
It looks like to achieve what I am after, I need to chain multiple units of work together.
Yes, you are correct. You will need to execute bunch of “units of work”.
So I am thinking that If I have 250 unique records to update, each with a unique set of new values, then I need to chain 250 single record update operations together into a single transaction.
In general, you are correct about idea. But your batches will be smaller since transaction API puts restrictions on amount of operations per transaction. For example Cloud99 allows 20 operations per transaction.
The one downside is that if one update fails the entire batch rolls-back, and I have to handle that exception.
Yes, you are correct. You will need to handle errors in your client code.
But unfortunatelly this is the only way at the current moment in which you can reduce amount of API calls.
Regards, Andriy
Ok, Since i am on Cloud 9 that puts me at 10 operations per transaction (which is probably better than 250 anyway). That reduces API calls to 25, which is certainly better than 250.
I’ll take a look at this, thanks for reminding me on the transaction size limits.
Best,