Hello again!
# spicedb
f
Hello again! When trying to integrate a project we have that uses Autzhed into our existing backend we have encounter the following error:
error={"message":"rpc error: code = InvalidArgument desc = update count of 510 is greater than maximum allowed of 500"}
We are having to process a query that attempts to update more than 500 items. The question is how to deal with this limit, I mean by
hand
by making sure that I do not process more that 500 items at a time or, is there another API for doing this in batches automatically?
v
Large transactions can degrade the service, hence the limit. Are you using Authzed Serverless, or your own instance? if the latter, there is a flag to change the maximum value. I assume it's the former because SpiceDB's default is 1K. You could look into the BulkImport API, which is meant for ingesting large amount of relationships.
It's worth noting that BulkImport ingests transactionally when you close the stream. It's recommended to ingest a maximum of 10K elements - if you cross that barrier, you should batch the bulk import calls
Now that I think about it - I believe BulkImport is not supported in Authzed Serverless 😬
f
Umm I am not sure what you mean by Serverless. We have a corporate account and our instance of Authzed will be running in there. We will use grpc to connect to it, so I imagine this is serverless, right?
Also, and just to confirm with you, there is no API that you can invoke to get the max amount of items that
WriteRelationships
and
BulkImportRelationshipsRequest
can deal with rigth? Is just 500 for the first one and 10K for the second one, right?
v
Authzed offers several managed products. If you are using the service provided by https://app.authzed.com/, then that's "Authzed Serverless". no, there is no API that tells you what the limits are, it's a instance setting. For Authzed Serverless it's 500. BulkImport is not supported in Authzed Serverless.
f
Thanks again. So, because I am using "Authzed Serverless" I cannot change the limit nor use BulkImport thus, the only solution whould be to use
WriteRelationships
in batches, right? Something like: `// ... (assuming you have a slice of relationship updates called
updates
) const maxUpdatesPerBatch = 500 for i := 0; i < len(updates); i += maxUpdatesPerBatch { batchEnd := i + maxUpdatesPerBatch if batchEnd > len(updates) { batchEnd = len(updates) } batch := updates[i:batchEnd] // Make the SpiceDB write call with the 'batch' slice err := authzedClient.WriteRelationships(ctx, batch...) if err != nil { // Handle the error appropriately } }` We reckon we won't be dealing with a lot of large transactions but ocasitionaly we will have to process a few
y
@Felix Medina correct - using serverless means you're subject to this limit. If you're running dedicated or self-hosting, there are ways to change the setting around this limit, and the default (iirc) is 1000 updates.
24 Views