Hello, I'm doing some benchmarking on keto, and fo...
# talk-keto
Hello, I'm doing some benchmarking on keto, and for that I need to create larger and larger databases. I was using `keto relation-tuple create`to ingest the benchmarking data, but I'm starting to hit a size limit when ingesting data (at around 4MB I believe). I was going to write a script to fragment my data, but maybe I missed a simpler solution ?
are you piping in JSON?
it might be the case that the deserialization part is not really optimized for that big data sets at once
the api should have less of a problem
Yes, I was giving a single json file as an argument. The error seemed to say that the problem came from the gRPC message size:
Copy code
Error doing the request: rpc error: code = ResourceExhausted desc = grpc: received message larger than max (20135416 vs. 4194304)
I'll try with the API
right, then there is that limit on the api
so split it into separate files 😉
OK, will do then