Glyph.7805:

I have a large database of users where there will be multiple ?access_token requests and I am trying to avoid sending and receiving alot of http requests to and from our servers.

is there a way to retrieve the JSON data formatted by “user” per access token? in a bulk request? e.g. by header or GET? (?access_token=“xx1”&“xx2”&“xx3”&“xx4”)

My idea is to create a cron job that will send a request of 200 users at a time every x minutes. so that the data is requested in chunks.

What is the recommended method here?

Lawton Campbell.8517:

There isn’t a way to request multiple character blobs, nor will there probably be in the future (the backend that the API talks to for character data also follows a request/response protocol and only supports one blob/request).

I’m not sure if our HTTP implementation supports pipelining, but pipelining the HTTP requests across a handful of persistent connections would be the fastest way to get the data out. Not sure if your HTTP library supports it, but I know cURL does, at least.

The main latency overhead from sending multiple requests is going to be the SSL handshake though, so even without pipelining re-using the connections is going to be a significant difference over using a connection/request.