Connector request limit

Hi community!,

is there a size limit for request responses? I have issues calling a connector with a chatgpt getting no response (timeout).

Hi there!

Are you using a HTTP connector with POST method to connect with chatgpt? I am wondering if there could be a limit on ChatGPT side as well, but will check with our team too :slight_smile:

Yes, i’m using post. With ChatGPT i meant our intern. I got always a respons when i test this using postman but raley from tulip…

Hey @tokotu -

The default HTTP timeout limit for connectors is 20 seconds (primarily to protect the user experience in apps). Are you seeing latency higher than that in postman?

Importantly, Postman may be reaching out to a streaming endpoint on the OpenAI side, and the time to first token is far far far lower than the time for a complete message.

The max response side is 20mb, but you will be nowhere near that with a text response.

Pete

1 Like

i thought the 20 second limit is only when i test the connectors and the limit is set higher while running in player…

I don’t use streaming as response, i got a single json obj.

I can check what size the response from postman has.

It would be nice if we have more control over the connector timeout.

So… evaluate this using Developer Tools and i see the correct response i expect and the status is 200. BUT the variable which should contain the response don’t get the text and i got a error message.
image

Content-Length:1858
Content-Type:application/json; charset=utf-8

Is there a intern issue with json parse?

I think the issue is using the https response code and response message… if i delete these from the output format it works…

Hey @tokotu -

Let me connect you with the support team, we would not expect this call to be hanging on the status code response.

Pete