Some applications useDocumentation Index
Fetch the complete documentation index at: https://docs.egisai.co/llms.txt
Use this file to discover all available pages before exploring further.
httpx or requests to talk to model providers
directly — for example calling a custom endpoint, a self-hosted model, or a
provider whose official Python SDK is not yet supported. The SDK’s HTTP
fallback patches these libraries so those calls can be governed too.
When to enable it
Enable the HTTP fallback when:- You call an LLM endpoint via
httpx.post/requests.postrather than via the officialopenai,anthropic, orgoogle-generativeaiSDKs. - You want broad audit coverage across miscellaneous outbound model traffic.
Configuration
The HTTP fallback is on by default. Disable it with:Use
No code changes required — onceegisai.init() runs with
enable_http_fallback=True (the default), httpx.Client.send and
requests.Session.send are wrapped.
- Inspects the URL to identify model-host traffic.
- Routes the call through the same evaluation pipeline used for the official SDKs.
- Audits the verdict.
When a call has already been gated by one of the official-SDK adapters in
the same call chain, the HTTP wrapper detects this and skips re-evaluation
to avoid double-counting.
Notes and limitations
- The fallback only inspects bodies it can decode as JSON. Calls using other content types are recorded but not policy-checked.
- Streaming over HTTP (server-sent events / chunked transfer) is recorded as a single event when the connection closes.
- For best fidelity with provider-specific call shapes, prefer the official SDK integration when one is available.