This commit is contained in:
commit
7cef56de15
23 changed files with 3136 additions and 0 deletions
5
.gitignore
vendored
Normal file
5
.gitignore
vendored
Normal file
|
|
@ -0,0 +1,5 @@
|
||||||
|
.venv/
|
||||||
|
.pytest_cache/
|
||||||
|
.ruff_cache/
|
||||||
|
__pycache__/
|
||||||
|
/data/
|
||||||
163
.opencode/plans/1775725199925-glowing-orchid.md
Normal file
163
.opencode/plans/1775725199925-glowing-orchid.md
Normal file
|
|
@ -0,0 +1,163 @@
|
||||||
|
# Plan: make local account state authoritative
|
||||||
|
|
||||||
|
## Problem
|
||||||
|
|
||||||
|
Current `accounts.json` state is not acting like a real source of truth for `/token`:
|
||||||
|
|
||||||
|
- `active_account_id` is persisted but not actually used to drive selection
|
||||||
|
- `last_known_usage` is mostly treated as a sort hint, then immediately replaced by a live refresh
|
||||||
|
- `/token` live-refreshes every candidate it tries, so local limits are not truly trusted
|
||||||
|
- the persisted snapshot contains duplicate derived fields (`used_percent`, `remaining_percent`, `exhausted`) in addition to canonical window and flag data
|
||||||
|
- as a result, the file is hard to reason about: some fields look authoritative but are only decorative or transient
|
||||||
|
|
||||||
|
## Desired behavior
|
||||||
|
|
||||||
|
1. `accounts.json` is the source of truth for `/token` selection.
|
||||||
|
2. If `active_account_id` exists and local state says the active account is usable, `/token` must try that account first.
|
||||||
|
3. Only if local state says the active account is blocked or unusable should `/token` fall back to choosing another account.
|
||||||
|
4. Fallback selection can keep the current ranking approach: most-used eligible account first.
|
||||||
|
5. `/usage` should continue to refresh all accounts live.
|
||||||
|
6. Persisted usage snapshots should store canonical data only, without duplicated derived fields.
|
||||||
|
|
||||||
|
## Recommended approach
|
||||||
|
|
||||||
|
### 1. Split `/token` selection into local selection and live validation
|
||||||
|
|
||||||
|
In `src/gibby/manager.py`:
|
||||||
|
|
||||||
|
- Add `_find_active_account(state)` to resolve `state.active_account_id` to an `AccountRecord`
|
||||||
|
- Add `_is_locally_blocked(account, current)` to decide from local file state only whether an account can be tried
|
||||||
|
- Add `_build_selection_order(state, current)` that:
|
||||||
|
- returns the active account first if it exists and is not locally blocked
|
||||||
|
- otherwise falls back to the remaining eligible accounts sorted by current saved-usage ranking
|
||||||
|
- never duplicates the active account in the fallback list
|
||||||
|
|
||||||
|
`_is_locally_blocked()` should use only persisted local state:
|
||||||
|
|
||||||
|
- blocked if `cooldown_until > now`
|
||||||
|
- blocked if `last_known_usage` exists and local usage indicates exhaustion
|
||||||
|
- otherwise not blocked
|
||||||
|
|
||||||
|
This gives the exact behavior the user requested:
|
||||||
|
|
||||||
|
- active account is mandatory first choice when nothing local blocks it
|
||||||
|
- local file decides whether active is allowed before any network call
|
||||||
|
- live refresh remains only a validation step for the chosen candidate
|
||||||
|
|
||||||
|
### 2. Keep live refresh only for accounts actually attempted
|
||||||
|
|
||||||
|
In `src/gibby/manager.py`:
|
||||||
|
|
||||||
|
- keep the current live refresh path (`refresh_account_usage`) once an account has been selected for an attempt
|
||||||
|
- if active account passes local checks but fails live validation, persist the updated state and continue to the next candidate
|
||||||
|
- if active account is locally blocked, skip live refresh for it during that `/token` call
|
||||||
|
|
||||||
|
`/usage` stays as-is and continues refreshing all accounts live.
|
||||||
|
|
||||||
|
### 3. Clean up the persisted usage snapshot schema
|
||||||
|
|
||||||
|
In `src/gibby/models.py` and `src/gibby/store.py`:
|
||||||
|
|
||||||
|
- stop persisting derived snapshot fields:
|
||||||
|
- `used_percent`
|
||||||
|
- `remaining_percent`
|
||||||
|
- `exhausted`
|
||||||
|
- keep only canonical persisted snapshot data:
|
||||||
|
- `checked_at`
|
||||||
|
- `primary_window`
|
||||||
|
- `secondary_window`
|
||||||
|
- `limit_reached`
|
||||||
|
- `allowed`
|
||||||
|
|
||||||
|
Implementation direction:
|
||||||
|
|
||||||
|
- keep `UsageSnapshot` as the in-memory model for now, but derive:
|
||||||
|
- `used_percent`
|
||||||
|
- `remaining_percent`
|
||||||
|
- `exhausted`
|
||||||
|
from canonical fields when loading/parsing
|
||||||
|
- update `store._snapshot_to_dict()` to write only canonical fields
|
||||||
|
- update `store._snapshot_from_dict()` to reconstruct the full in-memory `UsageSnapshot` from canonical persisted fields
|
||||||
|
|
||||||
|
This keeps code churn smaller than a full model rewrite while making the file itself cleaner and more honest.
|
||||||
|
|
||||||
|
### 4. Keep cooldown as the persisted local block, but make local exhaustion matter too
|
||||||
|
|
||||||
|
Local selection should not depend on a fresh API round-trip.
|
||||||
|
|
||||||
|
For `/token`:
|
||||||
|
|
||||||
|
- `cooldown_until` remains the strongest persisted block
|
||||||
|
- if cooldown is clear but local `last_known_usage` still says exhausted, treat the account as locally blocked too
|
||||||
|
- only accounts that pass local checks are eligible to be attempted live
|
||||||
|
|
||||||
|
This changes current behavior in an important way:
|
||||||
|
|
||||||
|
- today, an account with expired or missing cooldown can still be live-refreshed even if local snapshot says exhausted
|
||||||
|
- after the change, local state truly gates the initial decision
|
||||||
|
|
||||||
|
### 5. Preserve existing fallback ranking for non-active accounts
|
||||||
|
|
||||||
|
After active account is rejected locally, keep the current fallback sort in `manager.py`:
|
||||||
|
|
||||||
|
- primary window used percent descending
|
||||||
|
- secondary window used percent descending
|
||||||
|
|
||||||
|
That avoids a larger policy change in this pass and isolates the refactor to "trust local state first".
|
||||||
|
|
||||||
|
## Files to modify
|
||||||
|
|
||||||
|
- `/home/wzray/AI/gibby/src/gibby/manager.py`
|
||||||
|
- respect `active_account_id`
|
||||||
|
- add local-only eligibility predicate
|
||||||
|
- change selection order to active-first-when-locally-usable
|
||||||
|
- `/home/wzray/AI/gibby/src/gibby/models.py`
|
||||||
|
- keep canonical usage derivation helpers centralized
|
||||||
|
- support reconstructing derived values from canonical fields
|
||||||
|
- `/home/wzray/AI/gibby/src/gibby/store.py`
|
||||||
|
- write canonical snapshot shape only
|
||||||
|
- read canonical snapshot shape into full in-memory model
|
||||||
|
- `/home/wzray/AI/gibby/src/gibby/account_ops.py`
|
||||||
|
- keep refresh path aligned with canonical snapshot handling
|
||||||
|
- reuse a local exhaustion predicate if helpful instead of duplicating logic
|
||||||
|
- `/home/wzray/AI/gibby/tests/test_core.py`
|
||||||
|
- add and update selection behavior tests
|
||||||
|
- `/home/wzray/AI/gibby/tests/test_account_ops.py`
|
||||||
|
- update snapshot persistence assumptions if needed
|
||||||
|
- `/home/wzray/AI/gibby/tests/test_app.py`
|
||||||
|
- adjust fixture shapes only if response expectations change
|
||||||
|
- `/home/wzray/AI/gibby/tests/test_refresh_limits.py`
|
||||||
|
- ensure live refresh still rewrites canonical local state correctly
|
||||||
|
- `/home/wzray/AI/gibby/tests/test_oauth_helper.py`
|
||||||
|
- ensure oauth helper stores canonical snapshot shape correctly
|
||||||
|
|
||||||
|
## Test plan
|
||||||
|
|
||||||
|
### Selection behavior
|
||||||
|
|
||||||
|
Add or update tests in `tests/test_core.py` for:
|
||||||
|
|
||||||
|
- active account is used first when locally allowed, even if another account has higher saved usage
|
||||||
|
- active account is skipped without live refresh when `cooldown_until` is still active
|
||||||
|
- active account is skipped without live refresh when local snapshot says exhausted
|
||||||
|
- active account passes local checks but fails live refresh, then fallback account is tried
|
||||||
|
- missing or stale `active_account_id` falls back cleanly to non-active selection logic
|
||||||
|
- fallback ordering still prefers higher saved primary usage and uses secondary as tie-breaker
|
||||||
|
|
||||||
|
### Snapshot/file behavior
|
||||||
|
|
||||||
|
Add or update tests to verify:
|
||||||
|
|
||||||
|
- `accounts.json` no longer writes `used_percent`, `remaining_percent`, or `exhausted`
|
||||||
|
- loading canonical persisted snapshots still reconstructs full in-memory `UsageSnapshot`
|
||||||
|
- `/usage`, `refresh_limits.py`, and `oauth_helper.py` still persist refreshed canonical state correctly
|
||||||
|
|
||||||
|
## Verification
|
||||||
|
|
||||||
|
- `uv run pytest -q tests/test_core.py tests/test_account_ops.py tests/test_app.py tests/test_refresh_limits.py tests/test_oauth_helper.py`
|
||||||
|
- inspect a real `accounts.json` after `/usage` or `refresh_limits.py` and confirm snapshot entries contain only canonical fields
|
||||||
|
- manually test `/token` with:
|
||||||
|
- a locally usable active account
|
||||||
|
- a locally blocked active account
|
||||||
|
- a dangling `active_account_id`
|
||||||
|
- verify that active account is not live-refreshed when local file state already blocks it
|
||||||
22
Dockerfile
Normal file
22
Dockerfile
Normal file
|
|
@ -0,0 +1,22 @@
|
||||||
|
FROM python:3.14-slim
|
||||||
|
|
||||||
|
ENV PYTHONUNBUFFERED=1 \
|
||||||
|
PATH="/app/.venv/bin:$PATH" \
|
||||||
|
DATA_DIR="/data"
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
RUN pip install uv
|
||||||
|
|
||||||
|
COPY pyproject.toml uv.lock ./
|
||||||
|
|
||||||
|
RUN uv sync --frozen --no-dev
|
||||||
|
|
||||||
|
COPY src .
|
||||||
|
|
||||||
|
EXPOSE 80
|
||||||
|
|
||||||
|
HEALTHCHECK --start-period=10s --start-interval=1s \
|
||||||
|
CMD ["python", "-c", "import urllib.request; urllib.request.urlopen('http://127.0.0.1:80/health', timeout=3)"]
|
||||||
|
|
||||||
|
CMD ["uvicorn", "gibby.app:app", "--host", "0.0.0.0", "--port", "80", "--no-access-log"]
|
||||||
40
README.md
Normal file
40
README.md
Normal file
|
|
@ -0,0 +1,40 @@
|
||||||
|
# gibby
|
||||||
|
|
||||||
|
Minimal multi-account Codex token aggregator.
|
||||||
|
|
||||||
|
## Endpoints
|
||||||
|
|
||||||
|
- `GET /health`
|
||||||
|
- `GET /token`
|
||||||
|
|
||||||
|
## OAuth helper
|
||||||
|
|
||||||
|
Add accounts with:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
uv run python scripts/oauth_helper.py --mode local
|
||||||
|
uv run python scripts/oauth_helper.py --mode manual
|
||||||
|
```
|
||||||
|
|
||||||
|
By default the helper stores accounts in `data/accounts.json` and does not replace the current active account.
|
||||||
|
|
||||||
|
## Run
|
||||||
|
|
||||||
|
```bash
|
||||||
|
uv sync
|
||||||
|
uv run uvicorn gibby.app:app --host 0.0.0.0 --port 8080
|
||||||
|
```
|
||||||
|
|
||||||
|
## Environment
|
||||||
|
|
||||||
|
- `DATA_DIR` default: `data`
|
||||||
|
- `GIBBY_BIND_HOST` default: `0.0.0.0`
|
||||||
|
- `GIBBY_PORT` default: `8080`
|
||||||
|
- `GIBBY_CALLBACK_HOST` default: `127.0.0.1`
|
||||||
|
- `GIBBY_CALLBACK_PORT` default: `1455`
|
||||||
|
- `GIBBY_OAUTH_CLIENT_ID` default: `app_EMoamEEZ73f0CkXaXp7hrann`
|
||||||
|
- `GIBBY_ORIGINATOR` default: `opencode`
|
||||||
|
|
||||||
|
## State format
|
||||||
|
|
||||||
|
The service stores tokens and last known cooldown information in JSON. Accounts already known to be in cooldown are skipped without re-checking usage until their cooldown expires.
|
||||||
16
pyproject.toml
Normal file
16
pyproject.toml
Normal file
|
|
@ -0,0 +1,16 @@
|
||||||
|
[project]
|
||||||
|
name = "gibby"
|
||||||
|
version = "0.1.0"
|
||||||
|
requires-python = ">=3.14"
|
||||||
|
dependencies = [
|
||||||
|
"fastapi>=0.116.1",
|
||||||
|
"httpx>=0.28.1",
|
||||||
|
"pygments>=2.20.0",
|
||||||
|
"uvicorn>=0.35.0",
|
||||||
|
]
|
||||||
|
|
||||||
|
[dependency-groups]
|
||||||
|
dev = [
|
||||||
|
"pytest>=8.4.1",
|
||||||
|
"pytest-asyncio>=1.1.0",
|
||||||
|
]
|
||||||
200
scripts/oauth_helper.py
Normal file
200
scripts/oauth_helper.py
Normal file
|
|
@ -0,0 +1,200 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import asyncio
|
||||||
|
import shutil
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
from urllib.parse import parse_qs, urlparse
|
||||||
|
|
||||||
|
sys.path.insert(0, str(Path(__file__).resolve().parents[1] / "src"))
|
||||||
|
|
||||||
|
from gibby.account_ops import (
|
||||||
|
PermanentAccountFailure,
|
||||||
|
failed_identifier,
|
||||||
|
refresh_account_usage,
|
||||||
|
window_used_percent,
|
||||||
|
)
|
||||||
|
from gibby.client import OpenAIAPIError, OpenAIClient
|
||||||
|
from gibby.models import AccountRecord, format_reset_in
|
||||||
|
from gibby.oauth import (
|
||||||
|
build_authorize_url,
|
||||||
|
generate_pkce_pair,
|
||||||
|
generate_state,
|
||||||
|
make_account_id,
|
||||||
|
)
|
||||||
|
from gibby.settings import Settings
|
||||||
|
from gibby.store import JsonStateStore
|
||||||
|
|
||||||
|
|
||||||
|
def parse_redirect_url(url: str) -> tuple[str, str]:
|
||||||
|
parsed = urlparse(url.strip())
|
||||||
|
query = parse_qs(parsed.query)
|
||||||
|
code = (query.get("code") or [None])[0]
|
||||||
|
state = (query.get("state") or [None])[0]
|
||||||
|
if not code or not state:
|
||||||
|
raise ValueError("Redirect URL must contain code and state")
|
||||||
|
return code, state
|
||||||
|
|
||||||
|
|
||||||
|
async def wait_for_callback(
|
||||||
|
host: str, port: int, expected_state: str, timeout: float = 300.0
|
||||||
|
) -> tuple[str, str]:
|
||||||
|
loop = asyncio.get_running_loop()
|
||||||
|
result: asyncio.Future[tuple[str, str]] = loop.create_future()
|
||||||
|
|
||||||
|
async def handle_client(
|
||||||
|
reader: asyncio.StreamReader, writer: asyncio.StreamWriter
|
||||||
|
) -> None:
|
||||||
|
try:
|
||||||
|
request_line = await reader.readline()
|
||||||
|
parts = request_line.decode("utf-8", errors="replace").split()
|
||||||
|
if len(parts) < 2:
|
||||||
|
raise ValueError("Invalid HTTP request line")
|
||||||
|
code, state = parse_redirect_url(parts[1])
|
||||||
|
if not result.done():
|
||||||
|
result.set_result((code, state))
|
||||||
|
|
||||||
|
while True:
|
||||||
|
line = await reader.readline()
|
||||||
|
if not line or line == b"\r\n":
|
||||||
|
break
|
||||||
|
|
||||||
|
writer.write(
|
||||||
|
b"HTTP/1.1 200 OK\r\n"
|
||||||
|
b"Content-Type: text/plain; charset=utf-8\r\n"
|
||||||
|
b"Connection: close\r\n\r\n"
|
||||||
|
b"Authorization received. You can close this tab."
|
||||||
|
)
|
||||||
|
await writer.drain()
|
||||||
|
finally:
|
||||||
|
writer.close()
|
||||||
|
await writer.wait_closed()
|
||||||
|
|
||||||
|
server = await asyncio.start_server(handle_client, host, port)
|
||||||
|
try:
|
||||||
|
code, state = await asyncio.wait_for(result, timeout=timeout)
|
||||||
|
finally:
|
||||||
|
server.close()
|
||||||
|
await server.wait_closed()
|
||||||
|
|
||||||
|
if state != expected_state:
|
||||||
|
raise ValueError("OAuth state mismatch")
|
||||||
|
return code, state
|
||||||
|
|
||||||
|
|
||||||
|
async def exchange_and_store_account(
|
||||||
|
store: JsonStateStore,
|
||||||
|
client: OpenAIClient,
|
||||||
|
code: str,
|
||||||
|
verifier: str,
|
||||||
|
set_active: bool,
|
||||||
|
) -> AccountRecord:
|
||||||
|
access_token, refresh_token, expires_at = await client.exchange_code(code, verifier)
|
||||||
|
account = AccountRecord(
|
||||||
|
id=make_account_id(),
|
||||||
|
access_token=access_token,
|
||||||
|
refresh_token=refresh_token,
|
||||||
|
expires_at=expires_at,
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
usage = await refresh_account_usage(
|
||||||
|
account,
|
||||||
|
client,
|
||||||
|
client.settings.exhausted_usage_threshold,
|
||||||
|
)
|
||||||
|
except PermanentAccountFailure:
|
||||||
|
store.append_failed_identifier(failed_identifier(account))
|
||||||
|
raise
|
||||||
|
except OpenAIAPIError as exc:
|
||||||
|
account.last_error = str(exc)
|
||||||
|
store.upsert_account(account, set_active=set_active)
|
||||||
|
print("Usage fetch failed, stored account without usage snapshot.")
|
||||||
|
return account
|
||||||
|
|
||||||
|
store.upsert_account(account, set_active=set_active)
|
||||||
|
print(
|
||||||
|
f"token ready for {account.id}, "
|
||||||
|
f"primary {window_used_percent(usage.primary_window)}% "
|
||||||
|
f"reset in {format_reset_in(usage.primary_window.reset_at if usage.primary_window else None)}, "
|
||||||
|
f"secondary {window_used_percent(usage.secondary_window)}% "
|
||||||
|
f"reset in {format_reset_in(usage.secondary_window.reset_at if usage.secondary_window else None)}"
|
||||||
|
)
|
||||||
|
return account
|
||||||
|
|
||||||
|
|
||||||
|
async def run(
|
||||||
|
mode: str,
|
||||||
|
set_active: bool,
|
||||||
|
data_dir: Path | None = None,
|
||||||
|
) -> None:
|
||||||
|
settings = Settings()
|
||||||
|
if data_dir is not None:
|
||||||
|
settings.data_dir = data_dir
|
||||||
|
verifier, challenge = generate_pkce_pair()
|
||||||
|
state = generate_state()
|
||||||
|
url = build_authorize_url(settings, challenge, state)
|
||||||
|
|
||||||
|
print("Open this URL and complete authorization:\n")
|
||||||
|
print(url)
|
||||||
|
print()
|
||||||
|
|
||||||
|
opener = shutil.which("xdg-open")
|
||||||
|
if opener:
|
||||||
|
print("Opening browser...")
|
||||||
|
print()
|
||||||
|
try:
|
||||||
|
subprocess.Popen([opener, url])
|
||||||
|
except OSError:
|
||||||
|
print("Open the URL manually.")
|
||||||
|
print()
|
||||||
|
else:
|
||||||
|
print("Open the URL manually.")
|
||||||
|
print()
|
||||||
|
|
||||||
|
if mode == "local":
|
||||||
|
code, _ = await wait_for_callback(
|
||||||
|
settings.callback_host,
|
||||||
|
settings.callback_port,
|
||||||
|
state,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
redirect_url = input("Paste the final redirect URL: ").strip()
|
||||||
|
code, returned_state = parse_redirect_url(redirect_url)
|
||||||
|
if returned_state != state:
|
||||||
|
raise ValueError("OAuth state mismatch")
|
||||||
|
|
||||||
|
store = JsonStateStore(settings.accounts_file, settings.failed_file)
|
||||||
|
client = OpenAIClient(settings)
|
||||||
|
try:
|
||||||
|
account = await exchange_and_store_account(
|
||||||
|
store, client, code, verifier, set_active
|
||||||
|
)
|
||||||
|
finally:
|
||||||
|
await client.aclose()
|
||||||
|
|
||||||
|
print(f"Stored account: {account.id}")
|
||||||
|
print(f"Access token expires at: {account.expires_at}")
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
parser = argparse.ArgumentParser()
|
||||||
|
parser.add_argument("-m", "--mode", choices=["local", "manual"], default="local")
|
||||||
|
parser.add_argument("-a", "--set-active", action="store_true")
|
||||||
|
parser.add_argument("-d", "--data-dir", type=Path)
|
||||||
|
args = parser.parse_args()
|
||||||
|
try:
|
||||||
|
asyncio.run(run(args.mode, args.set_active, args.data_dir))
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
print("\nCancelled.")
|
||||||
|
except TimeoutError:
|
||||||
|
print("Timed out waiting for OAuth callback.")
|
||||||
|
except ValueError as exc:
|
||||||
|
print(str(exc))
|
||||||
|
except PermanentAccountFailure as exc:
|
||||||
|
print(str(exc))
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
75
scripts/refresh_limits.py
Normal file
75
scripts/refresh_limits.py
Normal file
|
|
@ -0,0 +1,75 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import asyncio
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
sys.path.insert(0, str(Path(__file__).resolve().parents[1] / "src"))
|
||||||
|
|
||||||
|
from gibby.account_ops import (
|
||||||
|
PermanentAccountFailure,
|
||||||
|
handle_failed_account,
|
||||||
|
refresh_account_usage,
|
||||||
|
window_used_percent,
|
||||||
|
)
|
||||||
|
from gibby.client import OpenAIClient
|
||||||
|
from gibby.models import format_reset_in
|
||||||
|
from gibby.settings import Settings
|
||||||
|
from gibby.store import JsonStateStore
|
||||||
|
|
||||||
|
|
||||||
|
async def run(data_dir: Path | None = None) -> None:
|
||||||
|
settings = Settings()
|
||||||
|
if data_dir is not None:
|
||||||
|
settings.data_dir = data_dir
|
||||||
|
|
||||||
|
store = JsonStateStore(settings.accounts_file, settings.failed_file)
|
||||||
|
state = store.load()
|
||||||
|
if not state.accounts:
|
||||||
|
print("No accounts configured.")
|
||||||
|
return
|
||||||
|
|
||||||
|
client = OpenAIClient(settings)
|
||||||
|
try:
|
||||||
|
for account in list(state.accounts):
|
||||||
|
previous_id = account.id
|
||||||
|
try:
|
||||||
|
usage = await refresh_account_usage(
|
||||||
|
account,
|
||||||
|
client,
|
||||||
|
settings.exhausted_usage_threshold,
|
||||||
|
)
|
||||||
|
store.update_active_account_id(state, previous_id, account.id)
|
||||||
|
print(
|
||||||
|
f"token ready for {account.id}, "
|
||||||
|
f"primary {window_used_percent(usage.primary_window)}% "
|
||||||
|
f"reset in {format_reset_in(usage.primary_window.reset_at if usage.primary_window else None)}, "
|
||||||
|
f"secondary {window_used_percent(usage.secondary_window)}% "
|
||||||
|
f"reset in {format_reset_in(usage.secondary_window.reset_at if usage.secondary_window else None)}"
|
||||||
|
)
|
||||||
|
except PermanentAccountFailure as exc:
|
||||||
|
account.last_error = str(exc)
|
||||||
|
handle_failed_account(store, account)
|
||||||
|
store.remove_account(state, account.id)
|
||||||
|
print(f"{account.id}: removed={exc}")
|
||||||
|
except Exception as exc:
|
||||||
|
account.last_error = str(exc)
|
||||||
|
print(f"{account.id}: error={exc}")
|
||||||
|
store.save(state)
|
||||||
|
finally:
|
||||||
|
await client.aclose()
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
parser = argparse.ArgumentParser()
|
||||||
|
parser.add_argument("-d", "--data-dir", type=Path)
|
||||||
|
args = parser.parse_args()
|
||||||
|
try:
|
||||||
|
asyncio.run(run(args.data_dir))
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
print("\nCancelled.")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
0
src/gibby/__init__.py
Normal file
0
src/gibby/__init__.py
Normal file
99
src/gibby/account_ops.py
Normal file
99
src/gibby/account_ops.py
Normal file
|
|
@ -0,0 +1,99 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from gibby.client import OpenAIAPIError, OpenAIClient
|
||||||
|
from gibby.models import (
|
||||||
|
AccountRecord,
|
||||||
|
UsageSnapshot,
|
||||||
|
UsageWindow,
|
||||||
|
compute_cooldown_until,
|
||||||
|
now_ts,
|
||||||
|
parse_usage_payload,
|
||||||
|
)
|
||||||
|
from gibby.store import JsonStateStore
|
||||||
|
|
||||||
|
|
||||||
|
class PermanentAccountFailure(RuntimeError):
|
||||||
|
def __init__(self, account: AccountRecord, reason: str):
|
||||||
|
super().__init__(reason)
|
||||||
|
self.account = account
|
||||||
|
|
||||||
|
|
||||||
|
def failed_identifier(account: AccountRecord) -> str:
|
||||||
|
return account.email or account.account_id or account.id
|
||||||
|
|
||||||
|
|
||||||
|
def sync_account_identity(account: AccountRecord) -> None:
|
||||||
|
if account.email:
|
||||||
|
account.id = account.email
|
||||||
|
|
||||||
|
|
||||||
|
def handle_failed_account(store: JsonStateStore, account: AccountRecord) -> None:
|
||||||
|
store.append_failed_identifier(failed_identifier(account))
|
||||||
|
|
||||||
|
|
||||||
|
def window_used_percent(window: UsageWindow | None) -> int:
|
||||||
|
if window is None:
|
||||||
|
return 0
|
||||||
|
return window.used_percent
|
||||||
|
|
||||||
|
|
||||||
|
def snapshot_is_exhausted(
|
||||||
|
snapshot: UsageSnapshot, exhausted_threshold: int = 100
|
||||||
|
) -> bool:
|
||||||
|
if snapshot.exhausted:
|
||||||
|
return True
|
||||||
|
if snapshot.limit_reached or not snapshot.allowed:
|
||||||
|
return True
|
||||||
|
pri = snapshot.primary_window
|
||||||
|
sec = snapshot.secondary_window
|
||||||
|
if pri is not None and pri.used_percent >= exhausted_threshold:
|
||||||
|
return True
|
||||||
|
if sec is not None and sec.used_percent >= exhausted_threshold:
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
async def ensure_fresh_access_token(
|
||||||
|
account: AccountRecord, client: OpenAIClient
|
||||||
|
) -> None:
|
||||||
|
if account.expires_at > now_ts() + 30:
|
||||||
|
return
|
||||||
|
if not account.refresh_token:
|
||||||
|
raise RuntimeError("Missing refresh token")
|
||||||
|
try:
|
||||||
|
access_token, refresh_token, expires_at = await client.refresh_access_token(
|
||||||
|
account.refresh_token
|
||||||
|
)
|
||||||
|
except OpenAIAPIError as exc:
|
||||||
|
if exc.permanent:
|
||||||
|
raise PermanentAccountFailure(account, str(exc)) from exc
|
||||||
|
raise
|
||||||
|
account.access_token = access_token
|
||||||
|
account.refresh_token = refresh_token
|
||||||
|
account.expires_at = expires_at
|
||||||
|
|
||||||
|
|
||||||
|
async def refresh_account_usage(
|
||||||
|
account: AccountRecord,
|
||||||
|
client: OpenAIClient,
|
||||||
|
exhausted_threshold: int = 100,
|
||||||
|
) -> UsageSnapshot:
|
||||||
|
await ensure_fresh_access_token(account, client)
|
||||||
|
try:
|
||||||
|
payload = await client.fetch_usage_payload(account.access_token)
|
||||||
|
except OpenAIAPIError as exc:
|
||||||
|
if exc.permanent:
|
||||||
|
raise PermanentAccountFailure(account, str(exc)) from exc
|
||||||
|
raise
|
||||||
|
account.email = payload.get("email") or account.email
|
||||||
|
account.account_id = payload.get("account_id") or account.account_id
|
||||||
|
sync_account_identity(account)
|
||||||
|
usage = parse_usage_payload(payload)
|
||||||
|
account.last_known_usage = usage
|
||||||
|
account.last_error = None
|
||||||
|
account.cooldown_until = (
|
||||||
|
compute_cooldown_until(usage, exhausted_threshold)
|
||||||
|
if snapshot_is_exhausted(usage, exhausted_threshold)
|
||||||
|
else None
|
||||||
|
)
|
||||||
|
return usage
|
||||||
59
src/gibby/app.py
Normal file
59
src/gibby/app.py
Normal file
|
|
@ -0,0 +1,59 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import logging
|
||||||
|
from contextlib import asynccontextmanager
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from fastapi import FastAPI
|
||||||
|
from fastapi.responses import JSONResponse, PlainTextResponse
|
||||||
|
|
||||||
|
from gibby.client import OpenAIClient
|
||||||
|
from gibby.manager import AccountManager, NoUsableAccountError
|
||||||
|
from gibby.settings import Settings
|
||||||
|
from gibby.store import JsonStateStore
|
||||||
|
|
||||||
|
logging.basicConfig(level=logging.INFO, format="%(levelname)s %(name)s: %(message)s")
|
||||||
|
logging.getLogger("httpx").setLevel(logging.WARNING)
|
||||||
|
logging.getLogger("httpcore").setLevel(logging.WARNING)
|
||||||
|
|
||||||
|
|
||||||
|
def create_app(manager: AccountManager | None = None) -> FastAPI:
|
||||||
|
@asynccontextmanager
|
||||||
|
async def lifespan(app: FastAPI):
|
||||||
|
owned_client: OpenAIClient | None = None
|
||||||
|
if manager is None:
|
||||||
|
settings = Settings()
|
||||||
|
store = JsonStateStore(settings.accounts_file, settings.failed_file)
|
||||||
|
owned_client = OpenAIClient(settings)
|
||||||
|
app.state.manager = AccountManager(store, owned_client, settings)
|
||||||
|
else:
|
||||||
|
app.state.manager = manager
|
||||||
|
try:
|
||||||
|
yield
|
||||||
|
finally:
|
||||||
|
if owned_client is not None:
|
||||||
|
await owned_client.aclose()
|
||||||
|
|
||||||
|
app = FastAPI(lifespan=lifespan)
|
||||||
|
if manager is not None:
|
||||||
|
app.state.manager = manager
|
||||||
|
|
||||||
|
@app.get("/health", response_class=PlainTextResponse)
|
||||||
|
async def health() -> str:
|
||||||
|
return "ok"
|
||||||
|
|
||||||
|
@app.get("/token", response_model=None)
|
||||||
|
async def token() -> Any:
|
||||||
|
try:
|
||||||
|
return await app.state.manager.issue_token_response()
|
||||||
|
except NoUsableAccountError as exc:
|
||||||
|
return JSONResponse(status_code=503, content={"error": str(exc)})
|
||||||
|
|
||||||
|
@app.get("/usage", response_model=None)
|
||||||
|
async def usage() -> Any:
|
||||||
|
return await app.state.manager.get_usage_report()
|
||||||
|
|
||||||
|
return app
|
||||||
|
|
||||||
|
|
||||||
|
app = create_app()
|
||||||
108
src/gibby/client.py
Normal file
108
src/gibby/client.py
Normal file
|
|
@ -0,0 +1,108 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
|
||||||
|
from gibby.models import UsageSnapshot, now_ts, parse_usage_payload
|
||||||
|
from gibby.oauth import TOKEN_URL, USAGE_URL
|
||||||
|
from gibby.settings import Settings
|
||||||
|
|
||||||
|
|
||||||
|
class OpenAIAPIError(RuntimeError):
|
||||||
|
def __init__(
|
||||||
|
self, message: str, *, permanent: bool, status_code: int | None = None
|
||||||
|
):
|
||||||
|
super().__init__(message)
|
||||||
|
self.permanent = permanent
|
||||||
|
self.status_code = status_code
|
||||||
|
|
||||||
|
|
||||||
|
def _extract_error_text(response: httpx.Response) -> str:
|
||||||
|
try:
|
||||||
|
payload = response.json()
|
||||||
|
if isinstance(payload, dict):
|
||||||
|
for key in ("error_description", "error", "message", "detail"):
|
||||||
|
value = payload.get(key)
|
||||||
|
if isinstance(value, str) and value:
|
||||||
|
return value
|
||||||
|
return json.dumps(payload, ensure_ascii=True)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
return response.text[:500]
|
||||||
|
|
||||||
|
|
||||||
|
def _is_permanent_auth_error(response: httpx.Response, body: str) -> bool:
|
||||||
|
lowered = body.lower()
|
||||||
|
return response.status_code in {401, 403} or "invalid_grant" in lowered
|
||||||
|
|
||||||
|
|
||||||
|
def _raise_for_openai_error(response: httpx.Response) -> None:
|
||||||
|
if response.is_success:
|
||||||
|
return
|
||||||
|
body = _extract_error_text(response)
|
||||||
|
raise OpenAIAPIError(
|
||||||
|
f"OpenAI request failed: status={response.status_code} body={body}",
|
||||||
|
permanent=_is_permanent_auth_error(response, body),
|
||||||
|
status_code=response.status_code,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class OpenAIClient:
|
||||||
|
def __init__(
|
||||||
|
self, settings: Settings, http_client: httpx.AsyncClient | None = None
|
||||||
|
):
|
||||||
|
self.settings = settings
|
||||||
|
self.http_client = http_client or httpx.AsyncClient(
|
||||||
|
timeout=settings.request_timeout
|
||||||
|
)
|
||||||
|
self._owns_client = http_client is None
|
||||||
|
|
||||||
|
async def aclose(self) -> None:
|
||||||
|
if self._owns_client:
|
||||||
|
await self.http_client.aclose()
|
||||||
|
|
||||||
|
async def exchange_code(self, code: str, verifier: str) -> tuple[str, str, int]:
|
||||||
|
payload = {
|
||||||
|
"grant_type": "authorization_code",
|
||||||
|
"client_id": self.settings.oauth_client_id,
|
||||||
|
"code": code,
|
||||||
|
"code_verifier": verifier,
|
||||||
|
"redirect_uri": self.settings.redirect_uri,
|
||||||
|
}
|
||||||
|
response = await self.http_client.post(TOKEN_URL, data=payload)
|
||||||
|
_raise_for_openai_error(response)
|
||||||
|
body = response.json()
|
||||||
|
expires_at = now_ts() + int(body["expires_in"])
|
||||||
|
return body["access_token"], body["refresh_token"], expires_at
|
||||||
|
|
||||||
|
async def refresh_access_token(self, refresh_token: str) -> tuple[str, str, int]:
|
||||||
|
payload = {
|
||||||
|
"grant_type": "refresh_token",
|
||||||
|
"client_id": self.settings.oauth_client_id,
|
||||||
|
"refresh_token": refresh_token,
|
||||||
|
}
|
||||||
|
response = await self.http_client.post(TOKEN_URL, data=payload)
|
||||||
|
_raise_for_openai_error(response)
|
||||||
|
body = response.json()
|
||||||
|
expires_at = now_ts() + int(body["expires_in"])
|
||||||
|
next_refresh = str(body.get("refresh_token") or refresh_token)
|
||||||
|
return body["access_token"], next_refresh, expires_at
|
||||||
|
|
||||||
|
async def fetch_usage_payload(self, access_token: str) -> dict:
|
||||||
|
headers = {
|
||||||
|
"Authorization": f"Bearer {access_token}",
|
||||||
|
"User-Agent": "CodexProxy",
|
||||||
|
"Accept": "application/json",
|
||||||
|
}
|
||||||
|
response = await self.http_client.get(USAGE_URL, headers=headers)
|
||||||
|
_raise_for_openai_error(response)
|
||||||
|
payload = response.json()
|
||||||
|
if not isinstance(payload, dict):
|
||||||
|
raise OpenAIAPIError(
|
||||||
|
"OpenAI usage response is not a JSON object", permanent=False
|
||||||
|
)
|
||||||
|
return payload
|
||||||
|
|
||||||
|
async def fetch_usage(self, access_token: str) -> UsageSnapshot:
|
||||||
|
return parse_usage_payload(await self.fetch_usage_payload(access_token))
|
||||||
271
src/gibby/manager.py
Normal file
271
src/gibby/manager.py
Normal file
|
|
@ -0,0 +1,271 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import logging
|
||||||
|
from dataclasses import asdict
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from gibby.account_ops import (
|
||||||
|
PermanentAccountFailure,
|
||||||
|
failed_identifier,
|
||||||
|
handle_failed_account,
|
||||||
|
refresh_account_usage,
|
||||||
|
snapshot_is_exhausted,
|
||||||
|
window_used_percent,
|
||||||
|
)
|
||||||
|
from gibby.client import OpenAIClient
|
||||||
|
from gibby.models import (
|
||||||
|
AccountRecord,
|
||||||
|
StateFile,
|
||||||
|
build_limit,
|
||||||
|
format_reset_in,
|
||||||
|
now_ts,
|
||||||
|
)
|
||||||
|
from gibby.settings import Settings
|
||||||
|
from gibby.store import JsonStateStore
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class NoUsableAccountError(RuntimeError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class AccountManager:
|
||||||
|
def __init__(self, store: JsonStateStore, client: OpenAIClient, settings: Settings):
|
||||||
|
self.store = store
|
||||||
|
self.client = client
|
||||||
|
self.settings = settings
|
||||||
|
self._lock = asyncio.Lock()
|
||||||
|
|
||||||
|
async def issue_token_response(self) -> dict[str, Any]:
|
||||||
|
async with self._lock:
|
||||||
|
state = self.store.load()
|
||||||
|
if not state.accounts:
|
||||||
|
logger.error("No accounts configured")
|
||||||
|
raise NoUsableAccountError("No accounts configured")
|
||||||
|
|
||||||
|
current = now_ts()
|
||||||
|
for account in self._build_selection_order(state, current):
|
||||||
|
payload = await self._try_issue_token(state, account)
|
||||||
|
if payload is not None:
|
||||||
|
return payload
|
||||||
|
|
||||||
|
self.store.save(state)
|
||||||
|
logger.error(
|
||||||
|
"No usable account available: %s",
|
||||||
|
[
|
||||||
|
{
|
||||||
|
"id": account.id,
|
||||||
|
"cooldown_until": account.cooldown_until,
|
||||||
|
"last_error": account.last_error,
|
||||||
|
}
|
||||||
|
for account in state.accounts
|
||||||
|
],
|
||||||
|
)
|
||||||
|
raise NoUsableAccountError("No usable account available")
|
||||||
|
|
||||||
|
async def _try_issue_token(
|
||||||
|
self, state: StateFile, account: AccountRecord
|
||||||
|
) -> dict[str, Any] | None:
|
||||||
|
previous_id = account.id
|
||||||
|
try:
|
||||||
|
usage = await refresh_account_usage(
|
||||||
|
account,
|
||||||
|
self.client,
|
||||||
|
self.settings.exhausted_usage_threshold,
|
||||||
|
)
|
||||||
|
except PermanentAccountFailure as exc:
|
||||||
|
account.last_error = str(exc)
|
||||||
|
handle_failed_account(self.store, account)
|
||||||
|
self.store.remove_account(state, account.id)
|
||||||
|
self.store.save(state)
|
||||||
|
logger.error(
|
||||||
|
"Removed failed account %s (%s): %s",
|
||||||
|
account.id,
|
||||||
|
failed_identifier(account),
|
||||||
|
exc,
|
||||||
|
)
|
||||||
|
return None
|
||||||
|
except Exception as exc:
|
||||||
|
account.last_error = str(exc)
|
||||||
|
logger.exception(
|
||||||
|
"Account %s failed during refresh or usage check",
|
||||||
|
account.id,
|
||||||
|
)
|
||||||
|
self.store.update_active_account_id(state, previous_id, account.id)
|
||||||
|
self.store.save(state)
|
||||||
|
return None
|
||||||
|
|
||||||
|
self.store.update_active_account_id(state, previous_id, account.id)
|
||||||
|
account.last_known_usage = usage
|
||||||
|
account.last_error = None
|
||||||
|
if snapshot_is_exhausted(usage, self.settings.exhausted_usage_threshold):
|
||||||
|
logger.warning(
|
||||||
|
"Account %s exhausted: primary=%s%% secondary=%s%% cooldown_until=%s",
|
||||||
|
account.id,
|
||||||
|
window_used_percent(usage.primary_window),
|
||||||
|
window_used_percent(usage.secondary_window),
|
||||||
|
account.cooldown_until,
|
||||||
|
)
|
||||||
|
self.store.save(state)
|
||||||
|
return None
|
||||||
|
|
||||||
|
account.cooldown_until = None
|
||||||
|
state.active_account_id = account.id
|
||||||
|
self.store.save(state)
|
||||||
|
logger.info(
|
||||||
|
"token issued for %s, primary %s%% reset in %s, secondary %s%% reset in %s",
|
||||||
|
account.id,
|
||||||
|
window_used_percent(usage.primary_window),
|
||||||
|
format_reset_in(
|
||||||
|
usage.primary_window.reset_at if usage.primary_window else None
|
||||||
|
),
|
||||||
|
window_used_percent(usage.secondary_window),
|
||||||
|
format_reset_in(
|
||||||
|
usage.secondary_window.reset_at if usage.secondary_window else None
|
||||||
|
),
|
||||||
|
)
|
||||||
|
return {
|
||||||
|
"token": account.access_token,
|
||||||
|
"limit": build_limit(usage),
|
||||||
|
"usage": {
|
||||||
|
"primary_window": asdict(usage.primary_window)
|
||||||
|
if usage.primary_window
|
||||||
|
else None,
|
||||||
|
"secondary_window": asdict(usage.secondary_window)
|
||||||
|
if usage.secondary_window
|
||||||
|
else None,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
async def get_usage_report(self) -> dict[str, Any]:
|
||||||
|
async with self._lock:
|
||||||
|
state = self.store.load()
|
||||||
|
accounts_report: list[dict[str, Any]] = []
|
||||||
|
|
||||||
|
for account in list(state.accounts):
|
||||||
|
previous_id = account.id
|
||||||
|
try:
|
||||||
|
usage = await refresh_account_usage(
|
||||||
|
account,
|
||||||
|
self.client,
|
||||||
|
)
|
||||||
|
except PermanentAccountFailure as exc:
|
||||||
|
account.last_error = str(exc)
|
||||||
|
identifier = failed_identifier(account)
|
||||||
|
handle_failed_account(self.store, account)
|
||||||
|
self.store.remove_account(state, account.id)
|
||||||
|
logger.error(
|
||||||
|
"Removed failed account %s (%s): %s",
|
||||||
|
account.id,
|
||||||
|
identifier,
|
||||||
|
exc,
|
||||||
|
)
|
||||||
|
accounts_report.append(
|
||||||
|
{
|
||||||
|
"id": account.id,
|
||||||
|
"email": account.email,
|
||||||
|
"status": "removed",
|
||||||
|
"error": str(exc),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
except Exception as exc:
|
||||||
|
account.last_error = str(exc)
|
||||||
|
logger.exception(
|
||||||
|
"Account %s failed during usage refresh", account.id
|
||||||
|
)
|
||||||
|
accounts_report.append(
|
||||||
|
{
|
||||||
|
"id": account.id,
|
||||||
|
"email": account.email,
|
||||||
|
"status": "error",
|
||||||
|
"error": str(exc),
|
||||||
|
"cooldown_until": account.cooldown_until,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
self.store.update_active_account_id(state, previous_id, account.id)
|
||||||
|
status = (
|
||||||
|
"depleted"
|
||||||
|
if snapshot_is_exhausted(
|
||||||
|
usage, self.settings.exhausted_usage_threshold
|
||||||
|
)
|
||||||
|
else "ok"
|
||||||
|
)
|
||||||
|
accounts_report.append(
|
||||||
|
{
|
||||||
|
"id": account.id,
|
||||||
|
"email": account.email,
|
||||||
|
"status": status,
|
||||||
|
"used_percent": usage.used_percent,
|
||||||
|
"remaining_percent": usage.remaining_percent,
|
||||||
|
"cooldown_until": account.cooldown_until,
|
||||||
|
"primary_window": asdict(usage.primary_window)
|
||||||
|
if usage.primary_window
|
||||||
|
else None,
|
||||||
|
"secondary_window": asdict(usage.secondary_window)
|
||||||
|
if usage.secondary_window
|
||||||
|
else None,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
self.store.save(state)
|
||||||
|
return {
|
||||||
|
"accounts": accounts_report,
|
||||||
|
"active_account_id": state.active_account_id,
|
||||||
|
"count": len(accounts_report),
|
||||||
|
}
|
||||||
|
|
||||||
|
def _find_active_account(self, state: StateFile) -> AccountRecord | None:
|
||||||
|
if not state.active_account_id:
|
||||||
|
return None
|
||||||
|
for account in state.accounts:
|
||||||
|
if account.id == state.active_account_id:
|
||||||
|
return account
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _is_locally_blocked(self, account: AccountRecord, current: int) -> bool:
|
||||||
|
if account.cooldown_until and account.cooldown_until > current:
|
||||||
|
return True
|
||||||
|
if account.last_known_usage and snapshot_is_exhausted(
|
||||||
|
account.last_known_usage, self.settings.exhausted_usage_threshold
|
||||||
|
):
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
def _build_selection_order(
|
||||||
|
self, state: StateFile, current: int
|
||||||
|
) -> list[AccountRecord]:
|
||||||
|
active = self._find_active_account(state)
|
||||||
|
excluded_ids = {active.id} if active is not None else set()
|
||||||
|
fallback = self._iter_candidates(state, current, excluded_ids)
|
||||||
|
if active is None:
|
||||||
|
return fallback
|
||||||
|
if self._is_locally_blocked(active, current):
|
||||||
|
logger.info("active account %s is blocked by local state", active.id)
|
||||||
|
return fallback
|
||||||
|
return [active, *fallback]
|
||||||
|
|
||||||
|
def _iter_candidates(
|
||||||
|
self, state: StateFile, current: int, excluded_ids: set[str] | None = None
|
||||||
|
) -> list[AccountRecord]:
|
||||||
|
excluded_ids = excluded_ids or set()
|
||||||
|
available = [
|
||||||
|
account
|
||||||
|
for account in state.accounts
|
||||||
|
if account.id not in excluded_ids
|
||||||
|
and not self._is_locally_blocked(account, current)
|
||||||
|
]
|
||||||
|
return sorted(available, key=self._candidate_sort_key, reverse=True)
|
||||||
|
|
||||||
|
def _candidate_sort_key(self, account: AccountRecord) -> tuple[int, int]:
|
||||||
|
snapshot = account.last_known_usage
|
||||||
|
if snapshot is None:
|
||||||
|
return (0, 0)
|
||||||
|
return (
|
||||||
|
window_used_percent(snapshot.primary_window),
|
||||||
|
window_used_percent(snapshot.secondary_window),
|
||||||
|
)
|
||||||
211
src/gibby/models.py
Normal file
211
src/gibby/models.py
Normal file
|
|
@ -0,0 +1,211 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import time
|
||||||
|
from dataclasses import dataclass, field
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
UNKNOWN_EXHAUSTED_BACKOFF_SECONDS = 300
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(slots=True)
|
||||||
|
class UsageWindow:
|
||||||
|
used_percent: int
|
||||||
|
limit_window_seconds: int
|
||||||
|
reset_after_seconds: int
|
||||||
|
reset_at: int
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(slots=True)
|
||||||
|
class UsageSnapshot:
|
||||||
|
checked_at: int
|
||||||
|
used_percent: int
|
||||||
|
remaining_percent: int
|
||||||
|
exhausted: bool
|
||||||
|
primary_window: UsageWindow | None
|
||||||
|
secondary_window: UsageWindow | None
|
||||||
|
limit_reached: bool
|
||||||
|
allowed: bool
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(slots=True)
|
||||||
|
class AccountRecord:
|
||||||
|
id: str
|
||||||
|
email: str | None = None
|
||||||
|
account_id: str | None = None
|
||||||
|
access_token: str = ""
|
||||||
|
refresh_token: str = ""
|
||||||
|
expires_at: int = 0
|
||||||
|
cooldown_until: int | None = None
|
||||||
|
last_known_usage: UsageSnapshot | None = None
|
||||||
|
last_error: str | None = None
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(slots=True)
|
||||||
|
class StateFile:
|
||||||
|
version: int = 1
|
||||||
|
active_account_id: str | None = None
|
||||||
|
accounts: list[AccountRecord] = field(default_factory=list)
|
||||||
|
|
||||||
|
|
||||||
|
def now_ts() -> int:
|
||||||
|
return int(time.time())
|
||||||
|
|
||||||
|
|
||||||
|
def format_reset_in(reset_at: int | None, current: int | None = None) -> str:
|
||||||
|
if not reset_at:
|
||||||
|
return "n/a"
|
||||||
|
remaining = reset_at - (current if current is not None else now_ts())
|
||||||
|
if remaining <= 0:
|
||||||
|
return "now"
|
||||||
|
|
||||||
|
days, remainder = divmod(remaining, 86400)
|
||||||
|
hours, remainder = divmod(remainder, 3600)
|
||||||
|
minutes, seconds = divmod(remainder, 60)
|
||||||
|
parts: list[str] = []
|
||||||
|
if days:
|
||||||
|
parts.append(f"{days}d")
|
||||||
|
if hours:
|
||||||
|
parts.append(f"{hours}h")
|
||||||
|
if minutes:
|
||||||
|
parts.append(f"{minutes}m")
|
||||||
|
if seconds or not parts:
|
||||||
|
parts.append(f"{seconds}s")
|
||||||
|
return " ".join(parts)
|
||||||
|
|
||||||
|
|
||||||
|
def clamp_percent(value: Any) -> int:
|
||||||
|
try:
|
||||||
|
num = float(value)
|
||||||
|
except TypeError, ValueError:
|
||||||
|
return 0
|
||||||
|
if num < 0:
|
||||||
|
return 0
|
||||||
|
if num > 100:
|
||||||
|
return 100
|
||||||
|
return int(round(num))
|
||||||
|
|
||||||
|
|
||||||
|
def parse_usage_window(window: dict[str, Any] | None) -> UsageWindow | None:
|
||||||
|
if not isinstance(window, dict):
|
||||||
|
return None
|
||||||
|
return UsageWindow(
|
||||||
|
used_percent=clamp_percent(window.get("used_percent") or 0),
|
||||||
|
limit_window_seconds=int(window.get("limit_window_seconds") or 0),
|
||||||
|
reset_after_seconds=int(window.get("reset_after_seconds") or 0),
|
||||||
|
reset_at=int(window.get("reset_at") or 0),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _effective_used_percent(
|
||||||
|
primary: UsageWindow | None, secondary: UsageWindow | None
|
||||||
|
) -> int:
|
||||||
|
return max(
|
||||||
|
(window.used_percent for window in (primary, secondary) if window), default=0
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _snapshot_is_exhausted(
|
||||||
|
primary: UsageWindow | None,
|
||||||
|
secondary: UsageWindow | None,
|
||||||
|
limit_reached: bool,
|
||||||
|
allowed: bool,
|
||||||
|
) -> bool:
|
||||||
|
return (
|
||||||
|
_effective_used_percent(primary, secondary) >= 100
|
||||||
|
or limit_reached
|
||||||
|
or not allowed
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def build_usage_snapshot(
|
||||||
|
*,
|
||||||
|
checked_at: int,
|
||||||
|
primary_window: UsageWindow | None,
|
||||||
|
secondary_window: UsageWindow | None,
|
||||||
|
limit_reached: bool,
|
||||||
|
allowed: bool,
|
||||||
|
) -> UsageSnapshot:
|
||||||
|
used_percent = _effective_used_percent(primary_window, secondary_window)
|
||||||
|
return UsageSnapshot(
|
||||||
|
checked_at=checked_at,
|
||||||
|
used_percent=used_percent,
|
||||||
|
remaining_percent=max(0, 100 - used_percent),
|
||||||
|
exhausted=_snapshot_is_exhausted(
|
||||||
|
primary_window, secondary_window, limit_reached, allowed
|
||||||
|
),
|
||||||
|
primary_window=primary_window,
|
||||||
|
secondary_window=secondary_window,
|
||||||
|
limit_reached=limit_reached,
|
||||||
|
allowed=allowed,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_usage_payload(
|
||||||
|
payload: dict[str, Any], checked_at: int | None = None
|
||||||
|
) -> UsageSnapshot:
|
||||||
|
checked_at = checked_at or now_ts()
|
||||||
|
rate_limit = payload.get("rate_limit") or {}
|
||||||
|
primary = parse_usage_window(rate_limit.get("primary_window"))
|
||||||
|
secondary = parse_usage_window(rate_limit.get("secondary_window"))
|
||||||
|
limit_reached = bool(rate_limit.get("limit_reached"))
|
||||||
|
allowed = bool(rate_limit.get("allowed", True))
|
||||||
|
return build_usage_snapshot(
|
||||||
|
checked_at=checked_at,
|
||||||
|
primary_window=primary,
|
||||||
|
secondary_window=secondary,
|
||||||
|
limit_reached=limit_reached,
|
||||||
|
allowed=allowed,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def build_limit(snapshot: UsageSnapshot) -> dict[str, int | bool]:
|
||||||
|
return {
|
||||||
|
"used_percent": snapshot.used_percent,
|
||||||
|
"remaining_percent": snapshot.remaining_percent,
|
||||||
|
"exhausted": snapshot.exhausted,
|
||||||
|
"needs_prepare": False,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _window_reset_deadline(window: UsageWindow, checked_at: int) -> int | None:
|
||||||
|
if window.reset_at > checked_at:
|
||||||
|
return window.reset_at
|
||||||
|
if window.reset_after_seconds > 0:
|
||||||
|
return checked_at + window.reset_after_seconds
|
||||||
|
if window.limit_window_seconds > 0:
|
||||||
|
return checked_at + window.limit_window_seconds
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def compute_cooldown_until(
|
||||||
|
snapshot: UsageSnapshot, exhausted_threshold: int = 100
|
||||||
|
) -> int | None:
|
||||||
|
effective_used = _effective_used_percent(
|
||||||
|
snapshot.primary_window, snapshot.secondary_window
|
||||||
|
)
|
||||||
|
if (
|
||||||
|
effective_used < exhausted_threshold
|
||||||
|
and not snapshot.limit_reached
|
||||||
|
and snapshot.allowed
|
||||||
|
):
|
||||||
|
return None
|
||||||
|
blocking_windows = [
|
||||||
|
window
|
||||||
|
for window in (snapshot.primary_window, snapshot.secondary_window)
|
||||||
|
if window and window.used_percent >= exhausted_threshold
|
||||||
|
]
|
||||||
|
if not blocking_windows and (snapshot.limit_reached or not snapshot.allowed):
|
||||||
|
blocking_windows = [
|
||||||
|
window
|
||||||
|
for window in (snapshot.primary_window, snapshot.secondary_window)
|
||||||
|
if window
|
||||||
|
]
|
||||||
|
|
||||||
|
deadlines = [
|
||||||
|
deadline
|
||||||
|
for window in blocking_windows
|
||||||
|
if (deadline := _window_reset_deadline(window, snapshot.checked_at)) is not None
|
||||||
|
]
|
||||||
|
if deadlines:
|
||||||
|
return max(deadlines)
|
||||||
|
return snapshot.checked_at + UNKNOWN_EXHAUSTED_BACKOFF_SECONDS
|
||||||
43
src/gibby/oauth.py
Normal file
43
src/gibby/oauth.py
Normal file
|
|
@ -0,0 +1,43 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import base64
|
||||||
|
import hashlib
|
||||||
|
import secrets
|
||||||
|
from urllib.parse import urlencode
|
||||||
|
|
||||||
|
from gibby.settings import Settings
|
||||||
|
|
||||||
|
AUTHORIZE_URL = "https://auth.openai.com/oauth/authorize"
|
||||||
|
TOKEN_URL = "https://auth.openai.com/oauth/token"
|
||||||
|
USAGE_URL = "https://chatgpt.com/backend-api/wham/usage"
|
||||||
|
|
||||||
|
|
||||||
|
def generate_pkce_pair() -> tuple[str, str]:
|
||||||
|
verifier = secrets.token_urlsafe(64)
|
||||||
|
digest = hashlib.sha256(verifier.encode("utf-8")).digest()
|
||||||
|
challenge = base64.urlsafe_b64encode(digest).decode("utf-8").rstrip("=")
|
||||||
|
return verifier, challenge
|
||||||
|
|
||||||
|
|
||||||
|
def generate_state() -> str:
|
||||||
|
return secrets.token_urlsafe(32)
|
||||||
|
|
||||||
|
|
||||||
|
def build_authorize_url(settings: Settings, challenge: str, state: str) -> str:
|
||||||
|
params = {
|
||||||
|
"response_type": "code",
|
||||||
|
"client_id": settings.oauth_client_id,
|
||||||
|
"redirect_uri": settings.redirect_uri,
|
||||||
|
"scope": settings.oauth_scope,
|
||||||
|
"code_challenge": challenge,
|
||||||
|
"code_challenge_method": "S256",
|
||||||
|
"id_token_add_organizations": "true",
|
||||||
|
"codex_cli_simplified_flow": "true",
|
||||||
|
"state": state,
|
||||||
|
"originator": settings.originator,
|
||||||
|
}
|
||||||
|
return f"{AUTHORIZE_URL}?{urlencode(params)}"
|
||||||
|
|
||||||
|
|
||||||
|
def make_account_id() -> str:
|
||||||
|
return f"acct_{secrets.token_hex(6)}"
|
||||||
36
src/gibby/settings.py
Normal file
36
src/gibby/settings.py
Normal file
|
|
@ -0,0 +1,36 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import os
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
DEFAULT_CLIENT_ID = "app_EMoamEEZ73f0CkXaXp7hrann"
|
||||||
|
DEFAULT_SCOPE = "openid profile email offline_access"
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(slots=True)
|
||||||
|
class Settings:
|
||||||
|
data_dir: Path = Path(os.getenv("DATA_DIR", "data"))
|
||||||
|
bind_host: str = os.getenv("GIBBY_BIND_HOST", "0.0.0.0")
|
||||||
|
port: int = int(os.getenv("GIBBY_PORT", "8080"))
|
||||||
|
exhausted_usage_threshold: int = int(
|
||||||
|
os.getenv("GIBBY_EXHAUSTED_USAGE_THRESHOLD", "95")
|
||||||
|
)
|
||||||
|
callback_host: str = os.getenv("GIBBY_CALLBACK_HOST", "localhost")
|
||||||
|
callback_port: int = int(os.getenv("GIBBY_CALLBACK_PORT", "1455"))
|
||||||
|
oauth_client_id: str = os.getenv("GIBBY_OAUTH_CLIENT_ID", DEFAULT_CLIENT_ID)
|
||||||
|
oauth_scope: str = os.getenv("GIBBY_OAUTH_SCOPE", DEFAULT_SCOPE)
|
||||||
|
originator: str = os.getenv("GIBBY_ORIGINATOR", "opencode")
|
||||||
|
request_timeout: float = float(os.getenv("GIBBY_REQUEST_TIMEOUT", "40"))
|
||||||
|
|
||||||
|
@property
|
||||||
|
def accounts_file(self) -> Path:
|
||||||
|
return self.data_dir / "accounts.json"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def failed_file(self) -> Path:
|
||||||
|
return self.data_dir / "failed.txt"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def redirect_uri(self) -> str:
|
||||||
|
return f"http://{self.callback_host}:{self.callback_port}/auth/callback"
|
||||||
171
src/gibby/store.py
Normal file
171
src/gibby/store.py
Normal file
|
|
@ -0,0 +1,171 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
from dataclasses import asdict
|
||||||
|
from pathlib import Path
|
||||||
|
from tempfile import NamedTemporaryFile
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from gibby.models import (
|
||||||
|
AccountRecord,
|
||||||
|
StateFile,
|
||||||
|
UsageSnapshot,
|
||||||
|
UsageWindow,
|
||||||
|
build_usage_snapshot,
|
||||||
|
parse_usage_window,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class JsonStateStore:
|
||||||
|
def __init__(self, path: Path, failed_path: Path | None = None):
|
||||||
|
self.path = path
|
||||||
|
self.failed_path = failed_path or path.with_name("failed.txt")
|
||||||
|
|
||||||
|
def load(self) -> StateFile:
|
||||||
|
try:
|
||||||
|
if not self.path.exists():
|
||||||
|
return StateFile()
|
||||||
|
raw = json.loads(self.path.read_text())
|
||||||
|
if not isinstance(raw, dict):
|
||||||
|
raise RuntimeError("accounts.json must contain a JSON object")
|
||||||
|
accounts = [
|
||||||
|
self._account_from_dict(item) for item in raw.get("accounts", [])
|
||||||
|
]
|
||||||
|
return StateFile(
|
||||||
|
version=int(raw.get("version", 1)),
|
||||||
|
active_account_id=raw.get("active_account_id"),
|
||||||
|
accounts=accounts,
|
||||||
|
)
|
||||||
|
except (OSError, json.JSONDecodeError, TypeError, ValueError, KeyError) as exc:
|
||||||
|
raise RuntimeError(f"Invalid accounts state file: {exc}") from exc
|
||||||
|
|
||||||
|
def save(self, state: StateFile) -> None:
|
||||||
|
self.path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
payload = {
|
||||||
|
"version": state.version,
|
||||||
|
"active_account_id": state.active_account_id,
|
||||||
|
"accounts": [self._account_to_dict(account) for account in state.accounts],
|
||||||
|
}
|
||||||
|
with NamedTemporaryFile(
|
||||||
|
"w", delete=False, dir=self.path.parent, encoding="utf-8"
|
||||||
|
) as tmp:
|
||||||
|
json.dump(payload, tmp, ensure_ascii=True, indent=2)
|
||||||
|
tmp.write("\n")
|
||||||
|
temp_name = tmp.name
|
||||||
|
os.replace(temp_name, self.path)
|
||||||
|
|
||||||
|
def upsert_account(
|
||||||
|
self, account: AccountRecord, set_active: bool = False
|
||||||
|
) -> AccountRecord:
|
||||||
|
state = self.load()
|
||||||
|
for index, existing in enumerate(state.accounts):
|
||||||
|
if existing.email and account.email and existing.email == account.email:
|
||||||
|
if state.active_account_id == existing.id:
|
||||||
|
state.active_account_id = account.id
|
||||||
|
state.accounts[index] = account
|
||||||
|
break
|
||||||
|
if existing.id == account.id:
|
||||||
|
state.accounts[index] = account
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
state.accounts.append(account)
|
||||||
|
if set_active or not state.active_account_id:
|
||||||
|
state.active_account_id = account.id
|
||||||
|
self.save(state)
|
||||||
|
return account
|
||||||
|
|
||||||
|
def remove_account(self, state: StateFile, account_id: str) -> None:
|
||||||
|
state.accounts = [
|
||||||
|
account for account in state.accounts if account.id != account_id
|
||||||
|
]
|
||||||
|
if state.active_account_id == account_id:
|
||||||
|
state.active_account_id = state.accounts[0].id if state.accounts else None
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def update_active_account_id(
|
||||||
|
state: StateFile, previous_id: str, current_id: str
|
||||||
|
) -> None:
|
||||||
|
if state.active_account_id == previous_id:
|
||||||
|
state.active_account_id = current_id
|
||||||
|
|
||||||
|
def append_failed_identifier(self, identifier: str) -> None:
|
||||||
|
identifier = identifier.strip()
|
||||||
|
if not identifier:
|
||||||
|
return
|
||||||
|
self.failed_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
existing = set()
|
||||||
|
if self.failed_path.exists():
|
||||||
|
existing = {
|
||||||
|
line.strip()
|
||||||
|
for line in self.failed_path.read_text().splitlines()
|
||||||
|
if line.strip()
|
||||||
|
}
|
||||||
|
if identifier in existing:
|
||||||
|
return
|
||||||
|
with self.failed_path.open("a", encoding="utf-8") as file:
|
||||||
|
file.write(f"{identifier}\n")
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _window_to_dict(window: UsageWindow | None) -> dict[str, Any] | None:
|
||||||
|
return asdict(window) if window else None
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _snapshot_to_dict(cls, snapshot: UsageSnapshot | None) -> dict[str, Any] | None:
|
||||||
|
if snapshot is None:
|
||||||
|
return None
|
||||||
|
return {
|
||||||
|
"checked_at": snapshot.checked_at,
|
||||||
|
"primary_window": cls._window_to_dict(snapshot.primary_window),
|
||||||
|
"secondary_window": cls._window_to_dict(snapshot.secondary_window),
|
||||||
|
"limit_reached": snapshot.limit_reached,
|
||||||
|
"allowed": snapshot.allowed,
|
||||||
|
}
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _window_from_dict(window: dict[str, Any] | None) -> UsageWindow | None:
|
||||||
|
return parse_usage_window(window)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _snapshot_from_dict(
|
||||||
|
cls, snapshot: dict[str, Any] | None
|
||||||
|
) -> UsageSnapshot | None:
|
||||||
|
if not isinstance(snapshot, dict):
|
||||||
|
return None
|
||||||
|
return build_usage_snapshot(
|
||||||
|
checked_at=int(snapshot.get("checked_at") or 0),
|
||||||
|
primary_window=cls._window_from_dict(snapshot.get("primary_window")),
|
||||||
|
secondary_window=cls._window_from_dict(snapshot.get("secondary_window")),
|
||||||
|
limit_reached=bool(snapshot.get("limit_reached")),
|
||||||
|
allowed=bool(snapshot.get("allowed", True)),
|
||||||
|
)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _account_from_dict(cls, payload: dict[str, Any]) -> AccountRecord:
|
||||||
|
return AccountRecord(
|
||||||
|
id=str(payload["id"]),
|
||||||
|
email=payload.get("email"),
|
||||||
|
account_id=payload.get("account_id"),
|
||||||
|
access_token=str(payload.get("access_token") or ""),
|
||||||
|
refresh_token=str(payload.get("refresh_token") or ""),
|
||||||
|
expires_at=int(payload.get("expires_at") or 0),
|
||||||
|
cooldown_until=int(payload["cooldown_until"])
|
||||||
|
if payload.get("cooldown_until")
|
||||||
|
else None,
|
||||||
|
last_known_usage=cls._snapshot_from_dict(payload.get("last_known_usage")),
|
||||||
|
last_error=payload.get("last_error"),
|
||||||
|
)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _account_to_dict(cls, account: AccountRecord) -> dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"id": account.id,
|
||||||
|
"email": account.email,
|
||||||
|
"account_id": account.account_id,
|
||||||
|
"access_token": account.access_token,
|
||||||
|
"refresh_token": account.refresh_token,
|
||||||
|
"expires_at": account.expires_at,
|
||||||
|
"cooldown_until": account.cooldown_until,
|
||||||
|
"last_known_usage": cls._snapshot_to_dict(account.last_known_usage),
|
||||||
|
"last_error": account.last_error,
|
||||||
|
}
|
||||||
233
tests/test_account_ops.py
Normal file
233
tests/test_account_ops.py
Normal file
|
|
@ -0,0 +1,233 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import time
|
||||||
|
from typing import Any, cast
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from gibby.account_ops import (
|
||||||
|
PermanentAccountFailure,
|
||||||
|
refresh_account_usage,
|
||||||
|
snapshot_is_exhausted,
|
||||||
|
window_used_percent,
|
||||||
|
)
|
||||||
|
from gibby.client import OpenAIAPIError
|
||||||
|
from gibby.models import (
|
||||||
|
UNKNOWN_EXHAUSTED_BACKOFF_SECONDS,
|
||||||
|
AccountRecord,
|
||||||
|
UsageSnapshot,
|
||||||
|
UsageWindow,
|
||||||
|
compute_cooldown_until,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class FakeClient:
|
||||||
|
def __init__(self, usage: UsageSnapshot, *, permanent: bool = False):
|
||||||
|
self.usage = usage
|
||||||
|
self.permanent = permanent
|
||||||
|
self.refresh_calls: list[str] = []
|
||||||
|
|
||||||
|
async def refresh_access_token(self, refresh_token: str):
|
||||||
|
self.refresh_calls.append(refresh_token)
|
||||||
|
return ("new-token", "new-refresh", int(time.time()) + 600)
|
||||||
|
|
||||||
|
async def fetch_usage_payload(self, access_token: str) -> dict:
|
||||||
|
if self.permanent:
|
||||||
|
raise OpenAIAPIError("invalid_grant", permanent=True, status_code=401)
|
||||||
|
primary_window = self.usage.primary_window
|
||||||
|
assert primary_window is not None
|
||||||
|
secondary_window = (
|
||||||
|
{
|
||||||
|
"used_percent": self.usage.secondary_window.used_percent,
|
||||||
|
"limit_window_seconds": self.usage.secondary_window.limit_window_seconds,
|
||||||
|
"reset_after_seconds": self.usage.secondary_window.reset_after_seconds,
|
||||||
|
"reset_at": self.usage.secondary_window.reset_at,
|
||||||
|
}
|
||||||
|
if self.usage.secondary_window is not None
|
||||||
|
else None
|
||||||
|
)
|
||||||
|
return {
|
||||||
|
"email": "acc@example.com",
|
||||||
|
"account_id": "acc-1",
|
||||||
|
"rate_limit": {
|
||||||
|
"allowed": self.usage.allowed,
|
||||||
|
"limit_reached": self.usage.limit_reached,
|
||||||
|
"primary_window": {
|
||||||
|
"used_percent": primary_window.used_percent,
|
||||||
|
"limit_window_seconds": primary_window.limit_window_seconds,
|
||||||
|
"reset_after_seconds": primary_window.reset_after_seconds,
|
||||||
|
"reset_at": primary_window.reset_at,
|
||||||
|
},
|
||||||
|
"secondary_window": secondary_window,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def make_usage(
|
||||||
|
primary: int,
|
||||||
|
secondary: int | None = None,
|
||||||
|
*,
|
||||||
|
limit_reached: bool = False,
|
||||||
|
allowed: bool | None = None,
|
||||||
|
reset_after: int = 10,
|
||||||
|
checked_at: int | None = None,
|
||||||
|
primary_reset_at: int | None = None,
|
||||||
|
secondary_reset_at: int | None = None,
|
||||||
|
primary_limit_window_seconds: int = 18000,
|
||||||
|
secondary_limit_window_seconds: int = 604800,
|
||||||
|
) -> UsageSnapshot:
|
||||||
|
checked_at = checked_at or int(time.time())
|
||||||
|
exhausted = (
|
||||||
|
primary >= 100 or (secondary is not None and secondary >= 100) or limit_reached
|
||||||
|
)
|
||||||
|
if allowed is None:
|
||||||
|
allowed = not exhausted
|
||||||
|
return UsageSnapshot(
|
||||||
|
checked_at=checked_at,
|
||||||
|
used_percent=max(primary, secondary or 0),
|
||||||
|
remaining_percent=max(0, 100 - max(primary, secondary or 0)),
|
||||||
|
exhausted=exhausted or not allowed,
|
||||||
|
primary_window=UsageWindow(
|
||||||
|
primary,
|
||||||
|
primary_limit_window_seconds,
|
||||||
|
reset_after,
|
||||||
|
primary_reset_at
|
||||||
|
if primary_reset_at is not None
|
||||||
|
else checked_at + reset_after,
|
||||||
|
),
|
||||||
|
secondary_window=UsageWindow(
|
||||||
|
secondary,
|
||||||
|
secondary_limit_window_seconds,
|
||||||
|
reset_after,
|
||||||
|
secondary_reset_at
|
||||||
|
if secondary_reset_at is not None
|
||||||
|
else checked_at + reset_after,
|
||||||
|
)
|
||||||
|
if secondary is not None
|
||||||
|
else None,
|
||||||
|
limit_reached=limit_reached,
|
||||||
|
allowed=allowed,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_window_used_percent_defaults_to_zero() -> None:
|
||||||
|
assert window_used_percent(None) == 0
|
||||||
|
|
||||||
|
|
||||||
|
def test_snapshot_is_exhausted_checks_various_conditions() -> None:
|
||||||
|
assert snapshot_is_exhausted(make_usage(94, 95), 96) is False
|
||||||
|
assert snapshot_is_exhausted(make_usage(94), 95) is False
|
||||||
|
assert snapshot_is_exhausted(make_usage(95), 95) is True
|
||||||
|
assert snapshot_is_exhausted(make_usage(50, 95), 95) is True
|
||||||
|
assert snapshot_is_exhausted(make_usage(50, limit_reached=True), 95) is True
|
||||||
|
|
||||||
|
|
||||||
|
def test_compute_cooldown_until_uses_primary_blocking_window() -> None:
|
||||||
|
usage = make_usage(
|
||||||
|
95, 20, checked_at=1000, primary_reset_at=1100, secondary_reset_at=1300
|
||||||
|
)
|
||||||
|
|
||||||
|
assert compute_cooldown_until(usage, 95) == 1100
|
||||||
|
|
||||||
|
|
||||||
|
def test_compute_cooldown_until_uses_secondary_blocking_window() -> None:
|
||||||
|
usage = make_usage(
|
||||||
|
20, 95, checked_at=1000, primary_reset_at=1100, secondary_reset_at=1300
|
||||||
|
)
|
||||||
|
|
||||||
|
assert compute_cooldown_until(usage, 95) == 1300
|
||||||
|
|
||||||
|
|
||||||
|
def test_compute_cooldown_until_waits_for_all_blocking_windows() -> None:
|
||||||
|
usage = make_usage(
|
||||||
|
95, 95, checked_at=1000, primary_reset_at=1100, secondary_reset_at=1300
|
||||||
|
)
|
||||||
|
|
||||||
|
assert compute_cooldown_until(usage, 95) == 1300
|
||||||
|
|
||||||
|
|
||||||
|
def test_compute_cooldown_until_uses_latest_window_when_blocker_is_ambiguous() -> None:
|
||||||
|
usage = make_usage(
|
||||||
|
80,
|
||||||
|
40,
|
||||||
|
limit_reached=True,
|
||||||
|
allowed=False,
|
||||||
|
checked_at=1000,
|
||||||
|
primary_reset_at=1100,
|
||||||
|
secondary_reset_at=1300,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert compute_cooldown_until(usage, 95) == 1300
|
||||||
|
|
||||||
|
|
||||||
|
def test_compute_cooldown_until_falls_back_to_limit_window_seconds() -> None:
|
||||||
|
usage = make_usage(
|
||||||
|
95,
|
||||||
|
checked_at=1000,
|
||||||
|
reset_after=0,
|
||||||
|
primary_reset_at=0,
|
||||||
|
primary_limit_window_seconds=600,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert compute_cooldown_until(usage, 95) == 1600
|
||||||
|
|
||||||
|
|
||||||
|
def test_compute_cooldown_until_uses_backoff_when_no_reset_metadata_exists() -> None:
|
||||||
|
usage = make_usage(
|
||||||
|
95,
|
||||||
|
checked_at=1000,
|
||||||
|
reset_after=0,
|
||||||
|
primary_reset_at=0,
|
||||||
|
primary_limit_window_seconds=0,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert compute_cooldown_until(usage, 95) == 1000 + UNKNOWN_EXHAUSTED_BACKOFF_SECONDS
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_refresh_account_usage_populates_snapshot_and_cooldown() -> None:
|
||||||
|
current = int(time.time())
|
||||||
|
account = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok",
|
||||||
|
refresh_token="ref",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
)
|
||||||
|
usage = make_usage(
|
||||||
|
96,
|
||||||
|
1,
|
||||||
|
limit_reached=True,
|
||||||
|
allowed=False,
|
||||||
|
checked_at=current,
|
||||||
|
primary_reset_at=current + 100,
|
||||||
|
secondary_reset_at=current + 300,
|
||||||
|
)
|
||||||
|
client = FakeClient(usage)
|
||||||
|
|
||||||
|
result = await refresh_account_usage(account, cast(Any, client), 95)
|
||||||
|
|
||||||
|
assert result.used_percent == usage.used_percent
|
||||||
|
assert result.limit_reached is usage.limit_reached
|
||||||
|
assert result.allowed is usage.allowed
|
||||||
|
assert account.last_known_usage is not None
|
||||||
|
assert account.last_known_usage.used_percent == usage.used_percent
|
||||||
|
assert account.cooldown_until == current + 100
|
||||||
|
assert account.email == "acc@example.com"
|
||||||
|
assert account.id == "acc@example.com"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_refresh_account_usage_raises_permanent_failure() -> None:
|
||||||
|
account = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok",
|
||||||
|
refresh_token="ref",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
)
|
||||||
|
|
||||||
|
with pytest.raises(PermanentAccountFailure):
|
||||||
|
await refresh_account_usage(
|
||||||
|
account,
|
||||||
|
cast(Any, FakeClient(make_usage(1), permanent=True)),
|
||||||
|
95,
|
||||||
|
)
|
||||||
89
tests/test_app.py
Normal file
89
tests/test_app.py
Normal file
|
|
@ -0,0 +1,89 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
from gibby.app import create_app
|
||||||
|
from gibby.manager import AccountManager, NoUsableAccountError
|
||||||
|
|
||||||
|
|
||||||
|
class StubManager(AccountManager):
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
response: dict[str, Any] | None = None,
|
||||||
|
usage_response: dict[str, Any] | None = None,
|
||||||
|
error: Exception | None = None,
|
||||||
|
):
|
||||||
|
self.response = response
|
||||||
|
self.usage_response = usage_response
|
||||||
|
self.error = error
|
||||||
|
|
||||||
|
async def issue_token_response(self):
|
||||||
|
if self.error is not None:
|
||||||
|
raise self.error
|
||||||
|
if self.response is not None:
|
||||||
|
return self.response
|
||||||
|
return {}
|
||||||
|
|
||||||
|
async def get_usage_report(self):
|
||||||
|
if self.usage_response is not None:
|
||||||
|
return self.usage_response
|
||||||
|
return {"accounts": [], "active_account_id": None, "count": 0}
|
||||||
|
|
||||||
|
|
||||||
|
def test_health_ok() -> None:
|
||||||
|
client = TestClient(create_app(StubManager(response={})))
|
||||||
|
response = client.get("/health")
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.text == "ok"
|
||||||
|
|
||||||
|
|
||||||
|
def test_token_success_shape() -> None:
|
||||||
|
payload = {
|
||||||
|
"token": "abc",
|
||||||
|
"limit": {
|
||||||
|
"used_percent": 10,
|
||||||
|
"remaining_percent": 90,
|
||||||
|
"exhausted": False,
|
||||||
|
"needs_prepare": False,
|
||||||
|
},
|
||||||
|
"usage": {"primary_window": None, "secondary_window": None},
|
||||||
|
}
|
||||||
|
client = TestClient(create_app(StubManager(response=payload)))
|
||||||
|
response = client.get("/token")
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json() == payload
|
||||||
|
|
||||||
|
|
||||||
|
def test_token_error_shape() -> None:
|
||||||
|
client = TestClient(
|
||||||
|
create_app(
|
||||||
|
StubManager(error=NoUsableAccountError("No usable account available"))
|
||||||
|
)
|
||||||
|
)
|
||||||
|
response = client.get("/token")
|
||||||
|
assert response.status_code == 503
|
||||||
|
assert response.json() == {"error": "No usable account available"}
|
||||||
|
|
||||||
|
|
||||||
|
def test_usage_success_shape() -> None:
|
||||||
|
payload = {
|
||||||
|
"accounts": [
|
||||||
|
{
|
||||||
|
"id": "a1",
|
||||||
|
"email": "a1@example.com",
|
||||||
|
"status": "ok",
|
||||||
|
"used_percent": 12,
|
||||||
|
"remaining_percent": 88,
|
||||||
|
"cooldown_until": None,
|
||||||
|
"primary_window": {"used_percent": 12},
|
||||||
|
"secondary_window": {"used_percent": 1},
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"active_account_id": "a1",
|
||||||
|
"count": 1,
|
||||||
|
}
|
||||||
|
client = TestClient(create_app(StubManager(usage_response=payload)))
|
||||||
|
response = client.get("/usage")
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json() == payload
|
||||||
570
tests/test_core.py
Normal file
570
tests/test_core.py
Normal file
|
|
@ -0,0 +1,570 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import time
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from gibby.client import OpenAIAPIError, OpenAIClient
|
||||||
|
from gibby.manager import AccountManager, NoUsableAccountError
|
||||||
|
from gibby.models import AccountRecord, StateFile, UsageSnapshot, UsageWindow
|
||||||
|
from gibby.settings import Settings
|
||||||
|
from gibby.store import JsonStateStore
|
||||||
|
|
||||||
|
|
||||||
|
class FakeClient(OpenAIClient):
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
usage_by_token=None,
|
||||||
|
refresh_map=None,
|
||||||
|
failing_tokens=None,
|
||||||
|
permanent_refresh_tokens=None,
|
||||||
|
):
|
||||||
|
self.usage_by_token = usage_by_token or {}
|
||||||
|
self.refresh_map = refresh_map or {}
|
||||||
|
self.failing_tokens = set(failing_tokens or [])
|
||||||
|
self.permanent_refresh_tokens = set(permanent_refresh_tokens or [])
|
||||||
|
self.fetched_tokens: list[str] = []
|
||||||
|
self.refresh_calls: list[str] = []
|
||||||
|
|
||||||
|
async def refresh_access_token(self, refresh_token: str):
|
||||||
|
self.refresh_calls.append(refresh_token)
|
||||||
|
if refresh_token in self.permanent_refresh_tokens:
|
||||||
|
raise OpenAIAPIError("invalid_grant", permanent=True, status_code=401)
|
||||||
|
return self.refresh_map[refresh_token]
|
||||||
|
|
||||||
|
async def fetch_usage_payload(self, access_token: str):
|
||||||
|
self.fetched_tokens.append(access_token)
|
||||||
|
if access_token in self.failing_tokens:
|
||||||
|
raise RuntimeError("usage failed")
|
||||||
|
usage = self.usage_by_token[access_token]
|
||||||
|
return {
|
||||||
|
"email": f"{access_token}@example.com",
|
||||||
|
"account_id": f"acct-{access_token}",
|
||||||
|
"rate_limit": {
|
||||||
|
"allowed": usage.allowed,
|
||||||
|
"limit_reached": usage.limit_reached,
|
||||||
|
"primary_window": {
|
||||||
|
"used_percent": usage.primary_window.used_percent,
|
||||||
|
"limit_window_seconds": usage.primary_window.limit_window_seconds,
|
||||||
|
"reset_after_seconds": usage.primary_window.reset_after_seconds,
|
||||||
|
"reset_at": usage.primary_window.reset_at,
|
||||||
|
}
|
||||||
|
if usage.primary_window
|
||||||
|
else None,
|
||||||
|
"secondary_window": {
|
||||||
|
"used_percent": usage.secondary_window.used_percent,
|
||||||
|
"limit_window_seconds": usage.secondary_window.limit_window_seconds,
|
||||||
|
"reset_after_seconds": usage.secondary_window.reset_after_seconds,
|
||||||
|
"reset_at": usage.secondary_window.reset_at,
|
||||||
|
}
|
||||||
|
if usage.secondary_window
|
||||||
|
else None,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def make_usage(
|
||||||
|
*,
|
||||||
|
used: int,
|
||||||
|
secondary_used: int | None = None,
|
||||||
|
limit_reached: bool = False,
|
||||||
|
reset_after: int = 0,
|
||||||
|
) -> UsageSnapshot:
|
||||||
|
exhausted = (
|
||||||
|
used >= 100
|
||||||
|
or (secondary_used is not None and secondary_used >= 100)
|
||||||
|
or limit_reached
|
||||||
|
)
|
||||||
|
return UsageSnapshot(
|
||||||
|
checked_at=int(time.time()),
|
||||||
|
used_percent=used,
|
||||||
|
remaining_percent=max(0, 100 - used),
|
||||||
|
exhausted=exhausted,
|
||||||
|
primary_window=UsageWindow(
|
||||||
|
used_percent=used,
|
||||||
|
limit_window_seconds=604800,
|
||||||
|
reset_after_seconds=reset_after,
|
||||||
|
reset_at=int(time.time()) + reset_after if reset_after else 0,
|
||||||
|
),
|
||||||
|
secondary_window=UsageWindow(
|
||||||
|
used_percent=secondary_used,
|
||||||
|
limit_window_seconds=604800,
|
||||||
|
reset_after_seconds=reset_after,
|
||||||
|
reset_at=int(time.time()) + reset_after if reset_after else 0,
|
||||||
|
)
|
||||||
|
if secondary_used is not None
|
||||||
|
else None,
|
||||||
|
limit_reached=limit_reached,
|
||||||
|
allowed=not exhausted,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def make_manager(
|
||||||
|
store: JsonStateStore,
|
||||||
|
client: FakeClient,
|
||||||
|
*,
|
||||||
|
threshold: int = 95,
|
||||||
|
) -> AccountManager:
|
||||||
|
return AccountManager(
|
||||||
|
store,
|
||||||
|
client,
|
||||||
|
Settings(data_dir=store.path.parent, exhausted_usage_threshold=threshold),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def make_store(tmp_path: Path, state: StateFile) -> JsonStateStore:
|
||||||
|
store = JsonStateStore(tmp_path / "accounts.json")
|
||||||
|
store.save(state)
|
||||||
|
return store
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_prefers_active_account_when_locally_usable(tmp_path: Path) -> None:
|
||||||
|
active = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=20),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=70),
|
||||||
|
)
|
||||||
|
store = make_store(
|
||||||
|
tmp_path, StateFile(active_account_id="a1", accounts=[active, second])
|
||||||
|
)
|
||||||
|
client = FakeClient(
|
||||||
|
usage_by_token={
|
||||||
|
"tok-a1": make_usage(used=21),
|
||||||
|
"tok-a2": make_usage(used=72),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a1"
|
||||||
|
assert client.fetched_tokens == ["tok-a1"]
|
||||||
|
assert store.load().active_account_id == "tok-a1@example.com"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_prefers_higher_primary_usage_from_saved_snapshot(tmp_path: Path) -> None:
|
||||||
|
first = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=20),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=70),
|
||||||
|
)
|
||||||
|
store = make_store(tmp_path, StateFile(accounts=[first, second]))
|
||||||
|
client = FakeClient(usage_by_token={"tok-a2": make_usage(used=72)})
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a2"
|
||||||
|
assert client.fetched_tokens == ["tok-a2"]
|
||||||
|
saved = store.load()
|
||||||
|
assert saved.active_account_id == "tok-a2@example.com"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_breaks_ties_with_secondary_usage(tmp_path: Path) -> None:
|
||||||
|
first = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=60, secondary_used=10),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=60, secondary_used=40),
|
||||||
|
)
|
||||||
|
store = make_store(tmp_path, StateFile(accounts=[first, second]))
|
||||||
|
client = FakeClient(
|
||||||
|
usage_by_token={"tok-a2": make_usage(used=61, secondary_used=41)}
|
||||||
|
)
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a2"
|
||||||
|
assert client.fetched_tokens == ["tok-a2"]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_treats_missing_secondary_as_zero(tmp_path: Path) -> None:
|
||||||
|
first = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=60),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=60, secondary_used=1),
|
||||||
|
)
|
||||||
|
store = make_store(tmp_path, StateFile(accounts=[first, second]))
|
||||||
|
client = FakeClient(
|
||||||
|
usage_by_token={"tok-a2": make_usage(used=61, secondary_used=1)}
|
||||||
|
)
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a2"
|
||||||
|
assert client.fetched_tokens == ["tok-a2"]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_skips_account_still_in_cooldown(tmp_path: Path) -> None:
|
||||||
|
active = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
cooldown_until=int(time.time()) + 300,
|
||||||
|
last_known_usage=make_usage(used=80),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=20),
|
||||||
|
)
|
||||||
|
store = make_store(
|
||||||
|
tmp_path, StateFile(active_account_id="a1", accounts=[active, second])
|
||||||
|
)
|
||||||
|
client = FakeClient(usage_by_token={"tok-a2": make_usage(used=25)})
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a2"
|
||||||
|
assert client.fetched_tokens == ["tok-a2"]
|
||||||
|
assert store.load().active_account_id == "tok-a2@example.com"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_skips_active_account_blocked_by_local_exhausted_snapshot(
|
||||||
|
tmp_path: Path,
|
||||||
|
) -> None:
|
||||||
|
active = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=96),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=20),
|
||||||
|
)
|
||||||
|
store = make_store(
|
||||||
|
tmp_path, StateFile(active_account_id="a1", accounts=[active, second])
|
||||||
|
)
|
||||||
|
client = FakeClient(usage_by_token={"tok-a2": make_usage(used=25)})
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a2"
|
||||||
|
assert client.fetched_tokens == ["tok-a2"]
|
||||||
|
assert store.load().active_account_id == "tok-a2@example.com"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_live_checks_depleted_and_moves_to_next(tmp_path: Path) -> None:
|
||||||
|
high = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=94),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=50),
|
||||||
|
)
|
||||||
|
store = make_store(tmp_path, StateFile(accounts=[high, second]))
|
||||||
|
client = FakeClient(
|
||||||
|
usage_by_token={
|
||||||
|
"tok-a1": make_usage(used=96, reset_after=120),
|
||||||
|
"tok-a2": make_usage(used=52),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a2"
|
||||||
|
assert client.fetched_tokens == ["tok-a1", "tok-a2"]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_live_checks_secondary_depleted_and_moves_to_next(tmp_path: Path) -> None:
|
||||||
|
high = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=30, secondary_used=94),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=20, secondary_used=10),
|
||||||
|
)
|
||||||
|
store = make_store(tmp_path, StateFile(accounts=[high, second]))
|
||||||
|
client = FakeClient(
|
||||||
|
usage_by_token={
|
||||||
|
"tok-a1": make_usage(used=30, secondary_used=100, reset_after=120),
|
||||||
|
"tok-a2": make_usage(used=22, secondary_used=10),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a2"
|
||||||
|
assert client.fetched_tokens == ["tok-a1", "tok-a2"]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_falls_through_when_live_usage_is_depleted(tmp_path: Path) -> None:
|
||||||
|
first = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=80),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=70),
|
||||||
|
)
|
||||||
|
store = make_store(tmp_path, StateFile(accounts=[first, second]))
|
||||||
|
client = FakeClient(
|
||||||
|
usage_by_token={
|
||||||
|
"tok-a1": make_usage(used=95, reset_after=120),
|
||||||
|
"tok-a2": make_usage(used=71),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
saved = store.load()
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a2"
|
||||||
|
assert client.fetched_tokens == ["tok-a1", "tok-a2"]
|
||||||
|
depleted = next(
|
||||||
|
account for account in saved.accounts if account.id == "tok-a1@example.com"
|
||||||
|
)
|
||||||
|
assert depleted.cooldown_until is not None
|
||||||
|
assert depleted.last_known_usage is not None
|
||||||
|
assert depleted.last_known_usage.used_percent == 95
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_keeps_account_out_until_blocking_window_resets(tmp_path: Path) -> None:
|
||||||
|
current = int(time.time())
|
||||||
|
blocked = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=80, secondary_used=40),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=70),
|
||||||
|
)
|
||||||
|
store = make_store(tmp_path, StateFile(accounts=[blocked, second]))
|
||||||
|
blocked_usage = UsageSnapshot(
|
||||||
|
checked_at=current,
|
||||||
|
used_percent=80,
|
||||||
|
remaining_percent=20,
|
||||||
|
exhausted=True,
|
||||||
|
primary_window=UsageWindow(
|
||||||
|
used_percent=80,
|
||||||
|
limit_window_seconds=604800,
|
||||||
|
reset_after_seconds=60,
|
||||||
|
reset_at=current + 60,
|
||||||
|
),
|
||||||
|
secondary_window=UsageWindow(
|
||||||
|
used_percent=40,
|
||||||
|
limit_window_seconds=604800,
|
||||||
|
reset_after_seconds=240,
|
||||||
|
reset_at=current + 240,
|
||||||
|
),
|
||||||
|
limit_reached=True,
|
||||||
|
allowed=False,
|
||||||
|
)
|
||||||
|
client = FakeClient(
|
||||||
|
usage_by_token={
|
||||||
|
"tok-a1": blocked_usage,
|
||||||
|
"tok-a2": make_usage(used=71),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
saved = store.load()
|
||||||
|
blocked_saved = next(
|
||||||
|
account for account in saved.accounts if account.id == "tok-a1@example.com"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a2"
|
||||||
|
assert blocked_saved.cooldown_until == current + 240
|
||||||
|
assert client.fetched_tokens == ["tok-a1", "tok-a2"]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_refreshes_expired_token_before_usage(tmp_path: Path) -> None:
|
||||||
|
active = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="old-token",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) - 1,
|
||||||
|
last_known_usage=make_usage(used=20),
|
||||||
|
)
|
||||||
|
store = make_store(tmp_path, StateFile(active_account_id="a1", accounts=[active]))
|
||||||
|
client = FakeClient(
|
||||||
|
usage_by_token={"new-token": make_usage(used=20)},
|
||||||
|
refresh_map={"ref-a1": ("new-token", "new-refresh", int(time.time()) + 600)},
|
||||||
|
)
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
saved = store.load()
|
||||||
|
|
||||||
|
assert payload["token"] == "new-token"
|
||||||
|
assert client.refresh_calls == ["ref-a1"]
|
||||||
|
assert client.fetched_tokens == ["new-token"]
|
||||||
|
assert saved.accounts[0].id == "new-token@example.com"
|
||||||
|
assert saved.accounts[0].access_token == "new-token"
|
||||||
|
assert saved.accounts[0].refresh_token == "new-refresh"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_raises_when_all_accounts_unusable(tmp_path: Path) -> None:
|
||||||
|
active = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=80),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=70),
|
||||||
|
)
|
||||||
|
store = make_store(
|
||||||
|
tmp_path, StateFile(active_account_id="a1", accounts=[active, second])
|
||||||
|
)
|
||||||
|
client = FakeClient(
|
||||||
|
usage_by_token={
|
||||||
|
"tok-a1": make_usage(used=96, reset_after=120),
|
||||||
|
"tok-a2": make_usage(used=97, reset_after=120),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
with pytest.raises(NoUsableAccountError):
|
||||||
|
await make_manager(store, client).issue_token_response()
|
||||||
|
|
||||||
|
saved = store.load()
|
||||||
|
assert all(account.cooldown_until is not None for account in saved.accounts)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_threshold_can_be_overridden_for_selection(tmp_path: Path) -> None:
|
||||||
|
active = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=96),
|
||||||
|
)
|
||||||
|
second = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=20),
|
||||||
|
)
|
||||||
|
store = make_store(
|
||||||
|
tmp_path, StateFile(active_account_id="a1", accounts=[active, second])
|
||||||
|
)
|
||||||
|
client = FakeClient(
|
||||||
|
usage_by_token={
|
||||||
|
"tok-a1": make_usage(used=96),
|
||||||
|
"tok-a2": make_usage(used=25),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
payload = await make_manager(store, client, threshold=97).issue_token_response()
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a1"
|
||||||
|
assert client.fetched_tokens == ["tok-a1"]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_removes_account_and_records_failed_email_on_permanent_refresh_failure(
|
||||||
|
tmp_path: Path,
|
||||||
|
) -> None:
|
||||||
|
dead = AccountRecord(
|
||||||
|
id="a1",
|
||||||
|
email="dead@example.com",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) - 1,
|
||||||
|
last_known_usage=make_usage(used=80),
|
||||||
|
)
|
||||||
|
alive = AccountRecord(
|
||||||
|
id="a2",
|
||||||
|
email="alive@example.com",
|
||||||
|
access_token="tok-a2",
|
||||||
|
refresh_token="ref-a2",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
last_known_usage=make_usage(used=70),
|
||||||
|
)
|
||||||
|
store = make_store(tmp_path, StateFile(accounts=[dead, alive]))
|
||||||
|
client = FakeClient(
|
||||||
|
usage_by_token={"tok-a2": make_usage(used=71)},
|
||||||
|
permanent_refresh_tokens={"ref-a1"},
|
||||||
|
)
|
||||||
|
|
||||||
|
payload = await make_manager(store, client).issue_token_response()
|
||||||
|
|
||||||
|
assert payload["token"] == "tok-a2"
|
||||||
|
saved = store.load()
|
||||||
|
assert [account.id for account in saved.accounts] == ["tok-a2@example.com"]
|
||||||
|
assert (tmp_path / "failed.txt").read_text().splitlines() == ["dead@example.com"]
|
||||||
177
tests/test_oauth_helper.py
Normal file
177
tests/test_oauth_helper.py
Normal file
|
|
@ -0,0 +1,177 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import sys
|
||||||
|
import urllib.parse
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any, cast
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
sys.path.insert(0, str(Path(__file__).resolve().parents[1] / "scripts"))
|
||||||
|
|
||||||
|
from gibby.client import OpenAIAPIError
|
||||||
|
from gibby.models import UsageSnapshot, UsageWindow
|
||||||
|
from gibby.oauth import build_authorize_url, generate_pkce_pair
|
||||||
|
from gibby.settings import Settings
|
||||||
|
from oauth_helper import ( # type: ignore[import-not-found]
|
||||||
|
exchange_and_store_account,
|
||||||
|
parse_redirect_url,
|
||||||
|
wait_for_callback,
|
||||||
|
)
|
||||||
|
from gibby.store import JsonStateStore
|
||||||
|
|
||||||
|
|
||||||
|
class FakeClient:
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
settings: Settings,
|
||||||
|
usage: UsageSnapshot,
|
||||||
|
*,
|
||||||
|
transient_usage_failure: bool = False,
|
||||||
|
):
|
||||||
|
self.settings = settings
|
||||||
|
self.usage = usage
|
||||||
|
self.transient_usage_failure = transient_usage_failure
|
||||||
|
|
||||||
|
async def exchange_code(self, code: str, verifier: str) -> tuple[str, str, int]:
|
||||||
|
return ("access-token", "refresh-token", 1776000000)
|
||||||
|
|
||||||
|
async def refresh_access_token(self, refresh_token: str) -> tuple[str, str, int]:
|
||||||
|
return ("access-token", refresh_token, 1776000000)
|
||||||
|
|
||||||
|
async def fetch_usage_payload(self, access_token: str) -> dict:
|
||||||
|
if self.transient_usage_failure:
|
||||||
|
raise OpenAIAPIError("usage timeout", permanent=False)
|
||||||
|
primary_window = self.usage.primary_window
|
||||||
|
assert primary_window is not None
|
||||||
|
secondary_window = (
|
||||||
|
{
|
||||||
|
"used_percent": self.usage.secondary_window.used_percent,
|
||||||
|
"limit_window_seconds": self.usage.secondary_window.limit_window_seconds,
|
||||||
|
"reset_after_seconds": self.usage.secondary_window.reset_after_seconds,
|
||||||
|
"reset_at": self.usage.secondary_window.reset_at,
|
||||||
|
}
|
||||||
|
if self.usage.secondary_window is not None
|
||||||
|
else None
|
||||||
|
)
|
||||||
|
return {
|
||||||
|
"email": "oauth@example.com",
|
||||||
|
"account_id": "oauth-1",
|
||||||
|
"rate_limit": {
|
||||||
|
"allowed": self.usage.allowed,
|
||||||
|
"limit_reached": self.usage.limit_reached,
|
||||||
|
"primary_window": {
|
||||||
|
"used_percent": primary_window.used_percent,
|
||||||
|
"limit_window_seconds": primary_window.limit_window_seconds,
|
||||||
|
"reset_after_seconds": primary_window.reset_after_seconds,
|
||||||
|
"reset_at": primary_window.reset_at,
|
||||||
|
},
|
||||||
|
"secondary_window": secondary_window,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def make_usage(primary: int, secondary: int | None = None) -> UsageSnapshot:
|
||||||
|
return UsageSnapshot(
|
||||||
|
checked_at=1775000000,
|
||||||
|
used_percent=max(primary, secondary or 0),
|
||||||
|
remaining_percent=max(0, 100 - max(primary, secondary or 0)),
|
||||||
|
exhausted=False,
|
||||||
|
primary_window=UsageWindow(primary, 18000, 100, 1775000100),
|
||||||
|
secondary_window=UsageWindow(secondary, 604800, 100, 1775000100)
|
||||||
|
if secondary is not None
|
||||||
|
else None,
|
||||||
|
limit_reached=False,
|
||||||
|
allowed=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_generate_pkce_pair_shapes() -> None:
|
||||||
|
verifier, challenge = generate_pkce_pair()
|
||||||
|
assert len(verifier) > 40
|
||||||
|
assert len(challenge) > 40
|
||||||
|
assert "=" not in challenge
|
||||||
|
|
||||||
|
|
||||||
|
def test_build_authorize_url_contains_redirect_and_state() -> None:
|
||||||
|
settings = Settings(callback_host="localhost", callback_port=1455)
|
||||||
|
url = build_authorize_url(settings, "challenge", "state-123")
|
||||||
|
parsed = urllib.parse.urlparse(url)
|
||||||
|
query = urllib.parse.parse_qs(parsed.query)
|
||||||
|
assert parsed.scheme == "https"
|
||||||
|
assert query["redirect_uri"] == ["http://localhost:1455/auth/callback"]
|
||||||
|
assert query["state"] == ["state-123"]
|
||||||
|
assert query["code_challenge"] == ["challenge"]
|
||||||
|
|
||||||
|
|
||||||
|
def test_parse_redirect_url_extracts_code_and_state() -> None:
|
||||||
|
code, state = parse_redirect_url(
|
||||||
|
"http://127.0.0.1:1455/auth/callback?code=abc&state=xyz"
|
||||||
|
)
|
||||||
|
assert code == "abc"
|
||||||
|
assert state == "xyz"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_wait_for_callback_receives_code_and_state() -> None:
|
||||||
|
task = asyncio.create_task(
|
||||||
|
wait_for_callback("127.0.0.1", 18555, "state-1", timeout=5)
|
||||||
|
)
|
||||||
|
await asyncio.sleep(0.05)
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
response = await client.get(
|
||||||
|
"http://127.0.0.1:18555/auth/callback?code=abc123&state=state-1",
|
||||||
|
timeout=5,
|
||||||
|
)
|
||||||
|
result = await task
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert result == ("abc123", "state-1")
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_exchange_and_store_account_populates_usage_snapshot(
|
||||||
|
tmp_path: Path,
|
||||||
|
) -> None:
|
||||||
|
store = JsonStateStore(tmp_path / "accounts.json")
|
||||||
|
settings = Settings(data_dir=tmp_path)
|
||||||
|
client = FakeClient(settings, make_usage(12, 3))
|
||||||
|
|
||||||
|
account = await exchange_and_store_account(
|
||||||
|
store,
|
||||||
|
cast(Any, client),
|
||||||
|
"code",
|
||||||
|
"verifier",
|
||||||
|
False,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert account.last_known_usage is not None
|
||||||
|
assert account.id == "oauth@example.com"
|
||||||
|
assert account.last_known_usage.primary_window is not None
|
||||||
|
assert account.last_known_usage.primary_window.used_percent == 12
|
||||||
|
assert account.last_known_usage.secondary_window is not None
|
||||||
|
assert account.last_known_usage.secondary_window.used_percent == 3
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_exchange_and_store_account_keeps_tokens_on_transient_usage_failure(
|
||||||
|
tmp_path: Path,
|
||||||
|
) -> None:
|
||||||
|
store = JsonStateStore(tmp_path / "accounts.json")
|
||||||
|
settings = Settings(data_dir=tmp_path)
|
||||||
|
client = FakeClient(settings, make_usage(12, 3), transient_usage_failure=True)
|
||||||
|
|
||||||
|
account = await exchange_and_store_account(
|
||||||
|
store,
|
||||||
|
cast(Any, client),
|
||||||
|
"code",
|
||||||
|
"verifier",
|
||||||
|
False,
|
||||||
|
)
|
||||||
|
|
||||||
|
saved = store.load()
|
||||||
|
assert account.last_known_usage is None
|
||||||
|
assert saved.accounts[0].access_token == "access-token"
|
||||||
|
assert saved.accounts[0].last_error is not None
|
||||||
132
tests/test_refresh_limits.py
Normal file
132
tests/test_refresh_limits.py
Normal file
|
|
@ -0,0 +1,132 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import time
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
sys.path.insert(0, str(Path(__file__).resolve().parents[1] / "scripts"))
|
||||||
|
|
||||||
|
from gibby.client import OpenAIAPIError
|
||||||
|
from gibby.models import AccountRecord, StateFile, UsageSnapshot, UsageWindow
|
||||||
|
from gibby.store import JsonStateStore
|
||||||
|
import refresh_limits # type: ignore[import-not-found]
|
||||||
|
|
||||||
|
|
||||||
|
def make_usage(primary: int, secondary: int | None = None) -> UsageSnapshot:
|
||||||
|
return UsageSnapshot(
|
||||||
|
checked_at=int(time.time()),
|
||||||
|
used_percent=max(primary, secondary or 0),
|
||||||
|
remaining_percent=max(0, 100 - max(primary, secondary or 0)),
|
||||||
|
exhausted=False,
|
||||||
|
primary_window=UsageWindow(primary, 18000, 10, int(time.time()) + 10),
|
||||||
|
secondary_window=UsageWindow(secondary, 604800, 10, int(time.time()) + 10)
|
||||||
|
if secondary is not None
|
||||||
|
else None,
|
||||||
|
limit_reached=False,
|
||||||
|
allowed=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class FakeClient:
|
||||||
|
def __init__(self, settings, *, permanent: bool = False):
|
||||||
|
self.settings = settings
|
||||||
|
self.permanent = permanent
|
||||||
|
|
||||||
|
async def aclose(self) -> None:
|
||||||
|
return
|
||||||
|
|
||||||
|
async def refresh_access_token(self, refresh_token: str):
|
||||||
|
return ("new-token", "new-refresh", int(time.time()) + 600)
|
||||||
|
|
||||||
|
async def fetch_usage_payload(self, access_token: str) -> dict:
|
||||||
|
if self.permanent:
|
||||||
|
raise OpenAIAPIError("invalid_grant", permanent=True, status_code=401)
|
||||||
|
usage = make_usage(12, 4)
|
||||||
|
primary_window = usage.primary_window
|
||||||
|
assert primary_window is not None
|
||||||
|
secondary_window = (
|
||||||
|
{
|
||||||
|
"used_percent": usage.secondary_window.used_percent,
|
||||||
|
"limit_window_seconds": usage.secondary_window.limit_window_seconds,
|
||||||
|
"reset_after_seconds": usage.secondary_window.reset_after_seconds,
|
||||||
|
"reset_at": usage.secondary_window.reset_at,
|
||||||
|
}
|
||||||
|
if usage.secondary_window is not None
|
||||||
|
else None
|
||||||
|
)
|
||||||
|
return {
|
||||||
|
"email": "acc@example.com",
|
||||||
|
"account_id": "acc-1",
|
||||||
|
"rate_limit": {
|
||||||
|
"allowed": usage.allowed,
|
||||||
|
"limit_reached": usage.limit_reached,
|
||||||
|
"primary_window": {
|
||||||
|
"used_percent": primary_window.used_percent,
|
||||||
|
"limit_window_seconds": primary_window.limit_window_seconds,
|
||||||
|
"reset_after_seconds": primary_window.reset_after_seconds,
|
||||||
|
"reset_at": primary_window.reset_at,
|
||||||
|
},
|
||||||
|
"secondary_window": secondary_window,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_refresh_limits_updates_all_accounts(
|
||||||
|
tmp_path: Path, monkeypatch: pytest.MonkeyPatch
|
||||||
|
) -> None:
|
||||||
|
store = JsonStateStore(tmp_path / "accounts.json")
|
||||||
|
store.save(
|
||||||
|
StateFile(
|
||||||
|
accounts=[
|
||||||
|
AccountRecord(
|
||||||
|
id="acc@example.com",
|
||||||
|
email="acc@example.com",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
)
|
||||||
|
]
|
||||||
|
)
|
||||||
|
)
|
||||||
|
monkeypatch.setattr(refresh_limits, "OpenAIClient", FakeClient)
|
||||||
|
|
||||||
|
await refresh_limits.run(tmp_path)
|
||||||
|
|
||||||
|
state = store.load()
|
||||||
|
assert state.accounts[0].last_known_usage is not None
|
||||||
|
assert state.accounts[0].last_known_usage.primary_window is not None
|
||||||
|
assert state.accounts[0].last_known_usage.primary_window.used_percent == 12
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_refresh_limits_removes_permanently_failed_account(
|
||||||
|
tmp_path: Path, monkeypatch: pytest.MonkeyPatch
|
||||||
|
) -> None:
|
||||||
|
store = JsonStateStore(tmp_path / "accounts.json")
|
||||||
|
store.save(
|
||||||
|
StateFile(
|
||||||
|
accounts=[
|
||||||
|
AccountRecord(
|
||||||
|
id="dead@example.com",
|
||||||
|
email="dead@example.com",
|
||||||
|
access_token="tok-a1",
|
||||||
|
refresh_token="ref-a1",
|
||||||
|
expires_at=int(time.time()) + 600,
|
||||||
|
)
|
||||||
|
]
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
def permanent_client(settings):
|
||||||
|
return FakeClient(settings, permanent=True)
|
||||||
|
|
||||||
|
monkeypatch.setattr(refresh_limits, "OpenAIClient", permanent_client)
|
||||||
|
|
||||||
|
await refresh_limits.run(tmp_path)
|
||||||
|
|
||||||
|
state = store.load()
|
||||||
|
assert state.accounts == []
|
||||||
|
assert (tmp_path / "failed.txt").read_text().splitlines() == ["dead@example.com"]
|
||||||
96
tests/test_store.py
Normal file
96
tests/test_store.py
Normal file
|
|
@ -0,0 +1,96 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
|
||||||
|
from gibby.models import AccountRecord, StateFile, UsageSnapshot, UsageWindow
|
||||||
|
from gibby.store import JsonStateStore
|
||||||
|
|
||||||
|
|
||||||
|
def test_store_writes_canonical_usage_snapshot_shape(tmp_path) -> None:
|
||||||
|
store = JsonStateStore(tmp_path / "accounts.json")
|
||||||
|
snapshot = UsageSnapshot(
|
||||||
|
checked_at=1000,
|
||||||
|
used_percent=75,
|
||||||
|
remaining_percent=25,
|
||||||
|
exhausted=False,
|
||||||
|
primary_window=UsageWindow(75, 18000, 300, 1300),
|
||||||
|
secondary_window=UsageWindow(10, 604800, 3600, 4600),
|
||||||
|
limit_reached=False,
|
||||||
|
allowed=True,
|
||||||
|
)
|
||||||
|
store.save(
|
||||||
|
StateFile(
|
||||||
|
accounts=[
|
||||||
|
AccountRecord(
|
||||||
|
id="acc@example.com",
|
||||||
|
email="acc@example.com",
|
||||||
|
access_token="tok",
|
||||||
|
refresh_token="ref",
|
||||||
|
expires_at=2000,
|
||||||
|
last_known_usage=snapshot,
|
||||||
|
)
|
||||||
|
]
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
payload = json.loads((tmp_path / "accounts.json").read_text())
|
||||||
|
saved_snapshot = payload["accounts"][0]["last_known_usage"]
|
||||||
|
|
||||||
|
assert set(saved_snapshot) == {
|
||||||
|
"checked_at",
|
||||||
|
"primary_window",
|
||||||
|
"secondary_window",
|
||||||
|
"limit_reached",
|
||||||
|
"allowed",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_store_load_reconstructs_derived_usage_fields(tmp_path) -> None:
|
||||||
|
path = tmp_path / "accounts.json"
|
||||||
|
path.write_text(
|
||||||
|
json.dumps(
|
||||||
|
{
|
||||||
|
"version": 1,
|
||||||
|
"active_account_id": "acc@example.com",
|
||||||
|
"accounts": [
|
||||||
|
{
|
||||||
|
"id": "acc@example.com",
|
||||||
|
"email": "acc@example.com",
|
||||||
|
"account_id": "acc-1",
|
||||||
|
"access_token": "tok",
|
||||||
|
"refresh_token": "ref",
|
||||||
|
"expires_at": 2000,
|
||||||
|
"cooldown_until": None,
|
||||||
|
"last_known_usage": {
|
||||||
|
"checked_at": 1000,
|
||||||
|
"primary_window": {
|
||||||
|
"used_percent": 80,
|
||||||
|
"limit_window_seconds": 18000,
|
||||||
|
"reset_after_seconds": 300,
|
||||||
|
"reset_at": 1300,
|
||||||
|
},
|
||||||
|
"secondary_window": {
|
||||||
|
"used_percent": 100,
|
||||||
|
"limit_window_seconds": 604800,
|
||||||
|
"reset_after_seconds": 3600,
|
||||||
|
"reset_at": 4600,
|
||||||
|
},
|
||||||
|
"limit_reached": False,
|
||||||
|
"allowed": True,
|
||||||
|
},
|
||||||
|
"last_error": None,
|
||||||
|
}
|
||||||
|
],
|
||||||
|
}
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
state = JsonStateStore(path).load()
|
||||||
|
snapshot = state.accounts[0].last_known_usage
|
||||||
|
|
||||||
|
assert snapshot is not None
|
||||||
|
assert snapshot.used_percent == 100
|
||||||
|
assert snapshot.remaining_percent == 0
|
||||||
|
assert snapshot.exhausted is True
|
||||||
|
assert snapshot.limit_reached is False
|
||||||
|
assert snapshot.allowed is True
|
||||||
320
uv.lock
generated
Normal file
320
uv.lock
generated
Normal file
|
|
@ -0,0 +1,320 @@
|
||||||
|
version = 1
|
||||||
|
revision = 3
|
||||||
|
requires-python = ">=3.14"
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "annotated-doc"
|
||||||
|
version = "0.0.4"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/57/ba/046ceea27344560984e26a590f90bc7f4a75b06701f653222458922b558c/annotated_doc-0.0.4.tar.gz", hash = "sha256:fbcda96e87e9c92ad167c2e53839e57503ecfda18804ea28102353485033faa4", size = 7288, upload-time = "2025-11-10T22:07:42.062Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1e/d3/26bf1008eb3d2daa8ef4cacc7f3bfdc11818d111f7e2d0201bc6e3b49d45/annotated_doc-0.0.4-py3-none-any.whl", hash = "sha256:571ac1dc6991c450b25a9c2d84a3705e2ae7a53467b5d111c24fa8baabbed320", size = 5303, upload-time = "2025-11-10T22:07:40.673Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "annotated-types"
|
||||||
|
version = "0.7.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "anyio"
|
||||||
|
version = "4.13.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "idna" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/19/14/2c5dd9f512b66549ae92767a9c7b330ae88e1932ca57876909410251fe13/anyio-4.13.0.tar.gz", hash = "sha256:334b70e641fd2221c1505b3890c69882fe4a2df910cba14d97019b90b24439dc", size = 231622, upload-time = "2026-03-24T12:59:09.671Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/da/42/e921fccf5015463e32a3cf6ee7f980a6ed0f395ceeaa45060b61d86486c2/anyio-4.13.0-py3-none-any.whl", hash = "sha256:08b310f9e24a9594186fd75b4f73f4a4152069e3853f1ed8bfbf58369f4ad708", size = 114353, upload-time = "2026-03-24T12:59:08.246Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "certifi"
|
||||||
|
version = "2026.2.25"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/af/2d/7bf41579a8986e348fa033a31cdd0e4121114f6bce2457e8876010b092dd/certifi-2026.2.25.tar.gz", hash = "sha256:e887ab5cee78ea814d3472169153c2d12cd43b14bd03329a39a9c6e2e80bfba7", size = 155029, upload-time = "2026-02-25T02:54:17.342Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9a/3c/c17fb3ca2d9c3acff52e30b309f538586f9f5b9c9cf454f3845fc9af4881/certifi-2026.2.25-py3-none-any.whl", hash = "sha256:027692e4402ad994f1c42e52a4997a9763c646b73e4096e4d5d6db8af1d6f0fa", size = 153684, upload-time = "2026-02-25T02:54:15.766Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "click"
|
||||||
|
version = "8.3.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/57/75/31212c6bf2503fdf920d87fee5d7a86a2e3bcf444984126f13d8e4016804/click-8.3.2.tar.gz", hash = "sha256:14162b8b3b3550a7d479eafa77dfd3c38d9dc8951f6f69c78913a8f9a7540fd5", size = 302856, upload-time = "2026-04-03T19:14:45.118Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e4/20/71885d8b97d4f3dde17b1fdb92dbd4908b00541c5a3379787137285f602e/click-8.3.2-py3-none-any.whl", hash = "sha256:1924d2c27c5653561cd2cae4548d1406039cb79b858b747cfea24924bbc1616d", size = 108379, upload-time = "2026-04-03T19:14:43.505Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "colorama"
|
||||||
|
version = "0.4.6"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "fastapi"
|
||||||
|
version = "0.135.3"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "annotated-doc" },
|
||||||
|
{ name = "pydantic" },
|
||||||
|
{ name = "starlette" },
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
{ name = "typing-inspection" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/f7/e6/7adb4c5fa231e82c35b8f5741a9f2d055f520c29af5546fd70d3e8e1cd2e/fastapi-0.135.3.tar.gz", hash = "sha256:bd6d7caf1a2bdd8d676843cdcd2287729572a1ef524fc4d65c17ae002a1be654", size = 396524, upload-time = "2026-04-01T16:23:58.188Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/84/a4/5caa2de7f917a04ada20018eccf60d6cc6145b0199d55ca3711b0fc08312/fastapi-0.135.3-py3-none-any.whl", hash = "sha256:9b0f590c813acd13d0ab43dd8494138eb58e484bfac405db1f3187cfc5810d98", size = 117734, upload-time = "2026-04-01T16:23:59.328Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "gibby"
|
||||||
|
version = "0.1.0"
|
||||||
|
source = { virtual = "." }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "fastapi" },
|
||||||
|
{ name = "httpx" },
|
||||||
|
{ name = "pygments" },
|
||||||
|
{ name = "uvicorn" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.dev-dependencies]
|
||||||
|
dev = [
|
||||||
|
{ name = "pytest" },
|
||||||
|
{ name = "pytest-asyncio" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.metadata]
|
||||||
|
requires-dist = [
|
||||||
|
{ name = "fastapi", specifier = ">=0.116.1" },
|
||||||
|
{ name = "httpx", specifier = ">=0.28.1" },
|
||||||
|
{ name = "pygments", specifier = ">=2.20.0" },
|
||||||
|
{ name = "uvicorn", specifier = ">=0.35.0" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.metadata.requires-dev]
|
||||||
|
dev = [
|
||||||
|
{ name = "pytest", specifier = ">=8.4.1" },
|
||||||
|
{ name = "pytest-asyncio", specifier = ">=1.1.0" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "h11"
|
||||||
|
version = "0.16.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "httpcore"
|
||||||
|
version = "1.0.9"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "certifi" },
|
||||||
|
{ name = "h11" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "httpx"
|
||||||
|
version = "0.28.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "anyio" },
|
||||||
|
{ name = "certifi" },
|
||||||
|
{ name = "httpcore" },
|
||||||
|
{ name = "idna" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "idna"
|
||||||
|
version = "3.11"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/6f/6d/0703ccc57f3a7233505399edb88de3cbd678da106337b9fcde432b65ed60/idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902", size = 194582, upload-time = "2025-10-12T14:55:20.501Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008, upload-time = "2025-10-12T14:55:18.883Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "iniconfig"
|
||||||
|
version = "2.3.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/72/34/14ca021ce8e5dfedc35312d08ba8bf51fdd999c576889fc2c24cb97f4f10/iniconfig-2.3.0.tar.gz", hash = "sha256:c76315c77db068650d49c5b56314774a7804df16fee4402c1f19d6d15d8c4730", size = 20503, upload-time = "2025-10-18T21:55:43.219Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cb/b1/3846dd7f199d53cb17f49cba7e651e9ce294d8497c8c150530ed11865bb8/iniconfig-2.3.0-py3-none-any.whl", hash = "sha256:f631c04d2c48c52b84d0d0549c99ff3859c98df65b3101406327ecc7d53fbf12", size = 7484, upload-time = "2025-10-18T21:55:41.639Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "packaging"
|
||||||
|
version = "26.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/65/ee/299d360cdc32edc7d2cf530f3accf79c4fca01e96ffc950d8a52213bd8e4/packaging-26.0.tar.gz", hash = "sha256:00243ae351a257117b6a241061796684b084ed1c516a08c48a3f7e147a9d80b4", size = 143416, upload-time = "2026-01-21T20:50:39.064Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b7/b9/c538f279a4e237a006a2c98387d081e9eb060d203d8ed34467cc0f0b9b53/packaging-26.0-py3-none-any.whl", hash = "sha256:b36f1fef9334a5588b4166f8bcd26a14e521f2b55e6b9de3aaa80d3ff7a37529", size = 74366, upload-time = "2026-01-21T20:50:37.788Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pluggy"
|
||||||
|
version = "1.6.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pydantic"
|
||||||
|
version = "2.12.5"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "annotated-types" },
|
||||||
|
{ name = "pydantic-core" },
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
{ name = "typing-inspection" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/69/44/36f1a6e523abc58ae5f928898e4aca2e0ea509b5aa6f6f392a5d882be928/pydantic-2.12.5.tar.gz", hash = "sha256:4d351024c75c0f085a9febbb665ce8c0c6ec5d30e903bdb6394b7ede26aebb49", size = 821591, upload-time = "2025-11-26T15:11:46.471Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5a/87/b70ad306ebb6f9b585f114d0ac2137d792b48be34d732d60e597c2f8465a/pydantic-2.12.5-py3-none-any.whl", hash = "sha256:e561593fccf61e8a20fc46dfc2dfe075b8be7d0188df33f221ad1f0139180f9d", size = 463580, upload-time = "2025-11-26T15:11:44.605Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pydantic-core"
|
||||||
|
version = "2.41.5"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/71/70/23b021c950c2addd24ec408e9ab05d59b035b39d97cdc1130e1bce647bb6/pydantic_core-2.41.5.tar.gz", hash = "sha256:08daa51ea16ad373ffd5e7606252cc32f07bc72b28284b6bc9c6df804816476e", size = 460952, upload-time = "2025-11-04T13:43:49.098Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ea/28/46b7c5c9635ae96ea0fbb779e271a38129df2550f763937659ee6c5dbc65/pydantic_core-2.41.5-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:3f37a19d7ebcdd20b96485056ba9e8b304e27d9904d233d7b1015db320e51f0a", size = 2119622, upload-time = "2025-11-04T13:40:56.68Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/74/1a/145646e5687e8d9a1e8d09acb278c8535ebe9e972e1f162ed338a622f193/pydantic_core-2.41.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:1d1d9764366c73f996edd17abb6d9d7649a7eb690006ab6adbda117717099b14", size = 1891725, upload-time = "2025-11-04T13:40:58.807Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/23/04/e89c29e267b8060b40dca97bfc64a19b2a3cf99018167ea1677d96368273/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:25e1c2af0fce638d5f1988b686f3b3ea8cd7de5f244ca147c777769e798a9cd1", size = 1915040, upload-time = "2025-11-04T13:41:00.853Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/84/a3/15a82ac7bd97992a82257f777b3583d3e84bdb06ba6858f745daa2ec8a85/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:506d766a8727beef16b7adaeb8ee6217c64fc813646b424d0804d67c16eddb66", size = 2063691, upload-time = "2025-11-04T13:41:03.504Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/74/9b/0046701313c6ef08c0c1cf0e028c67c770a4e1275ca73131563c5f2a310a/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4819fa52133c9aa3c387b3328f25c1facc356491e6135b459f1de698ff64d869", size = 2213897, upload-time = "2025-11-04T13:41:05.804Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8a/cd/6bac76ecd1b27e75a95ca3a9a559c643b3afcd2dd62086d4b7a32a18b169/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2b761d210c9ea91feda40d25b4efe82a1707da2ef62901466a42492c028553a2", size = 2333302, upload-time = "2025-11-04T13:41:07.809Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4c/d2/ef2074dc020dd6e109611a8be4449b98cd25e1b9b8a303c2f0fca2f2bcf7/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22f0fb8c1c583a3b6f24df2470833b40207e907b90c928cc8d3594b76f874375", size = 2064877, upload-time = "2025-11-04T13:41:09.827Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/18/66/e9db17a9a763d72f03de903883c057b2592c09509ccfe468187f2a2eef29/pydantic_core-2.41.5-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2782c870e99878c634505236d81e5443092fba820f0373997ff75f90f68cd553", size = 2180680, upload-time = "2025-11-04T13:41:12.379Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d3/9e/3ce66cebb929f3ced22be85d4c2399b8e85b622db77dad36b73c5387f8f8/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:0177272f88ab8312479336e1d777f6b124537d47f2123f89cb37e0accea97f90", size = 2138960, upload-time = "2025-11-04T13:41:14.627Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a6/62/205a998f4327d2079326b01abee48e502ea739d174f0a89295c481a2272e/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_armv7l.whl", hash = "sha256:63510af5e38f8955b8ee5687740d6ebf7c2a0886d15a6d65c32814613681bc07", size = 2339102, upload-time = "2025-11-04T13:41:16.868Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3c/0d/f05e79471e889d74d3d88f5bd20d0ed189ad94c2423d81ff8d0000aab4ff/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:e56ba91f47764cc14f1daacd723e3e82d1a89d783f0f5afe9c364b8bb491ccdb", size = 2326039, upload-time = "2025-11-04T13:41:18.934Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/e1/e08a6208bb100da7e0c4b288eed624a703f4d129bde2da475721a80cab32/pydantic_core-2.41.5-cp314-cp314-win32.whl", hash = "sha256:aec5cf2fd867b4ff45b9959f8b20ea3993fc93e63c7363fe6851424c8a7e7c23", size = 1995126, upload-time = "2025-11-04T13:41:21.418Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/48/5d/56ba7b24e9557f99c9237e29f5c09913c81eeb2f3217e40e922353668092/pydantic_core-2.41.5-cp314-cp314-win_amd64.whl", hash = "sha256:8e7c86f27c585ef37c35e56a96363ab8de4e549a95512445b85c96d3e2f7c1bf", size = 2015489, upload-time = "2025-11-04T13:41:24.076Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4e/bb/f7a190991ec9e3e0ba22e4993d8755bbc4a32925c0b5b42775c03e8148f9/pydantic_core-2.41.5-cp314-cp314-win_arm64.whl", hash = "sha256:e672ba74fbc2dc8eea59fb6d4aed6845e6905fc2a8afe93175d94a83ba2a01a0", size = 1977288, upload-time = "2025-11-04T13:41:26.33Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/92/ed/77542d0c51538e32e15afe7899d79efce4b81eee631d99850edc2f5e9349/pydantic_core-2.41.5-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:8566def80554c3faa0e65ac30ab0932b9e3a5cd7f8323764303d468e5c37595a", size = 2120255, upload-time = "2025-11-04T13:41:28.569Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bb/3d/6913dde84d5be21e284439676168b28d8bbba5600d838b9dca99de0fad71/pydantic_core-2.41.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:b80aa5095cd3109962a298ce14110ae16b8c1aece8b72f9dafe81cf597ad80b3", size = 1863760, upload-time = "2025-11-04T13:41:31.055Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5a/f0/e5e6b99d4191da102f2b0eb9687aaa7f5bea5d9964071a84effc3e40f997/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3006c3dd9ba34b0c094c544c6006cc79e87d8612999f1a5d43b769b89181f23c", size = 1878092, upload-time = "2025-11-04T13:41:33.21Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/71/48/36fb760642d568925953bcc8116455513d6e34c4beaa37544118c36aba6d/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:72f6c8b11857a856bcfa48c86f5368439f74453563f951e473514579d44aa612", size = 2053385, upload-time = "2025-11-04T13:41:35.508Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/20/25/92dc684dd8eb75a234bc1c764b4210cf2646479d54b47bf46061657292a8/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5cb1b2f9742240e4bb26b652a5aeb840aa4b417c7748b6f8387927bc6e45e40d", size = 2218832, upload-time = "2025-11-04T13:41:37.732Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e2/09/f53e0b05023d3e30357d82eb35835d0f6340ca344720a4599cd663dca599/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bd3d54f38609ff308209bd43acea66061494157703364ae40c951f83ba99a1a9", size = 2327585, upload-time = "2025-11-04T13:41:40Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/aa/4e/2ae1aa85d6af35a39b236b1b1641de73f5a6ac4d5a7509f77b814885760c/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2ff4321e56e879ee8d2a879501c8e469414d948f4aba74a2d4593184eb326660", size = 2041078, upload-time = "2025-11-04T13:41:42.323Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cd/13/2e215f17f0ef326fc72afe94776edb77525142c693767fc347ed6288728d/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d0d2568a8c11bf8225044aa94409e21da0cb09dcdafe9ecd10250b2baad531a9", size = 2173914, upload-time = "2025-11-04T13:41:45.221Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/02/7a/f999a6dcbcd0e5660bc348a3991c8915ce6599f4f2c6ac22f01d7a10816c/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:a39455728aabd58ceabb03c90e12f71fd30fa69615760a075b9fec596456ccc3", size = 2129560, upload-time = "2025-11-04T13:41:47.474Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3a/b1/6c990ac65e3b4c079a4fb9f5b05f5b013afa0f4ed6780a3dd236d2cbdc64/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_armv7l.whl", hash = "sha256:239edca560d05757817c13dc17c50766136d21f7cd0fac50295499ae24f90fdf", size = 2329244, upload-time = "2025-11-04T13:41:49.992Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d9/02/3c562f3a51afd4d88fff8dffb1771b30cfdfd79befd9883ee094f5b6c0d8/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:2a5e06546e19f24c6a96a129142a75cee553cc018ffee48a460059b1185f4470", size = 2331955, upload-time = "2025-11-04T13:41:54.079Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5c/96/5fb7d8c3c17bc8c62fdb031c47d77a1af698f1d7a406b0f79aaa1338f9ad/pydantic_core-2.41.5-cp314-cp314t-win32.whl", hash = "sha256:b4ececa40ac28afa90871c2cc2b9ffd2ff0bf749380fbdf57d165fd23da353aa", size = 1988906, upload-time = "2025-11-04T13:41:56.606Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/22/ed/182129d83032702912c2e2d8bbe33c036f342cc735737064668585dac28f/pydantic_core-2.41.5-cp314-cp314t-win_amd64.whl", hash = "sha256:80aa89cad80b32a912a65332f64a4450ed00966111b6615ca6816153d3585a8c", size = 1981607, upload-time = "2025-11-04T13:41:58.889Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9f/ed/068e41660b832bb0b1aa5b58011dea2a3fe0ba7861ff38c4d4904c1c1a99/pydantic_core-2.41.5-cp314-cp314t-win_arm64.whl", hash = "sha256:35b44f37a3199f771c3eaa53051bc8a70cd7b54f333531c59e29fd4db5d15008", size = 1974769, upload-time = "2025-11-04T13:42:01.186Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pygments"
|
||||||
|
version = "2.20.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/c3/b2/bc9c9196916376152d655522fdcebac55e66de6603a76a02bca1b6414f6c/pygments-2.20.0.tar.gz", hash = "sha256:6757cd03768053ff99f3039c1a36d6c0aa0b263438fcab17520b30a303a82b5f", size = 4955991, upload-time = "2026-03-29T13:29:33.898Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f4/7e/a72dd26f3b0f4f2bf1dd8923c85f7ceb43172af56d63c7383eb62b332364/pygments-2.20.0-py3-none-any.whl", hash = "sha256:81a9e26dd42fd28a23a2d169d86d7ac03b46e2f8b59ed4698fb4785f946d0176", size = 1231151, upload-time = "2026-03-29T13:29:30.038Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pytest"
|
||||||
|
version = "9.0.3"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||||
|
{ name = "iniconfig" },
|
||||||
|
{ name = "packaging" },
|
||||||
|
{ name = "pluggy" },
|
||||||
|
{ name = "pygments" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/7d/0d/549bd94f1a0a402dc8cf64563a117c0f3765662e2e668477624baeec44d5/pytest-9.0.3.tar.gz", hash = "sha256:b86ada508af81d19edeb213c681b1d48246c1a91d304c6c81a427674c17eb91c", size = 1572165, upload-time = "2026-04-07T17:16:18.027Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d4/24/a372aaf5c9b7208e7112038812994107bc65a84cd00e0354a88c2c77a617/pytest-9.0.3-py3-none-any.whl", hash = "sha256:2c5efc453d45394fdd706ade797c0a81091eccd1d6e4bccfcd476e2b8e0ab5d9", size = 375249, upload-time = "2026-04-07T17:16:16.13Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pytest-asyncio"
|
||||||
|
version = "1.3.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "pytest" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/90/2c/8af215c0f776415f3590cac4f9086ccefd6fd463befeae41cd4d3f193e5a/pytest_asyncio-1.3.0.tar.gz", hash = "sha256:d7f52f36d231b80ee124cd216ffb19369aa168fc10095013c6b014a34d3ee9e5", size = 50087, upload-time = "2025-11-10T16:07:47.256Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e5/35/f8b19922b6a25bc0880171a2f1a003eaeb93657475193ab516fd87cac9da/pytest_asyncio-1.3.0-py3-none-any.whl", hash = "sha256:611e26147c7f77640e6d0a92a38ed17c3e9848063698d5c93d5aa7aa11cebff5", size = 15075, upload-time = "2025-11-10T16:07:45.537Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "starlette"
|
||||||
|
version = "1.0.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "anyio" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/81/69/17425771797c36cded50b7fe44e850315d039f28b15901ab44839e70b593/starlette-1.0.0.tar.gz", hash = "sha256:6a4beaf1f81bb472fd19ea9b918b50dc3a77a6f2e190a12954b25e6ed5eea149", size = 2655289, upload-time = "2026-03-22T18:29:46.779Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0b/c9/584bc9651441b4ba60cc4d557d8a547b5aff901af35bda3a4ee30c819b82/starlette-1.0.0-py3-none-any.whl", hash = "sha256:d3ec55e0bb321692d275455ddfd3df75fff145d009685eb40dc91fc66b03d38b", size = 72651, upload-time = "2026-03-22T18:29:45.111Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "typing-extensions"
|
||||||
|
version = "4.15.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "typing-inspection"
|
||||||
|
version = "0.4.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/55/e3/70399cb7dd41c10ac53367ae42139cf4b1ca5f36bb3dc6c9d33acdb43655/typing_inspection-0.4.2.tar.gz", hash = "sha256:ba561c48a67c5958007083d386c3295464928b01faa735ab8547c5692e87f464", size = 75949, upload-time = "2025-10-01T02:14:41.687Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/dc/9b/47798a6c91d8bdb567fe2698fe81e0c6b7cb7ef4d13da4114b41d239f65d/typing_inspection-0.4.2-py3-none-any.whl", hash = "sha256:4ed1cacbdc298c220f1bd249ed5287caa16f34d44ef4e9c3d0cbad5b521545e7", size = 14611, upload-time = "2025-10-01T02:14:40.154Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "uvicorn"
|
||||||
|
version = "0.44.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "click" },
|
||||||
|
{ name = "h11" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/5e/da/6eee1ff8b6cbeed47eeb5229749168e81eb4b7b999a1a15a7176e51410c9/uvicorn-0.44.0.tar.gz", hash = "sha256:6c942071b68f07e178264b9152f1f16dfac5da85880c4ce06366a96d70d4f31e", size = 86947, upload-time = "2026-04-06T09:23:22.826Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b7/23/a5bbd9600dd607411fa644c06ff4951bec3a4d82c4b852374024359c19c0/uvicorn-0.44.0-py3-none-any.whl", hash = "sha256:ce937c99a2cc70279556967274414c087888e8cec9f9c94644dfca11bd3ced89", size = 69425, upload-time = "2026-04-06T09:23:21.524Z" },
|
||||||
|
]
|
||||||
Loading…
Add table
Add a link
Reference in a new issue