Using Online Services in Your Code

Online services expose functionality over HTTP APIs — weather, translation, search, payments, storage. Calling them from Python is mostly a matter of authenticating, hitting the right endpoint with the right parameters, and parsing the JSON response. The hard parts are credential handling, rate limits, retries, and the permanently-outdated documentation of whichever service you are integrating.

Most services authenticate with either a static API key (passed as a header or query parameter) or OAuth2. For a static key, store it in an environment variable (API_KEY = os.environ["SERVICE_API_KEY"]) and load at startup — never commit it to source. For OAuth2, use a library (authlib, requests-oauthlib) instead of reinventing the handshake.

Wrap each service in a small client class. The class owns the session, default headers, timeouts, and error translation. Methods on it map to the service's endpoints (client.get_user(id), client.create_order(order)). That way the rest of your code never sees HTTP details, and the day the API changes you have one file to update.

Rate limits, pagination, and transient errors are the three practical gotchas. Obey the Retry-After header on 429 responses. Iterate through paginated lists with a generator that follows the next cursor. Retry idempotent requests on 5xx with exponential backoff; never retry non-idempotent POSTs without an idempotency key.

Client class and auth

class WeatherClient: def __init__(self, key): self.s = requests.Session(); self.s.headers.update({"X-API-Key": key}). Every method uses self.s; the timeout and headers are set once.

Keep secrets out of the code. python-dotenv loads a local .env into os.environ for development; in production, rely on the platform's secret manager.

Pagination, rate limits, retries

Pagination: yield pages lazily. while url: r = get(url); yield from r["items"]; url = r.get("next"). That keeps memory flat for huge lists.

Rate limits: 429 means “slow down”. Read Retry-After, sleep, retry. urllib3.util.Retry and tenacity implement this for you.

Online-service integration tools.

ToolPurpose
requests.Session
class
Reuse connection, share headers.
python-dotenv
library
Load .env into os.environ.
requests-oauthlib
library
OAuth2 flow for requests.
authlib
library
OAuth1/2 + OpenID Connect.
tenacity
library
Configurable retry policies.
urllib3.Retry
class
Plug-in retry for requests.
429 Too Many Requests
spec
Rate-limit response code.
ETag / If-None-Match
spec
Cache validation headers.

Using Online Services in Your Code code example

The script builds a small client class for an imaginary weather service, with auth, a session, pagination, and retry.

# Lesson: Using Online Services in Your Code
import os
import time
import urllib.error
import urllib.parse
import urllib.request


class WeatherClient:
    """Tiny wrapper around a pretend weather API."""

    def __init__(self, base: str, api_key: str):
        self.base = base.rstrip("/")
        self.api_key = api_key

    def _url(self, path: str, **params) -> str:
        qs = urllib.parse.urlencode({**params, "key": self.api_key})
        return f"{self.base}{path}?{qs}"

    def _fetch(self, url: str, *, attempts: int = 3) -> dict:
        for i in range(1, attempts + 1):
            try:
                req = urllib.request.Request(url, headers={"User-Agent": "demo/1.0"})
                with urllib.request.urlopen(req, timeout=5) as resp:
                    return {"status": resp.status, "url": url}
            except urllib.error.URLError as err:
                if i == attempts:
                    raise
                time.sleep(0.1 * (2 ** (i - 1)))

    def current(self, city: str) -> dict:
        return self._fetch(self._url("/v1/current", city=city))

    def list_alerts(self, region: str):
        url = self._url("/v1/alerts", region=region)
        while url:
            page = self._fetch(url)
            yield from page.get("items", [])
            url = page.get("next")


# We don't actually call the network; we show the constructed URL.
client = WeatherClient("https://api.weather.example", api_key="KEY123")
print("current URL:", client._url("/v1/current", city="oslo"))
print("alerts URL :", client._url("/v1/alerts", region="EU"))

# Load a key from env safely
api_key = os.environ.get("WEATHER_API_KEY", "DEV-KEY")
print("using key  :", "***" + api_key[-3:])

# The full pattern with `requests` (for reference):
REQUESTS_PATTERN = '''
import os, requests
session = requests.Session()
session.headers.update({"X-API-Key": os.environ["API_KEY"]})
def get(path, **params):
    r = session.get(f"https://api.example.com{path}", params=params, timeout=10)
    r.raise_for_status()
    return r.json()
'''
print(REQUESTS_PATTERN.strip())

Map each block:

1) `_url` builds signed URLs with a consistent query format.
2) `_fetch` retries transient network errors with backoff.
3) `list_alerts` paginates lazily — callers iterate over results.
4) Secrets live in `os.environ`, never in the code.

Wrap a public JSON API with a tiny client.

import requests
class GitHubClient:
    def __init__(self, token=None):
        self.s = requests.Session()
        self.s.headers.update({"Accept": "application/vnd.github+json"})
        if token:
            self.s.headers["Authorization"] = f"Bearer {token}"
    def user(self, login):
        r = self.s.get(f"https://api.github.com/users/{login}", timeout=10)
        r.raise_for_status()
        return r.json()
# gh = GitHubClient(); print(gh.user("python")["login"])

URL construction is deterministic.

from urllib.parse import parse_qs, urlsplit
u = "https://api.example/v1/current?city=oslo&key=KEY"
q = parse_qs(urlsplit(u).query)
assert q == {"city": ["oslo"], "key": ["KEY"]}

Running prints:

current URL: https://api.weather.example/v1/current?city=oslo&key=KEY123
alerts URL : https://api.weather.example/v1/alerts?region=EU&key=KEY123
using key  : ***KEY
import os, requests
session = requests.Session()
session.headers.update({"X-API-Key": os.environ["API_KEY"]})
def get(path, **params):
    r = session.get(f"https://api.example.com{path}", params=params, timeout=10)
    r.raise_for_status()
    return r.json()