Hands-on Tutorial

Build Python tests with pytest and unittest.mock

Learn a simple step-by-step approach for testing AWS S3, Azure Blob Storage, REST APIs, Boto clients, PostgreSQL code, and Temporal Python apps. This page focuses on practical structure, small examples, setup tips, and easy scripts so a new developer can get working quickly.

1) Testing roadmap

The easiest way to test cloud and integration-heavy Python apps is to keep business logic separate from service SDK calls. That lets you mock the SDK layer and test your own logic first, then add a small number of integration tests later.

Step 1

Build small wrappers

Wrap boto3, Azure SDK, requests, psycopg, and Temporal client code in small classes or functions.

Step 2

Mock the boundary

Use unittest.mock.patch or pytest monkeypatch at the place your code imports the dependency.

Step 3

Assert behavior

Check returned values, raised exceptions, retry logic, SQL parameters, headers, and method calls.

Step 4

Add a few integration tests

Use real services only for key end-to-end paths, not every single unit test.

2) Simple project layout

A clean structure makes testing much easier.

myapp/ ├── src/ │ ├── config.py │ ├── s3_service.py │ ├── azure_blob_service.py │ ├── rest_client.py │ ├── postgres_repo.py │ ├── temporal_client_app.py │ └── temporal_activities.py ├── tests/ │ ├── test_s3_service.py │ ├── test_azure_blob_service.py │ ├── test_rest_client.py │ ├── test_postgres_repo.py │ ├── test_temporal_client_app.py │ ├── test_temporal_activities.py │ └── conftest.py ├── requirements.txt ├── pytest.ini ├── run-tests.sh └── run-tests.bat

3) Setup and configuration

requirements.txt

pytest pytest-cov requests boto3 botocore azure-storage-blob psycopg[binary] temporalio

pytest.ini

[pytest] testpaths = tests python_files = test_*.py addopts = -q --tb=short

4) Easy run scripts

run-tests.sh

#!/usr/bin/env bash set -e python -m venv .venv source .venv/bin/activate python -m pip install --upgrade pip pip install -r requirements.txt pytest -v pytest --cov=src --cov-report=term-missing

run-tests.bat

@echo off python -m venv .venv call .venv\Scripts\activate python -m pip install --upgrade pip pip install -r requirements.txt pytest -v pytest --cov=src --cov-report=term-missing

5) Core testing pattern

Write code so the external dependency is easy to replace in tests.

Example service

# src/rest_client.py import requests class UserApiClient: def __init__(self, base_url: str): self.base_url = base_url.rstrip('/') def get_user(self, user_id: str) -> dict: response = requests.get(f"{self.base_url}/users/{user_id}", timeout=10) response.raise_for_status() return response.json()

Example test

# tests/test_rest_client.py from unittest.mock import Mock, patch from src.rest_client import UserApiClient @patch('src.rest_client.requests.get') def test_get_user_returns_json(mock_get): fake_response = Mock() fake_response.json.return_value = {'id': '123', 'name': 'Ada'} fake_response.raise_for_status.return_value = None mock_get.return_value = fake_response client = UserApiClient('https://api.example.com') result = client.get_user('123') assert result['name'] == 'Ada' mock_get.assert_called_once()

6) Test AWS S3 and boto3

Mock the boto3 client or resource instead of calling real AWS in unit tests.

# src/s3_service.py import boto3 class S3DocumentStore: def __init__(self, bucket_name: str): self.bucket_name = bucket_name self.client = boto3.client('s3') def upload_text(self, key: str, text: str) -> None: self.client.put_object( Bucket=self.bucket_name, Key=key, Body=text.encode('utf-8'), ContentType='text/plain' ) # tests/test_s3_service.py from unittest.mock import patch from src.s3_service import S3DocumentStore @patch('src.s3_service.boto3.client') def test_upload_text_calls_put_object(mock_boto_client): fake_client = mock_boto_client.return_value store = S3DocumentStore('demo-bucket') store.upload_text('docs/a.txt', 'hello') fake_client.put_object.assert_called_once()

7) Test Azure Blob Storage

Create a small wrapper and patch BlobServiceClient where your module imports it.

# src/azure_blob_service.py from azure.storage.blob import BlobServiceClient class AzureBlobStore: def __init__(self, conn_str: str, container_name: str): self.client = BlobServiceClient.from_connection_string(conn_str) self.container_name = container_name def upload_text(self, blob_name: str, text: str): blob_client = self.client.get_blob_client(container=self.container_name, blob=blob_name) blob_client.upload_blob(text, overwrite=True) # tests/test_azure_blob_service.py from unittest.mock import patch from src.azure_blob_service import AzureBlobStore @patch('src.azure_blob_service.BlobServiceClient') def test_upload_text_to_blob(mock_blob_service_client): fake_service = mock_blob_service_client.from_connection_string.return_value fake_blob_client = fake_service.get_blob_client.return_value store = AzureBlobStore('UseDevelopmentStorage=true', 'docs') store.upload_text('sample.txt', 'hello azure') fake_blob_client.upload_blob.assert_called_once_with('hello azure', overwrite=True)

8) Test REST APIs

Mock requests.get, requests.post, or a Session object. Check URL, headers, payload, timeout, and error handling.

from unittest.mock import Mock, patch import pytest from src.rest_client import UserApiClient @patch('src.rest_client.requests.get') def test_get_user_raises_for_bad_status(mock_get): fake_response = Mock() fake_response.raise_for_status.side_effect = Exception('boom') mock_get.return_value = fake_response client = UserApiClient('https://api.example.com') with pytest.raises(Exception): client.get_user('999')

9) Test Postgres code

Mock the psycopg connection and cursor for unit tests. Keep SQL parameters separate and assert execute calls.

# src/postgres_repo.py import psycopg class UserRepository: def __init__(self, dsn: str): self.dsn = dsn def get_user_name(self, user_id: int) -> str | None: with psycopg.connect(self.dsn) as conn: with conn.cursor() as cur: cur.execute('SELECT name FROM users WHERE id = %s', (user_id,)) row = cur.fetchone() return row[0] if row else None # tests/test_postgres_repo.py from unittest.mock import MagicMock, patch from src.postgres_repo import UserRepository @patch('src.postgres_repo.psycopg.connect') def test_get_user_name(mock_connect): mock_conn = MagicMock() mock_cur = MagicMock() mock_cur.fetchone.return_value = ('Ada',) mock_connect.return_value.__enter__.return_value = mock_conn mock_conn.cursor.return_value.__enter__.return_value = mock_cur repo = UserRepository('postgresql://demo') name = repo.get_user_name(1) assert name == 'Ada' mock_cur.execute.assert_called_once_with('SELECT name FROM users WHERE id = %s', (1,))

10) Test Temporal apps

Separate workflow logic, activities, and client startup. Mock activity dependencies for unit tests and use Temporal's test tools for deeper tests.

# src/temporal_activities.py async def normalize_filename(name: str) -> str: return name.strip().lower().replace(' ', '_') # tests/test_temporal_activities.py import pytest from src.temporal_activities import normalize_filename @pytest.mark.asyncio async def test_normalize_filename(): result = await normalize_filename(' My File.PDF ') assert result == 'my_file.pdf'
Tip: For full workflow and worker tests, keep most logic in activities or helper modules so you can unit test it cheaply. Then add a smaller number of Temporal integration tests.

11) Use fixtures to reduce duplication

# tests/conftest.py import pytest from src.rest_client import UserApiClient @pytest.fixture def api_client(): return UserApiClient('https://api.example.com')

Fixtures help centralize setup like config, fake environment variables, reusable mock data, and sample payloads.

12) Suggested step-by-step learning path

Week 1

Learn pytest discovery, asserts, fixtures, and running tests from shell scripts.

Week 2

Practice unittest.mock patch, Mock, MagicMock, and AsyncMock on REST and SDK wrappers.

Week 3

Add Postgres and cloud storage tests. Assert SQL statements, request payloads, and error handling.

Week 4

Add Temporal tests. Start with activity logic, then worker and workflow integration tests.

13) Final advice

Test your code first, not the cloud vendor's SDK internals.
Mock at the module import location used by your code.
Keep real integration tests small, targeted, and separate from unit tests.
Use simple scripts so anyone on the team can run tests quickly on Windows or shell.