How a BDE Connects Business Vision With Technology
How a BDE Connects Business Vision With Technology Kumkum Kumari 21/11/2025At Speqto, we work with organizations that are constantly evolving entering new markets, scaling operations, or […]

Delivering reliable web applications at speed demands a solid automation backbone. A Selenium-Pytest framework combines the power of browser automation with Python’s most popular testing library, enabling teams to write clear, maintainable UI tests that integrate seamlessly with CI/CD pipelines. In this guide, learn how to build a robust Selenium-Pytest framework from project setup through advanced features like fixtures, page objects, parallel execution, and reporting.
Many teams struggle with flaky tests, brittle locators, and tangled fixtures that slow down releases. Without a clear structure, test code and application code drift apart, leading to maintenance overhead and eroding confidence in release quality. A well-architected Selenium-Pytest framework solves these problems, providing scalable, reliable tests that evolve alongside your application.
Our Selenium-Pytest framework consists of:
• Project Layout: Clear directory structure separating tests, page objects, data, and utilities.
• Page Object Model: Encapsulate page interactions in classes for reuse and readability.
• Pytest Fixtures: Manage WebDriver lifecycle, test data, and configuration.
• Parallel Execution: Leverage pytest-xdist for faster feedback across multiple browsers.
• Reporting & Logging: Generate HTML reports and capture screenshots on failures.
Start with a virtual environment and install:
pip install selenium pytest pytest-xdist pytest-html.
Create this directory structure:
project_root/
│
├── tests/
│ ├── test_login.py
│ └── test_shopping_cart.py
│
├── pages/
│ ├── base_page.py
│ ├── login_page.py
│ └── cart_page.py
│
├── utils/
│ ├── config.py
│ └── logger.py
│
└── pytest.ini
In pytest.ini, define markers, log format, and report options:
[pytest]
markers =
smoke: quick smoke tests
regression: full regression suite
addopts =
--capture=tee-sys
--html=reports/report.html
-n auto
log_cli = true
log_cli_level = INFO
Define a session-scoped fixture in conftest.py to initialize and quit WebDriver:
import pytest
from selenium import webdriver
@pytest.fixture(scope="session", params=["chrome", "firefox"])
def driver(request):
browser = request.param
if browser == "chrome":
driver = webdriver.Chrome()
else:
driver = webdriver.Firefox()
driver.maximize_window()
yield driver
driver.quit()
Encapsulate page elements and actions. In login_page.py:
from selenium.webdriver.common.by import By
class LoginPage:
URL = "https://example.com/login"
USER_INPUT = (By.ID, "username")
PASS_INPUT = (By.ID, "password")
LOGIN_BTN = (By.CSS_SELECTOR, "button[type='submit']")
def __init__(self, driver):
self.driver = driver
def load(self):
self.driver.get(self.URL)
def login(self, user, pwd):
self.driver.find_element(*self.USER_INPUT).send_keys(user)
self.driver.find_element(*self.PASS_INPUT).send_keys(pwd)
self.driver.find_element(*self.LOGIN_BTN).click()
Use the page objects and fixtures in tests:
import pytest
from pages.login_page import LoginPage
@pytest.mark.smoke
def test_login_success(driver):
login = LoginPage(driver)
login.load()
login.login("user1", "pass123")
assert "Dashboard" in driver.title
Run tests in parallel across browsers with -n auto (pytest-xdist). Combine with markers:
pytest -m smoke -n 4 --html=reports/smoke.html
Hook into pytest’s pytest_runtest_makereport to capture screenshots on failure:
import os
import pytest
@pytest.hookimpl(hookwrapper=True)
def pytest_runtest_makereport(item, call):
outcome = yield
rep = outcome.get_result()
if rep.when == "call" and rep.failed:
driver = item.funcargs.get("driver")
screenshot = os.path.join("reports", f"{item.name}.png")
driver.save_screenshot(screenshot)
rep.extra = getattr(rep, "extra", []) + [
pytest_html.extras.png(screenshot)
]
• Keep page objects thin—only actions, no assertions.
• Use data-driven testing with @pytest.mark.parametrize.
• Group tests by feature and tag with markers.
• Clean up test data via fixtures to avoid state leaks.
• Integrate into CI (GitHub Actions, Jenkins) for every pull request.
A well-designed Selenium-Pytest framework empowers teams to deliver reliable UI automation with minimal maintenance. By combining clear project structure, the Page Object Model, pytest fixtures, parallel execution, and rich reporting, you’ll achieve fast, robust tests that scale with your application and CI/CD pipeline.
Ready to elevate your UI automation? Contact Speqto’s QA experts to build a custom Selenium-Pytest framework that accelerates your release cycles and ensures quality at scale.
How a BDE Connects Business Vision With Technology
How a BDE Connects Business Vision With Technology Kumkum Kumari 21/11/2025At Speqto, we work with organizations that are constantly evolving entering new markets, scaling operations, or […]
Apache JMeter Demystified: Your 7-Stage Blueprint for a Seamless First Performance Test
Apache JMeter Demystified: Your 7-Stage Blueprint for a Seamless First Performance Test Megha Srivastava 21 November 2025 In the intricate world of software development and deployment, ensuring a robust user experience is paramount. A slow application can quickly deter users, impacting reputation and revenue. This is where Apache JMeter emerges as an indispensable tool, offering […]
STRIDE Simplified: A Hands-On Blueprint for Pinpointing Software Threats Effectively
STRIDE Simplified: A Hands-On Blueprint for Pinpointing Software Threats Effectively Megha Srivastava 21 November 2025 In the intricate landscape of modern software development, proactive security measures are paramount. While reactive incident response is crucial, preventing vulnerabilities before they become exploits is the hallmark of robust software engineering. This is where threat modeling, and specifically the […]
From Static to Streaming: A Practical Developer’s Guide to Real-time Applications Using GraphQL Subscriptions
From Static to Streaming: A Practical Developer’s Guide to Real-time Applications Using GraphQL Subscriptions Shakir Khan 21 November 2025 The Paradigm Shift: From Static to Streaming Experiences In an era where user expectations demand instant gratification, the web has rapidly evolved beyond its static origins. Today, a modern application’s success is often measured by its […]
The TanStack Query Edge: Deep Dive into Advanced Caching for Optimal Application Speed
The TanStack Query Edge: Deep Dive into Advanced Caching for Optimal Application Speed Shubham Anand 21 November 2025 In the relentless pursuit of seamless user experiences and lightning-fast web applications, data management stands as a formidable challenge. Modern front-end frameworks demand intelligent solutions to handle asynchronous data, and this is precisely where TanStack Query (formerly […]