Ady Stokes
Freelance Consultant
He / Him
I am Open to Write, Teach, Speak, Meet at MoTaCon 2026, Podcasting, Review Conference Proposals

STEC and SQEC Certified. MoT Ambassador, writer, speaker, accessibility advocate. Consulting, Leeds Chapter Lead. MoT Certs curator. Testing wisdom, friendly, songs and poems. Great minds think differently

Chapter Lead
Ambassador

Achievements

Career Champion
Club Explorer
Bio Builder
Avid Reader
TestBash Trailblazer
Article Maven
Testing Scholar
MoT Community Certificate
MoT Software Testing Essentials Certificate
Scholarship Hero
Insights Spotter Bronze
TestBash Speaker
99 Second Speaker
The Testing Planet Contributor
Chapter Lead
MoT Streak
Unlimited Member
In the Loop
MoT Ambassador 2025
MoT Inked
404 Talk (Not) Found
Bug Finder
Collection Curator
Glossary Contributor
Meme Maker
Photo Historian
TestBash Brighton 2025 Attendee
TestBash Brighton 2024 Attendee
TestBash Teacher
Cert Shaper
Course creator
Author Debut
A tester's role in continuous quality
Prompting for testers
Improving your testing through operability
Cognitive biases in software testing
A software tester's guide to Chrome DevTools
Introduction to software development and testing
Introduction to modern testing
Introduction to accessibility testing
Bug reporting 101
Coding for non-coders
The building blocks of the internet
Introduction to JavaScript
Advanced prompting for testers
99 and Counting
TWiQ Host
Chapter Event Speaker
Pride Supporter
Meme Machine
Inclusive Companion
Social Connector
Open to Opportunities
Found at 404
Picture Perfect
Story Sharer
Neurodiversity Matters
Everyday security testing: A practical guide to getting started
Quality coaching essentials
Kind Click
Supportive Clicker
Encouragement Giver
Encouragement Champion
Goal Setter
Insights Taster
MoT Ambassador 2026
Chapter Discovery
Call for Insights
Moment Maker
Moment Sharer
Moment Documenter
Chapter Event Host
AMA Initiate
AMA Trailblazer

Certificates

MoT Software Quality Engineering Certificate image
Awarded for: Passing the exam with a score of 100%
MoT Software Testing Essentials Certificate image
Awarded for: Passing the exam with a score of 100%

Activity

Ady Stokes
Ady Stokes
earned:
Q&A time at A11y North  image
Q&A time at A11y North
Ady Stokes
Ady Stokes
earned:
A11y North time again  image
A11y North time again
Ady Stokes
Ady Stokes
earned:
Q&A time at A11y North  image
Q&A time at A11y North
Ady Stokes
Ady Stokes
contributed:
Q&A time at A11y North  image
Adam answered questions at A11y North held at Hippo in Leeds 
Ady Stokes
Ady Stokes
earned:
Member joined MoT Leeds chapter image
Member joined MoT Leeds chapter

Contributions

Q&A time at A11y North  image
  • Adam's profile image
  • Hippo's profile image
Adam answered questions at A11y North held at Hippo in Leeds 
A11y North time again  image
  • Adam's profile image
  • Hippo's profile image
Testing the Inaccessible. Where to start with a challenging website. The problems were no owner, lack of focus and many involved with no guidelines were the main issues. Adam Clarkson too...
Leeds Chapter has a new event  image
  • Colin Wren's profile image
  • MoT Leeds's profile image
  • Hippo's profile image
Go check out the latest event on the Leeds Chapter page. https://www.ministryoftesting.com/chapters/mot-leeds/events/mot-leeds-280526 
Testing Mindset image
  • Ady Stokes's profile image
The Testing Mindset (in my humble opinion) is an umbrella term that covers a number of different mindsets, or ways of thinking about a subject. As I said in my September 2025 article (see reference), "From the beginning of my software testing career in 2003, I’d repeatedly heard about the ‘tester's mindset.’ There were very few actual definitions and none that I felt fit well. I did some research to see if someone specific coined the term, but it is more likely that it evolved over time." In that article, I suggested that, as testers or quality professionals, we should have more than one mindset. I gave 11 examples of software testing mindsets, and since then, I've added another. I asked in the article for people to suggest their own or expand on them. Maybe you can add to this glossary term with your own ideas and mindsets. Or, you may see it differently, which is great. Debate pushes the craft forward. Here is the latest list as of April 2026. They are now grouped and I hope to share them as a model in the future.  Visionary and open Blue-sky (innovator)  Creative (visionary)  Exploratory (investigator)  Inclusive (ally)  Analytical and grounded Scientific (realist)  Sceptical (analyst)  Critical (evaluator)  Risk-based (strategist)  Philosophical and connected  Ethical (moralist)  Holistic (connector)  Collaborative (partner)  Dark and aggressive Malevolent (saboteur) 
We can probably do better on social media - Hashtags image
Probably If you’re using social media, you’re probably using hashtags. If you’re using hashtags, you’re probably using multiple words. If you’re using multiple words, you’re probably putting th...
I was asked, “What has the #MoTaverse ever done for us?” image
I was asked, “What has the #MoTaverse ever done for us?”So I replied…“You mean apart from publishing my articles, supporting my learning, giving me on‑demand courses and certificates to learn ...
STEC and SQEC certified baby!!!! woooooow  image
Very happy to be both STEC (Software Testing Essentials Certificate) and SQEC (Software Quality Engineering Certificate) certified ;-) 
100 terms added to the Glossary image
Grey box testing sounds rather innocuous, but it has a special place. It's the 100th term I've added to the glossary. Help create the largest testing terminology repository in the world for the MoT...
Grey box testing  image
  • Ady Stokes's profile image
Grey box testing is a method where the tester has partial knowledge of the application's internal structure. It is the middle ground between black box and white box testing. You might have access to the database schema or the API documentation while you test the user interface. This allows you to write better test cases because you understand the underlying logic.It is particularly useful for integration testing where you want to see how data flows between different components. During refinement, you might use your knowledge of the system architecture to identify specific risks. By looking at the acceptance criteria and the technical design, you can ensure that the tests cover both the user journey and data integrity.It helps to find bugs that a pure black box test would miss, such as a record not being updated correctly in the background or an API returning more data than it should. It is a smart way to test because it combines the user perspective with technical insight. You aren't just clicking buttons. You are verifying that the entire system is behaving as it should. 
Sad-path testing image
  • Ady Stokes's profile image
Sad-path testing is a very general term to cover testing the unexpected. It involves verifying how an application behaves when it receives invalid data or encounters an error. It is the direct opposite of happy path testing, which only follows the intended user journey. When you perform sad-path testing, you are checking that the system handles exceptions as required. This often means looking at acceptance criteria to see how the system should respond to incorrect logins. timed-out sessions. or empty fields. It is a critical part of making a product robust and reliable for real users. You are essentially trying to find where the logic breaks down when a user does something unexpected. By identifying these scenarios during refinement, you can ensure the developers build in proper error messages and recovery steps. It helps to move beyond basic functionality and ensures the software can handle the messiness of the real world. As a general term, it can cover many areas, but is a simple way to explain testing if for more than confirming software does what it is supposed to do. 
Backlog refinement  image
  • Ady Stokes's profile image
A backlog refinement is when the team gets together to review the work waiting in the queue. It is the time when a user story is reviewed to make sure the requirements are actually understood by everyone. You can spend this time adding acceptance criteria, so there is no confusion about what 'done' looks like. 3 Amigos sessions are similar but a more focused deep dive, with a smaller group which can include product owners or people not directly involved with the team. It can also be when you break down larger requirements into smaller, more manageable user stories or tasks. You are effectively checking that the user story or requirement is solid and good to go. This prevents the team from picking up a ticket in a sprint and then realising they do not know how to start, or will know when they are done. 
Login or sign up to create your own MoT page.
Subscribe to our newsletter