User stories and acceptance criteria work together as a pair. The user story sets the direction — it answers "what does the user need to accomplish?" The acceptance criteria answer "how do we know we built it correctly?"
A user story written without acceptance criteria leaves the definition of done ambiguous. Engineers complete their implementation, QA tests it, and the PM says "that's not quite what I meant" — because the unwritten criteria were different in each person's head. A user story written with specific acceptance criteria eliminates this ambiguity at the source.
User stories are intentionally technology-agnostic. They describe user goals and outcomes, not implementation details. This gives engineers design freedom — they can choose the technical approach that best serves the user need rather than implementing a narrowly specified solution that may not be the best fit for the architecture.
Acceptance criteria are intentionally specific. They translate the open-ended user story into a set of testable conditions. "Feature works correctly" is not an acceptance criterion. "Given a workspace admin with Slack connected, when a PRD status changes to In Review, then the admin receives a Slack notification within 2 seconds" is an acceptance criterion — specific enough for a QA engineer to write a test against it without asking the PM for clarification.
User stories vs tasks vs acceptance criteria
User story: Describes the desired outcome from a user's perspective. Answers: who wants what, and why? Format: "As a [user], I want [action], so that [outcome]." Written at sprint planning or during PRD drafting. Example: "As a workspace admin, I want to be notified when a PRD changes status, so that I can respond without manually checking the tool."
Engineering task/ticket: A discrete unit of implementation work derived from a user story. More specific than the user story — it names the technical work to be done. May be frontend, backend, QA, or infrastructure. Example: "Build Slack webhook delivery service for PRD status change events."
Acceptance criteria: The conditions that must be true for a user story or task to be complete. Written in verifiable "Given / When / Then" format. Applies to both user stories (high-level) and individual tickets (implementation-level). Example: "Given a Slack-connected workspace, when PRD status changes to In Review, then admin receives Slack message within 2 seconds containing PRD title, status, and link."
How to Use User Story vs Acceptance Criteria in Product Management
Write user stories before acceptance criteria — in that order. The user story establishes the intent; the acceptance criteria detail the verification. If you write acceptance criteria first, you risk over-constraining the implementation before you have clarified the user goal.
For each user story, write 3–5 acceptance criteria using the Gherkin format: Given [initial context], When [triggering action], Then [expected outcome]. Each criterion should be testable by a QA engineer without asking the PM for clarification. If a criterion requires explanation, rewrite it.
The most common mistake is writing acceptance criteria that describe the UI rather than the outcome. "The notification bell icon turns red" is a UI description. "The user sees an unread notification count in the nav bar" is an outcome. The UI criterion is too narrow — it breaks if the designer changes the icon. The outcome criterion is durable — it survives design iteration.
In a PRD, user stories live in the user stories section; acceptance criteria live in both the user stories section (high-level) and the engineering tickets section (implementation-level). Each ticket should have its own 3–5 acceptance criteria derived from the parent user story.
User Story vs Acceptance Criteria Examples
1User story with acceptance criteria: notification feature
User story: 'As a workspace admin, I want to be notified immediately when a PRD I am assigned to review changes status, so that I can respond within the same working day without manually checking Scriptonia.' Acceptance criteria: (1) Given a Slack-connected workspace, when PRD moves to In Review, then admin receives Slack message within 2 seconds. (2) Given Slack not connected, when PRD moves to In Review, then admin receives in-app notification and email. (3) Given admin has disabled email notifications, when PRD changes status, then only in-app notification is sent. (4) Given Slack connection is revoked, when PRD changes status, then system falls back to email + in-app and surfaces reconnection prompt.
2Weak vs strong acceptance criteria: same user story
User story: 'As a PM, I want to generate a PRD from a feature brief, so that I can share a complete spec with my engineering team quickly.' Weak criterion: 'PRD is generated successfully.' Strong criterion: 'Given a valid feature brief (feature name, target user, constraints), when the PM clicks Generate, then a complete PRD is returned within 30 seconds containing all 10 sections: problem statement, target users, success metrics, user stories, scope, constraints, architecture, tickets, edge cases, and acceptance criteria.' The strong criterion gives QA a checklist; the weak criterion requires PM judgment to verify.
3Acceptance criteria for an error state
User story: 'As a user, I want to know if my PRD generation failed, so that I can retry without losing my input.' Acceptance criteria for error state: (1) Given a timeout during generation (>30 seconds), when the request fails, then the user sees an error message with a 'Try again' button and all input fields retain their values. (2) Given a network error during generation, when the request fails, then the error is logged and the user is shown a human-readable error message (not a stack trace). (3) Given a retry after a failed generation, when the user clicks 'Try again', then the form is pre-filled with the previous input.
How Scriptonia Automates This
Scriptonia automatically generates acceptance criteria for every engineering ticket it creates — in Gherkin format (Given / When / Then), specific enough to test without PM clarification. For a typical feature, Scriptonia generates 3–5 acceptance criteria per ticket across 8–15 tickets: 24–75 acceptance criteria per PRD that would take hours to write manually.
The acceptance criteria are pushed to Linear, Jira, or GitHub Issues as checklist items on the corresponding ticket — engineers check them off during implementation, and QA verifies them at review. This creates an unbroken chain from user story intent to verifiable ticket completion.