PR Answer Packs
PR Answer Packs attach scoped repository context to GitHub pull requests. Snipara listens to the PR event, builds a source-grounded pack for the changed surface, and publishes it back to GitHub as a Check Run and optional sticky comment.
Positioning
Think of this as AI Answer Packs for Pull Requests, not a vector search feature. The user sees an artifact: impacted symbols, architecture continuity, implementation history, conventions, and code impact context.
The flow
PR opened | v Snipara deterministic indexing impacted symbols architecture retrieval implementation history conventions retrieval answer pack generation | v Claude Code / GitHub Copilot / Codex / Cursor scoped repository cognition relevant architectural continuity high-quality contextual grounding
The pack is generated for the pull request, not for a generic repository search. That is the difference: the active AI client gets context aligned to the current branch, changed paths, indexed symbols, and relevant project history.
GitHub-native surfaces
The first useful surface is GitHub itself. Snipara publishes the pack where reviewers and agents already look, while the dashboard keeps status and regeneration controls available for maintainers.
Check Run
Publishes the pack as a neutral GitHub Check Run named Snipara Answer Pack.
Sticky comment
Optionally mirrors the pack into a single updatable PR comment for teams that want visible handoff notes.
Dashboard viewer
Shows recent packs, pack status, freshness, and manual regeneration controls inside Snipara.
Marketplace lifecycle
Records Marketplace purchase, change, cancellation, and free-trial events without mixing teams.
What a pack contains
PR scope
Changed files
Impacted symbols
Repository context
Architecture continuity
Implementation history
Repository conventions
Agent handoff
Caveats
Answer Pack schema v1
The current GitHub surface is the Check Run plus sticky PR comment. Treat this schema as the product contract for how an Answer Pack should be read by agents and humans.
{
"version": "pr-answer-pack-v1",
"repository": "owner/repo",
"pull_request": {
"number": 31,
"head_sha": "eeaa70d",
"base_branch": "main",
"title": "Refactor authentication middleware"
},
"scope": {
"changed_files": ["apps/web/src/lib/github/app.ts"],
"impacted_symbols": ["getInstallationToken", "loadPrivateKey"],
"context_documents": ["docs/development/CODING_STANDARDS.md"]
},
"sections": [
"PR Scope",
"Changed Files",
"Impacted Symbols",
"Repository Context",
"Related Conventions And Architecture",
"Implementation History",
"Agent Handoff",
"Caveats"
],
"publication": {
"github_check_name": "Snipara Answer Pack",
"sticky_comment_marker": "<!-- snipara-pr-answer-pack -->"
}
}What agents should do with it
Claude Code, GitHub Copilot, Codex, Cursor, and other AI coding clients should treat the pack as the first context briefing for the PR. It does not replace reading files or running tests. It reduces rescanning and gives the agent a better starting map.
- scoped repository cognition
- relevant architectural continuity
- high-quality contextual grounding
- source-aware implementation history
Setup
PR Answer Packs are generated by the hosted Snipara GitHub App. For repository onboarding, run npx create-snipara --github from the repository root, approve the GitHub App in the browser, and connect the detected repository. Once connected, pull request events queue Answer Packs and push events keep the repository index fresh.
npx create-snipara --githubKeep the boundaries clear
Snipara prepares the repository context. Your chosen AI client reasons and edits. The project memory stays independent from any one model, session, or user.