Team Collaboration
Shared rules, AI pair programming patterns, PR workflows, and onboarding automation for teams.
title: "Team Collaboration" description: "Shared rules, AI pair programming patterns, PR workflows, and onboarding automation for teams." section: "Full Guide" readTime: "10 min"
Team Collaboration
Cursor's impact compounds when the team shares rules, patterns, and workflows — not just individual developers acting independently.
Shared Config in Git
Commit your AI config alongside your code. New devs inherit the full setup automatically.
What to commit:
.cursor/
rules/
01-code-style.md
02-testing.md
03-pr-review-checklist.md
mcp.json
instructions.md
.cursorignore
Add this to your onboarding docs:
After cloning:
1. Open with Cursor (not VS Code)
2. Rules auto-load from .cursor/rules/
3. Run: npm install
4. Cursor is already configured for this repo
Team Notepads
Use Cursor Notepads (Cmd+Shift+N) for team knowledge that doesn't belong in files but needs to be referenced during sessions.
Notepad: Architecture Decision Records (ADRs)
# ADR-001: Event Sourcing for Orders
Status: Accepted | Date: 2024-03
## Decision
Use event sourcing for order state management.
## Why
- Full audit trail required by compliance
- Undo/replay capability for support workflows
- Read replicas without complex sync
## Tradeoffs
- Higher initial complexity
- All devs need to understand event patterns
## When building order features, always:
- Reference @src/events/ for existing event types
- Emit events via EventBus, not direct DB writes
- Test with event replay scenariosNotepad: Onboarding Checklist
# Dev Onboarding
Day 1:
- [ ] Clone repo, open in Cursor
- [ ] Create .env from .env.example
- [ ] Run: docker-compose up -d (local DB)
- [ ] Run: npm run db:seed
- [ ] Open Cursor Composer → "I'm new to this repo. Walk me through @instructions.md step by step."
Week 1:
- [ ] Read ADR-001 (event sourcing), ADR-003 (auth)
- [ ] Complete first task: pick any "good-first-issue" ticket
- [ ] Pair with AI on first PR using .cursor/rules/03-pr-review-checklist.mdAI Pair Programming
Two patterns that work well for teams.
Driver-Navigator
Human plans the approach, AI implements:
Navigator (human):
1. We need pagination for the search results
2. Use cursor-based pagination (not offset) — see ADR-005
3. Max 50 items per page
4. Return: results[], nextCursor, hasMore
Driver (Cursor Composer):
@src/api/search.ts @src/types/pagination.ts @instructions.md
Implement cursor-based pagination per above spec.
Follow existing pattern in @src/api/orders.ts (already has pagination).
Parallel Exploration
Two Composer windows, same problem, different approaches — compare results:
# Window A
Approach: Redis cache for search results.
30-min TTL, cache key = hash(query + filters).
@src/api/search.ts @src/lib/redis.ts
# Window B
Approach: React Query with stale-while-revalidate.
No server cache, aggressive client-side cache.
@src/api/search.ts @src/hooks/
Compare both implementations, then pick (or merge best parts).
PR Review Workflow
Before opening a PR — self-review:
@git-diff (staged changes)
@.cursor/rules/03-pr-review-checklist.md
@instructions.md
Self-review this PR before I submit it.
Check each item in the PR checklist.
Flag anything that needs fixing.
PR checklist rule (.cursor/rules/03-pr-review-checklist.md):
---
type: rule
alwaysApply: false
---
# PR Review Checklist
## Automated
- [ ] All tests pass
- [ ] No TypeScript errors
- [ ] No lint errors
## Code Quality
- [ ] No `any` types
- [ ] No magic numbers (use named constants or design tokens)
- [ ] Error states handled
- [ ] Loading states handled
## Security
- [ ] All API inputs validated with Zod
- [ ] No secrets or credentials in code
- [ ] Authorization checks on all protected routes
- [ ] No direct SQL queries (use ORM methods)
## Testing
- [ ] Unit tests for new utilities
- [ ] Integration test for API changes
- [ ] E2E test for user-facing features
## Performance
- [ ] No N+1 queries (use batch/include in Prisma)
- [ ] Images use Next.js Image component
- [ ] Client bundles — no unnecessary large importsDuring review (as reviewer):
@git-diff
@.cursor/rules/
Code review this PR.
Check for:
1. Logic errors or edge cases
2. Security issues (OWASP Top 10)
3. Missing tests for the changed behavior
4. Violations of our conventions (@instructions.md)
For each finding: file path + line + explanation + suggested fix.
Onboarding Automation
New devs can onboard themselves with AI guidance:
# Day 1 prompt for new dev
"I'm new to this codebase. Help me set up.
@instructions.md
@.env.example
@package.json
1. Walk me through the project structure
2. What's the recommended first setup command?
3. What are the 3 most important conventions to know?
4. What's a good first task I could tackle?"
After completing first feature:
"I'm about to open my first PR.
@git-diff
@.cursor/rules/03-pr-review-checklist.md
Walk me through the checklist. Check each item against my changes."
Team Velocity Metrics
Reference numbers from teams using shared Cursor rules:
| Metric | Before | After | Change |
|---|---|---|---|
| PRs merged per week | 23 | 32 | +39% |
| Days to merge | 3.2 | 2.1 | -34% |
| Bug escape rate | baseline | -27% | -27% |
| Onboarding time (to first PR) | 1 week | 2 days | -71% |
Key driver: shared .cursor/rules/ means every dev (senior and junior) gets the same pattern guidance automatically.
Making It Stick
Common blockers and fixes:
| Problem | Fix |
|---|---|
| Rules ignored | Check alwaysApply: true frontmatter in .cursor/rules/ |
| Devs not using Cursor | Make rules help (not block) — rules that guide, not just warn |
| Instructions.md diverges | Treat it like code — PR required to change it |
| Context drift in long sessions | Enforce the 20-exchange reset (add it to your team process doc) |
| Inconsistent patterns across team | Run @codebase audit monthly, update rules with findings |