Get structured feedback on your requirements specification
Upload your PDF and receive standards-based feedback within minutes — specific findings per requirement, actionable improvement suggestions, and a quality score across four categories.
Tips for a high score
Before uploading, check that your requirements section includes:
- A clear chapter heading containing "Requirements" or "Eisen" — this is how the system finds your requirements section
- A table with one requirement per row
- Unique IDs for each requirement (e.g. FR-01, NFR-03)
- MoSCoW priorities — not everything is Must
- Measurable thresholds with numbers and units
- A verification method per requirement (test, inspection, demo)
Example: what a well-structured requirements table looks like
| ID | Priority | Requirement | Threshold | Verification |
|---|---|---|---|---|
| FR-01 | Must | System responds to user input | < 200 ms | Performance test |
| FR-02 | Should | Sensor detects objects at distance | ≥ 5 m | Field test |
| NFR-01 | Must | System available during operating hours | ≥ 99.5% | Uptime monitoring |
| NFR-02 | Could | Supports wireless firmware update | OTA success | Integration test |
How it works
- Upload your requirements specification as a PDF.
- Wait for the AI to review your document. This usually takes a few minutes, but may take longer if the server is busy.
- Receive detailed feedback in your inbox — per-requirement findings, top 3 improvements, and quality scores.
What you get
- Quality scores across completeness, testability, structure, and consistency
- Specific findings with improvement suggestions linked to individual requirements
- Top 3 actionable improvements to focus on first
- Structure checklist — are IDs, MoSCoW priorities, and measurable thresholds present?
- Problem statement review with clarity and scope feedback
See a sample review
- Add measurable thresholds to requirements that currently lack numbers or units (e.g. FR-03, NFR-02)
- Include a verification method column (test, inspection, demo) so each requirement traces to a V-model activity
- Replace vague terms like "fast" and "reliable" with specific acceptance criteria
| Req | Severity | Issue | Suggestion |
|---|---|---|---|
| FR-03 | major | "The system shall respond quickly" — no measurable threshold | Try: "The system shall respond within 200 ms" |
| NFR-02 | minor | Compound requirement: covers both reliability and availability | Split into two separate requirements with distinct acceptance criteria |
This is a representative example. Your actual review will be tailored to your document.
Why vmodel.eu?
- Domain-specific: Trained on real engineering specifications, not generic text. Understands MoSCoW, traceability, and testability.
- Structured output: Not a wall of text — you get a scored checklist, per-requirement findings, and prioritised improvements.
- Privacy-first: Your document is processed on EU-based infrastructure and deleted after review. Anonymized data may be retained to improve the tool — you can opt out during upload.
- Free for education: Available to engineers, students, and staff at universities and schools. No account, no subscription.
Privacy
Your document never leaves the EU. Upload, processing, and email delivery all run on European servers. By default, your PDF and the generated feedback are retained to improve this tool — no personal information (name, email) is stored with the retained data. You can opt out during upload or request deletion at any time. Data is never used for model training or profiling.
About
vmodel.eu is built and maintained at HAN University of Applied Sciences, Faculty of Engineering, Arnhem, the Netherlands. It was created to help engineers write better requirements specifications grounded in IEEE 29148, INCOSE, and V-model standards.
Questions or feedback? Get in touch.