Avoiding Over- and Under-Decomposition
Too many levels in a DFD can bury the user in detail. Too few, and critical flows go unseen. The challenge isn’t just about how far to go—it’s about knowing when to stop.
When I first worked on a government benefits system, I decomposed a single process into ten child diagrams. The team couldn’t follow it. We’d fallen into over-decomposition—chasing detail instead of clarity.
The real issue? Not understanding that the goal isn’t to list every step, but to preserve data integrity across levels. The right level of DFD detail balances expressiveness with maintainability. This chapter delivers the field-tested approach I’ve used in 200+ analysis projects.
You’ll learn how to spot over-decomposition and under-decomposition through real-world signals, use decision rules to validate depth, and apply a simple framework to target the correct level of detail—without guesswork.
Recognizing the Two Ends of the Spectrum
Over-Deformation: When Detail Becomes Noise
Over-decomposition DFD occurs when a process is broken down into so many sub-processes that the original intent is lost.
I’ve seen diagrams with 15+ levels. They’re not models—they’re flowcharts masquerading as analysis.
Look for these signs:
- Processes labeled with verbs like “Verify identity and validate user credentials”
- Sub-processes that only perform one action each
- Diagrams with no clear grouping or hierarchy
- Too many input/output flows per child process
When a step like “Check if the user is over 18” appears in a child diagram, it’s a red flag. That’s a decision point, not a process. It should be modeled as a decision node, not a decomposition.
Under-Deformation: The Risk of Oversimplification
Under-decomposition hides critical data movement. One client assumed “Process Payment” was sufficient. But no one could trace how funds moved or where the transaction status was stored.
Under-decomposition often appears when:
- Processes contain more than 3–4 inputs or outputs
- Processes like “Handle Customer Request” have no breakdown
- Key data stores are invisible or unlinked
- External entities receive flows with no clear source
These are not just design flaws. They’re compliance risks. If you can’t trace how data moves, you can’t audit it.
Establishing the Right Level of DFD Detail
There’s no universal rule for how many levels to use. But there are principles that guide when a process is decomposed far enough.
Ask yourself: “Would a stakeholder need to know this detail to understand the system?” If not, it’s likely over-decomposition.
Use this three-part framework to assess your decomposition:
- Isolate the atomic function—the smallest action that changes data state.
- Check for data consistency—all inputs and outputs must appear in the parent diagram.
- Evaluate for complexity—if a sub-process has more than 3–4 flows, it may need further breakdown.
When you reach a process that has only one input, one output, and no new data stores, you’ve likely reached the atomic level. That’s your stopping point.
Example: Customer Order Processing
Consider a DFD level 1 process: Process Order.
At level 2, it decomposes into:
- Verify Order Details
- Check Inventory Availability
- Calculate Total with Tax
- Generate Invoice
- Assign Shipment Date
These are the right level. Each has a clear input and output, no internal data stores, and performs a distinct function.
Decomposing “Calculate Total with Tax” into “Add Subtotal and Tax Rate” and “Multiply” would be over-decomposition. The arithmetic is implied in the process. The data flow is complete.
Decision-Making Framework for DFD Decomposition
Use this checklist when deciding whether to decompose a process:
| Criterion | Yes | No |
|---|---|---|
| More than 3–4 inputs/outputs? | Yes → Decompose | No → Keep as is |
| Contains multiple distinct functions? | Yes → Decompose | No → Keep as is |
| Has a data store that’s not linked? | Yes → Decompose | No → Keep as is |
| Is functionally complex (e.g., involves decisions, loops)? | Yes → Decompose | No → Keep as is |
When two or more criteria apply, decompose. When only one applies, consider whether the process is already atomic.
Remember: Decomposition isn’t about complexity—it’s about clarity. A process should be decomposed only if doing so improves understanding of data movement.
When to Stop Decomposing
Not all processes need to be broken down. Stop when:
- The process performs a single, well-defined action.
- It has a direct correspondence to a business rule or transaction.
- Its input and output flows are traceable from the parent diagram.
- It cannot be meaningfully separated without losing context.
I once modeled a healthcare scheduling system. “Reschedule Appointment” was decomposed into “Update Date,” “Notify Patient,” and “Reassign Staff.” That was the right level. Going further to “Send Email” or “Format Date String” would have crossed into coding territory.
Decomposition stops when the process is not just atomic—but meaningful.
When you find yourself asking “should this be split?”—ask instead: “does splitting it improve understanding of how data changes?” If the answer is no, stop.
Practical Tips to Avoid Pitfalls
Here are five habits I’ve built into my own modeling workflow to maintain DFD decomposition balance:
- Label processes with verbs—but only when they represent a transformation. “Verify,” “Calculate,” “Generate” are acceptable. “Perform task” or “Handle” are too vague.
- Use data dictionaries early—before decomposing, define all data flows. This ensures consistency.
- Apply the “5-second rule”—a stakeholder should understand the process in under 5 seconds. If not, it needs simplification.
- Review against the parent—every input and output in a child diagram must be traceable to the parent.
- Limit child processes to 5–7 per parent—more than that, and the diagram becomes unwieldy.
These aren’t rules. They’re guardrails. Use them to test your own work.
Summary: The Balance Is in the Flow
DFD decomposition balance is not a fixed number of levels. It’s a state of clarity.
Over-decomposition DFD creates noise. Under-decomposition hides complexity. The right level of DFD detail exposes the flow without overwhelming.
Use the framework above to test your work. Ask: “Would this help someone understand where data goes and how it changes?” If yes, you’re on track.
Remember: the goal isn’t to decompose until you’re done. It’s to decompose until you’re clear.
Frequently Asked Questions
What is the ideal number of DFD levels?
There’s no ideal number. Most systems use 2–3 levels. Level 0 (context diagram), Level 1 (high-level process), and Level 2 (detailed sub-processes). More than three levels usually indicate over-decomposition.
How do I know if I’ve over-decomposed a DFD?
Check for processes with only one input/output, labels like “check,” “verify,” or “send,” or child diagrams with no data stores. If a sub-process performs less than one data transformation, it’s likely over-decomposed.
Can under-decomposition cause compliance issues?
Absolutely. Under-decomposition hides data movement. If you can’t trace how data flows between systems, you can’t meet GDPR, SOX, or internal audit requirements.
Is there a tool to auto-validate DFD decomposition balance?
Yes—Visual Paradigm and similar tools flag inconsistencies in flows and missing data stores. But they can’t replace human judgment. Use them for validation, not decision-making.
What should I do if a process has too many inputs/outputs?
Break it down using the decision rules above. If it has more than 4 flows, decompose into smaller, focused processes. Always ensure the parent diagram reflects the child’s inputs and outputs.
How do I decide when to stop decomposing?
Stop when the process is atomic, has only one transformation, and no internal data store. If decomposing further adds no insight—stop. The right level of DFD detail is when the flow speaks for itself.