AI tools usually work by taking a few details you enter—injury type, treatment timeline, and the rough scale of losses—and turning that into a “likely range.” That can be helpful for orientation.
But many Minot residents discover that the estimate doesn’t match reality because real cases hinge on issues AI forms typically don’t capture, such as:
- Whether the chart supports the timeline (when symptoms began, when they were acted on, and what was documented)
- Whether causation is medically defensible (that the provider’s actions, not another condition, caused the harm)
- Whether the harm is measurable and ongoing (functional limits, worsening prognosis, and future care needs)
In other words: AI can sketch categories; it can’t replace the evidence required to prove negligence in a real North Dakota claim.


