Most AI tools work by taking limited inputs—age, injury type, relationship, and a few expense categories—and then producing a “range.” That can be a helpful starting point for thinking through losses, but it often breaks down when real-world facts don’t match the model’s assumptions.
In Marshall, many cases hinge on details like:
- How the collision occurred (speed, lane position, visibility, weather, and what witnesses actually observed)
- Whether reports are consistent (police narratives, EMS documentation, and later medical conclusions)
- Which party controls the risk (drivers, employers, property owners, maintenance contractors)
- Whether comparative fault is disputed
If the defense can argue that someone else was more responsible—or that the death was caused by something other than the alleged wrongful act—an AI tool’s “typical outcome” may not reflect what insurers and Missouri courts ultimately consider.


