Four astronauts came home on April 1, 2026. The Orion capsule hit the Pacific, the recovery teams moved in, and within hours NASA confirmed the crew was safe. Then the quieter work started: engineers photographing every square inch of the heat shield, cataloguing ablation patterns, comparing thermal data against pre-flight models. Early assessments, reported by NASA, Space.com, and The Register, suggest the shield performed well, a relief given the unexpected cracking and charring discovered after the uncrewed Artemis I reentry in 2022. But relief is a feeling, and feelings have a poor track record inside NASA. On February 1, 2003, the space shuttle Columbia broke apart over Texas during reentry, killing all seven crew members. A chunk of insulating foam had struck the orbiter's left wing during launch sixteen days earlier. Engineers flagged the strike. Managers concluded the risk was acceptable. Michael Cabbage and William Harwood's Comm Check... reconstructs, in documentary detail, how that conclusion was reached and what it cost.

Most coverage of the Artemis II heat shield assessments focuses on measurable quantities: ablation rates, thermal margins, material behavior under reentry heating. Engineering updates belong in that register. But Columbia's thermal protection also failed in a way that was technically identifiable before the orbiter ever began its descent. The lethal breakdown was organizational. NASA's management hierarchy in 2003 allowed risk assessments to be softened, reframed, and diluted as they traveled upward, so that a serious engineering concern could reach a senior manager's desk sounding merely procedural. Cabbage and Harwood trace that distortion meeting by meeting, email by email. Their reporting poses a question the current post-flight reviews cannot answer with thermal data alone: have the internal systems that process and escalate risk at NASA changed enough to prevent the same kind of filtering?

Cabbage and Harwood were veteran space journalists before Columbia, with a combined three decades on the shuttle beat. That experience bought them access. Comm Check... draws on dozens of exclusive interviews, internal recordings, and documents that had not previously been made public. The account they assemble is granular where official investigation reports, by institutional necessity, tend to generalize. The book's sharpest work reconstructs the sixteen days between Columbia's launch on January 16, 2003, and its destruction.

During ascent, a briefcase-sized piece of foam separated from the external tank and hit the leading edge of the orbiter's left wing. Engineers on the Debris Assessment Team requested satellite imagery to evaluate the damage. That request was routed through management and denied. Cabbage and Harwood show, through transcripts and internal correspondence, that the denial was the output of a system in which schedule pressure, budget constraints, and hierarchical communication habits made accepting risk easier than investigating it.

The authors then map the feedback loops that let safety culture weaken without anyone making a conscious decision to lower standards. Foam strikes had occurred on earlier missions without catastrophic results, and each uneventful flight made the next strike seem more tolerable. Sociologist Diane Vaughan calls this the normalization of deviance. Cabbage and Harwood give it operational texture: specific budget numbers, specific scheduling conflicts, specific conversations where an engineer's conditional warning about possible tile damage became, by the time it reached senior leadership, a reassurance that foam impacts were a known and accepted phenomenon. You can trace the distortion sentence by sentence through their sourcing. It is a portrait of failure that operates at the level of word choice, meeting minutes, and forwarded emails. One weakness: the seven crew members appear mostly in the opening and closing chapters, and their presence is handled with respectful brevity that sometimes thins into absence. For an account so focused on the human cost of institutional malfunction, the astronauts at the center of the loss remain more emblematic than individual. Cabbage and Harwood are system-oriented journalists, and that orientation is their strength, but the narrative pays a price for it. The strongest passages put you inside conference rooms where program managers weighed fragmentary data against competing pressures. A recurring, almost sickening detail is how the vocabulary of risk shifted depending on who was talking to whom. The same thermal concern could sound urgent in one meeting and routine in the next, depending on the audience and the speaker's place in the hierarchy. The book documents these shifts with enough precision that the distortion becomes visible as a structural property of the organization, something built into the reporting chain rather than traceable to any single villain.

Comm Check... is a specific, sourced, and sometimes uncomfortable account of how an agency's internal processes produced a catastrophe that its own engineers saw coming. In 2026, with NASA working through Artemis II data and planning Artemis III, the book serves as something more practical than a memorial. It is a field guide to the exact organizational failures that turn manageable risks into fatal ones. It will not tell you whether Orion's heat shield is safe. It will show you, in close and documented detail, what it looked like when an institution decided something was safe without finishing the work to confirm it.