I don't think that is true. MKBHD just came out with a video demonstrating using a video wall that updated in sync with the camera frame rate. Allowing two cameras to see different backgrounds, e.g. two different colors or even two different images, as the demo was showing a parallax effect.
Actually, checking the paper, they tested that - reducing the shutter speed, realigning nearby frames and adding motion blur.
As a vfx compositor, while I’d be OK with this workflow, it doesn’t come without their own issues and artifacts.
Thanks for looking at the paper. It forced me to go read it:
"We address this by increasing the repeating rate of the two lighting
conditions at 72HZ, so that the lighting changes from one color to
the next every 144th of a second. The lighting then appears nearly
constant, with a remaining effect being that rapidly moving objects
leave a trail of magenta/green outlines when seen against the screen,
as in Figure 6 (bottom)."
"The remaining drawback of time-multiplexing in this manner
is that the shorter shutter angle reduces the amount of motion
blur, which is considered desirable for cinema."
Interesting that the complaint is that it reduces motion blur.