The app gives you control over things like the color space, codecs, lens correction, LUT, etc., as well as better monitoring and manual adjustments. For that reason I think it’s not really useful to show a straight out of camera comparison, as the results very much depend on how you use these options and whether or not you intend to do color grading. Without adjusting the defaults, you’ll get a 4K h256 rec.709 video, whereas the default app will give you an HDR video, which might look better straight out of the camera, provided that the exposure and camera work are equally good.
The iPhone's AVAsset* framework system is... intense. For "simple" stuff nowadays it seems like the amount of work you have to do just to get bootstrapped is a lot. But it also seems insanely powerful for all kinds of stuff and would make it possible to do a whole heck of a lot without having to hardware hack.