Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>It's that they're making choices for us, and these are not the choices we would make for ourselves.

Of course, you can always take off the training wheels and shoot raw.

Apple even worked with Adobe to create ProRaw, which allows you to selectively turn on or off various parts of their image processing pipeline after you shoot the image.

https://lux.camera/understanding-proraw/



ProRaw is demosaiced and includes the computational photography stuff as well.

The big picture is that cameras profoundly influence how we see and share ourselves, others, and the world. Fairly recently, cameras (and their developers) started making opinionated decisions for us about things like the color of sky, the texture of skin, and much more. Sure, any one of us can opt out and do things RAW, but the rest of the world will go on taking pictures with these decisions baked in. I don't think the answer is RAW, because the problem is not pixel data -- it is who controls image making in the first place.


What’s your proposed way of solving the issue? HDR can be considered more natural to how our own eyes work, being pedantic the “old-school” model is much more foreign than the NN-enhanced one. Sure, one can overdo it, e.g. replacing an image of a moon with a photograph of it, but I don’t think that NN-based color balance, HDR and the like are worsening the problem.


I don't see how this is different than the film stock you shot on, the lens you used, or the format you shot it on.


> Fairly recently, cameras (and their developers) started making opinionated decisions for us about things like the color of sky

That's called "auto white balance" and we've been stuck with it since the advent of digital cameras. Phones don't do much more than that; if they detect the sky it's to adjust denoising to not cause banding artifacts etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: