David Mellor, a philosopher and artist who currently researches AI and ethics shares his take on Samsung phones using AI tools for night photography, specifically its ‘fake’ moon photos and why the public outrage around it may be an overreaction.
When is the moon not the moon? I don’t mean like in the 2022 film Moonfall where it turns out to be an alien mega-structure (don’t worry, that’s given away in the trailer). No, I’m referring, of course, to all the recent fuss about Samsung’s S23 Ultra and whether it can really take crisp shots of the moon.
It turns out that, according to a press release from Samsung, the phone invents details that weren’t in the source image captured by users. Samsung uses what they call a Scene Optimizer to add moon-like details to nocturnal snaps of our only natural satellite. This not only eliminates noise from the low-resolution source, it’s also enhancing the image using a ‘detail enhancement engine’.
Simply put, this is a machine learning model that uses learning data in order to supplement an image with sharper details than were captured by the camera sensor. In this case, the dark seas, significant craters, and mountain ranges.
There seems to be some intriguing psychology involved in all the criticism (and public outrage) going on about Samsung’s tech and what some perceive as the slight-of-hand involved in the moon shots.
I wonder if this has something to do with the recent yet very deep connections between our identities and our phones, coupled with a primordial affinity with the moon. In an unstable and uncertain world, it’s a reassuring, calm constant. That we could turn our most personal of devices upwards and engage in the timeless human wonder of contemplating our companion in the firmament… well, it seems there’s something rather special in that.
Consequently, it does seem a bit clumsy of Samsung to make the apparent ability to take clear moon photos a key advertising message. The point is that this type of camera tech is hardly new. No one has been pretending that in-camera machine learning processes that make crisper, clearer details don’t already exist. But a nerve has most certainly been struck.
And once again people find themselves asking fundamental questions about what a photograph really is anyway.
So what’s going on here? Some ancient philosophy might hold the answer. The Greeks didn’t have the concept of technology that we’re familiar with today. Instead they used the term technē, which can mean two things, technics or art.
Technics is roughly concerned with how we add things to our lives that don’t exist in nature, things that have obvious utility, though a ‘know-how’ that nature seems to have neglected. Art is also a practical activity, but unlike technics it is those things about being human that can never be reduced to or directly supplement our nature.
Photography is a form of technē. It’s both technics and art. It supplements our ocular capacities and externalises our memories. It also creates breath-taking forms of beauty that supersede anything found in nature. It does both of these by drawing on and then exceeding the raw materials provided by nature itself. You’ll doubtless be familiar with the similar yet modern argument that photography is both a science and an art.
What’s all this got to do with the moon shots? I’d say, the balance in technē is off. There’s nothing inherently wrong with a camera enhancing a source image with information imported from a learning data set. In this case, existing high-definition moon images.
Samsung don’t deserve to be vilified for this (although that’s not pre-empting any Advertising Standards Authority investigation!). But there is, I think, a wider potential issue with the rapid growth of automated in-camera processes (and other ‘simple-fix’ enhancement tools). And this has to do with when the logics of utility overtake the practical activities of art.
Ask yourself this: how much of a picture are you willing to let your camera take for you? Responses will vary and can neither be quantified or correct. But think about who decides how much of the process is autonomous and who decides what the value of this is.
I’m concerned there is a hollow centre to this bag of tricks. I’m worried that what we tend to call ‘technology’ nowadays is the logics of technics working without the spirit of art. This isn’t an anti-AI position. It’s an argument for an AI that doesn’t make every means an end where ultimate value resides in the products of automated processes.
There’s an old story about the ‘man in the moon’. I think that, at least in part, people are reacting to the Samsung ‘controversy’ because they see themselves as being the ‘man’ in the moon. They want to be the person who created a true moon image. They certainly want to use the technics of a camera to do this. But there’s a desire for an incompleteness too.
It’s not that people want to take rubbish moon shots. Rather they want the shot to be theirs and not an assemblage of ‘perfected’ additions drawn from external data.
There isn’t a definitive photo of the moon: a Moon, if you like. There won’t ever be one. In that sense, the moon of the Samsung S23 Ultra is a Moon that’s not the moon. When our technologies act in such a way as to push all moon photos towards something that’s mimicking the definitive, well, where’s the human spirit of art in that?
About David Melllor
David Mellor is a philosopher and artist with a DPhil from the University of Oxford. He is currently researching AI and ethics at Coventry University.
The views expressed in this column are not necessarily those of Amateur Photographer magazine or Kelsey Media Limited. If you have an opinion you’d like to share on this topic, or any other photography related subject, email: email@example.com.
- Night photography: how to photograph the night sky, moon and stars
- The night sky: How to photograph stars and the Milky Way
- Will AI replace photographers?
- The human cost of artificial intelligence in photography
- Samsung Galaxy S23 Ultra Review: the ultimate camera phone?
- The best camera phones for photography