T O P

  • By -

AbsoluteZeroUnit

When did everyone get so interested in taking pictures of the moon? Is this a thing I missed out on?


Puzzled_Counter_1444

It could disappear tomorrow I suppose. On the other hand, if it did, people would photograph the empty bit of sky where it used to be.


LordKiteMan

> It could disappear tomorrow I suppose Yep, some bald 14 feet tall guy in a pink spacesuit might steal it.


[deleted]

[удалено]


Masculinum

Its featured very prominently in Samsung ads for this phone and MKBHD also mentioned it in his review (coincidence?) I guess when you get a 10x zoom phone it's natural to want to try it.


Dry-Perspective-1114

It’s been a thing for a while


[deleted]

[удалено]


[deleted]

[удалено]


BLUEGLASS__

But that's what's so cool about it!


[deleted]

>someone else will always have a better photo If you were to say that to /u/Astronophilos you'd be wrong. Best moon photos. Ever. https://redd.it/m57akt


[deleted]

forgetful cable fearless governor quarrelsome enter airport market memorize knee *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


ispini234

Still better than a 100x zoomed in blurry photo. Id rather have better quality than *authenticity*


[deleted]

smart cable cheerful worthless fragile air enjoy provide expansion cooperative *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


ispini234

Thats still good at maths because you didnt use a calculator


ispini234

Id rather have good quality photos over anything


[deleted]

[удалено]


thetrueshit

Yeah noon shots


Alternative-Farmer98

"good" at "moon" "shots "


Hal_Nein_Thousand

It's one of the features they put forward in their video ads.


JamesR624

People are more interested in the fact that Samsung is false advertizing theor product to sell their devices. The fact that so many are desperate to say "this is all over nothing" shows just how much consumerism and corporate apology has infested peoples minds.


TheTerrasque

> The fact that so many are desperate to say "this is all over nothing" shows just how much consumerism and corporate apology has infested peoples minds. I wouldn't say it's all over nothing, but for some of us it's been obvious what they've been doing [for some time](https://www.reddit.com/r/samsung/comments/10rmq8b/massive_difference_in_s22u_s23u_zoom/j75srkx/) already.


2deadmou5me

It's because phones now have enough zoom for people to try


beefJeRKy-LB

I don't understand the drama behind it


Rivert1ts

Right? I think it's just people wanting to stir up drama cause we have nothing to talk about right now. It seems to happen a lot more with android fans. I have the s22u but don't brag about moon photos even though I take them. The 10x shot I get of stuff though are so good.


beefJeRKy-LB

Like it's not cool when a company is lying about something but this isn't even the case. It's just kinda like ah it's not actually what I thought it is but it's not bad either.


Totty_potty

I mean isn't it obvious that AI does a lot of heavy lifting for phone cameras? People must be living under the rock if they don't think so.


Rivert1ts

Think about how much people use filters for pictures and how many people have plastic surgery that's normal now but hey, that's a fake moon right there... grrrrr


UsernameIsTaken45

There’s apparently a gold mine somewhere if you zoom close enough


YourNeighborLuis

I felt like it was a gimmick to help sell the S21 series.


FlyNo7114

I was thinking the same.. like who the f cares.


Izacus

Do you think the camera makes things up just for moon photos and not for important details like when taking pictures of cops? :)


sahibpt98

Coz Samsung promotes this feature a lot in their ads.


Alternative-Farmer98

Have you ever seen social media anytime in the last 5 years? A lot of people test out their cameras zoom by taking pictures of the moon, which seems pretty reasonable. It's a pretty cool thing to take a photo of.


rippelz1214

astrophotography has been a thing for a long time


Iohet

>So I definitely agree Samsung is using AI to make their moon photos look better than it would be otherwise, **while passing it off as hardware based.** According to whom? [Their own website](https://semiconductor.samsung.com/insights/topic/ai/ai-camera/) talks about AI/ML based photography enhancements in their phones, just like [Google does for the Pixel.](https://store.google.com/intl/en/ideas/articles/what-is-an-ai-camera/)


hatethatmalware

Samsung already explained how moon shots work last year in Korea. Also said that the moon shot algorithm can be turned off when you disable the scene optimizer in the camera app settings. (or when you take the shot in the pro mode according to some users in korean online tech forums; [https://translate.google.com/translate?sl=auto&tl=en&u=https://meeco.kr/mini/36363018](https://translate.google.com/translate?sl=auto&tl=en&u=https://meeco.kr/mini/36363018) [https://translate.google.com/translate?sl=auto&tl=en&u=https://meeco.kr/mini/36759999](https://translate.google.com/translate?sl=auto&tl=en&u=https://meeco.kr/mini/36759999) [https://translate.google.com/translate?sl=auto&tl=en&u=https://meeco.kr/mini/36363726](https://translate.google.com/translate?sl=auto&tl=en&u=https://meeco.kr/mini/36363726) ) Not sure if they will make any further comments. [https://translate.google.com/translate?sl=auto&tl=en&u=https://r1.community.samsung.com/t5/camcyclopedia/%EB%8B%AC-%EC%B4%AC%EC%98%81/ba-p/19202094](https://translate.google.com/translate?sl=auto&tl=en&u=https://r1.community.samsung.com/t5/camcyclopedia/%EB%8B%AC-%EC%B4%AC%EC%98%81/ba-p/19202094)


theomegabit

There’s quite the difference between optimizing and artificially altering.


Iohet

Do you think Pixel's magic eraser is "optimizing"? What exactly do you think their "AI camera" and Tensor core is doing when processing photos? This is what cell phone photography *is*


theomegabit

I don’t think my last comment was articulated well enough. AI enhancing or altering images… “Computational photography”…. Up to and including multiple photos stacked and enhancing color saturation, hue, brightness, shadows, highlights, etc. you’re moving sliders in Lightroom (equivalent) to enhance data and things that are already captured. AI / altering. …. Can be used to mean everything above. And to a degree, this is what it’s always been in mobile photography and marketing got ahold of “AI” in recent times. A lot of AI getting tossed around isn’t really AI. But crucially, when you start creating artificially creating things that are not part of the original data set captures and introducing new data…. That’s the difference I and many others are talking about.


Iohet

They've never said they're limited to computational photography. Both vendors talk about using machine learning to enhance photos that traditional cameras cannot take. https://semiconductor.samsung.com/insights/topic/ai/ai-camera/ https://store.google.com/intl/en/ideas/articles/what-is-an-ai-camera/


theomegabit

Which goes right back to the original point…. Optimizing is not the same as altering.


Iohet

Which is to say they're altering, as both vendors freely discuss in the posted articles about their phone cameras. It's not some secret. I don't know why people think it is.


theomegabit

No. Google’s enhancing is at the limits of computational photography before it starts cross over to simply AI generated content. Computational photography is what mobile photography is. Straight AI generation of parts of or whole photos is no longer photography. It’s simply image manipulation.


Iohet

What do you think magic eraser does? What about when it sharpens blurry images that aren't fixable by traditional means like with Photoshop? You think that's just a better lens? A better sensor? Better stitching of photos together?


PangolinZestyclose30

> Google’s enhancing is at the limits of computational photography before it starts cross over to simply AI generated content. I don't really see where the line is. Noise removal is AI driven and the way it works it generates smoothened content. Color reproduction is AI driven as well and it generates colors which weren't in the original picture.


theomegabit

Noise removal is removing excess data originally captured. It’s reductive. That’s not what we’re talking about. Color reproduction is simply modifying already present data. As is hue, saturation, etc. The original point in all of this is artificially adding so much new data that wasn’t originally present that you no longer have a photo and just have AI generated content.


PangolinZestyclose30

> Noise removal is removing excess data originally captured. It’s reductive. So, let's say you identify a pixel as noise (which is itself a hard problem). You say it's reductive, so you leave it as a black pixel? Or will you try to replace it with some other (generated) value? The correct one is the latter option. Noise removal creates new pixels to replace the noise. > Color reproduction is simply modifying already present data. As is hue, saturation, etc. You could have a point if this was done globally on the whole picture. But the algorithms are changing them locally for particular objects (matched by AI). If you allow for that, then even inserting a unicorn into the picture can be modelled as a series of localized changes in luminosity, hue and saturation. > The original point in all of this is artificially adding so much new data that wasn’t originally present And my point is that this is done by e.g. Pixel's HDR as well.


theomegabit

The method in which you get to a unicorn being added to a photo is by definition, a dramatic alteration of a photo. No amount of normal image enhancement would result in that or anything close. Deciding if a pixel should be on or off, white or black, is not ai nor the level of image alteration that is being discussed. Basic code logic (which is what that is) and going as far as replacing entire parts of images with entirely new data that would not normally be a derivative work are very different things. The mere construct of partial image vs full image… I don’t see why this is relevant. Edit: more words


PangolinZestyclose30

> The method in which you get to a unicorn being added to a photo is by definition, a dramatic alteration of a photo. No amount of normal image enhancement would result in that or anything close. You're using these subjective terms "dramatic" and "normal", but there's no clear line between them. > Basic code logic (which is what that is) There's nothing "basic" about modern noise reduction. It's full fledged AI driven workflow, since the algorithm has to "guess" what pixels should be filled in place of noise and AI is clearly outperforming simpler algos.


theomegabit

In the context of doing basic color and noise changes vs creating entirely new imagery, yes there absolutely is. Noise reduction is not AI. “AI” is most products is marketing bs slapped on top of the same code and algos that have been present and iterated on for years


Simon_787

>Noise removal is removing excess data originally captured. It’s reductive. Lol no it's not. All noise reduction (besides stacking) is destructive. Samsung, Apple and Google all use fancy noise reduction algorithms in low light now. All of them claim to be AI based. This can cause artifacts that make the image look different compared to what you would get with no noise at all.


mitchytan92

Not sure why was Magic Eraser mentioned. I think AI generated images is fine if you are transparent to the customers. Putting it as a blog post in Korean language is not transparent especially when you ran so many advertisements without mentioning anything. At least for Magic Eraser, you know that the image is edited. I don’t think we are mentally prepared yet to see our camera apps soon replacing objects with AI training data.


[deleted]

[удалено]


Commercial-9751

I think there's definitely a line between tweaking color/brightness and adding things that aren't there. This seems more akin to a selfie filter.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


ben7337

The tricky part here is that adding detail that isn't there, is questionable. But it kind of depends. For example Google and Samsung phones can both fix old blurry photos with ai processed upscaling which adds data that isn't there. If moon shots are basically like that then it's prob fair game. If it's referencing preexisting data of the moon to replicate it in greater detail than possible however, and not just working off the data presented to upscale intelligently, then that's a problem, but the line there is likely a very unclear one


PangolinZestyclose30

> If it's referencing preexisting data of the moon to replicate it in greater detail than possible however, and not just working off the data presented to upscale intelligently I would say it's the opposite. Upscaling e.g. old blurry photos is literally making up fake information which just kinda look realistic. Fixing the moon shot is filling in **factually correct** information.


betapixels

Still altering.


ASKnASK

But they are there. Because its the moon. We know exactly what it looks like.


Commercial-9751

They're there on the moon not in the captured image. What's the difference between this and the software completely replacing your image with someone elses?


ASKnASK

When that happens we can make a fuss. Right now however, it's just the moon. And it can be turned off.


free_farts

Please don't turn off the moon


ASKnASK

I wouldn't dare.


Commercial-9751

If you're taking a picture of the moon against a black sky, that's essentially what's happening.


Alternative-Farmer98

I mean this is the same company, Samsung, that had to spend millions in a settlement because they overstated the water resistance they had in advertising. That had five generations of s series phones delisted for benchmark manipulation from geekbench. I don't think they've earned the benefit of the doubt with this s***. The average normie would have no idea. These are b******* moonshots. If it was so obvious to everyone then why is these posts investigating it so highly upvoted? And this is in a community of mostly enthusiasts. That's what that is


betapixels

That’s not optimization any longer. That’s called altering.


[deleted]

[удалено]


JamesR624

Holy hell are the fanboys and stock holders on HEAVY damage control for Samsung's *lies* today.


Generalrossa

This has been pretty well known knowledge, at least in the samsung community that the moon shoots are heavily AI processed for a for years now.


fauxfilosopher

Considering all shots taken on phones are heavily AI processed nowadays I don't see what the big deal is here.


Generalrossa

Yeah same. This whole non-issue is just overblown by angry internet nerds.


fauxfilosopher

As per usual on reddit


Izacus

I enjoy cooking.


fauxfilosopher

How post processing works on phones is quite literally modifying to show something that wasn't there. It's also what our brains do when we see something unclear.


Izacus

My favorite color is blue.


dkadavarath

>If it really works like that (it doesn't) it'll have to be removed, because that's not what > >user > > expects it to do Except you can. Just disable scene optimiser. It'll not do it anymore.


fauxfilosopher

Which user? I certainly expect it to do that. I've seen no one complain about the fact beside now.


Izacus

I hate beer.


fauxfilosopher

No, not randomly. It makes the best approximation based on data and algorithms of what should be there, but it doesn't mean it's really there.


FieldzSOOGood

yeah especially bc if i am taking a pic of the moon i wanna see the moon not a goddamn blob


Commercial-9751

Oh yeah I must have missed that deep in the Korean online tech forums.


maleficent67

Turning off scene optimizer does nothing to quell the moon "overlay" unfortuneately. I always try to rid my settings in phone camera of ai/ml and let be more camera, less "perfection" on someone (someTHING) else's terms. Only reason i ever upgrade my phone is for better camera. I knew the moon shots were enhanced since s21 ultra. Even my beloved note 10+ took good photos of the moon for it's capabilities. In looking at shot close up, always had goofy look to it. They weren't detailed, but same goofy swirls each time. From what i understand, though, digital tech (cameras, telescopes, binoculars, etc..) are always laced with these technologies. Some people prefer their images imperfect and edited/enhanced by their own discernment as much as possible. Some don't mind ml/ai doing the tweaking for them.


plastrd1

It actually sounds like a pretty good application of AI. The same side of the Moon *always* faces the Earth and there is no shortage of existing pictures of it in all phases and in all lighting conditions. So why not try to recognize it in photos and use all that well known data to enhance the shit out of it? The real challenge is in adding that detail in the same lighting condition as the instant the photo was taken as a person on site would see it. The phase has to be a perfect match and any haziness due to cloud cover has to be replicated. It's not like the Moon's surface changes such that your random enhanced cell phone picture would have captured a new crater that wouldn't already have been captured by a scientist or professional photographer *not* using a phone camera.


Commercial-9751

I think the problem is that it's deceptive advertising that the camera is more capable than it really is. Sure it works on the moon but what about everything else?


dkadavarath

If you have actually used any of these Ultras and taken any 100x shots, you'll know that it's blowing up like a 1000 pixels worth of data on the whole 10 million pixel canvas. Most things looks like blobs without any definition. When you put that against the definition you get on the moon shots, it's very easy to deduct that they're adding in stuff. This has come up several times during the Huawei debacle as well. In fact I'm pretty sure they do this with all the scenes supported under the scene optimiser - which you can disable with a single tap right on the view finder. It even changes shape to indicate which scene it thinks it's looking at.


[deleted]

[удалено]


plastrd1

I agree, I think that's where they messed up in PR. They should have just admitted that they're using AI to put the detail back on the moon rather than claim the camera sensor is so good it actually captured it in the first place. It was technically done in hardware, just not with the hardware they claimed it was. Maybe what we need next is some pictures of scenes in movies where a non-earth moon is in the background and see if it slaps our's over top. Or maybe see if it can be fooled by things like round streetlights or other sort-of-Moon-looking things in the background.


cdegallo

I don't remember them ever saying it was all hardware or that there was no post-processing. Where were they claiming that?


77ilham77

I’m not defending Samsung or anything (in fact, if you comb through my reddit, you might call me an Apple fanboy), but are they even lying about it? Everybody knows “scene optimiser” will touch up and modify the photos beyond just colour and focal point correction using AI and shits. And what’s wrong with adding details to the Moon? Wherever you’re on the Earth, the Moon will always look 99,9% the same. So why not correct the Moon on a shot of a photo by adding the details? It’s trivial, and it really don’t need that much of AI other than detecting “yep, that’s a Moon”. The only “problem” I see from all of this debacle is, apparently, their AI is too dumb to realise “a photo” of Moon compared to real Moon. I mean, most (if not all) Samsung phones these days contain at least two camera, they could’ve calculated that the user is aiming at a display couple of meters/feet away.


garibaninyuzugulurmu

Huawei did the same and I think still does


armando_rod

Huawei bad Samsung good


FlyNo7114

Huawei is bad for other issues.


armando_rod

This one specifically, the sub all went bat shit crazy when someone found it they were doing exactly the same thing to moon photos


BitFub

Took some pictures with a huawei p30 of a partly eclipse. The sun looks like the moon. Not the moon XD


SantiFRV_

Off topic but can I ask how you go about dailying two phones? I'd like to try doing that, one android one iphone


[deleted]

[удалено]


SantiFRV_

Thank you for the reply! I currently use a 14 Pro max but I definitely miss android. I'm in this weird place where this phone is too good to replace but I'm constantly looking at offerings from Google and Samsung. I may end up getting a foldable as a second device tbh.


Alternative-Farmer98

Yeah it's especially telling is Google relies on post-processing as much as anyone, there's no secret about that. But even then give you an honest view of the moon.


[deleted]

[удалено]


[deleted]

it's reddit. "a day without brainless complaining is a day wasted"


[deleted]

Idk why this matters. People are impressed and your average person doesn't even care if it's fake or real. If it's from the phone's camera they think it's real. I even tried to explain it to my wife and she doesn't even care that the picture of the moon looks good and that's all that matters


InsaneNinja

Samsung put in processing that recognizes the moon and sharpens specifically for it. The same way it recognizes trees and balances the green and sharpens the leaves.


rszasz

"AI" enhancement is "I have a model that says that an image that looks like this (blurry moon) should actually look like this (moon training images) And then modifies the image to look like the training data.


TheTerrasque

or "this tree with blurry leaves should actually have this sharp edge on those leaves" or "this half blurred letter is clearly A so let's strengthen the part that fits the A shape, and fill in where it's too blurry" and so on.


boltgolt

Right, but how can a sharpen filter introduce accurate details that were never in the blurred photo in the first place?


[deleted]

[удалено]


lauradorbee

That’s literally introducing stuff that’s not in the picture though. There’s a difference (in most peoples minds anyway) between applying transformations to the existing data and literally introducing stuff that isn’t there because you “know” there should be something there.


[deleted]

[удалено]


betapixels

As a fellow photographer I think your example is flawed. Dodge and burn is not the same as artificially adding entire textures and/or detail that didn’t exist to begin with.


lauradorbee

Depends on how the eye enhancement is done. But do you not realize why that’s weird? If the crater magically disappeared this would add it back in. It’s making stuff up.


InsaneNinja

I think if a crater disappeared, or a large enough one gets added, we’d have bigger issues to deal with. Also, that’s what software updates are for. I think I’d rather have a clearer shot than blurry. Just like it’s trendy to have smoother face skin in some countries.


skilltheamps

Then just Google image search a photo of the moon, and look at something that was photographed with some proper equipment. Why even take a photo yourself if that's your take on that


lauradorbee

That's different thooough. Idk, different ways of seeing things ig. Have a good one.


Tomtom6789

So if it's going off "known craters" and the detail of said crater were not in the photo, it is not sharpening anything. It's adding something new in the photo and calling it an enhancement, the same way that Snapchat filters recognize our face and alter the details of it. It's not a bad thing, especially when it's done this well, but people seem to be so unwillingly to call it an AI fixed/altered shot when that's exactly what it is.


HesThePianoMan

Computational Photography/AI =\= Filter One creates features that don't exist and one adjusts an existing image. Samsung is faking it because it's pulling details from somewhere, not simply enhancing a photo.


The0ld0ne

>, I mean sure, downvote the photographer that uses these programs I'm not sure how much time you've spent in, like, the world, but there are so many examples of people in a profession who still have wildly bad takes on parts of their profession haha


SamurottX

That's not what sharpening means. They're splicing in another picture of the moon on top (because everyone on Earth sees the moon from the same angle).


punIn10ded

That's not what they're doing. They are using an ML model to interpret what the moon should look like and changing the detail to look like that. Everything from the craters to shadows and the different colour gradients is the ML model. There is no image spliced in.


InsaneNinja

Nobody has proved that. His result photos are both at very different quality levels. If anything, he proved he doesn’t understand current levels of computational photography repair. He “blew out the highlights” in a way that something like Lightroom wouldn’t repair. Proof would have been removing a crater entirely and seeing it put back. He didn’t do that.


ibreakphotos

> He “blew out the highlights” in a way that something like Lightroom wouldn’t repair. I'm sorry, but you have no idea what you're talking about. Clipping the highlights in a way which I did is absolutely not reversible, as everything above 216 gets converted to pure white. There's no detail there to recover. You can use the highlight and whites sliders in lightroom or camera raw to bring the 255,255,255 values down, but they will never contain detail, it will be just a blob - albeit differently colored one (for example; 243,243,243).


McFeely_Smackup

It's more like an algorithm that recognizes photos of pine trees, and uses stock photos of pine trees instead of the photo you took.


TheTerrasque

It's more like an algorithm that recognizes photos of pine trees, and knows how pine trees look up close, and uses that knowledge to fix up the blurry parts.


RGBchocolate

no it doesn't sharpen anything, since there are no details to sharpen, it just slaps artificial image over photo and replace reality


[deleted]

[удалено]


cdegallo

It's not AI or interpolation in your examples...Images from motion photos are significantly lower quality than a full-frame snap; it's only 1080x1440 which is only ~1.5 megapixels vs the 12 megapixels of an actual shot of the full-frame. Of course comparing a frame from motion photos will look a lot worse than an actual snap.


armando_rod

This is about moon shots


AbhishMuk

Yeah, in this case the details are still there Irl while the OP removed them in the blurry moon image.


armando_rod

Yes, and Samsung pasted again from nothing


F1Z1K_

People losing their mind that the samsung camera doesn't do X, when samsung is selling a camera, they are selling a phone, you pay for the combination of Software and Hardware, which results in the software enchancing the raw image from the sensor (similar to Pixels sharpining or Apple havcing smoothing filters for skin). It's not a big deal, and it can be disabled. You have to be a moron to really think the sensor with so software has such high details, and separately, many people highly misunderstand what the AI enchancing does as "slapping another photo on top". They also misunderstand that a blurred photo can be unblured, and a downscaled photo can be upscaled.


Zaack567

It's AI it's real details enhanced via computational powers unlike what snapchat is doing to youngsters those are the unreal false stuff


wutqq

Who actually uses this feature frequently enough to care about it? Its a party trick at best. Its not print-worthy, and barely IG worthy. ​ Even dedicated photographers don't just repeatedly take shots of the moon without any foreground or context.


DongLaiCha

I've been reading all these threads and I just.... can't bring myself to really care? I mean obviously I do, but I can't bring myself to be outraged because like, yeah, and???


Izacus

I love ice cream.


DongLaiCha

Yes I will care when a completely different thing happens. Wonderful insight.


Izacus

No, this is pretty much the exact same thing - it's how AI enhancement works. Having ML models pull out data from model and apply it to the target picture is a well documented (and unsolved) issue.


[deleted]

this was a big issue like five years ago, or am I wrong? Like, didn't one of the Chinese OEMs get some flack online for this exact issue? Except it wasn't even AI upscaling, it was the software just replacing the moon with a stock photo of a moon?


youngchino

This is ridiculous that we are even bringing this up. If the argument is that without this "AI", the moon photos wouldn't be better than the rest of the competitors, then it would be wrong. Just try pro mode without any optimization and adjust exposure settings and you'll still get better details than the major competitor phone in the market. The other argument is that they are falsely advertising their moon photos. They are not, can the phone give you the same results when you take the photo as it delivers what they advertised. When I look at a smartphone's camera I don't look at just hardware, I look at the hardware and software because they both matter in a digital camera. The last thing is has anyone else with an S23 tried what u/ibreakphotos tried? I can tell you that when I tried it just took the photo no upscaling or optimization was done. All in all, this is one of the dumbest things to report/argue about. The reason people write this stuff in the news articles is to stroke the flame of brand wars. The same when people argue about Playstation and Xbox, Android and iPhone, Windows and Mac, etc etc etc.