Categories: world

Night Sight on Google Pixel shows how camera cameras have become faketastic

The small camera on this phone has a superpower: it can see things that our eyes can not. On the…

The small camera on this phone has a superpower: it can see things that our eyes can not.

On the night in recent weeks, I’ve trumped around dark places and taken photos with a new location on Google’s 800 pixel pixel 3 called Night Sight. Friends in a candle look as if they were taking a lighting crew. The dark streets are flush with red and green. A midnight cityscape lights up as if it was late in the afternoon. It goes far beyond an Instagram filter until you have to see this territory.

Night Sight is a great step forward for smartphone photography – and an example of how our photos get, good, super fake.

It’s true: You do not look like your photos. Photography has never been about capturing reality, but the latest phones are increasingly taking photographs in unshielded territory.

At present, Night Sight is just a situation that appears in dark shots on Google’s Pixel phones. But it’s hardly lonely: All types of answering machines boast about how amazing their photos look, no matter how realistic they are. IPhone’s “Portrait Mode” applies to built-up blur to backgrounds and identifies facial features to reduce red-eye. Features on phones that are popular in Asia are automatically slipping on head, brighter eyes and smooth skin. And the latest phones use a technology called HDR that joins multiple images to produce a hypertonic reality art.

When I recently took the same sunset photo with an iPhone 6 from 201

4 and this year’s iPhone XR, I was gobsmacked on the difference – the newer iPhone shot looked like it had been painted with watercolors.

What’s up? Smartphones democratized photography for 2.5 billion people – a good photo used to require special hardware and a user manual.

Now, artificial intelligence and other software advances democratization creates beauty. Yes, beauty. Editing photos no longer requires Photoshop skills. Now, when presented with a scenic vista or smiling face, camera cameras talk to algorithms who are educated on what people like to watch and churn out sound images.

Your phone really has high-tech glasses. Think of your camera less like a reality reflection, and more an AI trying to make you happy. It is faketastic.

Fasting a photo on a phone has become so much more than passing light through a lens on a sensor. Of course, the hardware is still important and has improved over the past decade.

But more and more is software – not hardware – it makes our images better. “It’s hyperbole, but true,” says Marc Levoy, a retired Stanford computer science professor who once taught Google founder Larry Page and Sergey Brin and is now working on them on camera projects, including Night Sight.

Levoys work is rotated in intrinsic sizes limitations of a smartphone. Phones can not fit large lenses (and sensors below them) like traditional cameras, so decision makers need to find creative ways to compensate. Set technologies that replace optics with software, such as digitally combining multiple shots into one.

New phones from Apple, Samsung and Huawei also use it, but “we are investing in software and AI ranches,” says Levoy. This freed Google to explore creating new ways.

“Google has got an edge on software,” said Nicolas Touchard, Vice President of Marketing at DxOMark Image Labs, which produces independent benchmarks for cameras. (If any of this is enough to help the Pixel winner converters from Apple and Samsung is a separate issue.)

With Night Sight, Google’s software is extremely extreme and takes up to 15 low-light images and blends them together to brighten faces , give crisp details and saturated colors in a way that draws in the eye. No blinking goes off – it constantly strengthens the light already there.

Whoever tried a lowlight shot on a traditional camera knows how hard it is to not take blurred photos. With Night Sight, before you even press the button, the phone measures the shake of your hand and the movement in the scene to determine how many pictures to take and how long you want to leave the shutter open. When you press the shutter, it warns “Hold” and shoot for up to 6 seconds.

For the next or two seconds, Night Sight shares all its images in a lot of small plates, customizing and joining the best pieces to make a complete picture. Finally, AI and other software analyze the image to choose colors and tones.

Night Sight had difficulty focusing and in scenes with almost no light. You – and your subject – must really keep that bag. But in most of my test pictures, the product was fantastic. Portrait smoothes the skin while the eyes look sharp. Night scenery illuminated hidden details and colored them like Willy Wonka’s chocolate factory.

The problem is: How does a computer choose tones and colors of things we experience in the dark? Will it make a star like twilight?

“If we can not see it, we do not know what it looks like,” said Levoy. “There are many aesthetic decisions. We did them in a way that you could do them in a different way. Perhaps eventually these phones will need a” What I See “Compared to What’s The Real’s Button . “

So if our phones make up colors and lighting to please us, does it really count as photography? Or is it computer generated artwork?

Some purists argue for the latter. “This is always what happens to disturbing technology,” says Levoy.

What does “fake” mean, he says, he asks. Pro photographers have long made adjustments in Photoshop or a dark space. Before that, the filmmaker tweaked colors for a certain look. It may be an academic question if we did not talk about the hobby – not to mention the memories – of one third of humanity.

How far will phones take away our images from reality? What can software train us to think looks normal? What parts of pictures do we allow computers to be edited? In a photo I took of the White House (without night vision), I noticed the algorithms in Pixel 3 that were trained to smooth out faults that actually removed architectural details that were still visible in a shot on the iPhone XS.

At DxOMark, Camera Surveillance, the question is how to even judge images when interpreted by software for features such as facial beauty.

“Sometimes manufacturers are driving too far. Usually, we say it’s okay if they have not ruined information. If you want to be objective, you must consider the camera as a device that captures information,” said Touchard.

For another perspective I called Kenan Aktulun, the founder of the annual iPhone Photography Awards. Over the last decade, he has reviewed more than a million photos taken with iPhones, which participants are discouraged from powerful editing.

The line between digital art and photography “gets really fuzzy at any time, “said Aktulun. However, he ultimately welcomes technical improvements that make invisible processes and tools invisible. The trick in smartphone photography is that it is available – one button and you are there. AI is a development of it. [19659002] “As the technical quality of the images has improved, what we are looking for is the emotional link,” said Aktulun. “Those who get much more attention is not technically perfect. They are pictures that provide insight into human life or experience. “

Published by