Tags : 더나인카지노쿠폰 더킹카지노쿠폰 샌즈카지노
id=”article-body” class=”row” section=”article-body”> Google Night Sight opens up creative possibilities.
Sarah Tew/CNET Over the last three years, Google’s Pixel phones have earned a well-deserved reputation for photographic strength. With the Pixel 4 and 4 XL, the company is flexing new camera hardware and software muscles.
The new flagship Android smartphone, which the search giant unveiled Tuesday, gets a second 12-megapixel camera, a key component in an overhauled portrait mode that focuses your attention on the subject by artificially blurring the background. The new portrait mode works more accurately and now handles more subjects and more compositional styles — and Google is taking some new styling cues from Italian Renaissance painter Titian, too.
The additional camera, a feature Google itself leaked, is just one of the photography advances in the Pixel 4. Many of the others stem from the company’s prowess in computational photography technology, including better zooming, live-view HDR+ for fine-tuning your shots and the extension of Night Sight to astrophotography.
The new features are the surest way Google can stand out in the ruthless, crowded smartphone market. Google knows a lot is riding on the phones. They’re a blip in the marketplace compared with models from smartphone superpowers Samsung and Apple. In June, Google improved its prospects with the low-priced Pixel 3A. But to succeed, Google also needs better alliances with carriers and other retail partners that can steer customers to a Pixel over a Samsung Galaxy. It also needs buyers to connect to new features like a radar chip that helps make secure face unlock faster.
See also
Google Home event 2019: Pixel 4, Pixelbook Go, Nest Mini, Pixel Buds, Stadia and more
Google Pixel 4 and 4 XL hands-on: Two rear cameras and that ugly notch is gone
Pixel Buds: Google’s new wireless headphones are ‘floating computers in your ear’
Nest Mini: Say hello to the newest (and smallest) Google Assistant smart speaker
Improving photography is something Google can do on its own, and photography is important. We’re taking more and more photos as we record our lives and share moments with friends. No wonder Google employs a handful of full-time professional photographers to evaluate its products. So I sat down with the Pixel 4’s camera leaders — Google distinguished engineer Marc Levoy and Pixel camera product manager Isaac Reynolds — to learn how the phone takes advantage of all the new technology.
Levoy himself revealed the computational photography features at the Pixel 4 launch event, even sharing some of the math behind the technology. “It’s not mad science, it’s just simple physics,” he said in a bit of a jab at Apple’s description of its own iPhone 11 computational photography tricks.
The Pixel 4’s main camera has a 12-megapixel sensor with a f1.7 aperture lens, while the telephoto camera has a 16-megapixel sensor with an f2.4 aperture lens. The telephoto camera only produces 12-megapixel photos taken from the central portion of the sensor, though. Using a tighter crop from only the central pixels makes for a bit more zoom reach and sidesteps the greater processing burden required to handle 16 megapixels. Google is using Sony-manufactured sensors, Levoy said.
Two ways to see three dimensions
Enlarge ImageThe Pixel 4, like its predecessors, can artificially blur photo backgrounds to concentrate attention on the photo subject.
Google To distinguish a close subject from a distant background, the Pixel 4’s portrait mode sees in 3D that borrows from our own stereoscopic vision. Humans reconstruct spatial information by comparing the different views from our two eyes.
The Pixel 4 has two such comparisons, though: a short 1mm distance from one side of its tiny lens to the other, and a longer gap about 10 times that between the two cameras. These dual gaps of different length, an industry first, let the camera judge depth for both close and distant subjects.
“You get to use the best of each. When one is weak, the other one kicks in,” Reynolds said.
Those two gaps are oriented perpendicularly, too, which means one method can judge up-down differences while the other judges left-right differences. That should improve 3D accuracy, especially with things like fences with lots of vertical lines.
Levoy, sitting at Google’s Mountain View, California, headquarters, flipped through photos on his MacBook Pro to show results. In one shot, a motorcycle in its full mechanical glory spans the full width of a shot. In another, a man stands far enough from the camera that you can see him head to toe. The smoothly blurred background in both shots would have been impossible with the Pixel 3 portrait mode.
Continuous zoom
Google wants you to think of the Pixel 4’s dual cameras as a single unit with a traditional camera’s continuous zoom flexibility. The telephoto focal length is 1.85X longer than the main camera, but the Pixel 4 will digitally zoom up to 3X with the same quality as optical zoom.
That’s because of Google’s technology called Super Res Zoom that cleverly transforms shaky hands from a problem into an asset. Small wobbles let the camera collect more detailed scene data so the phone can magnify the photo better.
“I regularly use it up to 4X, 5X or 6X and don’t even think about it,” Levoy said.
The iPhone 11 has an ultrawide camera that the Pixel 4 lacks. But Levoy said he’d rather zoom in than zoom out. “Wide angle can be fun, but we think telephoto is more important,” he said at the Pixel 4 launch.
The Pixel 4’s Super Res Zoom uses processing tricks to zoom beyond its camera’s optical abilities.
Google HDR+ view as you compose photos
HDR+ is Google’s high dynamic range technology to capture details in both bright and dark areas. It works by blending up to nine heavily underexposed shots taken in rapid succession into a single photo — a computationally intense process that until now took place only after the photo was taken. The Pixel 4, however, applies HDR+ to the scene you see as you’re composing a photo.
That gives you a better idea of what you’ll get so you don’t need to worry about tapping on the screen to set exposure, Levoy said.
More Titian, less Caravaggio
Google makes aesthetic choices about HDR+ styling, taking inspiration from Italian painters who rose to fame hundreds of years ago.
“We can learn from art,” Levoy said in a Google video about the Pixel 4 launch. “I’ve always been a fan of Caravaggio,” whose paintings have strong dark-light contrast and deep shadows. “That has always been the signature look of HDR+.”
With the Pixel 4, though, Google has steered closer to the lighter shadows of Titian. “We’ve moved a little bit toward that this year, not crushing the shadows quite as much,” Levoy said.
Separate camera controls for bright and dark
Live HDR+ lets Google offer better camera controls. Instead of just a single exposure slider to brighten or 코인카지노쿠폰 darken the photo, the Pixel 4 offers separate sliders for bright and dark regions.
That means you can show a shadowed face in the foreground without worrying you’ll wash out the sky behind. Or you can show details both on a white wedding dress and a dark tuxedo.
The dual-control approach is unique, and not just among smartphones, Levoy says. “There’s no camera that’s got live control over two variables of exposure like that,” he said.
Shoot the stars with astrophotography
In 2018, Google extended HDR+ with Night Sight, a path-breaking ability to shoot in dim restaurants and 코인카지노 on urban streets by night. On a clear night, the Pixel 4 can go a step further with a special astrophotography mode for stars.
The phone takes 16 quarter-minute shots for a 4-minute total exposure time, reduces sensor noise, then marries the images together into one shot.
The Pixel 4’s Night Sight mode can photograph the Milky Way and individual stars — if the sky is clear enough.
Google AI color correction
Digital cameras try to compensate for color casts like blue shade, yellow streetlights and orange candle light that can mar photos. The Pixel 4 now makes this adjustment, called white balance, based in part on AI software trained on countless real-world photos.
Levoy showed me an example where it makes a difference, a photo of a woman whose face had natural skin tones even though she stood in a richly blue ice cave.
Better bokeh
The character of out-of-focus regions is called bokeh in photography circles, and with the Pixel 4 it’s improved to be more like what an SLR would produce. That’s because more of the portrait mode calculations happen with raw image data for better calculations. Point sources of light now produce white discs in the bokeh, not gray, for example.
Depth data for better editing
The Pixel 4 adds the ability to record the 3D scene information called a depth map in every photo. That opens powerful editing abilities for tools like Adobe Lightroom, which can handle depth maps in iPhone photos.
All these features represent a massive investment in computational photography — one Apple is mirroring with its own Night Mode, Smart HDR and Deep Fusion. Google has to “run faster and breathe deeper in order to stay ahead,” Levoy acknowledged.
But Apple also brings more attention to Google’s work. “If Apple follows us, that’s a form of flattery.”
Now playing: Watch this: Pixel 4 and 4 XL hands-on: Dual rear cameras, radar face… 5:31 Originally published Oct. 15.
Updates, Oct. 15 and 16.: Adds detail about Google’s new computational photography abilities, Pixel 4 camera details and further comments from Levoy.