Mobile Photography Workflow: Pushing the Envelope With Lightroom and Pixel
Mobile Photography Workflow: Pushing the Envelope With Lightroom and Pixel
It'south no secret most of the world's photos are now shot with, and viewed on, a smartphone. For casual photography, the impressive on-board processing of modern phones is usually enough: Simply shoot and share. Just if you're a little more than fussy about your images, or are photographing difficult subjects or in catchy lighting, and then yous're likely to desire to practise some image editing. Fifty-fifty with clever apps like Lightroom CC and Snapseed, that can be painful on a telephone'south small-scale screen. Fortunately, at that place are at present some ways to stay light just accept a amend platform for editing.
Perhaps taking pity on me for carrying twenty pounds of photograph gear around every tech conference I cover, Google challenged me to see what I could practise with a Pixelbook, a Pixel two, and Lightroom. So I've been relying on that combination whenever possible for the terminal few weeks to see how complete a mobile photography solution I tin make it. I've supplemented it with either my Canon G9 X point-and-shoot or my Nikon D7500 for capturing images beyond what the phone can practice on its ain. Here'southward how it'southward worked out, forth with some options for tuning your own mobile photography workflow.
Making the Most out of Your Smartphone Camera
First, for the photos I care almost, I shoot in RAW. In the case of the Pixel 2 or my personal OnePlus five, that is typically DNG, although with Lightroom I can at present also take reward of Adobe's Raw HDR workflow. The latter produces a loftier-dynamic range, floating indicate DNG image that has already been given a default tone mapping, just still has much more than range to work with than a simple JPEG or traditional DNG. Other than the requirement for mail service-processing, the simply other reason non to shoot in RAW is that some of the super-clever computational imaging done by high-stop phones is merely accessible when you shoot JPEG. For case, Google's HDR+ technology that combines multiple frames to make a unmarried, superior, image just works for JPEGs shot with the Camera app.
Even though I have a bunch of photographic camera apps on my phones, for this article I've been using the photographic camera capability built into Lightroom Mobile and the default Google Photographic camera app for the Pixel 2. Lightroom's photographic camera syncs my images to the Adobe cloud and to my other Lightroom devices automatically, and also supports RAW image capture, which the default Google Photographic camera application for the Pixel 2 does not. Google'south Photographic camera app gives me the total do good of Google's impressive computational imaging HDR+ technology, and syncs with Google Photos automatically.
The images shot with the Lightroom camera are also stored in Google Photos — which is pretty cool, since all images shot with a Pixel 2 betwixt now and 2022 will be stored in original resolution for free by Google. As I've written about previously though, the proliferation of various backend photo clouds is adding some confusion, as each vendors apps currently piece of work best with their ain cloud.
As long every bit both devices are online, images shot on the Pixel 2 sync through Lightroom automatically to the Pixelbook (and to the Lightroom desktop machine in my studio, and my primary laptop). To go images off my Nikon D7500 or Canon G9 Ten, I either need to use Wi-Fi or an SD card. Unfortunately, while the Pixel 2 paired nicely with the D7500, the Pixelbook didn't. Ultimately I'1000 non sure how big a bargain that is, equally the cameras' Wi-Fi is too slow to transfer large numbers of image or RAW files. So a USB SD card reader is the small and simple reply. For the Pixelbook, yous'll either need a USB-C version or an inexpensive adapter. The good news is that everything including the D7500, flash, chargers, and cables fits in a convenient pocketbook like the pictured MindShift PhotoCross.
Not surprisingly, the workflow starting with the smartphone camera is a whole lot simpler. But how does it measure up for challenging photography situations? High-dynamic-range and low-lite scenes have been some of the toughest to capture with smartphones. A variety of computational imaging technologies under the loose heading "HDR" are at present available to address those shortcomings. We'll take a look at them and how they perform.
HDR on Smartphones is Improving by Leaps and Premises
When HDR first appeared on smartphones, information technology was clever, merely adequately clunky. Information technology mimicked the process of bracketing on a standalone camera by (relatively slowly) capturing ii-3 images and and then tone mapping them in a straightforward fashion. Now, the Pixel two, for example, captures upwards to 10 images in a fraction of a 2nd, and then aligns and assembles them using the full power of the phone's GPU and image processing hardware. Finally, it does noise reduction and an AI-based tone mapping that takes into account local contrast and the overall scene. Fifty-fifty the initial exposure is calculated based on a machine learning engine that has been trained on thousands of sample scenes. Apple, Samsung, and other high-end phone makers have similar systems, although they vary in how many images they capture, whether the images all have the same exposure, and in the quality of the post-processing and artifact suppression.
The result of Google's HDR+ (and similar features in other phones) is an effective extension of the phone camera's dynamic range well beyond what the 10-flake image sensor can provide natively. Google, as well equally Apple, Samsung, and a couple other phone makers have also done an excellent job reducing or eliminating the artifacts that come along with doing all that paradigm fusion. Yous tin however fool them with enough motility in the scene, but it is getting harder. For anyone who wants an instantly usable image, this in-photographic camera HDR produces a standard JPEG you can share right away. Just if you desire the ultimate in HDR, Adobe has pushed things even further.
With the newest version of Lightroom Mobile, if you have ane of the supported smartphones, Lightroom Mobile's photographic camera characteristic tin can painlessly capture enough individual images to record both the shadow and highlight areas of a scene. It and then automatically merges the individual RAW images into a high-fidelity floating point RAW version for follow-on processing. The results are very impressive, at least for static scenes. The process is slower than the built-in HDR+ feature, so it doesn't work as well when there is motility in the scene. Besides, because this is a twist on the RAW format that is unique to Adobe, images in this format aren't widely supported, at least not yet. For example the Adobe HDR images I shot with the Pixel 2 aren't viewable on Google Photos. However, they fit correct into Lightroom, which brings us to the side by side piece of the puzzle, epitome processing.
Lightroom Now Spans Just About Every Device from the Largest to the Smallest
Once only available on total-on computers, thanks in part to a complex interface that begged for a keyboard, mouse, and large display, at present Lightroom is hands accessible on phones, tablets, computers, and even the latest Chromebooks that have Android support. While the available feature set varies between devices, equally you lot'd expect, even the Mobile version has go quite powerful. When used on a Pixelbook or big tablet, you can practice a large amount of professional person-grade image editing with it. If that isn't enough, all of your images tin can be automatically synced to your computers for further editing, all the same in full fidelity.
The Pixelbook isn't your Male parent's Chromebook
When I tried to use the original Google Pixel as my traveling computer in 2022, it drove me nuts. There weren't any great prototype editors for Chrome Bone, and I had access to neither my familiar Windows apps or their Android equivalents. The addition of Android back up, availability of Lightroom Mobile, and the option for an active stylus help make the new Pixelbook an entirely different experience. I can now practise almost the aforementioned editing on the Pixelbook that I'd do on the road with my Windows laptop. And with Lightroom Mobile, my edits won't be wasted if I decide to practice more work on an prototype later on my Windows desktop in Lightroom or Photoshop.
Overall, Google has put together an effective one-two punch for photographers who desire to travel calorie-free, but even so have a high-terminate workflow. That said, if you don't need the keyboard on the Pixelbook, then an iPad or an Android tablet with an agile stylus would be a less expensive, and lighter-weight, alternative to the Pixelbook. Similarly, if you want one of the best smartphone cameras on the market, the Pixel two is ideal. Simply if you're on a budget, you can find less-expensive models that still support some form of automatic HDR and Adobe's RAW HDR capability. For example, my cheaper OnePlus 5 fit just as nicely into this workflow, although it doesn't produce the same image quality as the Pixel 2.
[Images by David Central]
Source: https://www.extremetech.com/mobile/263960-mobile-photography-workflow-pushing-envelope-lightroom-pixel
Posted by: plattbefoom.blogspot.com
0 Response to "Mobile Photography Workflow: Pushing the Envelope With Lightroom and Pixel"
Post a Comment