As I posted a while ago, I have been experimenting with stitching shots from my phone. Since I am shooting in raw on the phone, I have some latitude to play with the shots in post that wasn’t there before. This time, though, I thought about it a bit more and put the camera into manual mode to fix the exposure. This should make the stitching and blending easier than when it changed between shots (although, to give the Lightroom team credit, it did a pretty good job anyway). I allowed plenty of overlap and the merge seemed to go pretty well. Since it outputs a dng file, you still have the chance to edit more aggressively than would be possible with a jpeg. Meanwhile, you get a higher resolution shot than with the internal pano mode. This may be my go to method from now on.
One end of the railroad museum in Sacramento is a roundhouse. It is accessible still from the line outside and I was there for a modern locomotive that was being unveiled. Access comes via a turntable which sits right next to the path along the river. I figured I would put together a panorama of the scene. However, I only had my phone (albeit able to shoot raw). I had never tried shooting a pano sequence with it before having only used its internal pano function.
I wasn’t controlling the exposure (although there is a manual function in the app I use) but I had noticed that the Lightroom pano function seemed quite adept at dealing with small exposure variation. I took the sequence and there was not a big difference across them. When I got home, I added them to Lightroom and had a go at the stitching function. It worked better than I had expected. Some small distortions were there but it actually was rather good. I had not been happy about the reduced size of the pano function of the phone so this has provided a better option to use in the future.
The update to iOS 10 brought with it the possibility to shoot in RAW on the iPhone. For some reason Apple didn’t bother to incorporate this feature in the base phone app but they did make it available to other camera app developers. Camera+ is one that I use a bit so I figured I would start shooting in RAW via that. Obviously RAW means larger files but, since I download my files to the desktop frequently and tend to clear out the phone, this wasn’t a concern.
First thing I found out was that other apps could see the shots. I had taken a few shots and wanted to upload to Facebook and it turned out there wasn’t a problem doing so. However, the main benefit was anticipated to post processing back on the desktop. With the SLR shots (is there any point to saying DSLR these days?), it is possible to recover a lot from the highlights and shadows. Would the same be possible with the phone? Sort of. You can get a bit more in these areas than would be the case with the JPEG when things are quickly lost. However, the sensor data is still not anywhere close to being as adaptable as it is for an SLR. You get more flexibility to pull the sky back but it is still pretty limited.
Is it worth using? Definitely. While it might not be the post processing experience you will be used to with SLR files, it is certainly better than the JPEGs provide. The increase in file size is hardly an issue these days so I will using it from now on. The camera app doesn’t have the pan and time lapse stuff so easily to hand so the phone’s base app will still get used but, aside from that, it will be my choice. My main gripe now is that they have a random file naming protocol that is a little difficult to get used to. Small problems, eh?
While I have experimented with video a fair bit over time, one thing I haven’t done is put together a video with a presenter in it. My mum was recently staying and she had an idea for something she wanted to do that involved her doing a presentation on video that could be shared at a later date. My own experience and some information I had seen online made me think that the key to getting a good result was not going to be the video but was instead the sound. The microphone on the camera is of okay quality but it picks up the sound of everything around it. The voice is isolated and any video online that does not take a careful approach to audio is very obvious and sounds decidedly amateurish.
The ideal solution would be to have lav mikes, the small mike you see attached to the clothing of TV presenters. These are actually pretty accessible and cheap but I didn’t have the time to sort something out. However, a surprisingly good alternative was readily to hand. I have an app on my phone for sound recording which I use when interviewing people for articles. Instead of using the plugin microphone, I used the headphone/microphone cable. By running it inside the clothing and just leaving the microphone up near my mum’s throat, we were able to make a very good sound recording. The closeness of the mike to her mouth meant the sound was very localized and clear so the background noise was lost. The room we used did not have bad echoes either so the audio ended up being pretty clear.
Then it was just a case of having a conspicuous clap on the audio track and the video file to allow me to synch the sound and audio together and we were off to the races. I shot everything with two cameras – one head on and one from the side – with the idea of cutting between them. However, when I did the first edit, the side camera didn’t seem to fit with the style of presenting to camera. I imagine it works better for an interview style piece. I reverted to the head on shot with some images cut in periodically to illustrate the piece. Overall, it worked pretty well. We did a number of takes and mum got progressively more relaxed in each one. I had thought I might cut the best bits together but the final take was really good so I didn’t need to do so. I hope her audience likes the result.
The iPhone has an HDR function available in the camera’s software. However, it hasn’t impressed me in the past. My friend Hayman introduced me to an app called HDR Pro and I have used that as my default iPhone HDR app since. Recently, they introduced an updated version of the app called HDR Pro X. I decided to give it a go. I wanted to see what the images it produced were like, how the new controls worked and also to make a comparison with the output from HDR Pro.
The top shot is from the new app. The second one is the previous version. The added control certainly seems to be beneficial and the blowing out of the higlihgts is far better controlled. I am generally happy with the new version. The controls could be more user friendly. When you use Lightroom/Camera Raw all the time, anything less seems clunky! See what you think of the results.
A meeting in the heart of San Francisco meant a bunch of our team were meeting downtown. A few of us got there a little head of the meeting and, with a couple of minutes available, I wanted to check out the City Hall building since it was only a couple of blocks away. As an old City, San Francisco has some classic architecture and this is no exception. For some reason, despite the numerous times I have been to the city, I have never been to City Hall before.
A group of school kids were playing some orchestral music in the main hall and plenty of family members were there. I wandered around taking a look. Since I didn’t have my normal cameras, the phone had to serve duty. Fortunately, that also allowed me to try another one of the 360 panoramas. I suspect I shall be carrying another camera with me when I am next in the area.
A long time ago, I became intrigued by the idea of having location tag information appended to my images. It wasn’t something that I considered to be vital but it did seem to be potentially useful. However, as I am a Canon shooter, I quickly discovered that this was not going to be something that I could easily achieve. There were a number of GPS devices that you could attach to the hot shoe of a camera and plug in to the USB port but they worked on Nikons while the Canons did not interface with them.
I was a touch disappointed but not so much that it changed my life. I gave up on geotagging for a while until Lightroom 4 came along. It has a useful map module that allows you to see where images were taken if they have coordinates associated with them and so search for shots in a given area. More importantly for me, it allowed you to drag and drop images onto the map to embed the data if they didn’t originally have it. This became part of my workflow.
It did get me more interested in the idea of having real time data with the images rather than trying to decide which shots were taken where. The easiest option appeared to be getting a GPS tracking app for my phone to try. I took a look at the apps available. There are many of them, most of which seemed to be focused on keeping track of members of your family! All rather creepy. All I wanted was something that would keep a track of my location without draining the battery too much.
I settled on an app called GPX Master. It seemed to do what I wanted and had some good reviews. There is a free version with ads embedded or a paid version without ads. Since I was planning on it running in the background, having ads I would rarely see seemed useful. Moreover, it would automatically sync any track files I create with my Dropbox account. I have now taken it out for its first run. The results are very positive.
I was out shooting in a location but I did move a few times. When I got home, I imported my images to Lightroom and turned to the map module. I had checked that the clocks on the cameras were accurate since the time code of the GPS is what makes the process work. The GPX format file was already on my computer courtesy of Dropbox so I was ready to go. At the bottom of the screen was the drop down to select the track file. This I did and a blue track was immediately overlaid on the map showing where I had been. I then went to the map options drop down at the top of the page and it offered to automatically tag all files based on this track. Bingo! It was done.
It was so very easy. The images were grouped by their location. I zoomed in and saw that each location was actually a series of locations as I wandered around in each place. Very cool. I don’t know whether the tracks are absolutely accurate. They look pretty good to me. However, it is a lot more accurate than me guessing after the event. I am really impressed. The battery on the phone did not take much of a hit either so it looks like a promising approach. Now to remember to switch n the track whenever I am out shooting. I wonder how well I will do with that?
There is an app I have had on my phone for a while called 360 which is for taking panoramic photographs. I have had it for quite a while and have mentioned it before here but they have progressively introduced new features over that time. While the new operating system has a pano function built into the camera (if your phone isn’t too old), it is rather basic and nowhere near as good as this one.
Taking images requires a little planning since you are able to take a full spherical image. Doing this without having the whole thing look strange in close requires you to keep the camera point itself unchanged as you turn around. This is harder than you think. The software can compensate a bit but you need to try and get it right in camera as much as possible.
It shows you a grid of the total shot and so you can see which bits you have shot and what is needed to fill it all in. It gives you a live preview as you shoot including looking straight up and down when required. Once the image is complete (or as much of it as you want), it processes it and then you can upload it to a website to view later. The links here are from that site. It is a great app and fun in some situations and valuable in others when showing off a wide view is hard to do any other way.
When checking this examples out, don’t miss out on a cool feature. At the top of the viewer are three buttons. It starts on the middle setting which allows you to pan around. If you click on the left button, it creates a view from the ground up. The right button creates a view looking straight down. (This only works properly if I have shot a full 360 image.) This looks like the work that Gerry Holtz has done and I blogged about here although his is far superior.
While the title might be a bit inaccurate, you are probably not going to struggle to guess what the topic really is. I have always enjoyed playing with panos since my days of film when it would be a lot less complex and consist of sticking together a large number of 6×4 prints to make a larger collage. The effect was rough and ready but there was something rather cool about the way they came out at the highly accurate results possible today doesn’t have.
While stitching together shots is straightforward given any number of software tools (although Photomerge in Photoshop has got sufficiently good that I rarely need to use anything else), achieving a good pano with the camera in my phone was a different challenge. Without control of the exposure and the white balance, it was hard to make the shots merge cleanly. I then got hold of an app called 360 which would shoot a pano as you gradually moved the camera around. It overlays a grid so you can see the coverage still available and you could pan up and down as well as laterally.
Originally the results were less then ideal. It would have discontinuities on some of the straight lines in the scene and was particularly unhappy in low light or when panning vertically. However, the latest version of the app is now out and it is a dramatic improvement. After playing with it at home briefly, I tried it while on the road the other day. We were heading up passed Green Bay and so stopped off to see Lambeau Field. I’m sure the Bears fans I know will be cursing me for such treachery but I wanted to see such a famous stadium. Anyway, it wasn’t a photo trip so I only had my phone in my pocket at the time. I decided to give the pano a shot using the latest version of the app and I think it did a pretty good job. It was fun to watch new areas that had a different exposure suddenly get corrected to merge with the existing image and straight lines that were originally out of position get jumped to the right place as the app worked out what was wanted. It isn’t perfect yet. Some lines still don’t line up and the resolution is surprisingly low compared to a normal shot by the phone.
Nice work by occipital who are the developers. I like what you have done so far and look forward to what you come up with next.