Sometimes you just forget what you have tucked in the garage. I have been trying to get shots of the hummingbirds in our back yard and more recently shot a little video with the DSLR. Then it occurred to me that they might be wary of people but not of inanimate objects. Why not stick a camera on a post right next to the feeder.
While it hasn’t had much use recently, I have a GoPro (or two). I have an adaptor that would sit on top of a lighting stand which is plenty tall enough to get up to the height of the feeder. Moreover, I can control it all remotely using a phone/tablet including a live video feed. I sat indoors with the iPad on watching for movement on screen while doing other things. As soon as one appeared, a press of the button and they were being recorded. The initial attempts failed until I remembered to switch off the beeps and the LEDs that flash during recording. After that it was easy. The results were rather pleasing.
The update to iOS 10 brought with it the possibility to shoot in RAW on the iPhone. For some reason Apple didn’t bother to incorporate this feature in the base phone app but they did make it available to other camera app developers. Camera+ is one that I use a bit so I figured I would start shooting in RAW via that. Obviously RAW means larger files but, since I download my files to the desktop frequently and tend to clear out the phone, this wasn’t a concern.
First thing I found out was that other apps could see the shots. I had taken a few shots and wanted to upload to Facebook and it turned out there wasn’t a problem doing so. However, the main benefit was anticipated to post processing back on the desktop. With the SLR shots (is there any point to saying DSLR these days?), it is possible to recover a lot from the highlights and shadows. Would the same be possible with the phone? Sort of. You can get a bit more in these areas than would be the case with the JPEG when things are quickly lost. However, the sensor data is still not anywhere close to being as adaptable as it is for an SLR. You get more flexibility to pull the sky back but it is still pretty limited.
Is it worth using? Definitely. While it might not be the post processing experience you will be used to with SLR files, it is certainly better than the JPEGs provide. The increase in file size is hardly an issue these days so I will using it from now on. The camera app doesn’t have the pan and time lapse stuff so easily to hand so the phone’s base app will still get used but, aside from that, it will be my choice. My main gripe now is that they have a random file naming protocol that is a little difficult to get used to. Small problems, eh?
My regular trips across the country often result in seeing aircraft out of the window as they cross our path or head in the opposite direction. I have seen plenty of jets in relatively close proximity but, with the speed differential being high, they are usually gone without a chance to grab a camera. Keeping it to hand for the entire flight is a little inconvenient. However, I have been working on an alternative plan.
Southwest provides in flight wifi and I get it for free as a result of the amount of flying I do with them. I log on to the wifi and open up Flightradar24 on my iPad. I can see where we are and I can see where other planes are too. The number of ADS-B equipped planes has gone up substantially in the last couple of years so accurate tracks are now common. Flightradar is usually pretty close to real time. I have learned, though, that the speed of the net connection on the plane is a big lethargic and the locations on the app while airborne have a bit of lag compared to the real situation.
By compensating for this, I have been able to predict a few encounters with other jets. Of course, they are never as close when you plan for it as when you are caught out but they were still pretty close. Shooting through an airliner window is not ideal but I managed to get a few shots all the same. Does this count as my first airliner a2a sortie?
The iPhone has an HDR function available in the camera’s software. However, it hasn’t impressed me in the past. My friend Hayman introduced me to an app called HDR Pro and I have used that as my default iPhone HDR app since. Recently, they introduced an updated version of the app called HDR Pro X. I decided to give it a go. I wanted to see what the images it produced were like, how the new controls worked and also to make a comparison with the output from HDR Pro.
The top shot is from the new app. The second one is the previous version. The added control certainly seems to be beneficial and the blowing out of the higlihgts is far better controlled. I am generally happy with the new version. The controls could be more user friendly. When you use Lightroom/Camera Raw all the time, anything less seems clunky! See what you think of the results.
A meeting in the heart of San Francisco meant a bunch of our team were meeting downtown. A few of us got there a little head of the meeting and, with a couple of minutes available, I wanted to check out the City Hall building since it was only a couple of blocks away. As an old City, San Francisco has some classic architecture and this is no exception. For some reason, despite the numerous times I have been to the city, I have never been to City Hall before.
A group of school kids were playing some orchestral music in the main hall and plenty of family members were there. I wandered around taking a look. Since I didn’t have my normal cameras, the phone had to serve duty. Fortunately, that also allowed me to try another one of the 360 panoramas. I suspect I shall be carrying another camera with me when I am next in the area.
A long time ago, I became intrigued by the idea of having location tag information appended to my images. It wasn’t something that I considered to be vital but it did seem to be potentially useful. However, as I am a Canon shooter, I quickly discovered that this was not going to be something that I could easily achieve. There were a number of GPS devices that you could attach to the hot shoe of a camera and plug in to the USB port but they worked on Nikons while the Canons did not interface with them.
I was a touch disappointed but not so much that it changed my life. I gave up on geotagging for a while until Lightroom 4 came along. It has a useful map module that allows you to see where images were taken if they have coordinates associated with them and so search for shots in a given area. More importantly for me, it allowed you to drag and drop images onto the map to embed the data if they didn’t originally have it. This became part of my workflow.
It did get me more interested in the idea of having real time data with the images rather than trying to decide which shots were taken where. The easiest option appeared to be getting a GPS tracking app for my phone to try. I took a look at the apps available. There are many of them, most of which seemed to be focused on keeping track of members of your family! All rather creepy. All I wanted was something that would keep a track of my location without draining the battery too much.
I settled on an app called GPX Master. It seemed to do what I wanted and had some good reviews. There is a free version with ads embedded or a paid version without ads. Since I was planning on it running in the background, having ads I would rarely see seemed useful. Moreover, it would automatically sync any track files I create with my Dropbox account. I have now taken it out for its first run. The results are very positive.
I was out shooting in a location but I did move a few times. When I got home, I imported my images to Lightroom and turned to the map module. I had checked that the clocks on the cameras were accurate since the time code of the GPS is what makes the process work. The GPX format file was already on my computer courtesy of Dropbox so I was ready to go. At the bottom of the screen was the drop down to select the track file. This I did and a blue track was immediately overlaid on the map showing where I had been. I then went to the map options drop down at the top of the page and it offered to automatically tag all files based on this track. Bingo! It was done.
It was so very easy. The images were grouped by their location. I zoomed in and saw that each location was actually a series of locations as I wandered around in each place. Very cool. I don’t know whether the tracks are absolutely accurate. They look pretty good to me. However, it is a lot more accurate than me guessing after the event. I am really impressed. The battery on the phone did not take much of a hit either so it looks like a promising approach. Now to remember to switch n the track whenever I am out shooting. I wonder how well I will do with that?
There is an app I have had on my phone for a while called 360 which is for taking panoramic photographs. I have had it for quite a while and have mentioned it before here but they have progressively introduced new features over that time. While the new operating system has a pano function built into the camera (if your phone isn’t too old), it is rather basic and nowhere near as good as this one.
Taking images requires a little planning since you are able to take a full spherical image. Doing this without having the whole thing look strange in close requires you to keep the camera point itself unchanged as you turn around. This is harder than you think. The software can compensate a bit but you need to try and get it right in camera as much as possible.
It shows you a grid of the total shot and so you can see which bits you have shot and what is needed to fill it all in. It gives you a live preview as you shoot including looking straight up and down when required. Once the image is complete (or as much of it as you want), it processes it and then you can upload it to a website to view later. The links here are from that site. It is a great app and fun in some situations and valuable in others when showing off a wide view is hard to do any other way.
When checking this examples out, don’t miss out on a cool feature. At the top of the viewer are three buttons. It starts on the middle setting which allows you to pan around. If you click on the left button, it creates a view from the ground up. The right button creates a view looking straight down. (This only works properly if I have shot a full 360 image.) This looks like the work that Gerry Holtz has done and I blogged about here although his is far superior.
While the title might be a bit inaccurate, you are probably not going to struggle to guess what the topic really is. I have always enjoyed playing with panos since my days of film when it would be a lot less complex and consist of sticking together a large number of 6×4 prints to make a larger collage. The effect was rough and ready but there was something rather cool about the way they came out at the highly accurate results possible today doesn’t have.
While stitching together shots is straightforward given any number of software tools (although Photomerge in Photoshop has got sufficiently good that I rarely need to use anything else), achieving a good pano with the camera in my phone was a different challenge. Without control of the exposure and the white balance, it was hard to make the shots merge cleanly. I then got hold of an app called 360 which would shoot a pano as you gradually moved the camera around. It overlays a grid so you can see the coverage still available and you could pan up and down as well as laterally.
Originally the results were less then ideal. It would have discontinuities on some of the straight lines in the scene and was particularly unhappy in low light or when panning vertically. However, the latest version of the app is now out and it is a dramatic improvement. After playing with it at home briefly, I tried it while on the road the other day. We were heading up passed Green Bay and so stopped off to see Lambeau Field. I’m sure the Bears fans I know will be cursing me for such treachery but I wanted to see such a famous stadium. Anyway, it wasn’t a photo trip so I only had my phone in my pocket at the time. I decided to give the pano a shot using the latest version of the app and I think it did a pretty good job. It was fun to watch new areas that had a different exposure suddenly get corrected to merge with the existing image and straight lines that were originally out of position get jumped to the right place as the app worked out what was wanted. It isn’t perfect yet. Some lines still don’t line up and the resolution is surprisingly low compared to a normal shot by the phone.
Nice work by occipital who are the developers. I like what you have done so far and look forward to what you come up with next.