E D U B O T S
SHAPING THE FUTURE OF ROBOTICS
Hyperfocus: Follow Focus For The Masses, by Miles Crabill
The rise of consumer digital cameras has sparked an interest in inexpensive, high- quality video production. From DSLRs to GoPros, everyone wants excellent picture, video capture, and more. Motion capture shots have always required a steady hand or a tripod, and some serious effort on the part of the photographer. The best quality video capture, with automatic focus, has historically been reserved for the film industry and for movie budgets.
With Hyperfocus, our team is trying to disrupt follow focus. Commercial follow focus hardware costs anywhere from $500 to $5,000 and many of the lower end models are not automatic. Recently, a $50 manual follow focus was successfully Kickstarted at ten times its goal, proving the existence of the hobbyist market. We want to do the same thing for automatic follow focus, which is comically expensive and currently impractical for anyone other than big budget movie producers.
Automatic follow focus rigs typically use some sort of laser to track the distance between the target of the shot and the lens, and then adjust the focus based on this distance. This raises an interesting problem: the point on the focus ring where objects will be in focus changes from lens to lens. Each different lens has an equation that describes the ratio between the point on the focus ring and the distance where objects are in perfect focus. We also found that different distance ranges are modeled by separate equations, even for the same lens. We use hyperfocal distance as a guide to calibrate lenses, hence the name: Hyperfocus.
We started the project in the 2015 Spring Semester. Our first prototype was powered by an Arduino with an ultrasonic sensor and a stepper motor. We built a custom mount for the camera body, attached a belt from the stepper motor to a camera lens, and manually calibrated our test program for the lens. Our prototype sent and received ultrasonic pings, using the amount of time between “call” and “response” pings to calculate the rough distance between the lens and the object. Of course, this only worked for objects that were able to properly reflect the pings, so angular objects were tricky.
The coolest thing is holding the camera mount, looking through the lens, and just panning around—the focus adjusts automatically depending on what you point the camera at. Of course, our first prototype wasn’t perfect. The stepping wasn’t fully smooth, the distance ratios were a little off, and, most importantly, it wasn’t easily adaptable to different camera configurations.
We’re at work on our second prototype, which has moved to LIDAR for greater accuracy at a slightly higher cost. The second prototype is a tripod with a mounted panel that includes the LIDAR module, our Arduino, and the stepper motor used to step the lens. The goal is to create a calibration program that takes the user through the process of adapting Hyperfocus to the camera and lens. This entails focusing all the way in and all the way out and focusing on objects at specific distances, and is still quite involved.
Automatic follow focus is a really difficult problem. The real automatic rigs tend to be used for car chase scenes in movies and are rented hourly because they come with the car that they are mounted on. We’re trying to solve this problem and take it to the consumer market in one fell swoop. Any working robot that records video or sends back visual reports will benefit from automatic follow focus. Hyperfocus is just one of the ongoing projects that began in Dr. Kellar Autumn’s Technologies of the Future course, through Lewis & Clark’s Center for Entrepreneurship ( https://college.lclark.edu/programs/entrepreneurship/). Be on the lookout for reports on other projects from the class and beyond in the next edition of EduBots.
Lewis & Clark College, www.lclark.edu