Guide

What's the best Focal Length to take a NeRF?

Michael Rubloff

Jul 10, 2023

NeRF Focal Length
NeRF Focal Length

As one of the website's readers was so kind to point out, I promised to dive into focal lengths from my post about NeRF camera settings earlier this year. Obviously the camera settings make a massive difference for getting high quality NeRFs, but I believe the lens choice is right up there.

There is a longstanding photography maxim that you want to marry your lens and trade out camera bodies over time. This holds true with NeRFs, as switching lenses has seriously upped my NeRF game.

So with that in mind, what's worked best for me?

What's the ideal focal length for NeRFs?

In my opinion, the sweet spot seems to be 14mm. Yes this gets into fisheye territory and the added field of view means you're giving your NeRF method more data with each input image. However the benefits extend further as we approach some photography principles.

When I was first told about using a fisheye lens, my head kept thinking about this gif that compares focal lengths.

My sole thought was I was taught never to shoot portraits below 20mm because of the distortion. It took me a few weeks to wrestle with it until my friend finally lent me a 14mm lens and very quickly I realized I was wrong. The NeRFs immediately be sharper with significantly less artifacts.

Additionally, I realized there are more benefits to shooting wide.

Wide angle lenses are less susceptible to shaky hands when hitting the shutter button — though don't look at this with a one stop fix. Still try and keep standard photography principles of creating a makeshift tripod with your elbows.

But this doesn't stop at taking NeRFs of people or objects. Imagine how much easier shooting NeRFs inside becomes when you can suddenly fit significantly more per image. The wide angle helps get into places where you might not be able to shoot in traditionally and help fill in the data gaps.

The final benefit has to do with another photography principle, hyperfocal distance. You see this term pop up a lot in landscape photography and cinematography where it's critical to have as much of the view in front of the lens in focus. BH Photo defines hyperfocal distance as:

The distance when the lens is focused at infinity, at which objects from half of this distance to infinity will be in focus (or “acceptable sharpness”) for a particular lens. Alternatively, hyperfocal distance may refer to the closest distance that a lens can be focused for a given aperture while objects at a distance (infinity) will remain acceptably sharp.

BH Photo

As you might be guessing, a major factor of calculating your hyperfocal distance is your focal length combined with your aperture. For those that really want to go down the rabbit hole, here's a hyperfocal distance calculator.

Due to focusing to infinity, you get that deep depth of field and sharp backgrounds. Pair that with the Nerfacto or Nerfacto-huge and you have a potential landscape scene ready to go in Unreal Engine.

What's a bad focal length for NeRFs?

This is a tough one that gives the dreaded answer of it depends. Once you get past 50mm, you might want to consider using a wider lens. That said, if you're using a rig or dolly as part of your setup, that will change the recommendation. While you will be able to cut out almost all camera shaking, you will need to supplement your data set with more images to compensate for less views.

Also keep in mind, the more zoomed in you are, the wider of a circle you will have to make around your subject to NeRF. Often the marketing buzz for new lenses, bokeh will be your enemy in NeRFs. If you'd like to control your aperture in post, both Instant-NGP and Luma offer controls to do so.

If purchasing a wide angle lens isn't justifiable for you, there are also a ton of great options. A lot of kit lenses start in the 24mm range and you might already have one. These can be a great option for someone looking to use an existing lens. For crop shooters, Canon also makes a great pancake 24mm lens for roughly $130, but make sure that it's compatible with your camera body.

Should I use a Prime Lens for NeRFs?

The short answer is yes. Prime lenses have better build quality and offer preciously calibrated glass to the focal distance. Zoom lenses in theory sound better, but in practice, do not work well.

Prime lenses will also often be lighter than their zoom counterparts and depending how long it takes you to shoot a NeRF, you might find your forearms happier at the end of the day with a lighter lens.

A lot of prime lenses will come with the ability to open wide, but this will not be necessary for NeRFs. As I mentioned in the camera settings article, I would much rather crank up my ISO and shoot with a higher aperture than anything. If you're going to be using this lens specifically for NeRFs, you don't need a lens that opens to f2.8 or larger. I try to not go below f4 — it's just not necessary and the shallow depth of field begins to detract from the NeRF.

This doesn't mean you need to go out and buy a crazy expensive lens. There are great options at most budget options. Personally, I've been using KEH.com to find used lenses at a discount. This isn't sponsored whatsoever, but it's what helped me pull the trigger on a lens. I also was scoping out eBay for a while, but opted to go with KEH because of their warranty and return policy (I was a little skeptical at first how much of a difference a wide angle lens would make).

Taking NeRFs on a Phone?

There's a decent amount of people reading this probably thinking, but I'm just using my iPhone/smartphone. What about me? Well, you're also in luck. Smartphones are amazing for taking NeRFs to begin with. They offer sharp, deep depth of field, and they fit in your pocket. But that's not why you're reading this. If your phone offers a wide angle mode, use that. You want to be getting the widest field of view possible.

For the iPhone, that means using the .5 setting. When I'm shooting NeRFs on my phone, it's always set to this.

Keep in mind, nothing has been set in stone thus far for NeRF best practices, but for me this has made a massive difference. I don't go anywhere without my lens and have been really happy with the results I've gotten thus far. So far the lowest amount of photos I've tried in a dataset that came out really clean has been 24, but I'm looking to see how low I can go.

If something has been working better for you, let me know! I'm really curious to see what works for you!

Featured

Featured

Featured

Platforms

OpenSplat adds Mac GPU Acceleration

OpenSplat, which brought Mac training to 3DGS has received a big update, now allowing users to train with MPS backend with GPU acceleration.

Michael Rubloff

Apr 15, 2024

Platforms

OpenSplat adds Mac GPU Acceleration

OpenSplat, which brought Mac training to 3DGS has received a big update, now allowing users to train with MPS backend with GPU acceleration.

Michael Rubloff

Apr 15, 2024

Platforms

OpenSplat adds Mac GPU Acceleration

OpenSplat, which brought Mac training to 3DGS has received a big update, now allowing users to train with MPS backend with GPU acceleration.

Michael Rubloff

Research

Shrinking 3DGS File Size

Gaussian Splatting has quickly become one of the most exciting research topics in Radiance Fields, thanks to its fast training, real time rendering rates, and easy to create pipeline. The one critique that emerged was the resulting file size from captures, often venturing into the high hundreds of megabytes and up.

Michael Rubloff

Apr 11, 2024

Research

Shrinking 3DGS File Size

Gaussian Splatting has quickly become one of the most exciting research topics in Radiance Fields, thanks to its fast training, real time rendering rates, and easy to create pipeline. The one critique that emerged was the resulting file size from captures, often venturing into the high hundreds of megabytes and up.

Michael Rubloff

Apr 11, 2024

Research

Shrinking 3DGS File Size

Gaussian Splatting has quickly become one of the most exciting research topics in Radiance Fields, thanks to its fast training, real time rendering rates, and easy to create pipeline. The one critique that emerged was the resulting file size from captures, often venturing into the high hundreds of megabytes and up.

Michael Rubloff

Platforms

Luma AI Android Released

Native Android support from Luma AI is finally here. Of all the questions about Luma features I get, Android support is routinely at the top of the list.

Michael Rubloff

Apr 10, 2024

Platforms

Luma AI Android Released

Native Android support from Luma AI is finally here. Of all the questions about Luma features I get, Android support is routinely at the top of the list.

Michael Rubloff

Apr 10, 2024

Platforms

Luma AI Android Released

Native Android support from Luma AI is finally here. Of all the questions about Luma features I get, Android support is routinely at the top of the list.

Michael Rubloff

Research

PhysAvatar's Dynamic Dances

Playing as yourself in a video game has always seemed like a fun idea. Now, we're one step closer to making that a reality with PhysAvatar.

Michael Rubloff

Apr 9, 2024

Research

PhysAvatar's Dynamic Dances

Playing as yourself in a video game has always seemed like a fun idea. Now, we're one step closer to making that a reality with PhysAvatar.

Michael Rubloff

Apr 9, 2024

Research

PhysAvatar's Dynamic Dances

Playing as yourself in a video game has always seemed like a fun idea. Now, we're one step closer to making that a reality with PhysAvatar.

Michael Rubloff

Trending articles

Trending articles

Trending articles

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Research

Live NeRF Video Calls

Catching up with my sister has been an exercise in bridging distances. She recently moved to Copenhagen, trading the familiar landscapes of our shared childhood for the charming streets of the Danish capital.

Michael Rubloff

Oct 5, 2023

Research

Live NeRF Video Calls

Catching up with my sister has been an exercise in bridging distances. She recently moved to Copenhagen, trading the familiar landscapes of our shared childhood for the charming streets of the Danish capital.

Michael Rubloff

Oct 5, 2023

Research

Live NeRF Video Calls

Catching up with my sister has been an exercise in bridging distances. She recently moved to Copenhagen, trading the familiar landscapes of our shared childhood for the charming streets of the Danish capital.

Michael Rubloff

Guest Article

A short 170 year history of Neural Radiance Fields (NeRF), Holograms, and Light Fields

Lightfield and hologram capture started with a big theoretical idea 115 years ago and we have struggled to make them viable ever since. Neural Radiance fields aka NeRF along with gaming computers now for the first time provide a promising easy and low cost way for everybody to capture and display lightfields.

Katrin Schmid

Mar 2, 2023

Guest Article

A short 170 year history of Neural Radiance Fields (NeRF), Holograms, and Light Fields

Lightfield and hologram capture started with a big theoretical idea 115 years ago and we have struggled to make them viable ever since. Neural Radiance fields aka NeRF along with gaming computers now for the first time provide a promising easy and low cost way for everybody to capture and display lightfields.

Katrin Schmid

Mar 2, 2023

Guest Article

A short 170 year history of Neural Radiance Fields (NeRF), Holograms, and Light Fields

Lightfield and hologram capture started with a big theoretical idea 115 years ago and we have struggled to make them viable ever since. Neural Radiance fields aka NeRF along with gaming computers now for the first time provide a promising easy and low cost way for everybody to capture and display lightfields.

Katrin Schmid

Featured

Featured

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

SplaTV

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

SplaTV

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Michael Rubloff

Mar 15, 2024

SplaTV

Research

Live NeRF Video Calls

Catching up with my sister has been an exercise in bridging distances. She recently moved to Copenhagen, trading the familiar landscapes of our shared childhood for the charming streets of the Danish capital.

Michael Rubloff

Oct 5, 2023

Radiance Field Video Call

Research

Live NeRF Video Calls

Catching up with my sister has been an exercise in bridging distances. She recently moved to Copenhagen, trading the familiar landscapes of our shared childhood for the charming streets of the Danish capital.

Michael Rubloff

Oct 5, 2023

Radiance Field Video Call

Research

Live NeRF Video Calls

Michael Rubloff

Oct 5, 2023

Radiance Field Video Call

Guest Article

A short 170 year history of Neural Radiance Fields (NeRF), Holograms, and Light Fields

Lightfield and hologram capture started with a big theoretical idea 115 years ago and we have struggled to make them viable ever since. Neural Radiance fields aka NeRF along with gaming computers now for the first time provide a promising easy and low cost way for everybody to capture and display lightfields.

Katrin Schmid

Mar 2, 2023

History of Neural Radiance Fields

Guest Article

A short 170 year history of Neural Radiance Fields (NeRF), Holograms, and Light Fields

Lightfield and hologram capture started with a big theoretical idea 115 years ago and we have struggled to make them viable ever since. Neural Radiance fields aka NeRF along with gaming computers now for the first time provide a promising easy and low cost way for everybody to capture and display lightfields.

Katrin Schmid

Mar 2, 2023

History of Neural Radiance Fields

Guest Article

A short 170 year history of Neural Radiance Fields (NeRF), Holograms, and Light Fields

Katrin Schmid

Mar 2, 2023

History of Neural Radiance Fields

Recent articles

Recent articles

Platforms

OpenSplat adds Mac GPU Acceleration

OpenSplat, which brought Mac training to 3DGS has received a big update, now allowing users to train with MPS backend with GPU acceleration.

Michael Rubloff

Apr 15, 2024

OpenSplat

Platforms

OpenSplat adds Mac GPU Acceleration

OpenSplat, which brought Mac training to 3DGS has received a big update, now allowing users to train with MPS backend with GPU acceleration.

Michael Rubloff

Apr 15, 2024

OpenSplat

Research

Shrinking 3DGS File Size

Gaussian Splatting has quickly become one of the most exciting research topics in Radiance Fields, thanks to its fast training, real time rendering rates, and easy to create pipeline. The one critique that emerged was the resulting file size from captures, often venturing into the high hundreds of megabytes and up.

Michael Rubloff

Apr 11, 2024

3dgs compress

Research

Shrinking 3DGS File Size

Gaussian Splatting has quickly become one of the most exciting research topics in Radiance Fields, thanks to its fast training, real time rendering rates, and easy to create pipeline. The one critique that emerged was the resulting file size from captures, often venturing into the high hundreds of megabytes and up.

Michael Rubloff

Apr 11, 2024

3dgs compress

Platforms

Luma AI Android Released

Native Android support from Luma AI is finally here. Of all the questions about Luma features I get, Android support is routinely at the top of the list.

Michael Rubloff

Apr 10, 2024

Platforms

Luma AI Android Released

Native Android support from Luma AI is finally here. Of all the questions about Luma features I get, Android support is routinely at the top of the list.

Michael Rubloff

Apr 10, 2024