Pages

17/01/2013

Motion Blur Tutorial

Originally posted on 21/04/2011

What is motion blur?

Motion pictures are made up of a series of still images displayed in quick succession. These images are captured by briefly opening a shutter to expose a piece of film/electronic sensor to light (via a lens system), then closing the shutter and advancing the film/saving the data. Motion blur occurs when an object in the scene (or the camera itself) moves while the shutter is open during the exposure, causing the resulting image to streak along the direction of motion. It is an artifact which the image-viewing populous has grown so used to that its absence is conspicuous; adding it to a simulated image enhances the realism to a large degree.

Later we'll look at a screen space technique for simulating motion blur caused only by movement of the camera. Approaches to object motion blur are a little more complicated and worth a separate tutorial. First, though, let's examine a 'perfect' (full camera and object motion blur) solution which is very simple but not really efficient enough for realtime use.

Perfect solution

This is a naive approach which has the benefit of producing completely realistic full motion blur, incorporating both the camera movement and movement of the objects in the scene relative to the camera. The technique works like this: for each frame, render the scene multiple times at different temporal offsets, then blend together the results:

This technique is actually described in the red book (chapter 10). Unfortunately it requires that the basic framerate must be samples * framerate, which is either impossible or impractical for most realtime applications. And don't think about just using the previous samples frames - this will give you trippy trails (and nausea) but definitely not motion blur. So how do we go about doing it quick n' cheap?

Screen space to the rescue!

The idea is simple: each rendered pixel represents a point in the scene at the current frame. If we know where it was in the previous frame, we can apply a blur along a vector between the two points in screen space. This vector represents the size and direction of the motion of that point between the previous frame and the current one, hence we can use it to approximate the motion of a point during the intervening time, directly analogous to a single exposure in the real world.

The crux of this method is calculating a previous screen space position for each pixel. Since we're only going to implement motion blur caused by motion of the camera, this is very simple: each frame, store the camera's model-view-projection matrix so that in the next frame we'll have access to it. Since this is all done on the CPU the details will vary; I'll just assume that you can supply the following to the fragment shader: the previous model-view-projection matrix and the inverse of the current model-view matrix.

Computing the blur vector

In order to compute the blur vector we take the following steps within our fragment shader:
  1. Get the pixel's current view space position. There are a number of equally good methods for extracting this from an existing depth buffer, see Matt Pettineo's blog for a good overview. In the example shader I use a per-pixel ray to the far plane, multiplied by a per-pixel linear depth.
  2. From this, compute the pixel's current world space position using the inverse of the current model-view matrix.
  3. From this, compute the pixel's previous normalized device coordinates using the previous model-view-projection matrix and a perspective divide.
  4. Scale and bias the result to get texture coordinates.
  5. Our blur vector is the current pixel's texture coordinates minus the coordinates we just calculated
The eagle-eyed reader may have already spotted that this can be optimized, but for now we'll do it long-hand for the purposes of clarity. Here's the fragment program:
   uniform sampler2D uTexLinearDepth;

   uniform mat4 uInverseModelViewMat; // inverse model->view
   uniform mat4 uPrevModelViewProj; // previous model->view->projection

   noperspective in vec2 vTexcoord;
   noperspective in vec3 vViewRay; // for extracting current world space position
 
   void main() {
   // get current world space position:
      vec3 current = vViewRay * texture(uTexLinearDepth, vTexcoord).r;
      current = uInverseModelViewMat * current;
 
   // get previous screen space position:
      vec4 previous = uPrevModelViewProj * vec4(current, 1.0);
      previous.xyz /= previous.w;
      previous.xy = previous.xy * 0.5 + 0.5;

      vec2 blurVec = previous.xy - vTexcoord;
}

Using the blur vector

So what do we do with this blur vector? We might try stepping for n samples along the vector, starting at previous.xy and ending at vTexcoord. However this produces ugly discontinuities in the effect:

To fix this we can center the blur vector on vTexcoord, thereby blurring across these velocity boundaries:
Here's the rest of the fragment program (uTexInput the texture we're blurring):
// perform blur:
   vec4 result = texture(uTexInput, vTexcoord);
   for (int i = 1; i < nSamples; ++i) {
   // get offset in range [-0.5, 0.5]:
      vec2 offset = blurVec * (float(i) / float(nSamples - 1) - 0.5);
  
   // sample & add to result:
      result += texture(uTexInput, vTexcoord + offset);
   }
 
   result /= float(nSamples);

A sly problem

There is a potential issue around framerate: if it is very high our blur will be barely visible as the amount of motion between frames will be small, hence blurVec will be short. If the framerate is very low our blur will be exaggerated, as the amount of motion between frames will be high, hence blurVec will be long.

While this is physically realistic (higher fps = shorter exposure, lower fps = longer exposure) it might not be aesthetically desirable. This is especially true for variable-framerate games which need to maintain playability as the framerate drops without the entire image becoming a smear. At the other end of the problem, for displays with high refresh rates (or vsync disabled) the blur lengths end up being so short that the result will be pretty much unnoticeable. What we want in these situations is for each frame to look as though it was rendered at a particular framerate (which we'll call the 'target framerate') regardless of the actual framerate.

The solution is to scale blurVec according to the current actual fps; if the framerate goes up we increase the blur length, if it goes down we decrease the blur length. When I say "goes up" or "goes down" I mean "changes relative to the target framerate." This scale factor is easilly calculated:

   mblurScale = currentFps / targeFps

So if our target fps is 60 but the actual fps is 30, we halve our blur length. Remember that this is not physically realistic - we're fiddling the result in order to compensate for a variable framerate.

Optimization

The simplest way to improve the performance of this method is to reduce the number of blur samples. I've found it looks okay down to about 8 samples, where 'banding' artifacts start to become apparent.

As I hinted before, computing the blur vector can be streamlined. Notice that, in the first part of the fragment shader, we did two matrix multiplications:
// get current world space position:
   vec3 current = vViewRay * texture(uTexLinearDepth, vTexcoord).r;
   current = uInverseModelViewMat * current;
 
// get previous screen space position:
   vec4 previous = uPrevModelViewProj * vec4(current, 1.0);
   previous.xyz /= previous.w;
   previous.xy = previous.xy * 0.5 + 0.5;
These can be combined into a single transformation by constructing a current-to-previous matrix:

mat4 currentToPrevious = uPrevModelViewProj * uInverseModelViewMat

If we do this on the CPU we only have to do a single matrix multiplication per fragment in the shader. Also, this reduces the amount of data we upload to the GPU (always a good thing). The relevant part of the fragment program now looks like this:
   vec3 current = vViewRay * texture(uTexLinearDepth, vTexcoord).r;
   vec4 previous = uCurrentToPreviousMat * vec4(current, 1.0);
   previous.xyz /= previous.w;
   previous.xy = previous.xy * 0.5 + 0.5;

Conclusion

Even this limited form of motion blur makes a big improvement to the appearance of a rendered scene; moving around looks generally smoother and more realistic. At lower framerates (~30fps) the effect produces a filmic appearance, hiding some of the temporal aliasing that makes rendering (and stop-motion animation) 'look fake'.

If that wasn't enough, head over to the object motion blur tutorial, otherwise have some links:

"Stupid OpenGL Shader Tricks" Simon Green, NVIDIA

"Motion Blur as a Post Processing Effect" Gilberto Rosado, GPU Gems 3

Dinoooossaaaaaaurs!

18 comments:

  1. Your link <"Stupid OpenGL Shader Tricks" Simon Green, NVIDIA> brings me to "http://www.ncsoft.jp/"...

    ReplyDelete
    Replies
    1. This comment has been removed by the author.

      Delete
    2. The object motion blur tutorial appears to have been moved here: http://john-chapman-graphics.blogspot.com/2013/01/per-object-motion-blur.html

      Delete
  2. I have enjoyed it from your blog and looking forward to see more from you.
    Thanks....!
    television editor

    ReplyDelete
  3. Utilize the controls in the controller window to completely alter the Movement smash format. Clients can change the number, weight, gravity, thickness and effect of the smash impact to make perpetual cool looks. FCPX Motion Plugins

    ReplyDelete
  4. Really interesting material, thanks so much for sharing this!
    Have you tried gaussian blur instead of averaging? In reality the shutter of the film camera is smoothly opening and closing the aperture gate, instead of a sharp "on" and "off". The resulting motion blur then looks like gaussian blur.

    ReplyDelete
  5. Great with detailed information. It is really very helpful for us.
    Village Talkies a top quality professional Corporate Video Production Company in Bangalore and also best explainer video company in Bangalore & 3d, 2d animation video makers in Bangalore , Chennai, India & Maryland, Baltimore, provides Corporate & Brand films, Promotional, Marketing videos & Training videos , Product demo videos , Product video explainers, 2d, 3d Animation, Motion graphics, Whiteboard Employee videos and more for all start-ups, industries and corporate companies. From scripting to corporate, explainer & 3d, 2d animation video production , our solutions are customized to your budget, timeline and to meet the company goals and objectives.

    ReplyDelete
  6. This is quite a good blog.Are you also searching for DNP Capstone Project? we are the best solution for you. We are best known for delivering nursing writing services to students without having to break the bank.

    ReplyDelete
  7. This is quite a good blog.Are you also searching for BSN Writing Services? we are the best solution for you. We are best known for delivering the best bsn writing services to students without having to break the bank.

    ReplyDelete
  8. Your blogs are great.Are you also searching for Nursing Writing Help? we are the best solution for you. We are best known for delivering nursing writing services to students without having to break the bank.

    ReplyDelete
  9. Best Interpretor Video Production Company
    The Bombox interpretator is a video company whose interpretor is a practical experience in activating video which makes your business ideas, ideas, services and product most ideal ways. Explainer video company in Delhi
    Animated video production companies in India
    Animated video production services

    ReplyDelete
  10. Auto Blur background Photo Editor & Blur image Background Effect on image app gives a stylish DSLR blurry effect on images with easy to finger pressing editing work. Also, use Blur background Photo Editor & blurry image app like an expensive DSLR camera blurry photo tool.

    ReplyDelete
  11. Nice Articles Thanks For Sharing I love it B2B Explainer Videos, Explainer Videos, Motion Graphic Animation, Video Animation, PPT Presentation, 2D Animation, Graphics, Graphic Development, White Board Animation, E-learning, and Technical Videos Explainer video company in Bangalore

    ReplyDelete
  12. Good blog. Keep sharing. I love them Are you also searching for Spanish writing help? we are the best solution for you. We are best known for delivering Spanish writing services to students without having to break the bank

    ReplyDelete
  13. Good blog. Keep sharing. I love them Are you also searching for Cheap assignment help? we are the best solution for you. We are best known for delivering cheap assignments to students without having to break the bank

    ReplyDelete
  14. Great tutorial on motion blur! The step-by-step instructions were clear and easy to follow. Now I can add dynamic effects to my photos with confidence. Thanks for sharing this! 👍📸

    For more information check this - https://3dtrixs.com/3d-animation-studio/

    ReplyDelete