Hello and welcome back to our Image and Video Processing class. We are going to continue with image restoration, and so far we have dealt a lot with the noise. So let us now work with the filtering component. So let us remember that this is the complete model of degradation that we are talking about. We are talking about filtering the image and then adding noise. And so far we have considered that there was no filtering, but just noise. In general, what we have here as we have seen before is g(x,y). The observed image is equal to f(x,y) convolution with h(x,y) plus the noise, which, as we discussed before, is a random violate that we add noise to the image. And we have seen how to try to estimate the noise. Now let us discuss how to try to estimate this blurring function. In general it's called a blurring function, this h. This is much harder than estimating the noise. But before we discuss that why is this important. Let me just illustrate it as a very, very simple formula why this is important. And for that lets assume for a second that we don't have any noise. So basically there is no noise. We're going take the Fourier Transforms of this and the Fourier is g of u, v equal f of u, v times. H of UV. So for those that are familiar with , this is very clear. For those that are not, just bare with me for a moment. In Fourier, the convolution becomes a product. So basically we have transformed our image into a different domain where the convolution became a product. And now look what we have, we want to estimate the original image so we divide, so we say okay, very good. F u, v is equal to the observation, basically the degrade, the degraded image divided by the filter. So if we manage to estimate the filter, all very easy. We take the observed image, we do our Fourier transform, we do our Fourier transform of the estimated filter. We divide one by the other. We got the original image, actually it's Fourier transform, we invert it, we get the image back. This is what's called inverse filtering. Has some fundamental problems, in particular because there is no guarantee, that this H, U, V is always different than zero. Actually for realistic scenario's, it has either some zeros, or is very close to zero for some values of U or V, but the general idea is very, very good that we basically if we have knowledge about this degradation function, then we can try to invert it. And that's very cool. I mean it's that, we want to try to estimate it. As I say estimating it very, very hard, much harder than estimating the noise. So let's see some examples where we can actually estimate it and also some examples of what type of h functions do we have in reality. For example, one thing that happens is, as I mentioned earlier this week, is what's called blurring. Blurring means that I basically observe an image that is the result of my original image. Convolve, let's say, with the Gaussian function with zero mean uncertain variance, or standard deviation. And we already saw the general function of a Gaussian. Basically, this exponential with a quadratic exponential function. We saw it when we were discussing noise. Basically, it blurs my image. Now if you know that there is a Gaussian function that is affecting your observation. So this would be my observation. This would be my G. If you know that G is the result of F. With a Gaussian filter. And you also know that the F was actually a dot. A delta function. Just a point, that in that case for example, that very trivial case, you can actually estimate the blurring. Actually, not very hard because a simple property of the convolution says that basically if F is the delta function that is convolved, with G. Then we get G, so basically this is actually the Gaussian. So a very simple case that we can actually estimate our H function for the particular case, that we have a Gaussian filtering. Now, this might sound a bit ridiculous, as first. I say, how do I know that there was a dot, that it was a delta function in my image? But actually it is not so ridiculous, because we can use this to calibrate a system. If we know that our camera is blurring. Then I can put artificially an image that has a delta function and then I estimate my blurring. So this is very useful to, to calibration to do estimation of how the camera is behaving. Very important. Now I'm going to tell you something else that will further enhance the knowledge and the concept why this is a hard function. A hard function to estimate. If convolution G is equal to G convolution F. So you might wonder is F my original image? So was my original image a delta? Am I blaring a Gaussian? Or was my original image a Gauissian? Am I blaring a delta, which means no blaring. So there is a lot of basic ambiguity here that we don't' know which one is what, and that is yet another challenge that we have when we are trying to estimate the filtering function. And normally what you do is you assume models for the image and models for the filtering. But this is just to illustrate how hard this problem is. So this is an example where we can estimate. Gaussian actually as I say is very important for blaring. It's also a model that is very frequently used for turbulence. So here is an image which is courtesy of, of, of NASA, and it appears in the book that we are using so far. And here we have basically the same image under different levels of turbulence, from more turbulence to less turbulence. So at least you can observe here the blurring effect and very often turbulence is modelled with a Gaussian filter. So it's very, very important, we discussed that Gaussian noise distributions are very important mathematically although they don't appear really, in real physical scenarios very often. Gaussian smoothing, Gaussian filtering actually is a very realistic model. For a lot of things that are happening in physical cameras. The other type of blurring that appears a lot is what's called motion blurring. And the basic idea of motion blurring is that when you're taking a picture if the object is moving. Pictures are taken by integrating the light that comes into the sensor. So if the object is moving during your integration time, basically you get a blurry picture. And that's illustrated here before I write down the formulas. So in this case, it's basically a simulation of instead of moving the object, we move the camera. And the basic idea is, going back to our GF functions, basically your G function that we are observing is actually the integral from zero to T. That's a time of integration of your camera, of your image at X. But the point X is moving, so I can call that function of motion, let's say X of T, to write down that basically it's moving, and it's also moving in the Y direction, Y of T. And I'm integrating over T. Okay? So basically, what I'm doing here is I'm not taking one image. It's like I've taken multiple images and I'm adding them, and what I see is the result. This actually was simulated by taking this image here, shifting it, let's say, to the right, just by a tiny bit. Shifting it again, shifting it again, and then adding that. That's what this operation is doing, is basically adding images on its own shifts. And, this is how we're observation. Now, if we take, if we do the math and we take the Fourier transform in both sides, and we use properties of the Fourier transform. Then we can also write this in the form that we have seen before, G equal H, F. So, also this translation is actually a filter, and I am not going to do that but you're more than welcome to do that as an exercise, a very simple exercise, if you are familiar with basic concepts in Fourier transform, you basically just as I say, take the Fourier in both sides, and use properties of, of translation. And you get this filter, which of course will depend on the velocities. Okay, so you could imagine that if the object is not moving at all, there is no blurring. If the object is moving a lot, there is much more blurring. So, this filter will depend on these velocities. And once again the question will be, just to show how hard it is, how do I estimate from this? Both my image and my blurring effects. Certainly not a trivial thing to do. Even if I knew that the result is a result of motion blurring. The, the concept is that, from the sum of multiple images, you have to estimate one. Okay? So that's certainly not a trivial concept. If I give you the number five, and I tell you it's the sum of two numbers, there is no way for you to know which two numbers I added unless I give you more information about that. We're going to discuss this more when we talk in a few weeks more advanced topics in image restoration. When we come back to this topic you'll see different tools like sparsmalling and to expect that. This is to illustrate that is a hard problem, and an extremely important problem. Simple models like Gaussian de-blurring or motion, or, Gaussian blurring, sorry. The blurring is the operation of, of inverting, and motion blur are very, very important. And we need to be able to invert them to go back to our original very, very sharp image. What we're going to do next, is basically, we're going to see one way of doing that, and that's called Wiener filtering. There's one way of inverting a filter and inverting noise, and we're going to see that in the next video. See you soon. Thank you.