Download - Image Processing 4

Transcript
Page 1: Image Processing 4

Histogram Processing

IT 472: Digital Image Processing, Lecture 7

Page 2: Image Processing 4

Histogram

The i th histogram entry for a digital image is

h(ri ) =1

MN

M∑i=1

N∑j=1

χri (f [i , j ]), 0 ≤ ri ≤ L− 1,

where

χri (f [i , j ]) = 1 if f [i , j ] = ri

= 0 otherwise

Estimates the probability distribution function of gray valuesin an image.

Gives a good idea of contrast in an image.

IT472: Lecture 7 2/17

Page 3: Image Processing 4

Histogram

The i th histogram entry for a digital image is

h(ri ) =1

MN

M∑i=1

N∑j=1

χri (f [i , j ]), 0 ≤ ri ≤ L− 1,

where

χri (f [i , j ]) = 1 if f [i , j ] = ri

= 0 otherwise

Estimates the probability distribution function of gray valuesin an image.

Gives a good idea of contrast in an image.

IT472: Lecture 7 2/17

Page 4: Image Processing 4

Histogram

The i th histogram entry for a digital image is

h(ri ) =1

MN

M∑i=1

N∑j=1

χri (f [i , j ]), 0 ≤ ri ≤ L− 1,

where

χri (f [i , j ]) = 1 if f [i , j ] = ri

= 0 otherwise

Estimates the probability distribution function of gray valuesin an image.

Gives a good idea of contrast in an image.

IT472: Lecture 7 2/17

Page 5: Image Processing 4

Example of histograms

IT472: Lecture 7 3/17

Page 6: Image Processing 4

Histogram Processing

Conclusion: A high contrast image will tend to have a widerange of gray-values with a uniform distribution.

Given an input image with gray values treated as a randomvariable r with pdf pR(r), can we design a transformation Tsuch that the random variable

s = T (r)

is distributed uniformly, i.e. pS(s) = a, ∀s.

The answer is (almost) always YES and the algorithm to doso is called Histogram Equalization.

IT472: Lecture 7 4/17

Page 7: Image Processing 4

Histogram Processing

Conclusion: A high contrast image will tend to have a widerange of gray-values with a uniform distribution.

Given an input image with gray values treated as a randomvariable r with pdf pR(r), can we design a transformation Tsuch that the random variable

s = T (r)

is distributed uniformly, i.e. pS(s) = a, ∀s.

The answer is (almost) always YES and the algorithm to doso is called Histogram Equalization.

IT472: Lecture 7 4/17

Page 8: Image Processing 4

Histogram Processing

Conclusion: A high contrast image will tend to have a widerange of gray-values with a uniform distribution.

Given an input image with gray values treated as a randomvariable r with pdf pR(r), can we design a transformation Tsuch that the random variable

s = T (r)

is distributed uniformly, i.e. pS(s) = a, ∀s.

The answer is (almost) always YES and the algorithm to doso is called Histogram Equalization.

IT472: Lecture 7 4/17

Page 9: Image Processing 4

Histogram equalization

Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :

� bijective,� monotonically increasing.

Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?

For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.

FS(s0 = T (r0)) =

∫ s0

0pS(s) ds = FR(r) =

∫ r0

0pR(r) dr

What should pS(s) be?

s is uniformly distributed ⇒ pS(s) = 1, ∀s.

IT472: Lecture 7 5/17

Page 10: Image Processing 4

Histogram equalization

Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :

� bijective,� monotonically increasing.

Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?

For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.

FS(s0 = T (r0)) =

∫ s0

0pS(s) ds = FR(r) =

∫ r0

0pR(r) dr

What should pS(s) be?

s is uniformly distributed ⇒ pS(s) = 1, ∀s.

IT472: Lecture 7 5/17

Page 11: Image Processing 4

Histogram equalization

Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :

� bijective,� monotonically increasing.

Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?

For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.

FS(s0 = T (r0)) =

∫ s0

0pS(s) ds = FR(r) =

∫ r0

0pR(r) dr

What should pS(s) be?

s is uniformly distributed ⇒ pS(s) = 1, ∀s.

IT472: Lecture 7 5/17

Page 12: Image Processing 4

Histogram equalization

Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :

� bijective,� monotonically increasing.

Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?

For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.

FS(s0 = T (r0)) =

∫ s0

0pS(s) ds = FR(r) =

∫ r0

0pR(r) dr

What should pS(s) be?

s is uniformly distributed ⇒ pS(s) = 1, ∀s.

IT472: Lecture 7 5/17

Page 13: Image Processing 4

Histogram equalization

Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :

� bijective,� monotonically increasing.

Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?

For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.

FS(s0 = T (r0)) =

∫ s0

0pS(s) ds = FR(r) =

∫ r0

0pR(r) dr

What should pS(s) be?

s is uniformly distributed ⇒ pS(s) = 1, ∀s.

IT472: Lecture 7 5/17

Page 14: Image Processing 4

Histogram equalization

Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :

� bijective,� monotonically increasing.

Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?

For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.

FS(s0 = T (r0)) =

∫ s0

0pS(s) ds = FR(r) =

∫ r0

0pR(r) dr

What should pS(s) be?

s is uniformly distributed ⇒ pS(s) = 1, ∀s.

IT472: Lecture 7 5/17

Page 15: Image Processing 4

Histogram equalization

Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :

� bijective,� monotonically increasing.

Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?

For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.

FS(s0 = T (r0)) =

∫ s0

0pS(s) ds = FR(r) =

∫ r0

0pR(r) dr

What should pS(s) be?

s is uniformly distributed ⇒ pS(s) = 1, ∀s.

IT472: Lecture 7 5/17

Page 16: Image Processing 4

Histogram equalization

Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :

� bijective,� monotonically increasing.

Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?

For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.

FS(s0 = T (r0)) =

∫ s0

0pS(s) ds = FR(r) =

∫ r0

0pR(r) dr

What should pS(s) be?

s is uniformly distributed ⇒ pS(s) = 1, ∀s.

IT472: Lecture 7 5/17

Page 17: Image Processing 4

Histogram Equalization

∴ FS(s0) =∫ s0

0 1 ds = s0 = FR(r0) =∫ r0

0 pR(r) dr

Histogram Equalization transformation

s0 = FR(r0) =∫ r0

0 pR(r) dr

In general, if s = T (r), where T is bijective, monotonicallyincreasing and differentiable then pS(s) = pR(r)drds

dsdr = dT (r)

dr

= ddr

∫ r0 pR(r̄) dr̄

= pR(r) Using Leibnitz rule .

This gives pS(s) = pR(r) 1pR(r) = 1⇒ s is uniformly

distributed.

IT472: Lecture 7 6/17

Page 18: Image Processing 4

Histogram Equalization

∴ FS(s0) =∫ s0

0 1 ds = s0 = FR(r0) =∫ r0

0 pR(r) dr

Histogram Equalization transformation

s0 = FR(r0) =∫ r0

0 pR(r) dr

In general, if s = T (r), where T is bijective, monotonicallyincreasing and differentiable then pS(s) = pR(r)drds

dsdr = dT (r)

dr

= ddr

∫ r0 pR(r̄) dr̄

= pR(r) Using Leibnitz rule .

This gives pS(s) = pR(r) 1pR(r) = 1⇒ s is uniformly

distributed.

IT472: Lecture 7 6/17

Page 19: Image Processing 4

Histogram Equalization

∴ FS(s0) =∫ s0

0 1 ds = s0 = FR(r0) =∫ r0

0 pR(r) dr

Histogram Equalization transformation

s0 = FR(r0) =∫ r0

0 pR(r) dr

In general, if s = T (r), where T is bijective, monotonicallyincreasing and differentiable then pS(s) = pR(r)drds

dsdr = dT (r)

dr

= ddr

∫ r0 pR(r̄) dr̄

= pR(r) Using Leibnitz rule .

This gives pS(s) = pR(r) 1pR(r) = 1⇒ s is uniformly

distributed.

IT472: Lecture 7 6/17

Page 20: Image Processing 4

Histogram Equalization

∴ FS(s0) =∫ s0

0 1 ds = s0 = FR(r0) =∫ r0

0 pR(r) dr

Histogram Equalization transformation

s0 = FR(r0) =∫ r0

0 pR(r) dr

In general, if s = T (r), where T is bijective, monotonicallyincreasing and differentiable then pS(s) = pR(r)drds

dsdr = dT (r)

dr

= ddr

∫ r0 pR(r̄) dr̄

= pR(r) Using Leibnitz rule .

This gives pS(s) = pR(r) 1pR(r) = 1⇒ s is uniformly

distributed.

IT472: Lecture 7 6/17

Page 21: Image Processing 4

Histogram Equalization

∴ FS(s0) =∫ s0

0 1 ds = s0 = FR(r0) =∫ r0

0 pR(r) dr

Histogram Equalization transformation

s0 = FR(r0) =∫ r0

0 pR(r) dr

In general, if s = T (r), where T is bijective, monotonicallyincreasing and differentiable then pS(s) = pR(r)drds

dsdr = dT (r)

dr

= ddr

∫ r0 pR(r̄) dr̄

= pR(r) Using Leibnitz rule .

This gives pS(s) = pR(r) 1pR(r) = 1⇒ s is uniformly

distributed.

IT472: Lecture 7 6/17

Page 22: Image Processing 4

Histogram equalization

For discrete gray values pR(rk) =nrkMN , 0 ≤ rk ≤ 255

Histogram equalization: sk = T (rk) = (L− 1)∑k

j=0 pR(rj)

IT472: Lecture 7 7/17

Page 23: Image Processing 4

Histogram equalization

For discrete gray values pR(rk) =nrkMN , 0 ≤ rk ≤ 255

Histogram equalization: sk = T (rk) = (L− 1)∑k

j=0 pR(rj)

IT472: Lecture 7 7/17

Page 24: Image Processing 4

Examples

IT472: Lecture 7 8/17

Page 25: Image Processing 4

Examples

IT472: Lecture 7 9/17

Page 26: Image Processing 4

Histogram Equalization

What happens if we apply histogram equalization twice on animage?

Histogram equalization is idempotent!

IT472: Lecture 7 10/17

Page 27: Image Processing 4

Histogram Equalization

What happens if we apply histogram equalization twice on animage?

Histogram equalization is idempotent!

IT472: Lecture 7 10/17

Page 28: Image Processing 4

Histogram Equalization issues

IT472: Lecture 7 11/17

Page 29: Image Processing 4

Histogram Equalization issues

IT472: Lecture 7 12/17

Page 30: Image Processing 4

Histogram Specification

IT472: Lecture 7 13/17

Page 31: Image Processing 4

Histogram Specification

Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?

Idea: Assume continuous random variables between 0 and 1.

� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.

� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable

z = T−12 (T1(r))?

� pZ (z) = pS(s)

Histogram Specification

T = T−12 · T1 achieves the specified density function.

IT472: Lecture 7 14/17

Page 32: Image Processing 4

Histogram Specification

Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?

Idea: Assume continuous random variables between 0 and 1.

� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.

� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable

z = T−12 (T1(r))?

� pZ (z) = pS(s)

Histogram Specification

T = T−12 · T1 achieves the specified density function.

IT472: Lecture 7 14/17

Page 33: Image Processing 4

Histogram Specification

Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?

Idea: Assume continuous random variables between 0 and 1.

� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.

� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable

z = T−12 (T1(r))?

� pZ (z) = pS(s)

Histogram Specification

T = T−12 · T1 achieves the specified density function.

IT472: Lecture 7 14/17

Page 34: Image Processing 4

Histogram Specification

Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?

Idea: Assume continuous random variables between 0 and 1.

� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.

� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable

z = T−12 (T1(r))?

� pZ (z) = pS(s)

Histogram Specification

T = T−12 · T1 achieves the specified density function.

IT472: Lecture 7 14/17

Page 35: Image Processing 4

Histogram Specification

Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?

Idea: Assume continuous random variables between 0 and 1.

� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.

� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable

z = T−12 (T1(r))?

� pZ (z) = pS(s)

Histogram Specification

T = T−12 · T1 achieves the specified density function.

IT472: Lecture 7 14/17

Page 36: Image Processing 4

Histogram Specification

Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?

Idea: Assume continuous random variables between 0 and 1.

� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.

� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable

z = T−12 (T1(r))?

� pZ (z) = pS(s)

Histogram Specification

T = T−12 · T1 achieves the specified density function.

IT472: Lecture 7 14/17

Page 37: Image Processing 4

Histogram Specification

Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?

Idea: Assume continuous random variables between 0 and 1.

� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.

� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable

z = T−12 (T1(r))?

� pZ (z) = pS(s)

Histogram Specification

T = T−12 · T1 achieves the specified density function.

IT472: Lecture 7 14/17

Page 38: Image Processing 4

Histogram Specification: Digital Images

From Histogram equalization:sk = T1(rk) = (L− 1)

∑kj=0 pR(rj).

Histogram equalization on the specified histogram:T2(zp) = (L− 1)

∑pi=0 pZ (zp)

You may need to round-off non-integer values.

Computing zp = T−12 · T1(rk), ∀rk may not be feasible when

working with digital images.

Find zp such that T2(zp)− sk is minimized for every sk .

There may not be a unique minimizer. In this case, use thesmallest sk .

IT472: Lecture 7 15/17

Page 39: Image Processing 4

Histogram Specification: Digital Images

From Histogram equalization:sk = T1(rk) = (L− 1)

∑kj=0 pR(rj).

Histogram equalization on the specified histogram:T2(zp) = (L− 1)

∑pi=0 pZ (zp)

You may need to round-off non-integer values.

Computing zp = T−12 · T1(rk), ∀rk may not be feasible when

working with digital images.

Find zp such that T2(zp)− sk is minimized for every sk .

There may not be a unique minimizer. In this case, use thesmallest sk .

IT472: Lecture 7 15/17

Page 40: Image Processing 4

Histogram Specification: Digital Images

From Histogram equalization:sk = T1(rk) = (L− 1)

∑kj=0 pR(rj).

Histogram equalization on the specified histogram:T2(zp) = (L− 1)

∑pi=0 pZ (zp)

You may need to round-off non-integer values.

Computing zp = T−12 · T1(rk), ∀rk may not be feasible when

working with digital images.

Find zp such that T2(zp)− sk is minimized for every sk .

There may not be a unique minimizer. In this case, use thesmallest sk .

IT472: Lecture 7 15/17

Page 41: Image Processing 4

Histogram Specification: Digital Images

From Histogram equalization:sk = T1(rk) = (L− 1)

∑kj=0 pR(rj).

Histogram equalization on the specified histogram:T2(zp) = (L− 1)

∑pi=0 pZ (zp)

You may need to round-off non-integer values.

Computing zp = T−12 · T1(rk), ∀rk may not be feasible when

working with digital images.

Find zp such that T2(zp)− sk is minimized for every sk .

There may not be a unique minimizer. In this case, use thesmallest sk .

IT472: Lecture 7 15/17

Page 42: Image Processing 4

Histogram Specification: Digital Images

From Histogram equalization:sk = T1(rk) = (L− 1)

∑kj=0 pR(rj).

Histogram equalization on the specified histogram:T2(zp) = (L− 1)

∑pi=0 pZ (zp)

You may need to round-off non-integer values.

Computing zp = T−12 · T1(rk), ∀rk may not be feasible when

working with digital images.

Find zp such that T2(zp)− sk is minimized for every sk .

There may not be a unique minimizer. In this case, use thesmallest sk .

IT472: Lecture 7 15/17

Page 43: Image Processing 4

Histogram Specification: Digital Images

From Histogram equalization:sk = T1(rk) = (L− 1)

∑kj=0 pR(rj).

Histogram equalization on the specified histogram:T2(zp) = (L− 1)

∑pi=0 pZ (zp)

You may need to round-off non-integer values.

Computing zp = T−12 · T1(rk), ∀rk may not be feasible when

working with digital images.

Find zp such that T2(zp)− sk is minimized for every sk .

There may not be a unique minimizer. In this case, use thesmallest sk .

IT472: Lecture 7 15/17

Page 44: Image Processing 4

Histogram Specification example

IT472: Lecture 7 16/17

Page 45: Image Processing 4

Histogram Specification example

IT472: Lecture 7 17/17