Histogram Processing
IT 472: Digital Image Processing, Lecture 7
Histogram
The i th histogram entry for a digital image is
h(ri ) =1
MN
M∑i=1
N∑j=1
χri (f [i , j ]), 0 ≤ ri ≤ L− 1,
where
χri (f [i , j ]) = 1 if f [i , j ] = ri
= 0 otherwise
Estimates the probability distribution function of gray valuesin an image.
Gives a good idea of contrast in an image.
IT472: Lecture 7 2/17
Histogram
The i th histogram entry for a digital image is
h(ri ) =1
MN
M∑i=1
N∑j=1
χri (f [i , j ]), 0 ≤ ri ≤ L− 1,
where
χri (f [i , j ]) = 1 if f [i , j ] = ri
= 0 otherwise
Estimates the probability distribution function of gray valuesin an image.
Gives a good idea of contrast in an image.
IT472: Lecture 7 2/17
Histogram
The i th histogram entry for a digital image is
h(ri ) =1
MN
M∑i=1
N∑j=1
χri (f [i , j ]), 0 ≤ ri ≤ L− 1,
where
χri (f [i , j ]) = 1 if f [i , j ] = ri
= 0 otherwise
Estimates the probability distribution function of gray valuesin an image.
Gives a good idea of contrast in an image.
IT472: Lecture 7 2/17
Example of histograms
IT472: Lecture 7 3/17
Histogram Processing
Conclusion: A high contrast image will tend to have a widerange of gray-values with a uniform distribution.
Given an input image with gray values treated as a randomvariable r with pdf pR(r), can we design a transformation Tsuch that the random variable
s = T (r)
is distributed uniformly, i.e. pS(s) = a, ∀s.
The answer is (almost) always YES and the algorithm to doso is called Histogram Equalization.
IT472: Lecture 7 4/17
Histogram Processing
Conclusion: A high contrast image will tend to have a widerange of gray-values with a uniform distribution.
Given an input image with gray values treated as a randomvariable r with pdf pR(r), can we design a transformation Tsuch that the random variable
s = T (r)
is distributed uniformly, i.e. pS(s) = a, ∀s.
The answer is (almost) always YES and the algorithm to doso is called Histogram Equalization.
IT472: Lecture 7 4/17
Histogram Processing
Conclusion: A high contrast image will tend to have a widerange of gray-values with a uniform distribution.
Given an input image with gray values treated as a randomvariable r with pdf pR(r), can we design a transformation Tsuch that the random variable
s = T (r)
is distributed uniformly, i.e. pS(s) = a, ∀s.
The answer is (almost) always YES and the algorithm to doso is called Histogram Equalization.
IT472: Lecture 7 4/17
Histogram equalization
Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :
� bijective,� monotonically increasing.
Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?
For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.
FS(s0 = T (r0)) =
∫ s0
0pS(s) ds = FR(r) =
∫ r0
0pR(r) dr
What should pS(s) be?
s is uniformly distributed ⇒ pS(s) = 1, ∀s.
IT472: Lecture 7 5/17
Histogram equalization
Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :
� bijective,� monotonically increasing.
Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?
For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.
FS(s0 = T (r0)) =
∫ s0
0pS(s) ds = FR(r) =
∫ r0
0pR(r) dr
What should pS(s) be?
s is uniformly distributed ⇒ pS(s) = 1, ∀s.
IT472: Lecture 7 5/17
Histogram equalization
Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :
� bijective,� monotonically increasing.
Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?
For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.
FS(s0 = T (r0)) =
∫ s0
0pS(s) ds = FR(r) =
∫ r0
0pR(r) dr
What should pS(s) be?
s is uniformly distributed ⇒ pS(s) = 1, ∀s.
IT472: Lecture 7 5/17
Histogram equalization
Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :
� bijective,� monotonically increasing.
Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?
For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.
FS(s0 = T (r0)) =
∫ s0
0pS(s) ds = FR(r) =
∫ r0
0pR(r) dr
What should pS(s) be?
s is uniformly distributed ⇒ pS(s) = 1, ∀s.
IT472: Lecture 7 5/17
Histogram equalization
Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :
� bijective,� monotonically increasing.
Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?
For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.
FS(s0 = T (r0)) =
∫ s0
0pS(s) ds = FR(r) =
∫ r0
0pR(r) dr
What should pS(s) be?
s is uniformly distributed ⇒ pS(s) = 1, ∀s.
IT472: Lecture 7 5/17
Histogram equalization
Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :
� bijective,� monotonically increasing.
Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?
For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.
FS(s0 = T (r0)) =
∫ s0
0pS(s) ds = FR(r) =
∫ r0
0pR(r) dr
What should pS(s) be?
s is uniformly distributed ⇒ pS(s) = 1, ∀s.
IT472: Lecture 7 5/17
Histogram equalization
Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :
� bijective,� monotonically increasing.
Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?
For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.
FS(s0 = T (r0)) =
∫ s0
0pS(s) ds = FR(r) =
∫ r0
0pR(r) dr
What should pS(s) be?
s is uniformly distributed ⇒ pS(s) = 1, ∀s.
IT472: Lecture 7 5/17
Histogram equalization
Assume continuous random variables 0 ≤ r , s ≤ 1. Let’s try tosearch for a good T :
� bijective,� monotonically increasing.
Crucial observation: Given that s0 = T (r0), how areFS(s0) = P(s ≤ s0) and FR(r0) = P(r ≤ r0) related?
For all s = T (r), FS(s) = FR(r), since T is bijective andmonotonically increasing.
FS(s0 = T (r0)) =
∫ s0
0pS(s) ds = FR(r) =
∫ r0
0pR(r) dr
What should pS(s) be?
s is uniformly distributed ⇒ pS(s) = 1, ∀s.
IT472: Lecture 7 5/17
Histogram Equalization
∴ FS(s0) =∫ s0
0 1 ds = s0 = FR(r0) =∫ r0
0 pR(r) dr
Histogram Equalization transformation
s0 = FR(r0) =∫ r0
0 pR(r) dr
In general, if s = T (r), where T is bijective, monotonicallyincreasing and differentiable then pS(s) = pR(r)drds
dsdr = dT (r)
dr
= ddr
∫ r0 pR(r̄) dr̄
= pR(r) Using Leibnitz rule .
This gives pS(s) = pR(r) 1pR(r) = 1⇒ s is uniformly
distributed.
IT472: Lecture 7 6/17
Histogram Equalization
∴ FS(s0) =∫ s0
0 1 ds = s0 = FR(r0) =∫ r0
0 pR(r) dr
Histogram Equalization transformation
s0 = FR(r0) =∫ r0
0 pR(r) dr
In general, if s = T (r), where T is bijective, monotonicallyincreasing and differentiable then pS(s) = pR(r)drds
dsdr = dT (r)
dr
= ddr
∫ r0 pR(r̄) dr̄
= pR(r) Using Leibnitz rule .
This gives pS(s) = pR(r) 1pR(r) = 1⇒ s is uniformly
distributed.
IT472: Lecture 7 6/17
Histogram Equalization
∴ FS(s0) =∫ s0
0 1 ds = s0 = FR(r0) =∫ r0
0 pR(r) dr
Histogram Equalization transformation
s0 = FR(r0) =∫ r0
0 pR(r) dr
In general, if s = T (r), where T is bijective, monotonicallyincreasing and differentiable then pS(s) = pR(r)drds
dsdr = dT (r)
dr
= ddr
∫ r0 pR(r̄) dr̄
= pR(r) Using Leibnitz rule .
This gives pS(s) = pR(r) 1pR(r) = 1⇒ s is uniformly
distributed.
IT472: Lecture 7 6/17
Histogram Equalization
∴ FS(s0) =∫ s0
0 1 ds = s0 = FR(r0) =∫ r0
0 pR(r) dr
Histogram Equalization transformation
s0 = FR(r0) =∫ r0
0 pR(r) dr
In general, if s = T (r), where T is bijective, monotonicallyincreasing and differentiable then pS(s) = pR(r)drds
dsdr = dT (r)
dr
= ddr
∫ r0 pR(r̄) dr̄
= pR(r) Using Leibnitz rule .
This gives pS(s) = pR(r) 1pR(r) = 1⇒ s is uniformly
distributed.
IT472: Lecture 7 6/17
Histogram Equalization
∴ FS(s0) =∫ s0
0 1 ds = s0 = FR(r0) =∫ r0
0 pR(r) dr
Histogram Equalization transformation
s0 = FR(r0) =∫ r0
0 pR(r) dr
In general, if s = T (r), where T is bijective, monotonicallyincreasing and differentiable then pS(s) = pR(r)drds
dsdr = dT (r)
dr
= ddr
∫ r0 pR(r̄) dr̄
= pR(r) Using Leibnitz rule .
This gives pS(s) = pR(r) 1pR(r) = 1⇒ s is uniformly
distributed.
IT472: Lecture 7 6/17
Histogram equalization
For discrete gray values pR(rk) =nrkMN , 0 ≤ rk ≤ 255
Histogram equalization: sk = T (rk) = (L− 1)∑k
j=0 pR(rj)
IT472: Lecture 7 7/17
Histogram equalization
For discrete gray values pR(rk) =nrkMN , 0 ≤ rk ≤ 255
Histogram equalization: sk = T (rk) = (L− 1)∑k
j=0 pR(rj)
IT472: Lecture 7 7/17
Examples
IT472: Lecture 7 8/17
Examples
IT472: Lecture 7 9/17
Histogram Equalization
What happens if we apply histogram equalization twice on animage?
Histogram equalization is idempotent!
IT472: Lecture 7 10/17
Histogram Equalization
What happens if we apply histogram equalization twice on animage?
Histogram equalization is idempotent!
IT472: Lecture 7 10/17
Histogram Equalization issues
IT472: Lecture 7 11/17
Histogram Equalization issues
IT472: Lecture 7 12/17
Histogram Specification
IT472: Lecture 7 13/17
Histogram Specification
Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?
Idea: Assume continuous random variables between 0 and 1.
� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.
� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable
z = T−12 (T1(r))?
� pZ (z) = pS(s)
Histogram Specification
T = T−12 · T1 achieves the specified density function.
IT472: Lecture 7 14/17
Histogram Specification
Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?
Idea: Assume continuous random variables between 0 and 1.
� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.
� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable
z = T−12 (T1(r))?
� pZ (z) = pS(s)
Histogram Specification
T = T−12 · T1 achieves the specified density function.
IT472: Lecture 7 14/17
Histogram Specification
Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?
Idea: Assume continuous random variables between 0 and 1.
� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.
� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable
z = T−12 (T1(r))?
� pZ (z) = pS(s)
Histogram Specification
T = T−12 · T1 achieves the specified density function.
IT472: Lecture 7 14/17
Histogram Specification
Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?
Idea: Assume continuous random variables between 0 and 1.
� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.
� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable
z = T−12 (T1(r))?
� pZ (z) = pS(s)
Histogram Specification
T = T−12 · T1 achieves the specified density function.
IT472: Lecture 7 14/17
Histogram Specification
Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?
Idea: Assume continuous random variables between 0 and 1.
� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.
� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable
z = T−12 (T1(r))?
� pZ (z) = pS(s)
Histogram Specification
T = T−12 · T1 achieves the specified density function.
IT472: Lecture 7 14/17
Histogram Specification
Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?
Idea: Assume continuous random variables between 0 and 1.
� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.
� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable
z = T−12 (T1(r))?
� pZ (z) = pS(s)
Histogram Specification
T = T−12 · T1 achieves the specified density function.
IT472: Lecture 7 14/17
Histogram Specification
Can we build a transformation s = T (r), where r follows thedensity function pR(r), such that s follows a particularspecified density function pS(s)?
Idea: Assume continuous random variables between 0 and 1.
� Using Histogram equalization, we can compute z̄ = T1(r),such that pZ̄ (z̄) = 1.
� Similarly, we can compute z̃ = T2(s), such that pZ̃ (z̃) = 1.� What is the density function of the random variable
z = T−12 (T1(r))?
� pZ (z) = pS(s)
Histogram Specification
T = T−12 · T1 achieves the specified density function.
IT472: Lecture 7 14/17
Histogram Specification: Digital Images
From Histogram equalization:sk = T1(rk) = (L− 1)
∑kj=0 pR(rj).
Histogram equalization on the specified histogram:T2(zp) = (L− 1)
∑pi=0 pZ (zp)
You may need to round-off non-integer values.
Computing zp = T−12 · T1(rk), ∀rk may not be feasible when
working with digital images.
Find zp such that T2(zp)− sk is minimized for every sk .
There may not be a unique minimizer. In this case, use thesmallest sk .
IT472: Lecture 7 15/17
Histogram Specification: Digital Images
From Histogram equalization:sk = T1(rk) = (L− 1)
∑kj=0 pR(rj).
Histogram equalization on the specified histogram:T2(zp) = (L− 1)
∑pi=0 pZ (zp)
You may need to round-off non-integer values.
Computing zp = T−12 · T1(rk), ∀rk may not be feasible when
working with digital images.
Find zp such that T2(zp)− sk is minimized for every sk .
There may not be a unique minimizer. In this case, use thesmallest sk .
IT472: Lecture 7 15/17
Histogram Specification: Digital Images
From Histogram equalization:sk = T1(rk) = (L− 1)
∑kj=0 pR(rj).
Histogram equalization on the specified histogram:T2(zp) = (L− 1)
∑pi=0 pZ (zp)
You may need to round-off non-integer values.
Computing zp = T−12 · T1(rk), ∀rk may not be feasible when
working with digital images.
Find zp such that T2(zp)− sk is minimized for every sk .
There may not be a unique minimizer. In this case, use thesmallest sk .
IT472: Lecture 7 15/17
Histogram Specification: Digital Images
From Histogram equalization:sk = T1(rk) = (L− 1)
∑kj=0 pR(rj).
Histogram equalization on the specified histogram:T2(zp) = (L− 1)
∑pi=0 pZ (zp)
You may need to round-off non-integer values.
Computing zp = T−12 · T1(rk), ∀rk may not be feasible when
working with digital images.
Find zp such that T2(zp)− sk is minimized for every sk .
There may not be a unique minimizer. In this case, use thesmallest sk .
IT472: Lecture 7 15/17
Histogram Specification: Digital Images
From Histogram equalization:sk = T1(rk) = (L− 1)
∑kj=0 pR(rj).
Histogram equalization on the specified histogram:T2(zp) = (L− 1)
∑pi=0 pZ (zp)
You may need to round-off non-integer values.
Computing zp = T−12 · T1(rk), ∀rk may not be feasible when
working with digital images.
Find zp such that T2(zp)− sk is minimized for every sk .
There may not be a unique minimizer. In this case, use thesmallest sk .
IT472: Lecture 7 15/17
Histogram Specification: Digital Images
From Histogram equalization:sk = T1(rk) = (L− 1)
∑kj=0 pR(rj).
Histogram equalization on the specified histogram:T2(zp) = (L− 1)
∑pi=0 pZ (zp)
You may need to round-off non-integer values.
Computing zp = T−12 · T1(rk), ∀rk may not be feasible when
working with digital images.
Find zp such that T2(zp)− sk is minimized for every sk .
There may not be a unique minimizer. In this case, use thesmallest sk .
IT472: Lecture 7 15/17
Histogram Specification example
IT472: Lecture 7 16/17
Histogram Specification example
IT472: Lecture 7 17/17
Top Related