Minimum (k )-angle barrier coverage in wireless camera sensor...

10
Int. J. Sensor Networks, Vol. x, No. x, xxxx 1 Minimum (k, ω)-angle barrier coverage in wireless camera sensor networks Biaofei Xu School of Information, Renmin University of China, Beijing 100872, China Email: [email protected] Yuqing Zhu Department of Computer Science, California State University, Los Angeles, CA 90032, USA Email: [email protected] Deying Li* School of Information, Renmin University of China, Beijing 100872, China Email: [email protected] *Corresponding author Donghyun Kim Department of Mathematics and Physics, North Carolina Central University, Durham, NC 27707, USA Email: [email protected] Weili Wu Department of Computer Science, University of Texas at Dallas, Richardson TX 75080, USA Email: [email protected] Abstract: Barrier coverage is an important issue in wireless sensor networks, which guarantees to detect any intruder attempting to cross a barrier or penetrating a protected region monitored by sensors. However, the barrier coverage problem in wireless camera sensor networks (WCSNs) is different from that in scalar sensor networks. In this paper, based on (k, ω)-angle coverage, we study the minimum (k, ω)-angle barrier coverage problem in WCSNs. We first present the technique to deploy minimum number of camera sensors to form a (k, ω)-angle barrier. Then propose a novel method to select the minimum number of camera sensors from an arbitrary deployment to form a (k, ω)-angle barrier. Though our simulation, we confirm that our algorithms reduce the number of sensors required comparing to the state-of-art algorithm. Keywords: camera sensors; data collection; barrier construction; (k, ω)-angle coverage. Reference to this paper should be made as follows: Xu, B., Zhu, Y., Li, D., Kim, D. and Wu, W. (xxxx) ‘Minimum (k, ω)-angle barrier coverage in wireless camera sensor networks’, Int. J. Sensor Networks, Vol. x, No. x, pp.xxx–xxx. Biographical notes: Biaofei Xu received his Master degree in Computer Science from School of Information, Renmin University of China, in 2014. He received his BS from Department of Computer Science and Technology, Northeastern University, China, in 2011. His research interests include wireless networks, mobile computing and algorithm design and analysis. Copyright © 20xx Inderscience Enterprises Ltd.

Transcript of Minimum (k )-angle barrier coverage in wireless camera sensor...

Page 1: Minimum (k )-angle barrier coverage in wireless camera sensor …spacl.kennesaw.edu/donghyun.kim/papers/2014IJSNET.pdf · from multiple angles by k camera sensors. Figure 1 shows

Int. J. Sensor Networks, Vol. x, No. x, xxxx 1

Minimum (kkk, ωωω)-angle barrier coverage in wirelesscamera sensor networks

Biaofei XuSchool of Information,Renmin University of China,Beijing 100872, ChinaEmail: [email protected]

Yuqing ZhuDepartment of Computer Science,California State University,Los Angeles, CA 90032, USAEmail: [email protected]

Deying Li*School of Information,Renmin University of China,Beijing 100872, ChinaEmail: [email protected]*Corresponding author

Donghyun KimDepartment of Mathematics and Physics,North Carolina Central University,Durham, NC 27707, USAEmail: [email protected]

Weili WuDepartment of Computer Science,University of Texas at Dallas,Richardson TX 75080, USAEmail: [email protected]

Abstract: Barrier coverage is an important issue in wireless sensor networks, which guaranteesto detect any intruder attempting to cross a barrier or penetrating a protected region monitored bysensors. However, the barrier coverage problem in wireless camera sensor networks (WCSNs) isdifferent from that in scalar sensor networks. In this paper, based on (k, ω)-angle coverage, we studythe minimum (k, ω)-angle barrier coverage problem in WCSNs. We first present the technique todeploy minimum number of camera sensors to form a (k, ω)-angle barrier. Then propose a novelmethod to select the minimum number of camera sensors from an arbitrary deployment to form a(k, ω)-angle barrier. Though our simulation, we confirm that our algorithms reduce the number ofsensors required comparing to the state-of-art algorithm.

Keywords: camera sensors; data collection; barrier construction; (k, ω)-angle coverage.

Reference to this paper should be made as follows: Xu, B., Zhu, Y., Li, D., Kim, D. and Wu, W.(xxxx) ‘Minimum (k, ω)-angle barrier coverage in wireless camera sensor networks’, Int. J. SensorNetworks, Vol. x, No. x, pp.xxx–xxx.

Biographical notes: Biaofei Xu received his Master degree in Computer Science from Schoolof Information, Renmin University of China, in 2014. He received his BS from Department ofComputer Science and Technology, Northeastern University, China, in 2011. His research interestsinclude wireless networks, mobile computing and algorithm design and analysis.

Copyright © 20xx Inderscience Enterprises Ltd.

Page 2: Minimum (k )-angle barrier coverage in wireless camera sensor …spacl.kennesaw.edu/donghyun.kim/papers/2014IJSNET.pdf · from multiple angles by k camera sensors. Figure 1 shows

2 B. Xu et al.

Yuqing Zhu is an Assistant Professor in California State University, Los Angeles. He got his PhD inComputer Science from University of Texas at Dallas in 2014. He received his Master’s degree andthe Bachelor’s degree from Institute of Computing Technology, Chinese Academy of Sciences, andRenmin University of China, respectively, both on Computer Science. His research interests includeoptimisation in big data processing, social networks, distributed systems, and data management inwireless networks.

Deying Li received the MS in Mathematics from Huazhong Normal University in 1988 and PhDin Computer Science from City University of Hong Kong in 2004. She is currently a Professor inthe Department of Computer Science, Renmin University of China. Her research includes wirelessnetworks, mobile computing and algorithm design and analysis.

Donghyun Kim received the BS in Electronic and Computer Engineering from the HanyangUniversity, Ansan, Korea (2003), and the MS in Computer Science and Engineering from HanyangUniversity, Korea (2005). He received his PhD in Computer Science from the University of Texasat Dallas, Richardson, USA (2010). Currently, he is an Assistant Professor in the Department ofMathematics and Physics at North Carolina Central University, Durham, USA. His research interestsinclude wireless networks, mobile computing, and approximation algorithm design and analysis.

Weili Wu received PhD in Computer Science from University of Minnesota, now she is a FullProfessor in University of Texas at Dallas. Her focus is the design and analysis of algorithms foroptimisation problems that occur in wireless networking environments and various database systems.

1 Introduction

Camera sensors are deployed to continuously monitor sometargets or areas, usually each sensor monitors some objectsindependently (Wang et al., 2011; Tseng et al., 2012; Zhanget al., 2009; Chang et al., 2012; Wang and Cao, 2011b;Ma et al., 2012b). In order to get more information, whenmonitoring an object, observing it from multiple angles soas to clearly capture its behaviour is a good method. Hence,the k-angle object coverage problem was studied in Tsenget al. (2012), in which camera sensors are deployed tomonitor objects cooperatively and each object is observedfrom multiple angles by k camera sensors. Figure 1 showstwo ways to monitor an object from three angles. Scenario(a) is more favourable because we can extract more completefeatures of the object from different directions. In contrast,in (b) the sensors are more likely to provide duplicatedobservations.

Intrusion detection and border surveillance (e.g., countryborders protection, battlefield surveillance, critical resourceprotection, airport cordon) are some of the major applicationsof sensor monitoring. In these scenarios, monitoring fullpublic area by camera sensors is wasteful and unnecessary.Constructing sensor barrier is an efficient way for theseapplications. In wireless camera sensor networks (WCSNs), abarrier is composed of a chain of sensors across the deployedregion with the sensing areas of adjacent sensors overlappingwith each other. Any intruder can be detected by at leastone camera sensor in the barrier when the intruder crosses it.Recently, the barrier coverage problems for WCSNs havebeen intensively investigated Zhang et al. (2009), Chang et al.(2012), Wang and Cao (2011b) and Ma et al. (2012b). InZhang et al. (2009), any intruder who crosses the barriercan be observed once, but the intruder can only be detectedby one camera sensor. Chang et al. (2012) developed adecentralised algorithm to cope with the k-barrier coverage

problem in WCSNs. An intruder can be identified by kdifferent camera sensors when he or she crosses the barrier, butit is not guaranteed that the intruder is covered by k differentcamera sensors from multiple angles, and only a small partof information about the intruder can be obtained. In orderto get more information of the intruders, especially their faceidentifications, full-view coverage model was introduced inWang and Cao (2011b). An object is full-view covered ifthere is always at least one camera sensor whose viewingdirection is sufficiently close to the object’s facing directionto cover the object, no matter which direction the object faces.Obviously, we can get more information about the intruderin full-view model. Ma et al. (2012b) studied the minimumcamera barrier coverage problem in WCSNs when the camerasensors are deployed randomly in a target field. To some extent,the algorithm reduces the number of camera sensors, but agood many camera sensors are still needed to construct a full-viewed barrier. In this paper we want to get the maximummonitoring performance with the least visual sensors. We willseek a solution to reduce the number of camera sensors whilemaintain the quality of collected information, and the goalprompts us to come up with a new barrier coverage named by(k, ω)-angle barrier (the rigorous definition will be given inSection 3), which is a balance of the number of camera sensorsused and the information retrieved by the camera sensors.

In this paper we assume that each sensor can only monitora limited angle based on the observation from real life. Hence,to clearly monitor an object, we enforce that the object must besimultaneously monitored by at least k sensors, and in orderto make sure that different sensors monitor different parts ofthe object, the angle between two different sensors’ directions(e.g., ∠s1ojs2 in Figure 1(a)) must be at least ω. If thereexists a sensor deployment satisfying our requirements, wecall the object is (k, ω)-angle-covered. An effective (k, ω)-angle barrier is a connected zone across the monitored fieldsuch that every point in the zone is (k, ω)-angle-covered.

Page 3: Minimum (k )-angle barrier coverage in wireless camera sensor …spacl.kennesaw.edu/donghyun.kim/papers/2014IJSNET.pdf · from multiple angles by k camera sensors. Figure 1 shows

Minimum (k, ω)-angle barrier coverage in wireless camera sensor networks 3

Figure 1 (a) (3,π/2)−angle−covered example and (b) camerasensing model

Our contributions are as follows.

• We study the minimum (k, ω)-angle barrier coverage(MkABC) problem, and to the best of our knowledge,we are the first studying it. Our model fits real lifeapplication more, and asks fewer sensors to monitorwith little information loss. We first study MkABCunder deterministic deployment, and since in practice,sensors can be dropped from aircraft to an inaccessiblezone, we then study MkABC under random deployment.

• In deterministic scenario, we use the geometric methodto solve MkABC. Given some camera sensors we canuse the feature of regular polygon to deploy a(k, ω)-angle barrier, in which each point is(k, π

k )-angle-covered.In the random deploymentscenario, the MkABC problem is more difficult since thecamera sensors’ viewing direction should depend on thegeometric relationship, and the camera sensors mustmonitor objects cooperatively. We come up with abetter solution which uses geometric method to analyseeach sub-region, to judge which part of a sub-region is(k, ω)-angle-covered and which part is not. After thatwe seek to use the minimum number of sensors toconstruct a (k, ω)-angle barrier under this scenario.

• Through the experiment, we not only check thedifferent impacts of different parameters on theprobability of successfully constructing a barrier, wealso confirm that our algorithms use fewer camerasensors compared to the state-of-art algorithms.

The rest of this paper is organised as follows. Related workwill be presented in Section 2. Section 3 will provide thenetwork model and the formal definition of MkABC. Section 4will introduce the techniques to deploy camera sensors toachieve (k, ω)-angle barrier coverage. A novel method willbe proposed to select camera sensors from an arbitrarydeployment to form a (k,ω)-angle barrier in Section 5. We willpresent the experimental performance evaluation in Section 6.Finally we will conclude the paper in Section 7.

2 Related works

Barrier coverage was first proposed in the context of roboticsensors Gage et al. (1992). The goal of barrier coverage is

to detect intruders with a barrier consist of sensors, to ensurethat any intrusion across the monitored region can be detected.Chen et al. (2008) investigated the quality of barrier coverage.Their work can identify it when the barrier performance isless than a predefined value and where a repair is needed.Based on probabilistic sensing model, Li et al. (2012) studiedthe problem of scheduling sensors energy-efficiently whileguaranteeing the detection probability of any intrusion acrossthe region. Chen et al. (2007) proposed a localised barriercoverage protocol to detect all intruders whose movementsare confined to a slice of the original strip region. In mostapplication scenarios, targets are required to be covered by ksensors. Ma et al. (2012a) studied energy efficient k-barriercoverage in limited mobile wireless sensor networks.

There is also much effort on WCSNs. Liu et al. (2009)proposed a dynamic node collaboration scheme for mobiletarget tracking in WCSNs, which can estimate the beliefstate of the target location efficiently. In order to obtainthe metric calibration in camera networks, Devarajan et al.(2006) modelled the set of uncalibrated cameras as nodes ina distributed cameras as nodes in a communication network,and proposed a distributed algorithm which is comparablecalibration accuracy to centralised bundle adjustment.

For coverage detection in WCSNs, Johnson and Bar-Noy(2011) proposed an optimal dynamic programming algorithmfor a geometrically constrained setting for the pan and scanproblem, in which cameras are configured to observe multipletarget locations. Then an efficient 2-approximation algorithmswere presented. To deal with the requirement of identifyingintruder’s face, Wang and Cao (2011a) first defined the full-view model, and proposed a full-view coverage verificationmethod. The authors also presented an estimate of deploymentdensity to achieve full-view coverage for the whole monitoredarea.

Barrier coverage in WCSNs was studied in Shih et al.(2010) first. Shih et al. proposed a distributed protocol calledCoBRA. Based on the full-view coverage model proposedin Wang and Cao (2011a), the authors further study theproblem of full-view barrier coverage (Wang and Cao, 2011b).They proposed a method to select camera sensors from anarbitrary deployment to form a camera barrier and presenteda deployment scheme such that each point of given line isfull-viewed. In Ma et al. (2012b) improved the method ofminimising the number of camera sensors needed. Full-viewedcoverage required a lot of camera sensors. The cost of camerasensor is fairly high and in many intrusion detection scenarioswe do not need full-viewed coverage, nevertheless we stillhope to get the intruders’ information as much as possiblewhen using as few camera sensors as possible. Thus a newsolution is needed. We come up with a new k-angle barriercoverage problem depending on (k, ω)-angle coverage model(Tseng et al., 2012) which is different from the full-viewedcoverage model.

3 Notations and model

In this paper, we regard the monitoring region of the WCSN asa rectangle regionR. The size of the regionR isW × L, where

Page 4: Minimum (k )-angle barrier coverage in wireless camera sensor …spacl.kennesaw.edu/donghyun.kim/papers/2014IJSNET.pdf · from multiple angles by k camera sensors. Figure 1 shows

4 B. Xu et al.

W and L are the width and length of the region respectively.There is a set of n camera sensors,S = s1, s2, ..., sn, whichis deployed in the regionR. The location of si (i = 1, 2, ..., n)is denoted by (xi, yi). The sensing area of each sensor is asector centred at the sensor with the sensing radius r and thesensing angle 2θ. Each si has a direction αi ∈ [0, 2π), whichis the angle between the centre line of camera sensor and x-axis. si covers the sector area with the radius ri between angleαi − θ and angle αi + θ. We denote the distance between anysensor si and an object oj by dis(si, oj), denote the vectorfrom the location of si to the location of oj by−−→sioj , and denotethe direction of −−→sioj by dir(−−→sioj). Given si’s current directionαi, we say that oj is covered by si if dis(si, oj) ≤ ri and−θ ≤ dir(−−→sioj)− αi ≤ θ. For example, in Figure 2(a), oj iscovered by si, but ok is not.

Each camera sensor node si has a unique ID, and it isaware of its own location and the boundary coordinates of R.Furthermore, each si collects the IDs and location informationof its neighbouring nodes within one hop through the beaconframe exchange.

Figure 2 (a) (3,π/2)−angle−covered example and (b) camerasensing model

We give some definitions as follows:

Definition 3.1 (barrier coverage): A sensor network isbarrier covered if any crossing path is covered.

Definition 3.2 ((k, ω)-angle coverage): Given an integer kand an angle ω, we say an object oj is (k, ω)-angle-coveredif there is a sequence of k sensors sx1 , sx2 , ..., sxk

orderedin counter-clockwise direction, such that oj is located in thesensing area of every sensor sxi , and ∠sxpojsxp+1 ≥ ω forp = 1...k (sxk+1

= sx1 ). A region is (k, ω)-angle-covered ifand only if every point in the region is (k, ω)-angle-covered.

Figure 2(b) shows an example, where oj is (3,π/2)-angle-covered by sensors s1, s2, and s3.

Definition 3.3 ((k, ω)-angle barrier): For a given boundedarea R, with one side being the entrance and the opposite sidebeing the destination, and a given constant angle ω, if there isa connected regionB inR, which is (k, ω)-angle-covered by asubset T of the camera sensors, such that all the crossing pathsfrom the entrance side to the destination side must intersectwith B, then we call B a (k, ω)-angle barrier, and T is calleda (k, ω)-angle barrier sensor set.

Definition 3.4 (minimum (k, ω)-angle barrier coverage(MkABC) problem): Given a camera sensor network over theobjective region R, the MkABC problem is to find a (k, ω)-angle barrier B which is (k, ω)-angle-covered by a subset Tof the camera sensors such that |T | is minimised.

4 (kkk,ωωω)-angle barrier deployment

In this section, we use the feature of regular polygon to deploya (k, π

k )-angle barrier.

Theorem 4.1: Given a regular polygon P with k vertexesv1, v2, ..., vk. For any two vertexes vi and vj and any point pinside P , ∠vipvj ≥ π

k .

Proof.: For a regular polygonP with k vertices v1, v2, ..., vk,it must have a circumscribed circle that passes through allthe k vertices. For any two adjacent vertices vi and vi+1

(vk+1 = v1), they separate the circumscribed circle into 2 arcs,vivi+1 and vivi+1

′. Without loss of generality we assume thatvivi+1 < vivi+1

′, and then the circumferential angle of vivi+1

is of degree πk . For each point within the circumscribed circle,

∠vipvi+1 ≥ πk (Figure 3(a)). Since every point p within the

regular polygonP must also be within its circumscribed circle,we have ∠vipvi+1 ≥ π

k . Therefore, for any two vertexes viand vj , ∠vipvj ≥ π

k . Based on Theorem 4.1, we design a method to deploy thecamera sensors to form (k, π

k )-angle barrier.First, we discuss how to get a regular shape which is (k, π

k )-angle-covered. Given a camera sensor si with sensing ranger and sensing angle 2θ which is related with the value of k,we obtain an inscribed circle I for the sector of si, and wedenote the centre of I as oi. Rotate −−→oisi 2π degrees, we obtainanother circle O centres at oi with the radius |−−→oisi|. Find kpoints v1, ..., vk on O to divide this circle into k equal arcs.i.e., vivi+1 = vjvj+1 for ∀1 ≤ i = j ≤ k. We place a camerasensors on each of the k points and place k camera sensorsin all. The k deployed camera sensors form a regular k-sidedpolygon. According to Theorem 4.1, the inscribed circle I is(k, π

k )-angle-covered, which can be seen from Figure 3(b).

Figure 3 (a) An example of regular triangle, π3=∠ACB=∠ADB

< ∠AEB=∠AFB and the shaded region is(4,π

4)-covered (see online version for colours)

Secondly, according to the region’s length, from left to right,we deploy camera sensors one by one to get a regular shape

Page 5: Minimum (k )-angle barrier coverage in wireless camera sensor …spacl.kennesaw.edu/donghyun.kim/papers/2014IJSNET.pdf · from multiple angles by k camera sensors. Figure 1 shows

Minimum (k, ω)-angle barrier coverage in wireless camera sensor networks 5

that is (k, ω)-angle-covered, and ensure that any two adjacentregular shapes are connected. After that we get a seamless(k, ω)-angle barrier. We have proved that the shadow circlein Figure 3(b) is (4,π4 )-covered according to Theorem 4.1. InFigure 4, we plot a (4, π

4 )-angle barrier we constructed which iscomposed of many same shadow circles. Note that in realisticscenarios, the barrier is not always like a straight line, and abarrier of any shape is easily deployed.

Figure 4 An example of (4, π4

)-angle barrier (see online versionfor colours)

Next, we explain the relationship between θ and k. The abovemethod is to construct a regular shape (a circle) which is(k, π

k )-angle-covered, since the regular shape can help useasily construct a (k, π

k )-angle barrier. According to actualconditions, the range of θ is always [0 ∼ π

2 ]. But in ourscenario, in order to get a regular (k, π

k )-angle-covered area,we need limit the maximum value of θ according to differentvalues of k. When k = 2, the maximum value of θ is π

2 .This is because two points only can construct a straight line.We can deploy two camera sensors and get an regular area(the inscribed circle) which is (2, π

2 )-angle-covered. However,when k > 2 and θ > (k−2)π

2k , the inscribed circle of a sensorsi’s sensing sector will not be inside the regular k-sidedpolygon P , which means the overlapping region of the twoshaded regions (see in Figure 5(a)), i.e., the area that is (k, π

k )-angle-covered will not be a circle. The reason is that (k−2)π

k isthe degree of each interior angle of a regular k-sided polygon.So if we want to get the inscribed circle of a regular k-sidedpolygon that is (k, π

k )-angle-covered, we have to ensure thatθ ≤ (k−2)π

2k . For example, in Figure 5(a), if the value of θis no more than π

6 , the inscribed circle of camera sensors isinside the regular triangle. The (3, π

3 )-angle-covered area isthe inscribed circle of the camera sensor whose sensing angleis θ (θ ≤ (k−2)π

2k ), and it is a regular shape.In the following, we consider (k, π

k )-angle barrierdeployment for a given region while each camera sensor mayadjust its angle.

Obviously, for (k, πk )-angle barrier deployment of a given

region, the bigger k is, the more camera sensors are required.If k keeps the same, and the larger the (k, π

k )-covered areaconstructed by the sensors is, the fewer camera sensors will berequired. Hence it’s important to study how to get the largest(k, π

k )-covered area for given k camera sensors.For a given camera sensor, the area of the inscribed circle

can be calculated by the following formula f(r, θ).

f(r, θ) = π

(r · sin θ1 + sin θ

)2

(1)

According to actual conditions, the range of θ is always [0 ∼π2 ], and it is easy to verify that function f(r, θ) is monotoneincreasing with respect to θ while r is fixed. When θ = π

2 ,the area of inscribed circle becomes largest, which equals halfof the area covered by a camera sensor. However this onlyhappens in the situation where k = 2. When k > 2, θ = (k−2)π

2kwhere r is a given constant, f(r, θ) achieves its maximum.Therefore, for a given region, k, and r, when θ = (k−2)π

2k , wecan use the minimum number sensors to construct a (k, π

k )-angle barrier for the region.

Figure 5 (a) The overlapping area of the two shaded regions is (3,π/3)-angle-covered but it’s not a circle and (b) a specialcase, k = 3 and θ = π/6 (see online version for colours)

Table 1 is an illustration of the above result. It shows howmany camera sensors are needed to build a (k, π

k )-angle barrierfor an area with length of 100 m, when k is from 3 to 7 andr = 10 m, 20 m and θ = π

6 ,π4 respectively.

Table 1 Camera sensors needed for constructing a 100 m barrier

(r, θ) (10 , π6) (10 , π

4) (20 , π

6) (20 , π

4)

k = 3 45 37 21 19k = 4 60 49 27 25k = 5 75 61 34 31k = 6 90 74 41 37k = 7 105 86 47 43

5 (kkk,ωωω)-angle barrier construction in randomdeployment

In practice, camera sensors are usually randomly deployed ina target area R which is a W × L rectangle. Hence we can notprecisely control the position of each camera sensor. In thissection, we propose a method to select camera sensors froman arbitrary deployment to form a (k, ω)-angle barrier.

As mentioned in the introduction, simply selecting camerasensors across the field with connected sensing range does notensure that each point of the barrier is (k, ω)-angle-covered.This is the key challenge here. Our method includes threeparts. Firstly, we partition the original field into some grids.Secondly, we judge if these grids are (k, ω)-angle-covered.Finally, we transform this problem into minimum weight pathin a graph, in which each node represents a small grid area

Page 6: Minimum (k )-angle barrier coverage in wireless camera sensor …spacl.kennesaw.edu/donghyun.kim/papers/2014IJSNET.pdf · from multiple angles by k camera sensors. Figure 1 shows

6 B. Xu et al.

that is (k, ω)-angle-covered, and two nodes are connected ifand only if the corresponding grids are adjacent in the originalfield. Note that, although our method deals with the rectangletarget area, it can be extended easily to the actual situation.

5.1 Initialisation phase

Initially, the region R is partitioned into L×W unit grids asshown in Figure 6(a). Each grid is assigned with a coordinates(i, j). The rules for assigning coordinates are as follows. Theupper left grid is assigned with coordinates (1, 1) initially.From Figure 6(a), the x-coordinate and y-coordinate increaseby one if the location of a grid shifts one position towardright and down directions, respectively. gi,j represents the gridwhose coordinates are (i, j). Each camera sensor node in thisphase will firstly identify the coordinate of the located grid.Since the sensing range may cover more than one grid, a gridmight be commonly covered by several camera sensor nodes.Another important task of each camera sensor node in theinitialisation phase is to evaluate the coverage degree of thegrid it covers. Let d denotes the coverage degree of grid.

Figure 6 (a) Grid-based region and (b) fully covered grids of siand sj

Definition 5.1 (Ga: the fully covered set of sa): A grid gi,jis called a fully covered grid of a sensor sa if gi,j is fullycovered by sa. The fully covered set of sa, denoted by Ga,consists of all grids that are fully covered by sa.

In Figure 6(b), the symbols marked in each grid represent theIDs of the camera sensors whose sensing ranges fully coverthat grid. As shown in Figure 6(b), grids g2,2, g2,3, g3,2 arefully covered by si while grids g2,3, g3,2, g3,3 are fully coveredby sj . Therefore, we have Gi = g2,2, g2,3, g3,2 and Gj =g2,3, g3,2, g3,3. Since the field-of-view (FoV) angle and theorientation vector of the camera sensor’s lens are known, eachsensor sx can evaluate the coverage degree of each grid gi,j ∈Gx based on the neighbouring information of sx. Partitioningthe target region into a number of unit grids can simplify the(k, ω)-angle-barrier coverage problem.

5.2 (k, ω)-angle-covered judging phase

Now we judge how a subregion is (k, ω)-angle-covered.Given two camera sensors si and sj , first we can find

two different points p and p′ on the perpendicular bisectorof segment sisj that lay on different sides of sisj , such that

∠sipsj =∠ sip′sj =ω. Without loss of generality, suppose p is

on the left side and p′ is on the right side, as shown in Figure 7.Let sipsj be the arc on the circumscribed circle ofsipsj , andsip′sj be the arc on the circumscribed circle of sip

′sj . Thearea surrounded by sipsj and sip′sj is called the safe region.In fact, for any circle and a fixed chord (sisj in Figure 7) of thecircle, any two inscribed angles with the chord’s two endpointsare either equal or supplementary to each other. Specifically,they are equal if their third points, i.e., the angle vertices areon the same side of the chord. Furthermore, for a given pointpi, if pi is inside the circle, then ∠sipisj > ∠sipsj (p1 inFigure 7); if pi is outside the circle, then ∠sipisj < ∠sipsj(p2 in Figure 7).

Figure 7 The whole shaded region is safe region(si, sj); here∠siP1sj > ∠siPsj = ∠siP ′sj > ∠sip2sj

Obviously, if a grid is (k, ω)-angle-covered, it must be coveredby at least k camera sensors. So for the grid gi,j whosecoverage degree is equal or greater than k, we need to estimatewhether it is (k, ω)-angle-covered. Let Sc = sx1 , ..., sxk

denote the clockwise order of a sensors set s1, ..., sk. Wehave the following theorem.

Theorem 5.1: A region Ω is (k, ω)-angle-covered by Sc ifand only if Ω is covered by every sensor in Sc and is insidethe polygon bounded by sxisxi+1 |1 ≤ i ≤ k, and ∀1 ≤i ≤ k, Ω is inside the safe region generated by sxisxi+1 .(xk+1 = x1).

Proof: Given a line segment li, by using our method we canget a safe region Ωi, which means that every angle whosevertex is a point inside Ωi and two rays pass the two endsof the line segment li respectively is greater than ω . So foreach line segment sxisxi+1 , i = 1, 2, ..., k, we obtain a regionΩi, i = 1, 2, ..., k. The intersection of all regions Ωi willform a new subregion Ω′, which is (k, ω)-angle-covered bySc = sx1

, ..., sxk according to Definition 3.2.

For a grid gi,j whose coverage degree equals k, i.e., there arek camera sensors each of which fully covers gi,j , we can judgeif grid gi,j is (k, ω)-angle-covered by Theorem 5.1. If d > k,we need to estimate each subset from that contains k sensorsfrom d sensors and judge if the grid is (k, ω)-angle-coveredby the subset. The example in Figure 8 illustrates our idea. Inthis example, there are three camera sensors covering regionR′ which is partitioned into 2×2 equal-sized unit grids. We

Page 7: Minimum (k )-angle barrier coverage in wireless camera sensor …spacl.kennesaw.edu/donghyun.kim/papers/2014IJSNET.pdf · from multiple angles by k camera sensors. Figure 1 shows

Minimum (k, ω)-angle barrier coverage in wireless camera sensor networks 7

draw the boundaries of the safe regions for the three pairs ofneighbouring sensors(indicated by dotted arcs), and check ifthey intersect with R′. Note that in computation this can bedone by comparing the distance between the circle’s radius.As what is plotted in Figure 8, grid 1 and 3 is 3-coveredby camera sensors, but grid 3 is not full covered by the saferegion of these camera sensors. So only grid 1 is (3, π

2 )-angle-covered. Let Iv be the subset of all sensors which (k, ω)-anglecovers the grid corresponding to v, grid 1 is covered by I1 =v1, v2, v3.

Figure 8 An example of (3,π2

)-angle coverage, here grid 1 and 3 isfully covered by all camera sensors, but only grid 1 is(3,π

2)-angle-covered

After we judge whether a grid is (k, ω)-angle-covered,we construct a weight graph G = (V,E,w). The detail isdescribed as follows:

Let Vi denote a set of all the upper left grids which are (k,ω)-angle-covered, Vo denotes the set of all the most right-upgrids which are (k, ω)-angle-covered. Let Vr be the rest gridswhich are (k, ω)-angle-covered but are not in Vi or Vo, andV = Vi ∪ Vo ∪ Vr. For any two nodes u and v in Vr, if theircorresponding grids are connected, then add (u, v) into E. ∀ u∈ Vi, if there is a vertex v ∈ Vr and their corresponding gridsare connected, add (u, v) to E; ∀ u ∈ vo, if there is a vertex v∈ Vr and their corresponding grids are connected, add (v, u)to E. ∀ v ∈ V , w(v)= Iv , where Iv is the subset of all sensorswhich (k, ω)-angle cover the grid corresponding to v. Finally,add a virtual source node s to V , and ∀ u ∈ Vi, add an edge(s, u) to E; add a virtual sink node t to V , and ∀ u ∈ Vo, addan edge (u, t) to E. w(s) = w(t) = ∅.

5.3 (k, ω)-angle barrier construction phase

When G = (V,E,w) is constructed, we can simplify thegraph by removing all other connected components exceptthe components including s and t if G is not connected,since no barrier for R exists in these components. After thesimplification, if there exists a path from s to t, then there is aseries of connected grids in original area all of which are (k,ω)-angle-covered. Our object is to find a camera sensor barrierB which has the minimum number of required active camerasensors. Our MkABC problem can be converted to finding a

path from s to t in G containing minimum number of camerasensors.

Let P (v) be a path from s to v, e.g., P (v) =s, vi1 , ..., vim , v, the following lemma stands.

Lemma 5.2: For any path from s to t in G,∪

v∈P (t)Iv is a(k, ω)-angle barrier.

Proof: Since every node in any path from s to t denotes a gridwhich is (k, ω)-angle-covered, and two grids is connected byan edge only when they share at least a point. So any path froms to t actually corresponds to a connected subregion from theleft bound to right bound of region R, and the subregion is(k, ω)-angle-covered. Therefore,

∪v∈P (t)Iv is a (k, ω)-angle

barrier, a potential solution for the MkABC problem. Based on Lemma 1, we design an algorithm for MkABCproblem. We first define the weight of a path in G.

Definition 5.2 (weight of path): For any pathP (v) from s tov in G = (V,E,w), Let W (P (v)) denote the weight of P (v),W (P (v))=|

∪vi∈P (v)Ivi |.

To get the solution for MkABC problem, we use anoptimal algorithm for Minimum weight s− t Path Problem(MWstP ) (Ma et al., 2012b), which is to find a path P (t)from s to t such that W (P (t)) is minimised.

The algorithm for the MkABC problem is described asfollows:

Since there are |V | grids which are (k, ω)-angle-covered, andthe total number of vertexes in auxiliary graph is O(|V |). Thetime complexity of Algorithm 1 is O(|V | log |V |), and thetotal time complexity of solving MkABC problem is O(|V |)+ O(|V | log |V |)=O(|V | log |V |).

Page 8: Minimum (k )-angle barrier coverage in wireless camera sensor …spacl.kennesaw.edu/donghyun.kim/papers/2014IJSNET.pdf · from multiple angles by k camera sensors. Figure 1 shows

8 B. Xu et al.

6 Performance evaluation

In this section, we show our algorithm’s efficiency throughsimulations. We first show how probabilities of constructingthe general barrier (Zhang et al., 2009) and (k,ω)-angle barriercoverage vary as the number of deployed camera sensors variesto verify that our algorithm asks fewer sensors. Second weexamine all parameters to check their impacts on (k, ω)-anglebarrier construction.

6.1 The comparison of full-view barrier coverage and(k, ω)-angle barrier coverage

In WCSNs, (k, ω)-angle barrier means to find a connectedbelt region and every point in the connected belt region is(k, ω)-angle-covered, while full-view barrier coverage needto find a connected belt region and every point is full-viewcovered Wang and Cao (2011b). According to their differentdefinitions, the acquired information of them is different.Using full-view barrier coverage is more likely to get theinformation of the intruders’ face, while the (k, ω)-anglebarrier coverage gets more comprehensive information yetthe details of intruders’ face it observed may not as goodas that observed by full-view barrier coverage. We comparethe probability of constructing a full-view barrier coverageand that of constructing a (k, ω)-angle barrier coverage. Thefollowing is our simulation settings. The target field R is a 50× 200 m rectangle region. For each camera sensor, r = 40 m,θ = π/4. Let k = 3 and ω = π/3, furthermore, let the effectiveangle ϕ that is used to construct a full-view barrier (Wang andCao, 2011b) be 2π/3. Camera sensors are deployed randomlyand uniformly in the deployed field. We change the numberof deployed camera sensors from 500 to 2500 to evaluate theprobability of constructing a full-view barrier coverage andthe probability of constructing a (k, ω)-angle barrier coverage.To get the probabilities we run the experiments 100 times toverify if there exists a (k, ω)-angle camera barrier or a full-view barrier, and take the average value as the result. As shownin Figure 9, we can observe that the probability of the existenceof a (3, π/3)-angle barrier is almost 1 when the number ofdeployed cameras is beyond 2000, it is approximately 500 lessthan the number of camera sensors required to construct a full-view barrier. With the decrement of ϕ, the gap grows larger.Since when ϕ becomes smaller, getting a full-view coveredsubregion needs more camera sensors. This result is consistentwith our expectation.

In this subsection, we study the effects of the size of theunit grid, k and ω on the probability of constructing a (k, ω)-angle barrier. In this experiment, the target field is 50× 200 m.The camera sensors are deployed randomly in the target field.The sensing range r = 40 m and the angle θ = π/4. Werun experiment 100 times and take the average value as theprobability for constructing a (k, ω)-angle barrier. As whatFigure 10 illustrates, the probability of constructing a (3, π/3)-angle barrier increases when the number of deployed camerasensors increases. We also find that when the unit grid’s size

becomes smaller, the more grids which are (3, π/3)-angle-covered can be obtained. On the other hand, as the number ofdeployed camera sensors increases, the number of grids whichare (3, π/3)-angle-covered also increases, and the probabilityof constructing a (3, π/3)-angle barrier becomes more higherwhen the number camera sensors increases and the unit grid’ssize decreases.

Figure 9 Full-view barrier coverage and (3, π/3)-angle barriercoverage probability vs. number of deployed camerasensors (see online version for colours)

Figure 10 (3, π/3)-angle barrier coverage probability on differentsizes of grids (see online version for colours)

Figure 11 shows the variation of the probability of constructinga (k, π/3)-angle barrier when k = 2, 3, 4. When all the otherconditions are the same, if k is smaller, the probability ofconstructing a (k, π/3)-angle barrier is higher. Note that inour model, the cooperation of camera sensors is necessary forconstructing the (k, ω)-angle barrier. In the case where ω isfixed, our results show that constructing a (k, ω)-angle barrieris more and more difficult with the increment of k. Thismeans that the influence of each camera sensor relates with

Page 9: Minimum (k )-angle barrier coverage in wireless camera sensor …spacl.kennesaw.edu/donghyun.kim/papers/2014IJSNET.pdf · from multiple angles by k camera sensors. Figure 1 shows

Minimum (k, ω)-angle barrier coverage in wireless camera sensor networks 9

k, in order words, the cooperation between camera sensorsbecomes difficult when k becomes larger. Figure 12 showsthat the probability for constructing a (3, ω)-angle barrierwhen ω = π/4, π/3, π/2 separatively. When ω increases, theprobability of constructing a (3, ω)-angle barrier decreases.This phenomenon becomes more obvious when the value ofω gets larger.

Figure 11 (k, π/3)-angle barrier coverage probability ondifferent k (see online version for colours)

Figure 12 (3, π/3)-angle barrier coverage probability ondifferent ω

At last, we examine the influence of camera sensor’sfeatures, i.e., its sensing radius r and sensing angle 2θon constructing a (k, ω/k)-angle barrier. From Figure 13we see that, the two curves of k = 2 are closer to eachother than the other two curves of k = 3 are. When wefix k, the probability of constructing a (k, ω)-angle barrierincreases with the increment of its radius r. Figure 14 showsthat the probability of constructing a (3, π/3)-angle barrierincreases when θ increases. If every camera sensor covers alarger area, constructing a (k, ω)-angle barrier will becomeeasier.

Figure 13 (3, π/3)-angle barrier coverage probability on differentradiuses of camera sensors (see online versionfor colours)

Figure 14 (3, ω)-angle barrier coverage probability on differentangles of camera sensors (see online versionfor colours)

7 Conclusion

In this paper we study the minimum (k, ω)-angle barriercoverage problem based on the (k, ω)-angle-covered modelwe propose. We present techniques to deploy camera sensorsto get (k, ω)-angle barrier coverage. We also propose a novelmethod to select camera sensors from an arbitrary existingdeployment to form a (k, ω)-angle barrier. At last, in oursimulation, we confirm that our techniques reduce the numberof sensors required comparing to the state-of-art algorithms,and we also analyse how the number of sensors, the size of theunit grid, the required coverage degree k, the sensor’s sensingradius r and sensing angle 2θ impact on the probability ofconstructing a barrier.

Page 10: Minimum (k )-angle barrier coverage in wireless camera sensor …spacl.kennesaw.edu/donghyun.kim/papers/2014IJSNET.pdf · from multiple angles by k camera sensors. Figure 1 shows

10 B. Xu et al.

Acknowledgements

This research jointly supported in part by the National NaturalScience Foundation of China under grant 91124001 and theFundamental Research Funds for the Central Universities,and the Research Funds of Renmin University of Chinaunder grant 10XNJ032. This research was also supported inpart by US National Science Foundation (NSF) CREST No.HRD-1345219.

References

Cardei, M., Thai, M.T., Li, Y. and Wu, W. (2005) ‘Energy-efficienttarget coverage in wireless sensor networks’, Proceedings ofIEEE INFOCOM’05, 2005, pp.1976–1984.

Chang, C., Hsiao, C. and Chang, C. (2012) ‘The k-barrier coveragemechanism in wireless visual sensor networks’, Proceedings ofIEEE WCNC, 2012, pp.2318–2322.

Chen, A., Kumar, S. and Lai, T.H. (2007) ‘Designing localizedalgorithms for barrier coverage’, Proc. of the ACM Mobicom,2007, pp.63–74.

Chen, A., Lai, T. and Xuan, D. (2008) ‘Measuring andguaranteeing quality of barrier-coverage in wireless sensornetworks’, Processings of the ACM International Symposium onMobile Ad Hoc Networking and Computing(MobiHoc), 2008,pp.421–430.

Devarajan, D., Ranke, R.J. and Chung, H. (2006) ‘Distributed metriccalibration of ad-Hoc camera networks’, ACM Transactions onSensor Networks (TOSN), Vol. 2, No. 3, pp.380–403.

Gage, D. (1992) ‘Command control for many-robot systems’, Proc.of the Nineteenth Annual AUVS Technical Symposium (AUVS-92), 1992, pp.22–24.

Johnson, M.P. and Bar-Noy, A. (2011) ‘Pan and scan: configuringcamreas for coverage’, Proc. of IEEE INFOCOM, 2011,pp.1071–1079.

Li, J., Chen, J. and Lai, T.H. (2012) ‘Energy-efficient intrusiondetection with a barrier of probabilistic sensors’, Proceedings ofIEEE Conference on Compute Communications(INFOCOM),2012, pp.118–126.

Liu, L., Zhang, X. and Ma, H. (2009) ‘Dynamic node collaborationfor mobile target tracking in wireless camera sensor networks’,Proc. of IEEE INFOCOM, 2009, pp.1188–1196.

Ma, H., Li, D., Chen, W., Zhu, Q. and Yang, H. (2012a)‘Energy efficient k-barrier coverage in limited mobile wirelesssensor networks’, Computer Communications, Vol. 35, No. 14,pp.1749–1758.

Ma, H., Yang, M., Li, D., Hong, Y. and Chen, W. (2012b) ‘Minimumcamera barrier coverage in wireless camera sensor networks’,IEEE INFOCOM 2012, pp.217–225.

Shih, K.P., Chou, C.M., Liu, I.H. and Li, C.C. (2010) ‘On barriercoverage in wireless camera sensor networks’, Proc. of IEEEAINA, 2010, pp.873–879.

Tseng, Y., Chen, P. and Chen, W. (2012) ‘k-Angle object coverageproblem in a wireless sensor network’, IEEE Sensor Journal,Vol. 12, No. 12, December, pp.3408–3416.

Wang, Y. and Cao, G. (2011a) ‘On full-view coverage incamera sensor networks’, Proc. of IEEE INFOCOM, 2011,pp.1781–1789.

Wang, Y. and Cao, G. (2011b) ‘Barrier coverage in camera sensornetworks’, Proc. of the ACM MobiHoc, 2011, 16–19 May, Paris,France.

Wang, Y., Chen, Y. and Tseng, Y. (2011) ‘Using rotatable anddirectional (R&D) sensors to achieve temporal coverage ofobjects and its surveillance application’, IEEE Transactions onMobile Computing, Vol. 11, No. 8, pp.1358–1371.

Zhang, L., Tang, J. and Zhang, W. (2009) ‘Strong barrier coveragewith directional sensors’, IEEE GLOBECOM, 2009, pp.1–6.

Zhou, Z., Das, S. and Gupta, H. (2004) ‘Connected k-coverageproblem in sensor networks’, Proceedings of the 13th AnnualConference of the IEEE International Conference on ComputerCommunications and Networks(ICCCN), 2004, pp.373–378.