CEPANS / FOVCoverage

By: Jacob Adriaens


The incredible continuing drop in the cost of low power CMOS imagers has recently made them extremely attractive additions as potential, large scale, sensor networking components. In addition to these enablers that provide the push, the numerous uses of video surveillance for security, monitoring of habitats, road traffic, public venues, and many other scientific, consumer, industraial, and military applications provide the necessary pull for more rigorous studies of these types of networks. Although in many senses, video surveillance networks are similar to traditionally envision, modeled, and now in some cases deployed "sensor networks", at least two attributes make them unique, requiring more specialized attention: First, streaming video data typically requires much higher bandwidth than the typical low data-rate sensor networks that periodically sample temperature, humidity light, pressure, velocity, seismic activity, etc. Second, the directionality of the sensors in terms of their sensed regions become significant system design and use issues with typical video and image sensors. The dircetionality property and the limited field of view of such sensors and their impact on the overall system coverage is our main topic of interest here. More specifically, our strategic and long term goals with this work is the formal study of sensor network coverage, the associated deployment and scheduling problems, as well as statistical and theoretical properties of large scale sensor networks that are comprised of field-of-view sensors (such as video cameras).


Breach Algorithm

We have developed and implemented a provably optimal algorithm for finding the breach in FOV sensing networks, which indicates the minimum observed distance to any sensor that any target traveling through the field must have, even if the target optimally tries to avoid the sensors. The algorithm has an O(n^2) worst case complexity and O(n*log(n)) average case complexity.


We have created a stand alone simulator written for C++, which implements the algorithm we have developed for calculating maximal breach in field-of-view networks. The simulator uses the OpenGL/Glut libraries for a visualization and may be compiled without graphical support. The code was originally developed in Linux, but will compile and run in Windows and Solaris as well. The simulator provides the following features:

  • Runs maximal breach algorithm assuming sensor FOV is an isosceles triangle
  • Provides visualization of:
    • Sensor locations
    • FOV locations
    • Voronoi edges
    • Edge weights
    • Maximal breach path
  • Animates random-waypoint mobility of sensors
  • Animates random rotation of sensors
  • Simulates a grid approximation of Voronoi edges for algorithm verification
  • Simulates without Voronoi edges to illuminate the merits of Voronoi edges
  • Source code: Breach v2.4 (zip)

Empirical Results

We have run a number of simulations and are able to show the following:

  • There is a distinct critical point after which the probability of a full-breach (an unobserved path) is almost none when increasing any of the following parameters: number of sensors, angle of sensor observation and sensor range.
  • Average breach decreases logrithmically as the the number of sensors is increased.
  • Once the probability of a full-breach becomes sufficiently low, average breach is unaffected by further increases in sensor range.
  • Sensor position becomes less critical as the number of sensors increases and as angle of sensor observation increases.
  • Sensor orientation becomes less important as the number of sensors increases, however the relative importance of orientation is unaffected by changing the angle of sensor observation.


  • J. Adriaens, S. Megerian, M. Potkonjak, "Optimal Worst-Case Coverage of Directional Field-of-View Sensor Networks." Third annual IEEE Communications Society Conference on Sensor, Mesh and Ad Hoc Communications and Networks (SECON 2006), Vol 1, pp. 336-346, September 2006. Link
    Abstract: Sensor coverage is a fundamental sensor networking design and use issue that in general tries to answer the questions about the quality of sensing (surveillance) that a particular sensor network provides. Although isotropic sensor models and coverage formulations have been studied and analyzed in great depth recently, the obtained results do not easily extend to, and address the coverage of directional and field-of-view sensors such as imagers and video cameras. In this paper, we present an optimal polynomial time algorithm for computing the worst-case breach coverage in sensor networks that are comprised of directional "field-of-view" (FOV) sensors. Given a region covered by video cameras, a direct application of the presented algorithm is to compute "breach", which is defined as the maximal distance that any hostile target can maintain from the sensors while traversing through the region. Breach translates to "worst-case coverage" by assuming that in general, targets are more likely to be detected and observed when they are closer to the sensors (while in the field of view). The approach is amenable to the inclusion of any sensor detection model that is either independent of, or inversely proportional to distance from the targets. Although for the sake of discussion we mainly focus on square fields and model the sensor FOV as an isosceles triangle, we also discuss how the algorithm can trivially be extended to deal with arbitrary polygonal field boundaries and sensor FOVs, even in the presence of rigid obstacles. We also present several simulation-based studies of the scaling issues in such coverage problems and analyze the statistical properties of breach and its sensitivity to node density, locations, and orientations. A simple grid-based approximation approach is also analyzed for comparison and validation of the implementation.

Future Work

In the near future we plan to examine the following:

  • How can a given layout of sensors be changed to provide better coverage.
  • Different models of coverage such as the number of sensors to observe an object or exposure time vs exposure distance.
  • Extending the FOV model into 3D space.
  • Characterization of FOV's for better modeling.
  • Distributed algorithms for calculating coverage.