The current scientific community is going through a remarkable period of transition and growth, reflecting large strides in data-driven methodologies. An interesting aspect of this data revolution is that imaging sensors and modalities are becoming a major source of data. From mega scale (e.g. satellite and space imaging) to human scale (face recognition, vehicular data, etc) to nanoscale (e.g. sub-cellular structures and electron microscopy imaging), tremendous amount of data is being generated using conventional and novel imaging technologies. This phenomenon represents a massive growth in all aspects of data, including resolution, volume, information-content, and spatiotemporal frequency. Such newer data sources are creating newer challenges and inspiring newer solutions! A common feature of these imaging mechanisms is that they give rise to **structured data**. That is, the data (images) contain **objects of interest** and the major goal are to **understand and analyze roles of these objects in larger sy**stems.

We treat the problem of understanding objects in complex data as that of statistical inference and pose/address some fundamental issues: How can we mathematically represent shapes of objects of interest? What methods can be invented to quantify, compare, and analyze shapes of objectss? To address this, one needs to analyze, in a proper mathematical and statistical fashion, appearances or occurrences of these objects in the data. While the need for shape analysis is clear, this task is not easily accomplished. Shape is broadly defined to be a characteristic that is left after certain undesired (nuisance) transformations — rigid motions, global scaling, parameterizations, etc — have been removed. This makes shape analysis both challenging and interesting.

Our research follows closely **Grenander’s school of pattern theory** that advocates the following three steps:

1. Create representations in terms of algebraic systems with probabilistic superstructures

2. Analyze structures from the perspective of statistical inferences.

3. Develop efficient algorithms to facilitate applications.

### Like this:

Like Loading...