Lab05: Active Vision

There are 2 kinds of imaging: passive imaging where we sense the world, and active vision where we sense how the world responds to active stimulus, e.g. for active vision we may send out energy such as radio waves (for radar), sound waves (for sonar), or light (for lidar), and measure the response.

In this lab you will understand fundamentals of active vision.

Let us begin with Part A of the lab by understanding a simple example of active vision, namely marine radar. An early example of WaterHCI was marine radar for navigating vessels. Manfred Clynes coined the term "cyborg" for a combination of human and machine that's natural and intuitive and his favorite example was someone riding a bicycle. Equivalently we might have someone in a vessel. The invention of the boat is older than even the invention of the wheel, and the invention of clothing. The world's first cyborgs were likely boaters more than a million years ago.

In this lab we'll begin with marine radar. Consider the IPIX radar data from Simon Haykin's research team. Here are two Doppler radar recordings, one with a growler present and the other without.

Take a look at the data in terms of time-frequency (spectrogram), i.e. sliding window Fourier transform, and see if you can understand the growler's motion. For your convenience I have computed the spectrogram of the data and put there as .jpg and the data record is 32 seconds long, and the frequency range is -500 CPS (Cycles Per Second) to +500 CPS, as a square image array which I have cropped to quarter and half (i.e. +-250 CPS and +-125 CPS in the 2 images).

Part 1: Reproduce these results for both growler and non-growler data. Comment on your findings and provide a detailed analysis. Explain the patterns you see in the spectrogram for each.

Part 2: HDR radar... previously we created the world's first HDR radar. More recently we have built another test implementation, and provided data in HDR_radar_data.

Construct spectrogram results from this data and explain. Some examples are falling objects which should give you a linear frequency modulation (linear FM) effect in time-frequency, i.e. a q-chirp.

Other examples are better modeled as warblets (frequency goes up-and-down).

The first 4 columns are the real part and the next four are the imaginary part. See ABOUT_DATA.txt

Here's an explanation of some of the data.

KYLE_HEART_BASELINE.txt is Kyle's heart at rest;
KYLE_HEART_ACTIVE.txt is Kyle's heart during exercise.
STEVE_HEART_EX1.txt Steve's heart example 1
STEVE_HEART_EX2.txt Steve's heart example 2

SPIN_WARBLET.txt this is a warblet due to spinning.
RULER_8_IN.txt ruler pluck also generates a warblet; 8 inches, high freq.
RULER_16_IN.txt ruler pluck also generates a warblet; 16 inches, lower freq.
Bonus marks: compute chirplet transforms and provide insight.

Example application: with radar we can look out into a large crowd of people and see inside each person's heart. Using HRV (Heart Rate Variability) we can try to estimate health, e.g. CoViD or other health parameters and try to achieve covid-detection-at-a-distance.

With SONAR we can see into hearts of swimmers at-a-distance. Could we use this as a new form of WaterHCI?

Feel free also to create something to present at WaterHCI on Wednesday. Wednesday March 30th, join us online. WaterHCI.com

Save the date!