Personal Technologies Journal
Special Issue on Wearable Computing and Personal Imaging
Introduction to the Special Issue
Steve Mann, Chair (Guest Editor of the Special Issue)
On the Bandwagon or Beyond Wearable Computing?
Why are ``wearables'' suddenly all the rage? What is it that makes
wearable technology so compelling? The recent fashion shows depicting
fanciful yet nonfunctional units suggest a purely aesthetic motive
inspired by earlier generations of functional units.
Indeed, at a San Francisco bar/restaurant, after a lecture series I
once gave on the West Coast, someone almost reached out and touched
me and the early live and open 6000 volt cathode-ray-tube-based
system I was wearing. You see, it's become a West-Coast phenomenon
to wear non-functioning electronic devices as a fashion statement,
and of course these become focal points for curious hands,
just like slash-and-burn scarification, multiple body piercings,
and various other freakshow fashions. Not just to say that people are
losing respect for live electric circuits (not that unwanted touching has
ever been an acceptable thing), but moreover to ask the question:
``Are wearables exciting because they look strange?''.
I hope not, for otherwise we've become a Spectacular Research Society.
In an era when we could rigorously explore some of the early visions
of wearable computing, let us not instead build a field of fluff
in the society of the spectacle.
Hence a reasonable question to ask in the postmodern era of absent objective
scientific reality (e.g. predicting the future and inventing science
through vociferous assertion) and the pastmodern era of wearable computing
fashion shows, is: ``What are the real scientific, and theoretical issues?'',
or ``Where is the real research?''.
And this ``real research'' can certainly include the work of hobbyists,
tinkerers, and other passionate individuals, who, though without the
budget or means of major research labs, are nevertheless the ones that
invented the field; that work was seminal and will hopefully continue
to inspire scientific thought by others. Such efforts, although outside
the traditional academic research labs, if taken seriously, have and
will continue to be of great value. It is these very efforts that should
be built upon using the vast funding and infrastructure of major academic
research labs, while leaving the hype to be taken care of by the
spectacular commodity society outside the academic environment.
Hype tends to feed upon itself and certainly doesn't need funding
or philanthropy to flourish.
Moreover, Early work in wearable computing was characterized by a
healthy absence of any hegemonic ``alpha male'' phenomenon --- it was
driven by passion for truth, discovery, and exploration, free of
vociferous assertion for the purpose of building an empire. Thus
the 1970s and 1980s perhaps constitute the ``golden era'' of wearable
computing --- the era in which the work was driven by passion rather
than desire for disciples.
It is my hope that we will see a return to some of these values
in the coming years, and that thoughts and ideas will follow a stream of
meritocracy (advancement based on ability or past achievement) rather
than any vociferous leader. It is truly a field in which individuals
can make up their own minds rather than blindly looking to others
for facts, concepts, or information they can verify or gather themselves.
The true spirit of science (including the true spirit of the high-school
hobbyist tinkering in the basement at home) is one of verification of
basic ``facts'', and of personal empowerment through reliance on self.
And not to say, either, that quality and depth of research need be
limited to science and engineering. There are also many great
and worthy avenues of pursuit for wearable computing as it pertains to
the fine-arts, to philosophy, to scholarly cultural criticism, etc..
Indeed, these remain as areas characterized by healthy dissidence
and healthy individual free-spirited thinking.
It just seems that within the domain of science and engineering,
there has recently been a confused sense of panic as various
entities suddenly desire to climb onto the bandwagon ,
without a clear sense of where it is going, so long as it is
going somewhere at a sufficiently frenetic pace.
A colleague once told me about how Digital Equipment Corporation
(DEC) named their computers ``PDP'' (those who recall entering
machine instructions into a PDP8, PDP11, or the like will
no doubt reminisce about these wonderful machines), and how this
acronym stood for Programmable Data Processor because
the marketing folks felt that the term ``computer'' had earned
a bad name. You see, the term ``computer'' had been all the
rage, and the resulting hype had created inflated expectations.
Whenever the popular press and mainstream media inflate the expectations
of a research field, beyond what can actually be delivered,
the tendency is for a backlash against the field, owing
to the fact that it most likely cannot meet the inflated
expectations. Therein lies the danger of press releases, especially
attempting to do science with press releases.
A similar thing happened to the Artificial Intelligence
(AI) field. Great things were promised by researchers in this field,
there was tremendous hype, and finally when the research failed to
meet the inflated expectations, there was a backlash against AI.
Another problem in the ``wearables'' field is the tendency for
results to be published before the research is done. As the hype
increases, there is pressure to publish, and the result tends to be
speculative rather than substantive papers. Indeed, as it takes some time
to actually design and build systems, what happens is that
results are presented based on systems that are still being
built. Papers then become ``place-holders'' for future work-in-progress,
as people attempt to obtain recognition for work they plan on doing,
but have not yet done.
This tendency continues to escalate the inflated expectations
of the field, and could also contribute to the coming backlash against it.
It is my hope that wearable computing
will not suffer this same fate, though it seems inevitable it
will, unless we can go beyond wearable computing itself, and
look to some of the real research issues.
SMART CLOTHING VERSUS SMART UNIFORMS:
The other direction of this research which I find most disturbing
is the tendency toward smart uniforms, and toward means
of increased subordination of individuals by large organizations .
Indeed, wearable technology has recently been proposed as a mechanism
for imprisonment  in the form of remote-control obedience cuffs
worn around the ankle for tracking, and sometimes pain-giving purposes.
More recently waist-worn obedience belts have been proposed and used
in the United States as well as other areas. Amnesty International
is currently calling for a ban on such telematic torture devices.
Such "teletorture" devices raise the spectre of
a totalitarian society, or at least threaten to undermine individual freedom.
Obedience devices have two attributes: the ability to observe (e.g.
through some means of surveillance
or tracking) and the ability to control (e.g. through teletorture)
the individual. This combination of observability and controllability
is what gives rise to a pre-programmed punishment algorithm that
enslaves the wearer in the feedback loop of an automated process.
Even technologies that contain only one of these
two channels (e.g. cellular phones that track the whereabouts of the
user at all times) suggest a disturbing trend.
Part of my goal in creating machines to function as true extensions
of the mind and body was based on the existential principle of
self-determination and mastery over our own destiny. Whether these
machines imprison us in spaces of ever smaller confinement, or
whether instead they truly
make the world a better place, will be determined, hopefully, by
responsible engineers and scientists, with an awareness of many of
the underlying issues.
In many ways, the interrogative artists, philosophers, and
``cultural engineers'', often with their heightened sense of
awareness of the human condition, may well provide valuable input
toward the design of
a more humanistic personal intelligence apparatus of the future.
Unlike many other fields of research, wearable technology
is very personal. This insertion of technology
into the prosthetic territory
of the individual can be either a violation of personal space,
or a true enhancement of personal capability that captures a
wonderful aspect of the essence of the human-environment boundary.
POSSIBLE NEED FOR A CLEAR DEFINITION OF WEARABLE COMPUTING:
Clothing is wearable technology, yet is taken for granted.
Is a wristwatch a wearable computer? It ``computes'' time, and
displays this ``computed'' result to the wearer.
On the other extreme is the notion of having everything we need
and use be wearable. We could carry with us our supply of
food, electricity, etc.; clearly, short of a wearable
nuclear reactor, there are limits on what we can wear.
On the hideous end of the spectrum, it was once said
that a diaper is just a wearable ``restroom'' facility.
That being said, there must be a compromise between what we
wear and what we rely on the environment to provide. And there
will be certain tasks or capabilities
better served by wearable technology than environmental technology,
and vice-versa. I like to think that personally relevant
``light current'' (information,
etc.) will be more in the wearable spirit, while
``heavy current'' (electricity for lighting the environment,
charging batteries, etc., as well as general non-person-specific
information) will be more in the environmental
spirit. Thus suggests ``smart'' materials for personal information
on the body, and ``dumb'' materials (infrastructure in its most raw form)
in the environment. At the very least, we might ask the question ``since
we have intelligent highways,
smart floors, smart furniture, smart lightbulbs, smart toilets,
smart elevators, ... why not have ``smart people'' --- people
equipped with information processing hardware.
This ``smart people/dumb environment'' paradigm suggests
an alternative to ``smart rooms'' and other environmental intelligence
gathering infrastructure. Moreover, the ``smart people/dumb environments''
framework solves the privacy issues,
as well as the customization and user-preference issues, by allowing
each individual to ``own'' his or her own ``bits'', as well as to set
forth and customize the protocol for interacting with the world.
A simple example is that of heating and cooling controls and other
such matters that have recently become the domain of ``smart buildings''
and intelligent occupancy sensors. Instead, a ``dumb'' building
which merely ``obeyed'' commands wirelessly sent from the occupant
(e.g. ``I'd like light turned on now'', or ``I'd too cold'')
would be far less invasive of the privacy of the occupant, and
would also not need to second-guess the occupant's true internal
physical or preferential state, or even know the occupant's
identity. Therein lies the true capability of ``wearables''
to improve the human condition.
DEFINITION OF WEARABLE COMPUTING:
Wearable computing is a term that might at first seem self-explanatory,
yet there exists a possibility that it might be of assistance to some
to arrive at a definition. Indeed, words like ``wearables'' are often used
in much of the recent
literature to imply some vague sense of computational capability
but without ever being defined. Moreover, the term ``wearables''
has previously had a broader definition
that includes ordinary
cloth T-shirts, or just about any other article of clothing that has
no relation to electrical circuits of any kind.
In this issue, what is meant by ``wearable computer'' is
a data processing system attached to the body,
with one or more output devices
(often comprising a visual display to one or both eyes that's capable
in both text and graphics), where the output is perceptible constantly
despite the particular task or body position,
and input means (typically comprising
pushbutton switches operable with one hand)
where the input means allows the functionality of the data
processing system (e.g. instruction set) to be modified.
An attempt to make this concept precise is in defining its
CONSTANT, UNRESTRICTIVE, UNMONOPOLIZING, OBSERVABLE, CONTROLLABLE,
ATTENTIVE, COMMUNICATIVE, and PERSONAL.
(See Figure 1.)
CONSTANT: Always ready. May have ``sleep modes'' but
never ``dead''. Unlike a laptop computer which must be opened up,
switched on, and booted up before use. Always on, always running.
UNRESTRICTIVE to the user: ambulatory, mobile, roving,
``you can do other things while using it'',
e.g. you can dial a phone number or
respond to email while jogging, etc.
UNMONOPOLIZING of the user's attention:
it does not cut you off from the outside world like a
virtual reality game or the like, so that you can
attend to other matters while using the apparatus.
In fact, ideally, it will provide enhanced sensory
capabilities. It may, however, mediate (augment, alter,
or diminish-with-purpose) the sensory capabilities.
OBSERVABLE by the user:
It can get your attention continuously if you want it to
(e.g. it can notify you of an incoming call or message).
CONTROLLABLE by the user: Responsive.
You can grab control of it at any time you wish.
Even in automated processes you can manually override to
break open the control loop and become part of the loop at
any time you want to (example: a big ``Halt'' button to deal
with the situation arising when an application opens all 50
documents that were highlighted when you accidently pressed
ATTENTIVE to the environment:
Environmentally aware, multimodal, multisensory.
(As a result this ultimately gives the user increased
COMMUNICATIVE to others:
Can be used as a medium of expression or as a
communications medium any time you want.
PERSONAL: Human and computer are inextricably intertwined.
There is, in fact, something new and interesting that emerges from
such an apparatus, for, after time, it becomes a true extension
of the mind and body. For example, the personal imaging (WearCam)
invention  is a camera that attempts to move us from the
``point and click'' metaphor, thought to be the simplest photography
had to offer, to an even simpler metaphor: ``look and think''.
This gives rise to a new genre of documentary video, as well as to
the ability to generate environment maps merely by looking around
(``painting with looks''), wherein a new result in algebraic projective
geometry is embodied in a wearable camera/processing system .
It is my hope that wearable computing will become not a means
unto itself, but, rather, an enabling technology to make other devices
that function as true extensions of the mind and body.
A HOPEFUL LOOK TO THE FUTURE:
When I approached the IEEE in 1996 to propose an October 1997 conference
on ``Wearable Computing'', the Executive Director of the IEEE
Computer Society sent me back a favorable reply, and by
August 1996, there was a tremendous deal of interest in this work
at the Society level.
Thus, amid the hype, it appeared that there was a hope
of a possible scientific avenue of pursuit in the scholarly tradition
of a peer-reviewed conference. Not that the academic tradition
is totally infallible , but, as publications chair, it was my
sincere hope to see a body of substantive work
in this area --- work that will go beyond wearable computing,
and evolve into something more. It is also my hope that
this Special Issue might show a hint of that spirit, and might
take an important first step, however small,
toward looking beyond wearable computing for its own sake.
The ultimate goal, as I see it, is that wearable computing becomes
useful technology that truly empowers the individual.
I wish to thank the rest of the ISWC97 committee, and those
on the Personal Technologies editorial board, as well as Kris Popat
of Xerox Parc and Jonathan Rose at the University of Toronto for
many helpful suggestions in organizing this introduction.
Chip McGuire, Vaughan Pratt, and Bruce Macdonald made valuable
suggestions toward arriving at a clear definition of wearable computing.
 ``The Bandwagon'', Claude Shannon, Institute of Radio Engineers,
Transactions on Information Theory, Vol. IT-2, Mar., 1956, p. 3.
 Smart Clothing, Steve Mann, CACM Vol 39, No 8, Aug 1996
 KEEPING TABS ON CRIMINALS, Joseph Hoshen and Jim Sennott and Max Winkler,
IEEE SPECTRUM, February 1995, pages 26-32
Wearable Computing, A first step toward Personal Imaging, Steve Mann,
IEEE Computer, Vol 30, No 2, Feb 1997, http://wearcam.org/ieeecomputer.htm
Video Orbits of the projective group: A simple featureless approach
to estimating parameters, S. Mann and R.W. Picard, IEEE Trans. Image
Processing, Vol 6, No 9, Sept 1997. See also U.S. Patent 5706416
 Transgressing the Boundaries, http://www.physics.nyu.edu/faculty/sokal
Steve Mann, inventor of the wearable computer/personal imaging system,
is currently a faculty member at the University of Toronto, Department
of Electrical and Computer Engineering.
Steve has been developing WearComp-based personal imaging as
a hobby, since his high school days in the 1970s and early 1980s.
In 1991 he brought his vision to the Massachusetts Institute of
Technology, and continued this new direction of research there,
defining a new field called ``Personal Imaging'', in which he received
his PhD degree from MIT in 1997. His previous degree is Master of
Electrical Engineering (MEng) from McMaster University.
Steve also holds undergraduate degrees in physics (BSc) and electrical
engineering (BEng) from McMaster University.
He was one of four organizers of the ACM's first international
workshop on wearable computing, and publications chair for the IEEE
international symposium on wearable computing (ISWC-97).
His present research interests include homometric imaging,
lightspace rendering, and wearable, tetherless computer-mediated reality.
He is currently setting up a new "Humanistic Intelligence" lab
at the University of Toronto, where the first major project will
be to "invent the camera of the future".
Steve is also interested in the visual arts and has exhibited his
WearComp-based PhotoRenderings (the means and apparatus for which
he was recently awarded US Patent 5,706,416) in numerous art
galleries and museums from 1985 to present. See