WHY YOU’LL WEAR A BODY CAMERA

InfoTrends says people will take 1.2 trillion digital photos this year. That’s 100 billion more than last year and nearly double the number taken as recently as 2013.

The rate at which photo taking grows is currently clocked at a whopping 100 billion per year – that means each year humanity takes 100 billion more photos that it did last year.

I think that rate is about to accelerate. And the reason is wearable cameras.

As the cost goes down, quality goes up and ease of use improves (through miniaturization, better software and better batteries), wearable cameras will become more compelling.

These will arrive in the form of clip-on cameras, smartwatch cameras and cameras attached permanently or temporarily to glasses, including smart glasses.

A few years ago, a first generation of so-called “lifelogging cameras” came and went. These failed in the market because they were two early on two fronts: The technology wasn’t ready. And the public wasn’t ready.

Now, increasingly, both are ready. As better technology improves the quality of images, enabling even 4k video and real-time live streaming, society increasingly acclimates to people taking pictures all the time. Those 1.2 trillion photos aren’t all being taken in private spaces.

Just a few years ago, nobody could have predicted or imagined what’s now acceptable public behavior with a smartphone camera. People shamelessly pose and posture in public for selfies without embarrassment. They take pictures of their food and drinks in restaurants. They take selfies in the bathroom mirror.

The shift in acceptance of wearable cameras is changing even faster. When Google launched its Google Glass Explorer Program four and a half years ago, the technology was widely slammed in the public and press for the privacy-invading insult of a camera worn on the face.

But when Snap last year started distributing its Spectacles product, complete with camera worn on the face, criticism was muted.

What happened in the intervening three and a half years that drove face cameras from socially unacceptable to mostly OK?

Cameras became ubiquitous. The camera-drone revolution happened. Police everywhere started wearing body cams. Doorbells got cameras. The quality of smartphone cameras edged ever so close to low-end DSLR cameras. A few million more “iGen” digital-camera natives became smartphone carrying teenagers.

Mostly, according to InfoTrends, the human race took another 3.5 trillion digital pictures between 2013 and 2016.

And another thing happened: A whole bunch of great wearable cameras came on the market.

The new generation of wearable cameras

A new generation of clip-on cameras look like the previous generation, which centered around the Narrative Clip. But their cameras, radios, software and batteries are in some cases much better.

These include the 61N, Compass, FrontRow, SnapCam, MeCam, meMINI, Perfect Memory Camera and Streamcam.

There’s even a wearable camera for kids called Benjamin Button.

There’s a lot of innovation taking place for wrist-worn cameras, too. The Beoncam offers an HD 360 camera on a wristwatch. The CMRA puts two cameras on a replacement wrist band for an Apple Watch – one camera for shooting pictures away from the wearer, and the other pointing up at the user for selfies and video calls.

And, of course, glasses are a factor. Google Glass Enterprise Edition is now available. Products like FaceShot and PogoCam offer face-mounted cameras.

It turns out that the location of a wearable camera makes all the difference for how it’s used.

Badge-style clip-on cameras are acceptable for “lifelogging” applications – jogging your personal memory about places you go and people you meet. But they’re horrible for “photography.” Because the physical cameras move around, sit at odd angles and aren’t directly controlled by the user (they tend to shoot photos at intervals, or take video), the pictures are universally bad, save that one odd lucky shot.

Wrist-worn cameras are best used as expedient replacements for smartphone cameras – group shots, vacation snapshots and selfies.

As Google Glass wearers learned, eyeglasses-based cameras can take amazing photos. They point the camera where the user is looking, and show a first-person, this-is-what-I-saw picture, which can be photographically compelling.

But it’s going to be many years before wearable camera photos are anywhere near as good as smartphone pics.

Which is fine. Because pictures isn’t what will drive the body-cam revolution.

Why you’ll wear a body camera

In technology, one thing always leads to another.

The early PC revolution was driven by the desire to learn programming, play games and do spreadsheets. But the ubiquity of PCs enabled the web, which changed how humans share and access information and communicate with each other. Nobody thought their desktop Gateway PC was also a gateway to Facebook, YouTube and Amazon.com.

The early smartphone revolution was driven by applications like email. But the ubiquity of smartphones ushered in the mobile app revolution, which changes how people use “computers.” Nobody thought their Palm Treo would pave the way for Pandora, Snapchat and Google Maps.

Likewise, over the next year, the ubiquity of smartphones will enable the augmented-, mixed-, and virtual reality revolution on untethered mobile devices. And this will drive demand for the ideal platform for blended realities, which is smartglasses.

Today, few appreciate how profound and central smartglasses-based AR glasses will become. The electronics will be miniaturized, so within a few years, smartglasses will be almost identical to ordinary glasses and sunglasses. We’ll be able to choose smart frames at the optometrist’s office.

Sure, high-end, special-purpose, bulky and conspicuous smartglasses will exist. Magic Leap’s patent for AR glasses went online this week; it’s got four big cameras on it and far too much hardware for everyday use. (Also: Magic Leap told the press that patent drawings are conceptual and don’t reflect the coming product.)

Smartglasses will use camera electronics and lenses as much for data gathering as photography. Images and video will be processed for object and face recognition and this data will be fed back into the AR application. Looking at a table with a goldfish bowl on it, an AR app will know that a virtual kitten can stand on the table but not the bowl, and a virtual shark can swim in the bowl but not the table. In AR, cameras aren’t for photography.

Other applications will capture photos or video all day, and process it through artificial intelligence systems to provide extremely good data on activity, behavior and environment.

Best of all, photography can be retroactive, either as photography or as data.

For example, instead of taking pictures of their food while they’re eating it, consumers can just tell their virtual assistant at the end of the day: “Post a picture of that pie I ate.” A.I. will reach into the recorded video, grab the best still shot of the pie and post it online. From a data perspective, we’ll ask that same assistant: “How many slices of pie did I eat last year?”

Cameras worn elsewhere on the body will be less useful for AR, but very useful for A.I. and personal assistant applications. Many people will wear both.

Because body cameras will be more about AR reality and data and less about photography, enterprises will be heavy users.

Shonin
The Shonin Streamcam exemplifies the newest generation of wearable cameras. They’re roughly the same size and shape as the first generation, but with much better electronics, lenses and software.

The co-founder and CEO of Shonin, Sameer Hasan, told me wearable cameras will be initially focused on quality control and documentation, medical applications and security. They’ll be immediately usable for “instruction and demonstration, live entertainment and news reporting.”

Wearable cameras will enable AR to “process video information in real time and instantly provide the wearer with analysis and recommendations based on what the camera is seeing,” according to Hasan.

Reports from the Financial Times says Apple’s rumored smartglasses may feature 3D cameras, but no display. A Bloomberg report says the glasses will have both the cameras and the display. Either way, it will have cameras, according to these reports.

Microsoft’s HoloLens chief Alex Kipman says wearable camera-centric smart glasses will eventually even replace smartphones altogether.

To say in 2017 that you’ll never use a wearable camera is like saying in 1987 that you’ll never carry a mobile phone.

So you’ll wear a body camera. You might wear one for taking pictures. But you’ll definitely wear one for A.I. data. And you’ll wear one for AR, too.

Before long, they’ll be as ubiquitous as the smartphones that enables us to take more than a trillion pictures a year.

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *