For the first time since 1992, the number of households with a television declined. The downturn was slight, but its impact may hint at much larger changes in the country’s entertainment and marketing industries and, by extension, the future of Americans’ social interactions.
Technological innovations have always had social implications. When Thomas Edison invented the motion picture film in the 1890s, for example, he envisioned motion picture viewing as an individual experience. Edison resisted the movie projector; instead, he promoted the kinetoscope, a movie machine that permitted individual customers to push their eyes against a view lens and watch images flicker about.
Other inventors, however, surmised that projecting motion pictures allowed for mass audiences—and larger profits. Unlike today, the early days of visual electronics trended toward shared community experiences.
These shared community experiences grew through the 1940s but, by the 1950s, television began to cut into Hollywood’s profits. Within a few years TV took hold of the family living room, with family sitcoms such as The Andy Griffith Show ruling the airwaves.
Instead of a theater audience, the immediate viewing experience was largely limited to the family unit. Nonetheless, the medium’s popularity and restricted viewing options created, as author Victor Brooks has noted, a “shared community…in which family members, friends, and schoolmates often watched the same program so that discussion of a particular show might carry over from the living room to the schoolyard the next day.”
During the 1970s and 1980s, however, even this “shared community” began to fragment. As television prices dropped and cable television led to more viewing options, TV viewing became more individualized.
By the turn of the millennium, mass audiences were becoming obsolete, and “narrowcasting” was commonplace. Today, broadcast programming is largely driven by “target marketing” and, intriguingly, is now less likely to involve television.
Younger people, in particular, are relying increasingly on personal computers, laptops, and mobile devices to fill the time they devote to electronic leisure. As they become a larger portion of the consumer market, the electronics market will change radically.
Even now, internet providers are considering ways to respond to an increased demand for online programming. Providers will most likely implement a “net-usage” fee while simultaneously increasing the cost of internet advertising.
It’s not clear how consumers will react to price increases, but advertisers are likely to pay the higher premiums as long as net usage grows. Unlike more “primitive” technologies, internet advertising allows for microtargeting; that is, advertising to individuals based on demographic characteristics and preferences—with the latter determined largely by users’ recent internet activity.
Consumers who have recently used search terms such as “depression” or “loneliness,” for example, may soon find their next online program interrupted with ads for Prozac or Zoloft.
Portable devices may prove even more attractive to advertisers. Phones with GPS, for example, can alert advertisers to users’ locations, which can quickly be matched with consumer preferences to produce almost instantaneous text-message coupons for nearby businesses.
While such matching techniques might lead to greater economic efficiency, its effect on social capital, “the capacity to work together for the common good,” may be more troubling. High levels of social capital help people work together productively, resulting in communities with high levels of trust, lower crime rates, and increased economic productivity.
Unfortunately, according to Harvard Political Scientist Robert Putnam, social capital rates have declined precipitously over the past six decades. The reason, he hints, is that “deep-seated technological trends are radically ‘privatizing’ or ‘individualizing’ our use of leisure time and thus disrupting many opportunities for social-capital formation.”
This individuation may become complete in the next twenty years, when scientists are expected to perfect wireless contact lenses that can provide images and text to users who wish to “augment reality.”
Book lovers will forgo books, and even Nooks, in favor of reading directly from their contact lenses; emails will be sent with the blink of an eye; and movies will be seen as holographic images floating a few feet in front of an audience of one.
Social recall will be just a memory, obviated by technology. Can’t remember where Walgreens is? Your contact lens will provide visual arrows that overlay your vision and point the way.
Can’t remember the name of the person approaching you in the supermarket? The lenses will employ facial recognition software to provide you with the person’s name, and then conduct a search of the web for additional helpful information, such as age, occupation, marital status, and children’s names.
In search of a mate? Your wireless lenses will be able to identify the blonde at the bar, run a background check, and produce a “match score”—all in less time than it would take to ask, “So, come here often?”
In less than two decades, our integration with visual technology will be virtually complete.
Writing in the 1990s, Putnam argued that the logical end of technological development is “to be entertained in total isolation.” But that isolation comes at a cost of social capital which, in turn, impairs our ability to achieve collective endeavors.
After more than 100 years of technological evolution, we are coming full circle. Edison’s vision of individualized electronic media is again ascendant, and increasingly isolated customers will soon be employing modern-day kinetoscopes to watch artificial images dancing before their eyes.