The medium of television has had many influences on society since its inception. The belief that this impact has been dramatic has been largely unchallenged in media theory since its inception. However, there is much dispute as to what those effects are, how serious the ramifications are and if these effects are more or less evolutionary with human communication.
Current research is discovering that individuals suffering from social isolation can employ television to create what is termed a parasocial or faux relationship with characters from their favorite television shows and movies as a way of deflecting feelings of loneliness and social deprivation. Just as an individual would spend time with a real person sharing opinions and thoughts, pseudo-relationships are formed with TV characters by becoming personally invested in their lives as if they were a close friend so that the individual can satiate the human desire to form meaningful relationships and establish themselves in society. Jaye Derrick and Shira Gabriel of the University of Buffalo, and Kurt Hugenberg of Miami University found that when an individual is not able to participate in interactions with real people, they are less likely to indicate feelings of loneliness when watching their favorite TV show.
They refer to this finding as the social surrogacy hypothesis. Furthermore, when an event such as a fight or argument disrupts a personal relationship, watching a favorite TV show was able to create a cushion and prevent the individual from experiencing reduced self-esteem and feelings of inadequacy that can often accompany the perceived threat. By providing a temporary substitute for acceptance and belonging that is experienced through social relationships, TV helps to relieve feelings of depression and loneliness when those relationships are not available. This benefit is considered a positive consequence of watching television, as it can counteract the psychological damage that is caused by isolation from social relationships.
Several studies have found that educational television has many advantages. The Media Awareness Network explains in its article "The Good Things about Television" that television can be a very powerful and effective learning tool for children if used wisely. The article states that television can help young people discover where they fit into society, develop closer relationships with peers and family, and teach them to understand complex social aspects of communication. Dimitri Christakis cites studies in which those who watched Sesame Street and other educational programs as preschoolers had higher grades, were reading more books, placed more value on achievement and were more creative. Similarly, while those exposed to negative role models suffered, those exposed to positive models behaved better.
In the Parent Circle, by PC exclusives, Priscilla J. S. Selvaraj points out several benefits of watching TV on an educational level and on an emotional level. She explains that it can, "... be used... both at home as well as in classrooms. With the range of channels on offer, there is no dearth [lack] of educational content." In addition to these benefits watching television brings awareness to their society, and can also help people become bilingual.  Because they are learning things outside the classroom, it is making things easier for children inside it. This creates happiness and can raise the energy too. Being energetic and happy allows your body to be more active. More activity makes people healthier.
Emotionally, watching television can help strengthen the bond of a family.  This being said spending time with family or loved ones can cause your body to release endorphins that can make you happier as well.
The rich array of pejoratives for television (for example, "boob tube" and "chewing gum for the mind" and so forth) indicate a disdain held by many people for this medium. Newton N. Minow spoke of the "vast wasteland" that was the television programming of the day in his 1961 speech.
Complaints about the social influence of television have been heard from the U.S. justice system as investigators and prosecutors decry what they refer to as "the CSI syndrome". They complain that, because of the popularity and considerable viewership of CSI and its spin-offs, juries today expect to be "dazzled", and will acquit criminals of charges unless presented with impressive physical evidence, even when motive, testimony, and lack of alibi are presented by the prosecution.
Television has also been credited with changing the norms of social propriety, although the direction and value of this change are disputed. Milton Shulman, writing about television in the 1960s, wrote that "TV cartoons showed cows without udders and not even a pause was pregnant," and noted that on-air vulgarity was highly frowned upon. Shulman suggested that, even by the 1970s, television was shaping the ideas of propriety and appropriateness in the countries the medium blanketed. He asserted that, as a particularly "pervasive and ubiquitous" medium, television could create a comfortable familiarity with and acceptance of language and behavior once deemed socially unacceptable. Television, as well as influencing its viewers, evoked an imitative response from other competing media as they struggle to keep pace and retain viewer- or readership.
According to a study published in 2008, conducted by John Robinson and Steven Martin from the University of Maryland, people who are not satisfied with their lives spend 30% more time watching TV than satisfied people do. The research was conducted with 30,000 people during the period between 1975 and 2006. This contrasted with a previous study, which indicated that watching TV was the happiest time of the day for some people. Based on his study, Robinson commented that the pleasurable effects of television may be likened to an addictive activity, producing "momentary pleasure but long-term misery and regret."
In 1989 and 1994, social psychologists Douglas T. Kenrick and Steven Neuberg with co-authors demonstrated experimentally that following exposure to photographs or stories about desirable potential mates, human subjects decrease their ratings of commitment to their current partners. Citing the Kenrick and Neuberg studies, in 1994, evolutionary biologist George C. Williams and psychiatrist Randolph M. Nesse observed that television (and other mass communications such as films) were arousing envy by broadcasting the lives of most successful members of society (e.g. Lifestyles of the Rich and Famous) and were causing lower feelings of commitment to spouses as a consequence of the television industry's hiring of physically attractive actors and actresses. In 1955, a majority of U.S. households had at least one television set, and by 1992, 60 percent of all U.S. households received cable television subscriptions. From 1960 to 2011, the percentage of all U.S. adults who were married declined from 72 percent to a record low of 51 percent, with the percentage of U.S. adults over the age of 25 who had never married rising to a record high of one-fifth by 2014 and the percentage of U.S. adults living without spouses or partners rising to 42 percent by 2017.
One theory says that when a person plays video games or watches TV, the basal ganglia portion of the brain becomes very active and dopamine is released. Some scientists believe that release of high amounts of dopamine reduces the amount of the neurotransmitter available for control of movement, perception of pain and pleasure and formation of feelings. A study conducted by Herbert Krugman found that in television viewers, the right side of the brain is twice as active as the left side, which causes a state of hypnosis.
Research shows that watching television starting at a young age can profoundly affect children's development. These effects include obesity, language delays, and learning disabilities. Physical inactivity while viewing TV reduces necessary exercise and leads to over-eating. Language delays occur when a child doesn't interact with others. Children learn language best from live interaction with parents or other individuals. Resulting learning disabilities from over-watching TV include ADHD, concentration problems and even reduction of IQ. Children who watch too much television can thus have difficulties starting school because they aren't interested in their teachers. Children should watch a maximum of 2 hours daily if any television.
Many scientific studies has been published about the embedded use of subliminal messages in songs, video and digital TV, trying to manipulate the choices of watchers and the public opinion. This point of view has hold up some countries to approve law, with the purpose of protecting citizens and their children.
In his book Bowling Alone, Robert D. Putnam noted a decline of public engagement in local social and civic groups from the 1960s to the 1990s. He suggested that television and other technology that individualizes leisure time accounted for 25% of this change.
Studies in both children and adults have found an association between the number of hours of television watched and obesity. A study found that watching television decreases the metabolic rate in children to below that found in children at rest. Author John Steinbeck describes television watchers:
The American Academy of Pediatrics (AAP) recommends that children under two years of age should not watch any television and children two and older should watch one to two hours at most. Children who watch more than four hours of television a day are more likely to become overweight.
Legislators, scientists and parents are debating the effects of television violence on viewers, particularly youth. Fifty years of research on the impact of television on children's emotional and social development have not ended this debate.
Some scholars have claimed that the evidence clearly supports a causal relationship between media violence and societal violence. However, other authors note significant methodological problems with the literature and mismatch between increasing media violence and decreasing crime rates in the United States.
A 2002 article in Scientific American suggested that compulsive television watching, television addiction, was no different from any other addiction, a finding backed up by reports of withdrawal symptoms among families forced by circumstance to cease watching. However, this view has not yet received widespread acceptance among all scholars, and "television addiction" is not a diagnoseable condition according to the Diagnostic and Statistical Manual -IV -TR.
A longitudinal study in New Zealand involving 1000 people (from childhood to 26 years of age) demonstrated that "television viewing in childhood and adolescence is associated with poor educational achievement by 12 years of age". The same paper noted that there was a significant negative association between time spent watching television per day as a child and educational attainment by age 26: the more time a child spent watching television at ages 5 to 15, the less likely they were to have a university degree by age 26. However, recent research (Schmidt et al., 2009) has indicated that, once other factors are controlled for, television viewing appears to have little to no impact on cognitive performance, contrary to previous thought. However, this study was limited to cognitive performance in childhood. Numerous studies have also examined the relationship between TV viewing and school grades.
A study published in Sexuality Research and Social Policy concluded that parental television involvement was associated with greater body satisfaction among adolescent girls, less sexual experience amongst both male and female adolescents, and that parental television involvement may influence self-esteem and body image, in part by increasing parent-child closeness. However, a more recent article by Christopher Ferguson, Benjamin Winegard, and Bo Winegard cautioned that the literature on media and body dissatisfaction is weaker and less consistent than often claimed and that media effects have been overemphasized. Similarly recent work by Laurence Steinbrerg and Kathryn Monahan has found that, using propensity score matching to control for other variables, television viewing of sexual media had no impact on teen sexual behavior in a longitudinal analysis.
Many studies have found little or no effect of television viewing on viewers (see Freedman, 2002). For example, a recent long-term outcome study of youth found no long-term relationship between watching violent television and youth violence or bullying.
On July 26, 2000 the American Academy of Pediatrics, the American Medical Association, the American Psychological Association, the American Academy of Family Physicians, and the American Academy of Child and Adolescent Psychiatry stated that "prolonged viewing of media violence can lead to emotional desensitization toward violence in real life." However, scholars have since analyzed several statements in this release, both about the number of studies conducted, and a comparison with medical effects, and found many errors.
Television is used to promote commercial, social and political agendas. Public service announcements (including those paid for by governing bodies or politicians), news and current affairs, television advertisements, advertorials and talk shows are used to influence public opinion. The Cultivation Hypothesis suggests that some viewers may begin to repeat questionable or even blatantly fictitious information gleaned from the media as if it were factual. Considerable debate remains, however, whether the Cultivation Hypothesis is well supported by scientific literature, however, the effectiveness of television for propaganda (including commercial advertising) is unsurpassed. The US military and State Department often turn to media to broadcast into hostile territories or nations.
While the effects of television programs depend on what is actually consumed, Neil Postman argues that the dominance of entertaining, but not informative programming, creates a politically ignorant society, undermining democracy: "Americans are the best entertained and quite likely the least-informed people in the Western world." In a four-part documentary series released by FRONTLINE in 2007, former Nightline anchor Ted Koppel stated, "To the extent that we're now judging journalism by the same standards that we apply to entertainment - in other words, give the public what it wants, not necessarily what it ought to hear, what it ought to see, what it needs, but what it wants - that may prove to be one of the greatest tragedies in the history of American journalism." Koppel also suggested that the decline in American journalism was made worse since the revocation of the FCC fairness doctrine provisions during the Reagan Administration, while in an interview with Reason, Larry King argued that the revocation of the Zapple doctrine's equal-time provisions in particular led to a decline in the public discourse and the quality of candidates running in U.S. elections.
Following the infamous first presidential debate between John F. Kennedy and Richard Nixon during the 1960 U.S. presidential election (for which the equal-time rule was suspended), most television viewers thought Kennedy had won the debate while most radio listeners believed that Nixon had won. After the first debate, Gallup polls showed Kennedy moving from a slight deficit to a slight lead over Nixon, while other polls revealed that more than half of all voters had been influenced by the debates and 6 percent alone claimed that the debates alone had decided their choice. Although the actual influence of television in these debates have been argued over time, recent studies by political scientist James N. Druckman determined that the visually-based television may have allowed viewers to evaluate the candidates more on their image (including perceived personality traits) than radio which allowed the transmission of voice alone. Termed "viewer-listener disagreement", this phenomenon may still affect the political scene of today.
In his Treatise on Human Nature (1739), philosopher David Hume observed that "reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them." Social psychologist Jonathan Haidt argues that his research with anthropologist Richard Shweder on moral dumbfounding vindicates Hume's observation, and Haidt cites verse 326 of the Dhammapada where Siddh?rtha Gautama compares the dual process nature of human moral reasoning metaphorically to a wild elephant and a trainer as a preferable descriptive analogy in comparison to a metaphor introduced by Plato in Phaedrus of a charioteer and a pair of horses.
Along with differential psychologist Dan P. McAdams, Haidt also argues that the Big Five personality traits constitute the lowest in a three-tiered model of personality with the highest level being a personal narrative identity constituted of events from episodic memory with moral developmental salience. As an example, Haidt cites how Rolling Stones guitarist Keith Richards recollects his experience as a choirboy in secondary school in his autobiography as being formative in the development of Richards political views along what Haidt refers to as the "authority/respect" moral foundation. Along with political scientist Sam Abrams, Haidt argues that political elites in the United States became more polarized beginning in the 1990s as the Greatest Generation and the Silent Generation (fundamentally shaped by their living memories of World War I and World War II) were gradually replaced with Baby boomers and Generation Jones (fundamentally shaped by their living memories of the U.S. culture war of the 1960s).
Haidt argues that because of the difference in their life experience relevant to moral foundations, Baby boomers and Generation Jones may be more prone to what he calls "Manichean thinking," and along with Abrams and FIRE President Greg Lukianoff, Haidt argues that changes made by Newt Gingrich to the parliamentary procedure of the U.S. House of Representatives beginning in 1995 made the chamber more partisan. In 1931, a majority of U.S. households owned at least one radio receiver, while by 1955, a majority of U.S. households owned at least one television set. Because of this, many Baby boomers (and members of Generation Jones in particular) have never known a world without television, and unlike during World War II (1939-1945) and the Korean War (1950-1953) when most U.S. households owned radios but did not have television, during the Vietnam War (1955-1975) most U.S. households did own at least one television set.
Also, unlike the first half of the 20th century, protests of the 1960s civil rights movement (such as the Selma to Montgomery marches in 1965) were televised, along with police brutality and urban race rioting during the latter half of the decade. In 1992, 60 percent of U.S. households held cable television subscriptions in the United States, and Haidt, Abrams, and Lukianoff argue that the expansion of cable television since the 1990s, and Fox News in particular since 2015 in their coverage of student activism over political correctness at colleges and universities in the United States, is one of the principal factors amplifying political polarization in the United States. In September and December 2006 respectively, Luxembourg and the Netherlands became the first countries to completely transition from analog to digital television, while the United States commenced its transition in 2008.
Haidt and journalists Bill Bishop and Harry Enten have noted the growing percentage of the U.S. presidential electorate living in "landslide counties", counties where the popular vote margin between the Democratic and Republican candidate is 20 percentage points or greater. In 1976, only 27 percent of U.S. voters lived in landslide counties, which increased to 39 percent by 1992. Nearly half of U.S. voters resided in counties that voted for George W. Bush or John Kerry by 20 percentage points or more in 2004. In 2008, 48 percent of U.S. voters lived in such counties, which increased to 50 percent in 2012 and increased further to 61 percent in 2016. In 2020, 58 percent of U.S. voters lived in landslide counties. At the same time, the 2020 U.S. presidential election marked the ninth consecutive presidential election where the victorious major party nominee did not win a popular vote majority by a double-digit margin over the losing major party nominee(s), continuing the longest sequence of such presidential elections in U.S. history that began in 1988 and in 2016 eclipsed the previous longest sequences from 1836 through 1860 and from 1876 through 1900.[note 1] In contrast, in 14 of the 17 U.S. presidential elections from 1920 through 1984 (or approximately 82 percent) the victorious candidate received more than 50 percent of the vote (with 1948, 1960, and 1968 excepted) while in 10 of the 17 elections (or approximately 59 percent) the victorious candidate received a majority of the popular vote by a double-digit margin (1920, 1924, 1928, 1932, 1936, 1952, 1956, 1964, 1972, and 1984).
While women, who were "traditionally more isolated than men" were given equal opportunity to consume shows about more "manly" endeavors, men's "feminine" sides are tapped by the emotional nature of many television programs.
Television played a significant role in the feminist movement. Although most of the women portrayed on television conformed to stereotypes, television also showed the lives of men as well as news and current affairs. These "other lives" portrayed on television left many women unsatisfied with their current socialization.
The representation of males and females on the television screen has been a subject of much discussion since the television became commercially available in the late 1930s. In 1964 Betty Friedan claimed that "television has represented the American Woman as a "stupid, unattractive, insecure little household drudge who spends her martyred mindless, boring days dreaming of love--and plotting nasty revenge against her husband." As women started to revolt and protest to become equals in society in the 1960s and 1970s, their portrayal on the television was an issue that they addressed. Journalist Susan Faludi suggested, "The practices and programming of network television in the 1980s were an attempt to get back to those earlier stereotypes of women." Through television, even the most homebound women can experience parts of our culture once considered primarily male, such as sports, war, business, medicine, law, and politics. Since at least the 1990s there has been a trend of showing males as insufferable and possibly spineless fools (e.g. Homer Simpson, Ray Barone).
The inherent intimacy of television makes it one of the few public arenas in our society where men routinely wear makeup and are judged as much on their personal appearance and their "style" as on their "accomplishments."
From 1930 till today daytime television hasn't changed much. Soap operas and talk shows still dominate the daytime time slot. Primetime television since the 1950s has been aimed at and catered towards males. In 1952, 68% of characters in primetime dramas were male; in 1973, 74% of characters in these shows were male. In 1970 the National Organization for Women (NOW) took action. They formed a task force to study and change the "derogatory stereotypes of women on television." In 1972 they challenged the licences of two network-owned stations on the basis of their sexist programming. In the 1960s the shows I Dream of Jeannie and Bewitched insinuated that the only way that a woman could escape her duties was to use magic. Industry analysis Shari Anne Brill of Carat USA states, "For years, when men were behind the camera, women were really ditsy. Now you have female leads playing superheroes, or super business women." Current network broadcasting features a range of female portrayals. This is evident in a 2014 study showing that "42% of all major characters on television are female".
In August of 2007, television was helping the woman of India by giving them female empowerment. In a Survey from 2001 to 2003, "Indian Women don't have a lot of control over their lives. More than half need permission from their husbands to go shopping." India Women were expected to be the traditional house wife that cooked, cleaned, and give birth to many of their kids. But around that time cable television had arrived in Indian villages. One of their most popular shows was where, "Their emancipated female characters are well-educated, work outside the home, control their own money, and have fewer children than rural women." The women's attitudes that had access to the television changes profoundly. For example, "After a village got cable, women's preference for male children fell by 12 percentage points. The average number of situations in which women said that wife beating is acceptable fell by about 10 percent. And the authors' composite autonomy index jumped substantially, by an amount equivalent to the attitude difference associated with 5.5 years of additional education." By giving India women access to cable television it opened their eyes to see what their life could be like. It is said they should call it the "Empowerment Box" because of the awareness it brought to their country.
Some communications researchers argue that television serves as a developmental tool that teaches viewers about members of the upper, middle, working, and lower-poor classes. Research conducted by Kathleen Ryan and Deborah Macey support this theory by providing evidence collected from ethnographic surveys of television viewers along with critical observational analysis of characters and structure of America's most popular television shows. A limited scope of findings of such studies demonstrate a shared public understanding about social class difference, which were learned through the dialogue and behavior of their favorite on-screen characters.
Research has been conducted to determine how television informs self-identity while reinforcing stereotypes about culture. Some communication researchers have argued that television viewers have become reliant on prime-time reality shows and sitcoms to understand difference as well as the relationship between television and culture. According to a 2013 study on matriarchal figures on the shows The Sopranos and Six Feet Under, researchers stated that the characters of Carmela Soprano and Ruth Fisher were written as stereotypical non-feminists who rely upon their husbands to provide an upscale lifestyle. They posited that these portrayals served as evidence that the media influences stereotype ideologies about class and stressed the importance of obtaining oral histories from "actual mothers, caretakers, and domestic laborers" who have never been accurately portrayed.
Pop culture researchers have studied the social impacts of popular television shows, arguing that televised competition shows such as The Apprentice send out messages about identity that may cause viewers to feel inadequate. According to Justin Kidd television media perpetuates narrow stereotypes about social classes while also teaching viewers to see themselves as inferior and insufficient due to personal aspects such as "race or ethnicity, gender or gender identity, social class, disability or body type, sexuality, age, faith or lack thereof, nationality, values, education, or another other aspect of our identities."
Television has effects on society's behavior and beliefs by publicizing stereotypes, especially with race. According to research done in 2015 by Dixon on misrepresentation of race in local news, Blacks, in particular, were accurately depicted as perpetrators, victims, and officers. However, although Latinos were accurately depicted as perpetrators, they continued to be underrepresented as victims and officers. Conversely, Whites remained significantly overrepresented as victims and officers.
In 2018, Deadline Hollywood observed that portrayals of diversity, and intersectionality on television had risen, citing a poll about favorite characters and a number of new shows featuring diverse characters.
In its infancy, television was a time-dependent, fleeting medium; it acted on the schedule of the institutions that broadcast the television signal or operated the cable. Fans of regular shows planned their schedules so that they could be available to watch their shows at their time of broadcast. The term appointment television was coined by marketers to describe this kind of attachment.
The viewership's dependence on schedule lessened with the invention of programmable video recorders, such as the videocassette recorder and the digital video recorder. Consumers could watch programs on their own schedule once they were broadcast and recorded. More recently, television service providers also offer video on demand, a set of programs that can be watched at any time.
Both mobile phone networks and the Internet can give video streams, and video sharing websites have become popular. In addition, the jumps in processing power within smartphone and tablet devices has facilitated uptake of "hybridised" TV viewing, where viewers simultaneously watch programs on TV sets and interact with online social networks via their mobile devices. A 2012 study by Australian media company Yahoo!7 found 36% of Australians will call or text family and friends and 41% will post on Facebook while watching TV. Yahoo!7 has already experienced significant early uptake of its Fango mobile app, which encourages social sharing and discussion of TV programs on Australian free-to-air networks.
The Japanese manufacturer Scalar has developed a very small TV system attached to eyeglasses, called "Teleglass T3-F".
If interpreted in the Princeton group's framework of activity as experienced being the sine qua non of measurement, that would mean that TV represents a highly enjoyable activity that would improve the quality of people's lives, given that more of Americans' free time is being devoted to it. Clearly, the data analyzed here point in the opposite direction. As noted at the outset, whether that means happiness leads to lower viewing, or that more viewing leads to unhappiness, cannot be determined from these data, and thus will require a panel design along with some careful observational study.