FACTOID # 14: North Carolina has a larger Native American population than North Dakota, South Dakota and Montana combined.
 Home   Encyclopedia   Statistics   States A-Z   Flags   Maps   FAQ   About 
People who viewed "Tokenism" also viewed:


FACTS & STATISTICS    Advanced view

Search encyclopedia, statistics and forums:



(* = Graphable)



Encyclopedia > Tokenism
? This article or section may contain original research or unverified claims.
Please help Wikipedia by adding references. See the talk page for details.

Tokenism refers to a policy or practice of limited inclusion of members of a minority group, usually creating a false appearance of inclusive practices, intentional or not. Typical examples in real life and fiction include purposely including a member of a minority race (such as a black character in a mainly white cast, or vice versa) into a group. Classically, token characters have some reduced capacity compared to the other characters, and may have bland or inoffensive personalities so as to not be accused of stereotyping negative traits. Instead, their difference may be overemphasized or made "exotic" and glamorous. Image File history File links Circle-question. ... The definition of a minority group can vary, depending on specific context, but generally refers to either a sociological sub-group that does not form either a majority or a plurality of the total population, or a group that, while not necessarily a numerical minority, is disadvantaged or otherwise has... For the term used in Computing, see Stereotype (computing). ...

Tokenism in fiction

A token character is a character in a story, myth, or legend, who only exists to achieve the minimum compliance with assumed normality for the environment described in the story. For example, a token wife is a wife who has no depth of character, or identity of her own; she only exists because the character that she is married to is expected to have a wife.

A token character can also be used by writers to pay lip service to rules or standards, when they otherwise have no intention of doing so, such as by obeying anti-racism policies by including a token black character, who, despite being present often, nevertheless does nothing, and has no function in the plot, and often is even stereotyped. Lip service is the name of the situation in which someone complies with a certain obligation, or expectation, they have been subjected to, to the minimum possible extent. ... Manifestations Slavery · Racial profiling · Lynching Hate speech · Hate crime · Hate groups Genocide · The Holocaust · Pogrom Ethnocide · Ethnic cleansing · Race war Religious persecution · Gay bashing Pedophobia · Ephebiphobia Movements Discriminatory Aryanism · Neo-Nazism · Supremacism Kahanism Ku Klux Klan Anti-discriminatory Abolitionism · Civil rights LGBT rights Womens/Universal suffrage · Feminism Mens... Look up plot in Wiktionary, the free dictionary. ... For the 1996 Blur single, see Stereotypes (song). ...

In fiction, token characters may represent various groups, which vary from the norm (usually white/heterosexual/physically attractive, frequently male), and are otherwise excluded from the story. They can be based on ethnicity (black, as well as Hispanic, Asian or Jewish), or be overweight or otherwise conventionally unattractive, non-heterosexual or a (usually good looking) female character in a male-dominated cast. Token characters will usually be relegated to the background and generally refrain from exhibiting stereotypical behavior, usually to render them inoffensive to readers or viewers. Such a character may also be disposed of relatively early in the story (either by being killed or voted off in a reality TV show) in order to enhance the drama while "conserving" the normal characters. An ethnic group or ethnicity is a population of human beings whose members identify with each other, usually on the basis of a presumed common genealogy or ancestry (Smith 1987). ... This article or section is in need of attention from an expert on the subject. ... The Hispanic world. ... This article deals primarily or exclusively with the definition of Asian in English-speaking countries, mainly referring to immigrants or descendants of immigrants living therein. ... For other uses, see Jew (disambiguation). ... Variation in the physical appearance of humans is believed by anthropologists to be an important factor in the development of personality and social relations in particular physical attractiveness. ... It has been suggested that this article or section be merged with sexual orientation. ... Gender often refers to the distinctions between males and females in common usage. ... For the 1996 Blur single, see Stereotypes (song). ... Reality television is a genre of television programming in which the fortunes of real life people (as opposed to fictional characters played by actors) are followed. ...

On the Show South Park, one of the characters is named Token Black, which parodies the stereotypes of blacks and other races portrayed on television. On the show, Token is possibly the wealthiest resident of South Park. South Park is an American, Emmy Award-winning[1] animated television comedy series about four fourth-grade school boys who live in the small town of South Park, Colorado. ... Token Williams is a fictional character on the animated series South Park. ...

See also

  Results from FactBites:
Festival Speech Synthesis System - 15 Text analysis (1396 words)
Becuase the relationship between tokens and word in some cases is complex, a user function may be specified for translating tokens into words.
the tokens "1985" should be pronounced differently, the first as a year, "nineteen eighty five" while the second as a quantity "one thousand nine hundred and eighty five".
The basic method is to find all occurrences of a homographic token in a large text database, label each occurrence into classes, extract appropriate context features for these tokens and finally build an classification tree or decision list based on the extracted features.
Tokenization - Wikipedia, the free encyclopedia (286 words)
In computer science, tokenization is the process of demarcating and possibly classifying sections of a string of input characters.
In human cognition tokenization is often used to refer to the process of converting a sensory stimulus into a cognitive "token" suitable for internal processing.
A stimulus that is not correctly tokenized may not be processed or may be incorrectly merged with other stimuli.
  More results at FactBites »



Share your thoughts, questions and commentary here
Your name
Your comments

Want to know more?
Search encyclopedia, statistics and forums:


Press Releases |  Feeds | Contact
The Wikipedia article included on this page is licensed under the GFDL.
Images may be subject to relevant owners' copyright.
All other elements are (c) copyright NationMaster.com 2003-5. All Rights Reserved.
Usage implies agreement with terms, 1022, m