Wikipedia talk:Content labeling proposal

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Censoring[edit]

NOTE: This discussion was started in Wikipedia:Village pump (technical) and was moved here for ease of editing and because it had expanded beyond the scope of that page.

Suggestion:

Would it be possible in user preferences to have certain categories of articles available for censorship, eg.
  • Sexual themed articles
  • Torture-related articles
  • Images
Please reply here, --J.B. 16:51, 27 Apr 2005 (UTC)
Articles? Images suppression has already been proposed. See m:End-user image suppression - Omegatron 17:39, Apr 27, 2005 (UTC)
It is possible, but highly unlikely that it will ever be software implemented, UNLESS it is shown that having a wiki-provided self-filtered PC aka "kid friendly" version of Wikipedia is good for the project. For example, grade schools may want to use Wikipedia in the classroom, but object over the adult content.
I think the prevailing attitude is "everyone should be able to see everything", but that is unrealistic, as it does not address the concerns of all the possible users of Wikipedia. Many parents will want to restrict adult content available to their children. I personally feel that most so called "adult content" is not harmful to children, but I respect parent's rights to oversee their children's viewing choices. If a parent has the right to say their 8 year old can't see R-rated movies, then that parent also has a right to say they can't view R-rated content online. Wikipedia should make every reasonable effort to address their concerns.
So, what does Wikipedia need to do to make it easy for parents to allow their young children to use the encyclopedia without being exposed to adult content? The answer: not much. Why? Because, if someone wants to filter internet content, there are plenty of programs that do that already. A parent concerned about their child viewing R material online will get a dynamic-filtering program that will enforce their content restrictions, not just for Wikipedia, but for all online sites. So a wiki-provided PC version is unnecessary and redundant -- better to let people make their own filtering decisions if they desire to.
That being said, I think Wikipedia should make every effort to assist people in identifying and warning for content of an adult nature. Perhaps a simple "Rated R"-style content notice could be put on top of pages containing content of an adult nature. Ex:
This page contains adult content, including: nudity, violence, profanity, etc. Viewer discretion is advised.
This does not censor the article. It merely informs readers of the article's content ahead of time so that they can make informed decisions about deciding to view the rest of the page. This notice will also assist filtering software. Better to warn readers ahead of time and let them decide for themselves then to let them accidentally view shocking images and text and become angry at Wikipedia as a result.
- Pioneer-12 17:57, 27 Apr 2005 (UTC)
I think the prevailing attitude is "everyone should be able to see everything", but that is unrealistic
I personally feel that most so called "adult content" is not harmful to children, but I respect parent's rights to oversee their children's viewing choices.
I agree completely. In fact, we're kind of self-censoring in that regard, since we are purposefully including content that will get us censored by parents, china, etc. and will therefore not be available to people who are under the censors' control.
Because, if someone wants to filter internet content, there are plenty of programs that do that already.
I think those programs or services would just block the entire wikipedia. They don't censor individual images or pages within a site, and if they did, they would need to have a huge team of people working nonstop on all our dynamically changing content. I think we need some kind of tagging system with us as the editors if we want that kind of protection.
Perhaps a simple "Rated R"-style content notice could be put on top of pages containing content of an adult nature.
I think a similar tag would be a good idea. But other people have a problem with this, for various reasons. ("Possibly objectionable content" is a better phrase, by the way.) - Omegatron 18:14, Apr 27, 2005 (UTC)
I was thinking "adult content" would be the style of notice that the most people could agree on, because "objectionable content" is rather vaguely defined--it would be difficult or impossible to reach a consensus on what is "objectionable", but I think most people have a similar idea of what is considered "adult". As for filtering programs blocking the whole Wikipedia--yes, old school filtering programs would do that, but newer dynamic-filtering programs exist which scan pages as they recieve them, and either filter out objectionable words or block the page individually if the content is deeded objectionable. These will be able to handle even the mercurial nature of Wikipedia. for example, if some vandal adds "F*** off, Pichachu!" to a Pokemon page, these programs will catch that and filter it.
- Pioneer-12 18:33, 27 Apr 2005 (UTC)
Discussions over various censorship proposals here has proven to me that "adult content" is no less vague than "objectionable content". Some people consider the KateWinslettTitanic image to be "adult" but others do not. Thryduulf 19:05, 27 Apr 2005 (UTC)
But why were they deleted? Most likely because the proposal was flawed or poorly understood. Who voted against them? It is probable they were rejected because Wikipedia has a high percentage of college-age and younger users, who often simply don't understand the concerns of parents.
I happen to be of college-age myself (grad school).... I'm not even a parent, but I have friends who are.
I will read the arguments against "content notice" templates and see if there is any validity to them.... Abu Ghraib torture and prisoner abuse already has a notice on it, and that's one's not going away, nor should it. If Abu Ghraib gets a notice (and it badly needed one), then it is logical that other articles may need notices too, and that a reasonable general policy should exist on this subject.
- Pioneer-12 18:48, 27 Apr 2005 (UTC)
Whoa whoa. We aren't deciding what is "objectionable" and what is not. We're just categorizing things and letting the end user decide which types of things they don't want displayed. They might not block Category:Nudes while blocking Category:Sexual images, for instance.
Just because similar proposals have been rejected in the past doesn't mean that any related idea should not even be considered. I imagine we're cutting off our highly useful information from a lot of kids who are behind filtering software (or overzealous parents). Remember that one of the principles of wikipedia is that it should be available to "every single person on the planet"...
I remember reading about political censorship concerns on the chinese wikipedia. How was this handled? - Omegatron 20:48, Apr 27, 2005 (UTC)
Right, we are just proposing sticking in descriptive adjectives so that people can make informed decisions. We are not declaring things "objectionable". The users makes their own decisions on what is objectionable. (Something they are going to do whether we put a notice on a page or not.) Each individual user can choose which adjectives to object to.
By the way, it is a common fallacy to think "Something kinda, sorta similar didn't work in the past, so thus NOTHING can work." I might point out a pretty obvious response to that: Content advisory systems have been implemented in numerous types of media, from movies to comic books to video games. Trust me, there is a system that can work here, if we choose to find it. - Pioneer-12 07:52, 28 Apr 2005 (UTC)
The problem with discriptive adjectives is that to avoid POV concerns and cultural differences you've got to be so precise that any selection would be unworkable. E.g. You'd have to get to levels like "white female. 20s. shown from waist up. one breast mostly exposed, nipple not visible. one breast almost completely exposed but nipple obscured by her fingers. Midriff completely exposed. One shoulder completely exposed, other shoulder covered. One arm completely exposed, other exposed from wrist to mid-upper arm. neck, and face completely exposed. Hair obscured by hat." Would this be classified as "nudity" or not? Thryduulf 09:56, 28 Apr 2005 (UTC)
QED. We define what nudity, etc. is and add adjectives based on that. While provocative, there is no true nudity in that picture as you described it. (With nudity being strictly defined as showing genitals, bare ass, or nipples.) We don't have to hit every single person's possible objections. I think a good rule of thumb would be: if a page would pass as "PG", then there should be no tag on it.
What about artistic nudes? I think they are approximately "PG-13". The canonical example is Michelangelo's David. That easily qualifies as a work of art, one I would personally not object to children seeing. Yet when I go to the page I am still a bit shocked to see his bare penis staring at me. For pages like that, a note like "this page contains artistic nudity" seems appropriate, and I myself would find the note helpful, even though I feel no desire to turn the pictures off on that page.
We don't have to hit all possible objections. But I think society has a general consensus on what level of content is considered "PG". Of course personal opinions differ on this, but there is a huge body of culture to draw on which shows what topics are commonly deemed non-PG by the vast majority of the population. There will always be borderline cases... and, on those few borderline cases, we can go by consensus. The rest of Wikipedia, well over 99% of it, will either have no content notice or unambiguously fall into a category requiring a certain adjective, just as David unambiguously falls into "artistic nudes".
By adding polite content notices (and well over 90% of all pages won't need any content notice), we show to society, to parents, and to people at work, that we understand their concerns and are working to address them. Thus we make Wikipedia more respectable and more acceptable in the eyes of the general public, we end most of the current and repeating controversy over including pictures about vaginas and fellatio, and we do nothing to interfere with the desires of people who want to see everything unfiltered. It's a "win-win" situation.
- Pioneer-12 13:05, 28 Apr 2005 (UTC)
Artistic nudity isn't unabiguous. Who defines what is artistic? Over the years David has been variously censored and not censored. Even in the past two years there have been American senators and the like who have demanded statues with breasts be covered up for the sake of "decency".
Nudity isn't as simple as "as showing genitals, bare ass, or nipples" in all cultures. What about a man's chest? In the UK and USA most people wouldn't consider that nudity, but other cultures around the world would (see also Topfree equality).
Other adjectives are even less well defined. What is "sexual content" for example (Oral sex? Penis? Breast? Breast cancer? Vagina? Fetish? Indecent exposure? Abortion? Pregnancy?). What consitutes "violence"?
PG is also troublesome. Frequently the same film will be considered PG in America and not in the UK, or vice versa, and this says nothing about what happens in other countries (France, I understand, is much more liberal about nudity than the UK is). Also, in the UK, the rating system is U -> PG -> 12A* -> 15 -> 18 (*for cinema releases, 12 for video). If you are meaning "PG" to mean "PG-13" then you see the how different it can be.
Consensus in these matters is hard to gain - I would personally classify images like the KateWinslettTitanic as U, wheras at least one contributor to that debate would probably have classified it as 15 or 18 on the UK scale. As someone is bound to ask, imho the Autofellatio pic would be classified 15 on the UK scale (the one with which I am familar), but I would be more restrictive about images of graphic violence than many people would. There are people who would not distingish between nudity and sex.
An NPOV tag system would be good in the situation you describe, but I genuinely do not believe that such a system is possible in reality. Thryduulf 13:34, 28 Apr 2005 (UTC)


We're talking about a real Pandora's box here. There is hardly an international consensus about what might be considered inappropriate for children or adolescents. Many American parents flip out if their kids see nudity, but in Europe, no one cares. In Saudi Arabia, they censor Lassie to edit out Timmy kissing his mother. What standard would we use?

The only possibility would be to warn about specific types of content -- nudity, violence, profanity, whatever. And then you'd have to decide what counts as nudity -- A man's rear end? A shirtless woman in Africa? A naked 5-year-old child?

Entertainment industries in America have been willing to overgeneralize in order to head off government regulation. What is our need here?

From my perspective, there could be a reason to alert users of sensitive content. For example, one time I read on the homepage that it was the anniversary of someone's conviction for "buggery." Not knowing what that word meant, I clicked on it, only to be directed to the anal sex article. Had my boss been looking over my shoulder at the time, I might have gotten in trouble.

Nevertheless, I would warn against attempting to adopt a universal standard for what is or is not acceptable for people of certain ages. Mwalcoff 09:10, 28 Apr 2005 (UTC)


A NPOV content labeling system IS possible, and here's how to do it[edit]

Re: Thryduulf--It's true that nudity isn't as simple as "showing genitals, bare ass, or nipples", but those are three things that people generally agree are nudity, even if they choose to add other body parts to that list. Still, "nudity" may remain too variable a term to nail down to one universal definition. As Mwalcoff suggested, it is probably unwise to try for an over-generalization--a universal one size fits all standard for "nudity". No problem: then we do what the Recreational Software Advisory Council does... classify different levels of nudity:

  • Level 0: No Nudity
  • Level 1: Revealing attire
  • Level 2: Partial nudity
  • Level 3: Frontal nudity
  • Level 4: Provocative frontal nudity

Oh, this is nice; this is really nice. Now John Ashcroft and the guy who can't bear to look at Kate Winslet's cleavage can use "Nudity Level 1 - revealing attire" as their guide, while Michelangelo's David gets Level 3 nudity (for artistic purposes).

The RSACi system also has 0 to 4 rating scales for violence, sex, and language. Most impressive! Full details are here: http://wp.netscape.com/comprod/products/communicator/netwatch/b_rsaci.html

Yes! A simple and effective solution. And one that's compatible with an "international, non-profit organization of internet leaders working to make the internet safer for children, while respecting the rights of content providers". (How's that for good PR?) The RSAC is now merged into the Internet Content Rating Association, and the ICRA's motto is "Choice not censorship". I couldn't have said it better myself.

So it looks like we didn't need to solve the rating problem after all; the problem's been solved for us. All we have to do is implement it.

- Pioneer-12 17:54, 28 Apr 2005 (UTC)

So there are two basic kinds of objections to this idea:

1. It can't be done

It's not going to be foolproof, but that's fine. It will still open up the content to a lot of people who were previously avoidant.
Of course we can't decide what is objectionable and what is not. We would just use tags indicating the general type of content (with a policy of conservative tagging for gray areas), and allow any type of tagged content to be "blocked" (where "blocked" = behind a warning link? or are we going to prevent it from being downloaded at all for schools and stuff? the latter is less realistic, but still feasible.) The current category tags could probably fill this role just fine.

2. It shouldn't be done

I don't like censorship and wouldn't wish it on anyone. But there are plenty of people who don't have a choice. (And, incidentally, the only way to change their censors' minds is with information that is now being blocked because it is part of the same project as "objectionable" unrelated information.)
I think it's better to get partial information to the subjugated masses than to embrace silly idealism (or defeatism) and thereby prevent them from accessing any of the information.

Also I don't want to accidently click on something with nudity in it at work. (Like I once did with pubic hair. Good thing no one was standing behind me...)

Mmm nudity... - Omegatron 18:06, Apr 28, 2005 (UTC)

  1. Please define "provocative".
  2. Please define "revealing attire" (e.g. is Image:National geographic 1910 11 laundress and street baby.jpg revealing? What about Image:Minirock (Lack) Model Dani 2.jpg, Image:Femalesoccerun01.jpg and Image:USpatent334359 1886.gif?
See also Image:Muslim Dress Billboard.jpg
  1. What are these images classed as:
  1. Image:Smilingguy.JPG
  2. Image:Barefootguy.JPG
  3. Image:National geographic 1910 11 peasants.jpg
  4. Image:Breastfeeding infant.jpg
  5. Image:Freiklettern347x500b.jpg
  6. Image:Female nipple profile.jpg
  7. Image:Weston nude.jpg
  8. Image:Mooning.jpg
  9. Image:TrangBang.jpg
  1. What ratings do you propose for sex and violence?
  2. How do you propose to objectively define what is "violent"?
  3. How do you propose to objectively define what is "sex"?
Thryduulf 21:54, 28 Apr 2005 (UTC)
Also, what do you propose for other things that some people might want censored somewhere - e.g. Drugs (including alcohol and tobacco), Nazism, communism, evolution, creationsim, religion, female sports, democracy, space flight, capitalism, Tibet, etc.
Don't forget that Amercian/whatever opinions on what is acceptable are not NPOV. Thryduulf 22:04, 28 Apr 2005 (UTC)
Personally, I'm quite keep on some sort of content censorship, but have yet to see a really inspiring method (and also quite keep on not implementing a "half-inspired" method). I do think that this should be given more of a chance though. I think the labelling system should be one of twoo methods:
  1. Consensus - wikipedia is quite proud of using consensus everywhere else, so why can't it work for labeling images?
  2. Voting - where wikipedia can't perform a consensus voting is sometimes adopted.
Wikipedia uses both quite efficiently elsewhere. Problems such as with Clitoris main article will always exist, but the vast majority will be absolutely fine.
As for imposing the content levels (such as on schools etc) in theory, wikipedia could use different domain names (such as http://contentlevel2.en.wikipedia.org) and it would be down to them to implement the firewall/proxy server which is quite reasonably done. With a content system already implemented this is pretty dirt easy to do. The normal en.wikipedia would of course work. -- Tomhab 00:02, 29 Apr 2005 (UTC)
Thryduulf, do you have a point with all of these trivial objections? Are you just trying to be a devil's advocate? This system--a multi-tier and multi-faceted descriptive content advisory system--provides a flexibility that encompasses all extremes of viewpoints, from the puritan to the libertine. Of course what will be considered nudity type 1, type 2, etc., will be defined, and sharply defined at that. More importantly, everyone will be able to find a level that they are comfortable with. It's like being able to find a shirt that fits. Some people may disagree over where to draw the boundaries (some people will fuss and disagree no matter what you do), but that is a whole different scale of objection. Disagreeing over that detail is like arguing over which color to paint a house, instead of deciding what kind of house to build and where to build it.
For the grey areas, as Omegatron suggested, we can be a little conservative in tagging things (better safe then sorry). After all, it is just an info tag.
As for things that "someone might want censored somewhere", we don't have to accommodate every single possible call for offense, nor should we. Only things that are considered reasonably objectionable by the majority of the population--only items where there is a consensus that it reasonable for some people to object to these types of content--should be tagged. There is a clear consensus among the general population that some people choose to object to violence, nudity, sex, and/or language (You don't need a poll to figure that out.)--and that they have their reasons for their objections, even if you or I do not share their concerns.
A perfect system is impossible, but a GOOD system, one that respects diversity and pluralism, and one that will please the vast majority of the population, is possible. As Abraham Lincoln said, "You can please some of the people all of the time, and you can please all of the people some of the time, but you can't please all of the people all of the time."
This is a system that will please all of the people that can be pleased. This is a system that can gain the approval of the vast majority of editors and readers, and it will be a hundred times better then the mess that we have now. If you think there is the possibility for a system that is even better then this one, then by all means research it and propose it.
But, before you look for more objections over what color to paint the house, realize that we are currently standing in the rain on a cold, dark night. A house, a good house is needed.
This can be that house.
- Pioneer-12 01:48, 29 Apr 2005 (UTC)
  • Firstly I do not see that my objections are trivial at all. Before I can support any attempt to censor Wikipedia in any way for whatever reason, I need to be certain what that censorship will entail.
  • Of course what will be considered nudity type 1, type 2, etc., will be defined, and sharply defined at that - What I'm asking you to do is define it now. Its no good saying things like "this will be defined in future" - would you vote for a government that said "We will introduce a law that protects our children, but we wont tell you what our definition of children is or what we are going to protect them from until the law is passed"? I certainly wouldn't, as how do you know whether it will mean that children under 5 cannot see people having sex, or everyone who is a child of someone else (i.e. everyone ever, or since Adam and Eve depending on your view on that matter) cannot see pornography or vote (protect them from the horrors of democracy)? I asked you above about images that are on wikimedia servers (some on wikipedia, some on commons) and you have not even attempted to answer the questions.
  • Disagreeing over that detail is like arguing over which color to paint a house, instead of deciding what kind of house to build and where to build it. I don't think we need a house at all, but if we have to have one we need to what having such a house would entail. My objections are designed to point out the crucial detail thats missing - your proposal is "we will have a house some walls" which leaves out such basic things as how many walls we have, where they will be, what type of walls they will be, etc.
  • we don't have to accommodate every single possible call for offense, nor should we. Only things that are considered reasonably objectionable by the majority of the population. The population of where? The population of the USA would probably not object to a picture showin a woman in a short-sleeved dress drinking a glass of wine. The population of Iran would probably be up in arms over about it. The population of France would probably not object to a picture of a naked man and woman on a nudist beach, a sizeable proportion of the USA population would have an appoplexy at the thought of their 14 year old see such a picture. The population of China (the largest on earth), or at least the communists there, would probably strongly object to images portraying Taiwan as an indpendent state. The population of the UK would more likely object to images of children holding guns than the population of the USA would. What makes one persons objections more valid than anothers?
  • only items where there is a consensus that it reasonable for some people to object to these types of content. Please define "reasonable" in this context. I think it is very reasonable for some people to have objections over imags of alcohol, to object to images of religion, etc. and yet you seem to be saying that there is no need to consider them? What gives Wikipedian's the right to say that some people's objections are more valid than others? The idea is to be NPOV, but what you seem to be proposing endorses some objections but not others.
  • This is a system that can gain the approval of the vast majority of editors and readers. Will it? Every attempt at defining things to censor in the past has had either no consensus with a leaning against it, or an outright rejection. Several other external systems have been proposed before, maybe even this one, but no consensus to adopt them has ever been reached. I agree this does not mean this time it wont, but you can't be sure about it.
  • and it will be a hundred times better then the mess that we have now. I disagree completely with that statement. What we have no is not, imho, a mess. Again imho, I do not see censorship of any sort as better than no censorship.
  • If you think there is the possibility for a system that is even better then this one, then by all means research it and propose it. The only system I can envisage that would be better than this is a prominent button at the top of the page saying "turn images off" or words to that effect. This has been proposed several times in several places recently.
  • realize that we are currently standing in the rain on a cold, dark night. No we are not. Please cite specific examples of instances where Wikipedia has been censored. For each of these instances please explain how your proposal would have prevented us being censored, had we had the system in place then. The only example of censorship I know of was by China becuase we have articles on Taiwan and Tianaman Square that don't conform to the Chinese government's POV.
  • A house, a good house is needed. No it isn't. Wikipedia is not censored. This discalimer is linked to from every article - even articles like 29 April and Dormouse. Thryduulf 08:16, 29 Apr 2005 (UTC)


Your opposition to censorship is admirable, but your concerns are misplaced.

It is really quite simple: THIS IS NOT CENSORSHIP. Nor is it "pseudo-censorship", "borderline censorship", or anything of that nature.

  • Censorship is "the systematic use of group power to broadly control freedom of speech and expression, largely in regard to secretive matters".
  • To censor is "to review in order to remove objectionable content".

No one is trying to control freedom of expression. No one is trying to remove objectionable content. What we're talking about is a information tag. An information tag which informs you of the page's contents. No one is saying "you can't do that on Wikipedia!" On the contrary, by clearly stating that a page contains nudity, sex, violence, or profanity, pages which have potentially objectionable content become less objectionable, because viewers are given fair notice. Based on the notice given by the information tag, viewers decide for themselves if they want to view the accompanying page.

Let me make something clear; I hate censorship as much, if not more then you do. If this was censorship--if this restricted the content of Wikipedia in any way--I would not be proposing it.

The case for why we need this has already been made above. The main details have already been given. A proposal with full details will be created in due course. Any suggestions for tweaks, enhancements, etc, can be made at the proposal.

The system suggested, RSACi, is a well known one. It is based on the work of Dr. Donald F. Roberts, chairman of the Communications Department at Stanford University, who has studied the media for nearly 20 years. It is so well known that even Microsoft has copied it. (I don't like Microsoft, but they do have a tendency to buy, copy, or steal good ideas.) If you are using a Microsoft OS, look under Internet_Options->Content->Content Advisor->Enable. There you will see an implementation of the RSACi system. There are free software implementations of the system as well.

The RSACi is flexible, easy to understand, and easy to use. It's a very good system. The current "argue over everything endlessly" setup we have now is absurd, inconsistent, illogical, and a total waste of time. Remember, this isn't doing anything we aren't doing already. We already use warning labels for pages, but in a totally erratic manner. This is just adding the labels in a consistent, respectable way. Yes, that is a hundred times better then the current state of cluelessness. It will get the job done. If you have an idea for improving on this system, then suggest it. If an even better system exists, then research it and propose it.

- Pioneer-12 13:43, 30 Apr 2005 (UTC)

Just because you keep saying it isn't censorship doesn't mean that it isn't censorship. You keep trying to force your prudity down on everybody else, and when people counter you, you call their objections trivial. The system you want to use is POV, no matter how much you try to claim otherwise. And even if your proposal were used, how would it be implemented? It would require coding changes so that people could indicate in their Prefernces which images to see, and the software would then have to verify it each time it wants to load an article. This is not a trivial change. Have you discussed with a developer how extensive this change would be? RickK 22:29, Apr 30, 2005 (UTC)

And just cuz people say it is censorship doesn't mean it is. As for the coding - thats semi-relevant. Objections so far have been entirely philosphical. IMHO its not even worth discussing with a developer if people can in no way be convinced that image content labeling (and I mean ANY content labeling) is possible with the wiki ethos. I think people should be allowed to say how prudish they are. This isn't restrictive, this gives people more of an options. -- Tomhab 23:21, 30 Apr 2005 (UTC)
This is not censorship, it's an idea for getting more information to people who would otherwise have that content censored against their will. Whether that's actually a problem or not, I don't know, but we aren't going to find out from the people who can't access this site... - Omegatron 23:55, Apr 30, 2005 (UTC)
No coding is necessary. This will be implemented using templates. "You keep trying to force your prudity down on everybody else". My prudity? I wouldn't be offended if Wikipedia had naked photos on every page. I wouldn't be offended by a shot of Jimbo in a speedo drinking a 40-oz and smoking crack-cocaine while shooting a rabbit with a BB gun, grabbing the boobs of a half-naked hooker, spray painting obscenities on a picture of Gandhi, and burning the flags of all nations.... but there are plenty of people who are. We should take reasonable means to respect their beliefs. - Pioneer-12 02:50, 1 May 2005 (UTC)[reply]
Can you arrange for him to fellate himself at the same time? We might be able to use the picture in several articles, then. - Nunh-huh 03:06, 1 May 2005 (UTC)[reply]
That would, indeed, be a terribly useful image. - Omegatron 03:51, May 1, 2005 (UTC)
We should take reasonable means to respect their beliefs.
We don't even need to respect their beliefs. I also am not offended by much (although I dislike gore), but am in favor of such a proposal because it would theoretically allow those behind censorship to see the "neutral" information while the "objectionable" content is censored. (Instead of blanket censorship of the whole site, which I know has happened for political content.) We can secretly hope that the "neutral" information will help open their eyes and they won't grow up to be censors themselves. Also it is just plain useful for situations like viewing at work, and will prevent the kind of ridiculous fighting going on over autofellatio (if you don't like it, set your preferences to block it).
Is a tagging and selective blocking system the best way to handle this? I'm not sure, but I like the idea so far. The only objections I have seen so far are dumb ones. - Omegatron 03:51, May 1, 2005 (UTC)
The problem is how the tags are applied, and who gets to pick them. Unless you agree to abide by, say, the People's Republic of China politic views, their ban will not be lifted. As far as I know, most schools do NOT currently block wikipedia. What problem is this the solution for? Do schools not provide copies of Britannica because it might contain a nude picture in an article on Michaelangelo? Or because a Britannica article on clitoris or penis might contain a picture. BTW - this page now would probably require a level 4 rating since it has text mentioning seriously aberrant sexual behavior. It's not an accident that there is no xxx top level internet domain. If wikipedia adheres to its goal of being encyclopedic, IMO no rating system is necessary. Prurient images will be voted out. Wholely inappropriate content will be revised. Yes, you might occasionally run into a picture or text that would offend Aunt Millie in Podunk, but what about WP:What_Wikipedia_is_not#Wikipedia_is_not_censored don't you understand? It's an unexpurgated encyclopedia, containing articles on every topic you might imagine (and many you might not). If you can't handle this, do this (warning: don't follow this link at work or in the presence of anyone who might be easily offended). -- Rick Block 04:35, 1 May 2005 (UTC)[reply]
As demonstrated, high dudgeon fails as a substitute for argumentation, and manages to drive out charm completely. - Nunh-huh 06:29, 1 May 2005 (UTC)[reply]
Sorry, I thought it would be funny. My points are:
  1. Participating in a system that removes objectionable content is censorship (please refer to earlier definition of censor).
  2. Every article would have to be monitored for objectionable content and the appropriate label applied, probably only by users with some sort of special privileges. In other contexts such people are sometimes called the rating board - I'd suggest we should call them censors.
  3. Is there any evidence that this is an actual problem that needs to be solved? What percentage of nanny filtering software packages include wikipedia in their blacklists? How many schools have blocked access to wikipedia?
  4. Uncensored printed encyclopedias already exist, and are already available in most schools. How is wikipedia fundamentally any different?
  5. Page ratings will not be constant. Any page might at any time be changed to include something that would require a more restrictive rating. If it's inappropriate it will be reverted, but it will still exist in the history. Any rating system for highly dynamic content will not in general work.
  6. Wikipedia is an encyclopedia. Content that might be "considered reasonably objectionable by the majority of the population" either shouldn't be here or is here as an integral part of an article on some topic. Our content is not here for prurient entertainment. Applying an entertainment based rating system is inappropriate.
Better? -- Rick Block 16:08, 1 May 2005 (UTC)[reply]
Yes, means the points can be easily discussed :)
  1. Even if it counts as censorship, does it matter? It just gives people extra option
  2. This is a problem - inconsistency across articles etc. Even if this is resolved, it'll take a huge amount of work to implement. Content rating by consensus may be an option.
  3. Valid point and needs further discussion
  4. Thats an assumption. I challenge you to find a picture of autofellatio in any school library.
  5. As 2
  6. In wikipedia's current form, its a reasonable point. But it leaves the option of including more detail that might be slightly more offensive to others. An example might be photographs of corpses that have died from certain diseases. Useful in medicine, but some might not want to see (I guess I'd include myself in that - unnecessarily over the top).
As said before, my heart is not set on this, but I certainly feel that throwing it out as irrelevant is unfair -- Tomhab 17:07, 1 May 2005 (UTC)[reply]


Interesting objections! Lets take a look at each, and give a rational response:
  • 1. You still don't understand the meaning of censorship. The definition has already been listed on this page. Labeling is not censoring.... I'm not sure how the concept can be explained much clearer then what has already been said. Do you object to content labels on movies, too? There are a number of people (myself included) who think that the movie rating system can be improved upon, but I don't know of anyone who thinks that having no movie ratings at all is better then having some sort of rating system.
  • 2. No one has proposed a ratings board. Everything on Wikipedia is user driven. This will be user driven to. In any case, a ratings board is not the same thing as censors. A ratings board doesn't say "That is too offensive! Burn it! BURN! BURN!" A ratings board, in common usage, says "In our judgement, this item is not appropriate for people below the age of x"... or, in our proposed usage, makes no "age decree" whatsoever, and merely says "This item contains x, y, and z". The people who want to censor things so so based on their own devices anyway. Heck, some people have wanted to censor Harry Potter, and that's rated PG!
  • 3. There is plenty of evidence. We can dump a truckload of evidence on you if you wish.
  • 4. Print encyclopedias don't include pictures of auto fellatio, among other things. They generally stay away from highly sensitive content. If they didn't, there'd be community groups looking to rip pages out of the Britannica. Wikipedia boldly dares to cover any topic, no matter how raunchy or controversial. Thus, we need a labeling system to have any hopes of even being considered with a fraction of the respect that Britannica has.
  • 5. It's true that content will be dynamic. Thus the labeling will inevitably be imperfect. However, the content level of most articles is dictated by the topic, so the fluctuations are generally going to be small. Thus the labels won't be perfect, but they'll be a reasonable guide that'll warn you 95% of the time. 95% is not perfect, but it's 95% better then 0%. These labels are not meant to be perfect, they are meant to provide information to assist viewers. There's no guarantee of offensive content being labeled, but there isn't a guarantee in any public space. You could be taking your nephew to the zoo when some guy runs by screaming "Motherf***ing monkey! Stole my f****** ice cream! F***ing buttf***** a****** j****** d******! I'm gonna show that monkey b****'s a** who's higher on the evolutionary scale!" If you say that in the zoo, ANY ZOO, they'll kick you out. If you say that in a Wikipedia article, it's not going to be kicked out (assuming it's sourced and relevant to the topic)... but it would be a good idea to put a warning label above it, don't you think? Think of the children. Think of yourself at work. Think of yourself at home and your Grandma walks into the room.
  • 6. It's not an "entertainment based rating system". It's a content based labeling system.
Much better. - Pioneer-12 17:25, 1 May 2005 (UTC)[reply]

This will be implemented using templates. Please explain how this will be implemented. How will I, Joe NewUser, know that a template will keep me from seeing pubic hair, which is so horrid to see. If I click on a link to pubic hair, or even if I get it from a Random search, what will the template do automatically that will not require coding? Would Joe NewUser automatically get told, "this has this list of things that you might object to '" on the page? Will he not be able to see the page at all? Will he have to set his Prefernces to see or not see whatever the templates have flagged? How can any of these things be accomplished without any coding? RickK 00:16, May 3, 2005 (UTC)

Sorry for the huge bunch of replies here, but I've been away for a couple of days.

  • 1. Censorship is "the systematic use of group power to broadly control freedom of speech and expression, largely in regard to secretive matters".
    • This proposal is clearly a "systematic use of group power" to enable "freedom of speech and expression" to be "broadly controlled". How is that not censorship?
  • 2. To censor is "to review in order to remove objectionable content".
    • And by reviewing articles in Wikipedia to allow objectionable content to be removed we how exactly would we not be censoring it?
  • 3. No one is trying to control freedom of expression. No one is trying to remove objectionable content.
    • Yes they are, the "soccer moms" you keep going on about (unless of course they don't exist, in which case your arguments invoking them are invalid).
  • 4. by clearly stating that a page contains nudity, sex, violence, or profanity, pages which have potentially objectionable content become less objectionable, because viewers are given fair notice.
    • Who defines what "nudity", "sex", "violence" and "profanity" mean? How can we ensure such definitions are NPOV?
    • What about pages that are potentially objectionable for other reasons (see the myriad reasons given several times before and repeatedly ignored)?
    • Why does sticking a note at the top saying "This page might be objectionable" make it any less so?
    • Viewers are already given fair notice through the general discalimer, including Wikipedia is not censored, linked from every page.
  • 5a. A proposal with full details will be created in due course.
  • 5b. The system suggested, RSACi, is a well known one. It is based on the work of Dr. Donald F. Roberts, chairman of the Communications Department at Stanford University, who has studied the media for nearly 20 years
    • Asside from these two statements being contradictorary, what does it matter that some bigwig in some university made this? That just tells me its based on his POV. Media like Wikipedia hasn't been around for 20 years, and wikis aren't traditional media, so how does this help anybody?
  • 6. The RSACi is flexible, easy to understand, and easy to use. It's a very good system
    • The above is your POV. Mine is that any ratings system must have rigid boundaries to be consistent. This is incompatible with the flexibility that you claim.
    • It might be easy to understand, but that doesn't make it appropriate. The ease of use issue is irrelvant at this point.
  • 7. Remember, this isn't doing anything we aren't doing already. We already use warning labels for pages, but in a totally erratic manner.
    • Erm, yes it is doing something we are not doing already - we don't have a censorship and/or content warning system so setting one up is different by definition.
    • The only warning labels that are agreed and accepted by the whole community are:
    • So any others are different to what we are already doing.
  • 8. The current "argue over everything endlessly" setup we have now is absurd, inconsistent, illogical, and a total waste of time
    • The current setup is NPOV. This is an "absolute and non-negotiable" pillar of Wikipedia. Until anything proposed to replace it is NPOV it is not a total waste of time.
  • 9. it would theoretically allow those behind censorship to see the "neutral" information while the "objectionable" content is censored. (Instead of blanket censorship of the whole site, which I know has happened for political content.)
    • All well and good in theory, but this proposal would do absolutely nothing about political content. It was even said expliclitly that we do not need to worry about all possible objections - so what is the point?
  • 10.Even if it counts as censorship, does it matter? It just gives people [an] extra option
  • 11. Labeling is not censoring....
    • Please see the several instances on this page where people have explained to you exactly how your proposals are censorship.
  • 12. Do you object to content labels on movies, too? There are a number of people (myself included) who think that the movie rating system can be improved upon, but I don't know of anyone who thinks that having no movie ratings at all is better then having some sort of rating system.
    • I do not object to ratings on movies (I object to the way some are applied, but that is a completely off topic argument). There is a fundamental difference - Wikipedia is an encyclopeadia. It is not a movie. This argument is a straw man.
  • 13. Everything on Wikipedia is user driven. This will be user driven to.
    • You say that it will be user driven, what we have currently is user driven - and you have derided this as a waste of time. Your
  • 14. No one has proposed a ratings board.... In any case, a ratings board is not the same thing as censors.
    • So how will the objectionable content be labelled if not by trusted users? If you let everybody have a say - i.e. what we have a the moment - then you will end up with the same "inconsistent waste of time" you beleive we already have.
    • Please explain how a board of people deciding who can and cannot see something are not censors.
  • 15. A ratings board doesn't say "That is too offensive! Burn it! BURN! BURN!" A ratings board, in common usage, says "In our judgement, this item is not appropriate for people below the age of x"...
    • In other words, they censor it.
  • 16. or, in our proposed usage ... merely says "This item contains x, y, and z".
    • Just saying this item contains x, y and z isn't censorship, but you are missing the point. What you are proposing is to label content for the purposes of censorship. You also miss the point that I have been repeatedly making and you have been repeatedly dismissing as trivial is that any labels must be defined and NPOV. Things are not as easy as black and white - what exactly counts as nudity? what counts as profanity? etc.
  • 17. The people who want to censor things [d]o so based on their own devices anyway. Heck, some people have wanted to censor Harry Potter, and that's rated PG!
    • You have just explained why censorship is inherently POV and given a wonderful example of just that.
  • 18. There is plenty of evidence. We can dump a truckload of evidence on you if you wish.
    • Please do so.
  • 19. Print encyclopedias don't include pictures of auto fellatio, among other things.
    • So? Print encyclopaedias don't generally inlcude pictures of pokemon, or a million other non-controversial things we have pictures of.
  • 20. They generally stay away from highly sensitive content.
    • So? We are not a traditional encylopaedia.
  • 21. If they didn't [stay away from highly sensitive content], there'd be community groups looking to rip pages out of the Britannica.
    • In various parts of the world there are. Brtiannica covers evolution, capitalism, isam, judaism, abortion and communisim among other things. There are groups who feel these topics should not belong in an encylopaedia.
  • 22a. Wikipedia boldly dares to cover any topic, no matter how raunchy or controversial.
  • 22b. Thus, we need a labeling system to have any hopes of even being considered with a fraction of the respect that Britannica has.
    • The second part of this statement does not follow (there is a name for this but I can't think what it is). We do, and Imho we are right to, cover any topic, but this does not mean we need a content labeling system. The fact that we are not censored gives us a huge amount more respect than Britannica among certain groups, just as the opposite is true of other groups. The level respect we have among the general population relates to this issue so minutely as to be irrelevant. The biggest things needed for respect of Wikipedia are accuracy and NPOV.
  • 23. There's no guarantee of offensive content being labeled
    • Please explain how this is different to what we have now?
    • If we have a system that Little Johnny's mother and father can use to make sure he doesn't see content they don't want him to (which is a goal of your proposal), we will get bitter complaints when Little Johnny finds the 5% of content you quote as escaping the filter.
    • The front page currently says the English Wikipedia has 548,560 articles. Lets say that 95% of these are either not objectionable or are labeled (521,132 articles - a mamoth task to achieve), that leaves 5% that aren't (27,428 articles). Of those 27,428 articles lets say that only 5% are objectionable in some way - 1,371 articles. It will only be a matter of time before Little Johnny or his parents find one of those articles, as they will be looked for.
    • Once one person finds one such article, then you can bet your bottom dollar they will tell their mates. As more and more people know about it, more and more people will try and find it or others. Also, pretty quickly the press will get hold of it - what was this respect you were talking about earlier?
  • 24. It's not an "entertainment based rating system". It's a content based labeling system.
    • ...Designed for the entertainment industry, not an encylcopaedia.

Thryduulf 09:37, 3 May 2005 (UTC)[reply]

You reply indicates that you have no understanding of the topic, the proposal, the facts, or of what currently goes on Wikipedia. You have managed to systematically misunderstand every single statement you quoted. You simply have no concept of what censoring is. I'm sorry, but your objections are grounded in an illusion very different from reality. Come back when you understand the definition of censorship.
I'm happy to see any objections or concerns based on reality. But objections based on a total and fundamental misunderstanding of everything are just a waste of time.
- Pioneer-12 11:22, 9 May 2005 (UTC)[reply]
Please don't descend into personal attacks. While I don't consider them to be, they could be interpreted as such.
I have explained several times how and why I consdier these proposals to be censorship. Some of them are directly censorship and others appear designed to facilitate censorship by others. If you disagree with these, please explain why individually.
It might help understanding if I make it clear that some of my objections are based on my opposing censorship, and others are based on what I see as violating the principles of the Neutral Point of View. If you are trying to understand the latter in terms of censorship, then it is quite likely that they wouldn't make sense.
I have explained several times here, for each point, what my views on it are and why I hold those views. I have misunderstood any of the proposals, please explain individually how I have misunderstood them, what you actually meant, and consider rephrasing them so that the potential misunderstanding is reduced - something that benefits everybody.
My interpretations, objections and concerns are based on the facts as I know them, reality as I see it, the proposals as I understand them, and your comments as they appear to me. If you disagree with any of my interpretations, please explain why you feel they are wrong individually so that I can understand what your POV is.
A single broad statement saying "You are wrong, I am right, come back when you are prepared to agree with me" is not going to help your cause at all, because it comes accross that you are not prepared to discuss things because you are insisting that your POV is right and that all others are wrong, regardless.
As this is at risk of degenerating into a pointless shouting match, I am going to request put up an RfC about this proposal to try and get some fresh perspectives on the issue. Thryduulf 12:10, 9 May 2005 (UTC)[reply]
An RfC is a good idea. Thanks for encouraging more people to join the discussion.
As for your earlier statements, the problem is that almost everything you are saying is based on a flawed understanding/interpretation/extrapolation/whatever of censorship. For most of your statements, there's not much to say in response except repeating the definition of censorship, as written in dictionaries and encyclopedias, over and over and over.
As for neutral point of view, you raise a good point there; it can be argued that giving something an age label--for example "may be inappropriate for viewers under the age of 14"--is "a point of view". I think that's a reasonable objection for a system of that nature, but I also think it's a rather insignificant one--I don't look at the age labels on movies and scream "This label is a point of view! It is a violation of my beliefs to read this label. Oh, I must wash my eyes now for seeing such POV filth." On, the contrary, I see age labels as a helpful descriptor of the movies's contents. I know age labels are given based on a well-defined system, and are accurate within the confines of the system.
However, since it is reasonable to object to age labels (I don't, but I can understand why others do), and since it is possible to create a system better then age labels, then I say we use a system better then age labels. Thus, content descriptors: nudity, sex, violence, and foul language. Now there is no POV objection because the label is not making an assertion of the "age appropriateness" of something.... it is instead objectively describing the content and viewers make their own viewing decisions based on the content descriptions.
- Pioneer-12 18:45, 9 May 2005 (UTC)[reply]

Why is this needed?[edit]

(new header added for editing convenience)

I've been persuaded that my comments were not useful, so I've removed them from this very large page. I shall allow this discussion to proceed uninterrupted. — Xiongtalk* 06:31, 2005 May 7 (UTC)

Let's start with point #3. Please cite some of this truckload of evidence. Note that I'm specifically asking about instances where wikipedia is currently blocked due to the inappropriateness of its content, NOT examples of inappropriate content. -- Rick Block 18:37, 1 May 2005 (UTC)[reply]

So if someone placed it on the Evolution, Communism, Islam and Arab-Israeli conflict becuase they didn't want kids to see NPOV articles about those topics that would be fine? Thryduulf 09:37, 3 May 2005 (UTC)[reply]
Errrm I guess you can get 10 out of 10 for simplicity? Could well work too... -- Tomhab 23:15, 1 May 2005 (UTC)[reply]

What is the evidence of the problem?[edit]

I repeat, please cite the evidence that there is an actual problem. Has wikipedia been threatened with any legal action? Is wikipedia.org blocked by anyone? Let's pretend for a minute this is an article that's advancing the claim that some portion of wikipedia is so offensive to someone somewhere that legal action has been taken (or threatened) or that the site has been blocked. Such claims would need references. Please cite your sources. If there is no problem, no solution of any sort is necessary. If there is a problem, without knowing the specifics I see no way to decide whether any proposed solution is adequate. -- Rick Block 01:14, 2 May 2005 (UTC)[reply]

Evidence of the problem: Wikipedia is an incredibly powerful educational tool, but use of this tool is reduced if school teachers and parents don't want to advocate its use to kids. School teachers and parents don't want to advocate its use to kids if they think kids might be violated by using it. Does this happen? I think this talk page is evidence that it does. Additional evidence can be found here:http://larrysanger.org/2012/05/what-should-we-do-about-wikipedias-porn-problem/. — Preceding unsigned comment added by 24.181.15.176 (talk) 08:39, 31 August 2015 (UTC)[reply]
Yikes you don't give up do you straw man? Just because there isn't a law suit against wikipedia or it being blocked doesn't mean there is not problem. This could potentially solve every problem with images that I've seen has been related to "appropriateness". Autofellatio, Clitoris etc. I'm sure people wouldn't insist on them being removed if it were 'clapping' and 'fingernails'. -- Tomhab 02:07, 2 May 2005 (UTC)[reply]
Well, no, I don't give up. Being clear about the problem we're solving seems like the obvious first step in solving it. If the problem isn't actual or threatened law suits and isn't that wikipedia is being blocked, what is it exactly? -- Rick Block 03:30, 2 May 2005 (UTC)[reply]
Erring on the safe side if nothing else? I'm more interested in this being a way of resolving various image disputes going around. Lets the prudes be prudish and the harder wikipedians be tougher skinned. I see nothing lost by some sort of content labeling system, but potentially an aid to disputes. -- Tomhab 04:02, 2 May 2005 (UTC)[reply]
Some sort, but what sort and why? Disputes about specific images within wikipedia sound to me like an issue that should be handled by wikipedia policy. Is the autofellatio jpg appropriate for inclusion in wikipedia? If not, let's get rid of it in accordance with some established policy. If it is appropriate for inclusion in wikipedia according to wikipedia's policies and there's some consequent problem (like we end up being subject to law suits or content filter blocking), exactly what is the issue? If we don't know what the problem actually is, it's my belief we can't know how to solve it. -- Rick Block 04:48, 2 May 2005 (UTC)[reply]
Let me guess.... you've been married at least once? :) -- Tomhab 13:12, 2 May 2005 (UTC)[reply]
Yes, let's get rid of the images of penises in mouths on wikipedia. Particularly when a child doing their homework, using wikipedia as a resource, with a parent nearby, remembers that they heard the word "blowjob" at school today and, not knowing the meaning, type the word into the useful search box at the top of the window. Imagine not being able to move your hand to the mouse quickly enough to block your 8yo from seeing that.

What exactly is the goal?[edit]

In this long, drawn-out discussion, really three types proposals have been discussed, each based on one of three separate ideas of what the goal really is.

1) We need a system to help parents who don't want their kids to see whatever. My vote: no. I go back to the argument about how we decide what some parent can consider objectionable. Maybe I'm misreading what he said, but to me it looks like Xiong says that anything that some parents could consider objectionable should be marked as "NO KIDS." So again, where do we draw the line? What you think is objectionable and what I think is objectionable may not be the same as what someone else thinks is objectionable. There's a lot of people out there who don't want their kids reading about Islam. Should we block that?

If you're an entertainment industry and only care about making money, it's acceptable to overgeneralize and restrict too much. But Wikipedia is not a business, and from a fundraising point of view, I don't think Middle American "soccer moms," as Xiong called them, are going to be the project's main contributors.

Xiong thinks that a simple "NO KIDS" logo would be perfect because it would satisfy the "peasants with pitchforks" but not stop the kids from looking at the page. I disagree. Perhaps I am of a minority opinion, but I think a "NO KIDS" logo is offensive, especially if by "kids" you mean adolescents too.

As far as the legal liability, well, I'm no lawyer, but I was under the impression that the courts have ruled several times that websites cannot get into trouble for putting naughty stuff on the net. Libel and copyright violations are far, far bigger risks from a legal point of view.

2) We need a system to prevent Wikipedia from being filtered. My vote: no. This reminds me of the situation with the article on the Abscam blog disclosure (now redirected to sponsorship scandal), when some guy erased information that was subject to a Canadian publication ban. It is not Wikipedia's job to abide by the expression-restricting laws of every country on the planet, even those of the United States. Perhaps one day, the content filters mandated for schools and libraries will block out Wikipedia. That would be a shame. But I think that if that were to happen, it would make the censorware programs and policies look like the bad guy, not Wikipedia.

I am a researcher for Google Answers, which has a policy of avoiding sexual matters, lest it get blocked from schools and workplaces. That's OK; they're a business. But in Wikipedia's case, participating in such schemes runs against what I would think would be the encyclopedia's mission.

3) We need a system to prevent people from accidentally accessing content that could get them in trouble. My vote: maybe. If we can do it right, this would be helpful. Perhaps people should be encouraged to put "Warning: contains nudity" next to links that go to such sites. But, of course, making that a policy carries logistical problems -- must I watch every page I link to in case someone puts up a nude picture? And none of the proposed fixes anyone's mentioned would prevent people from accessing up the auto-fellatio picture if some vandal has posted it on VfD or somewhere.Mwalcoff 09:25, 2 May 2005 (UTC)[reply]

You keep saying, "we need, we need, we need", but you have yet to prove that we need any such thing. RickK 00:17, May 3, 2005 (UTC)

No, I was just trying to summarize what other people have said. I think this discussion is kind of pointless until we decide what we are trying to do, exactly. Mwalcoff 11:24, 3 May 2005 (UTC)[reply]
I expect a lot of people who care about the aesthetics of wikipedia articles are going to object to Image:nokids.png, particularly above the fold. The problem with tagging content is that doing so is quintessentially POV, but I do agree that some kind of realpolitik solution is probably necessary (particularly if it can be made to provide additional functionality).
Such a solution might be to create filters, which would consist of lists of articles, and add functionality to treat articles in those filters as special cases. (The page might not resolve, or a specific view of the page might be displayed, with each such view a variant version — requiring more non-trivial infrastructure change, unfortunately — of the default/consensus/official article. If views are clever enough, they could be used to implement image-tagging, ratings displays, black marker-style redaction or even "technical", "kiddie safe" or "quality assured" variants.) Filters could be stored as articles (perhaps in User trees) and employ a class-like system of inheritance, and the implementation of views might be coupled with the ability to host non-canonical articles offsite to save wikipedia storage space and perhaps even bandwidth.
(Disclosure: I'd also like to use some of this functionality to implement collaborative fiction writing/world-building — which has an analogous need for both an anti-spoiler system for readers and a plot forking system to avoid discouraging creativity — over at wikifiction.) — JEREMY 09:24, 3 May 2005 (UTC)[reply]

The never ever remove the image on pain of death rule will just lead to the image being placed on every article. You might want it on Oral sex, someone else might want it on Islam, a third person might want it on Evolution, and a fourth person might want it on Billion. You might want to revert the latter ones as you consider them vandalism. Now you either don't revert the addition and the template appears on every article eventually (just think how easy it would be for someone to write a bot to place it on every page it wasn't on already); in which case what is the point of it, you just give a single rating to the entire site as proposed below and avoid putting a large load on the image servers and achieve the same effect (which, incidentally, is exactly what we have at the moment when we say Wikipedia is not censored). If you choose to revert, who decides which articles it is considered vandalism on and which it isn't - and how do you make such a decision NPOV? Thryduulf 18:28, 3 May 2005 (UTC)[reply]

Discussion about this exists in the wikipedia mailing list archives[edit]

This issue has been discussed on the wikipedia mailing list, see for example various threads from February 2005. Reading these threads I believe the wikimedia board considers this to be an issue that should be resolved by the en.wikipedia.org community, i.e. the board will not impose a policy. I take this to mean that there is no significant danger of any legal action relating to this issue. Taking the potential for law suits out of the picture, I think we're left with issues around appropriateness for certain audiences and potential for being blocked by nanny filters. I'm not sure either of these are terribly worth worrying about. In particular, rather than bother with Xiong's png per page idea I'd rather simply label the whole site with an R-rating. -- Rick Block 14:28, 3 May 2005 (UTC)[reply]

What might be interesting would be to get a feeling for what the wider wikicommunity feeling is on this by a sort of voting. Is that possible without making the judgment binding? I'd be curious whether (for example) 80% of wikipedians feel that they would rather risk being nanny-blocked or some sort of content labeling and partial censorship. That would solve the debate pretty quickly. -- Tomhab 14:42, 3 May 2005 (UTC)[reply]
The wording of the question would need to be NPOV - compare "Do you want Wikipedia to implement a content-labeling system? This would prevent it being blocked by net nanny software" and "Should Wikipedia content be subjected to a POV content labeling system designed to allow censorship of the encyclopedia? This would be an attempt, that may or may not work, to stop legal attacks, that may or may not be brought".
The following facts should also be presented:
  • No content labelling system has yet been proposed that is agreed to be NPOV
  • The only evidence so far presented of Wikipedia being blocked or censored was by the Peoples Republic of China over political issues.
  • No evidence has so far been presented of Wikipedia being blocked by Net-nanny software.
  • Part of the Wikipedia:General disclaimer linked at the bottom of every page is that Wikipedia is not censored.
Thryduulf 19:04, 3 May 2005 (UTC)[reply]
  • The reason the templates are "suppressed" is because the consensus of the community is that they are not wanted.
  • I agree completely that the film ratings are very POV, but my point is that all ratings systems are POV.
  • Precedent has shown that when large numbers of wikipedia community members become involved in a debate of this nature there is either no consensus or a consensus not to censor. Your "peasants-with-pitchforks" argument seems not to take this into account.
  • As long as people comply with the GFDL then the content here can be used anyway anybody wants (heck, I even dual liscence all my contributions into the public domain!), I don't understand the relevance of that to this debate though.
  • having the nokids image on every page is effectively the same as having a disclaimer on every page saying Wikipedia is not censored, which we already do. So the image is redundant.
  • Neutrally worded questions are possible, e.g. "Should the United Kingdom join the Single European Currency?". (contrast "Do you agree that the United Kingdom should benefit its international competitiveness by joining the Single European Curreny?" and "Should the United Kingdom hand over control of its monetary policy to foreigners by joining the Single European Currency?".
  • Your proposal is not "as neutral as possible", because it is based on the POV that one persons POV that any particular article is unsuitable for children is more more important than another persons view that it is suitable for cildren. Promoting one POV over another is incompatible with the "absolute and non-negotiable" NPOV backbone of Wikipedia. (That the NPOV should be absolute and non-negotiable is a POV. But as it is the POV of Jimbo, that is WP policy. If you disagree with this you are free to fork the content yourself or join any or all of the existing WP forks that do not have an NPOV policy, provided you don't break the GFDL.)
  • A lawsuit might be a risk, but it is the job of the board to determine that risk, and as I understand it they're assement is that it basically isn't one. The Wikimedia foundation doesn't actually have a lot of money, hence the periodic fundrasing drives. Its also a registered non-profit charitable organisation (if my understanding of the 501(3)(c) status is correct), which probably means that probably restricts what they can use the money they do have for. In short, there is probably not much point going after us with all guns blazing, particularly as they are not certain to win - we have explicit disclaimers linked from every page for precisely this reason. If on the other hand, we had warnings on only some articles, this could leave us open to attack if there wasn't one on a particular page.
  • The image might not take up a lot of resources, but it is completely redundant (unlike the other things mentioned). If we kept every single thing that was redundant it would soon add up to something significant.
  • Why should we work with you to develop something that we don't want? While precedent suggests that the community wont want it, we don't actually know in this case becuase they haven't been asked. Either you go with what those commenting are saying and what precedent shows, or you ask people. You can't compain about precedent not necessarily reflecting the current situation and refuse to find out what the current situation is. Thryduulf 15:01, 6 May 2005 (UTC)[reply]


Outreach or Alienation?[edit]

The question on a simple method of end-user filtering is a simple one for me and I support the proposal that Demi made on Meta. I've seen a whole ton of straw man arguments used against those who are supporting this so let me put it out there again because I've heard all of us say the same thing.

We don't agree with censorship and putting a simple label on something does not qualify as such any more than the title of this page does. Using a template driven system is discrete, is free of intrusive disclaimers or other labels and should be simple for the end-user to do (a checkbox in their preferencs pane). No additional work is required other than the normal image tag cleanup that is already done. I hate censorship, I remember joining in with friends saying "fuck Tipper Gore, fuck the PMRC" and I feel that way to this day. This isn't about censorship.

There is such a thing as too much of a good thing and that even goes with freedom. Because I will quote the humorist P.J. O'Rourke who said:

There is only one basic human right, the right to do as you damn well please. And with it comes the only basic human duty, the duty to take the consequences.

Now, it's time for you to ask yourself what Wikipedia should be about. I thought it was to be the sum total of human knowledge online and for free, I love that idea. However, I don't want to send my kid to Wikipedia to do a school project on "auto repair" or the like and get a search hit for a picture of some guy blowing himself.

This is a simple issue because trust me, I would gore each and every one of you to protect my kid if it came to that so that's the way I look at my job. If I can't find a way to prevent him from seeing stuff he's not ready for then I'll block the site altogether. Personally, I think that would be a shame because some people are far too interested in proving an overzealous point than in taking the consequences - denying a kid an otherwise great resource.

All we are asking for is a discrete and relatively easy way for the end-user to filter out content he/she does not wish to see, think of it as a "skin" if you may, in this case though maybe the "un-skin" would be more appropriate. So please, lay the straw arguments down, the borderline photograph postings, the pages if treatises on the evils of censorship. That's not what this is about. --Wgfinley 05:31, 7 May 2005 (UTC)[reply]

A policy allowing the insertion of anything as subjective as ratings into articles would set a self-destructive precedent (even at its most minimalist: by overloading the category system). If you want to filter wikipedia's content you should keep track of any articles (as a list external to those articles) you feel the need not to display, you should build the filtering schema, and you should negotiate its implementation. (Note that I'm using the second-person plural "you" here.) Just don't go vandalising "the sum total of human knowledge" with your POV nannycruft, or demanding others do it for you. — JEREMY 07:57, 7 May 2005 (UTC)[reply]
Obviously you didn't go read her proposal, these aren't ratings, three simple categories that aren't going to overload anything. Nannycruft? Demanding someone do something? Straw man argument #347, thank you. Some folks are making a request for a technical solution to a problem, haven't seen anyone demand it. --Wgfinley 14:35, 7 May 2005 (UTC)[reply]
I like Demi's proposal as well. My main concern with it is that people have different levels of sex, nudity and violence that they find acceptable. That's why I like an RSACi style system--very flexible while being extremely simple.... and also less vulnerable to strawman attacks. What do you think of RSACi? RSACi description from Netscape. Cool. I do think Demi's proposal would be much better then the current state of things, so I support her proposal as well.
- Pioneer-12 12:01, 9 May 2005 (UTC)  ::::- Pioneer-12 12:01, 9 May 2005 (UTC)[reply]

Well intended, but bad idea[edit]

I came here because it was on RfC... I think the entire idea of 'content labeling' (which in itself is a somewhat confusing euphemism) is well-intended but not a good idea. From the discussion above, it seems that many people oppose it on principle, and those that do not have yet to show a feasible way of implementing it. What I would suggest is that proponents of the system make a concrete proposal. Radiant_* 08:17, May 10, 2005 (UTC)

See also[edit]

Case-by-case warnings[edit]

I agree that an RSACi-like system is much better than an age-based system. However, even instituting an RSACi-style system implies categorizing things into levels of offensiveness or inappropriateness or whatever.

If the goal of "content labeling" is to prevent people from seeing something they don't want to see, my thought is that the easiest way to do this would be through case-by-case warning messages on the top of articles, such as the following:

"Note: The following article contains an image of _________."

The blank space could be "a topless woman," "a napalm victim in graphic detail," "Art Modell lifting the Lombardi Trophy," or anything.

I wouldn't even use the word "warning," since that implies that there is something wrong with the image. Perhaps the notice could also include a hyperlink to a text-only version of the article. Obviously, the image would have to be put "below the fold" for this to do any good.

Of course, then we get into the question of when such a notice should be put on the top of an article. I suppose that we could just say that if you think someone might want to be alerted to the content, you could go ahead and put the note on the top of the page.

Such notes could even be used for potentially disturbing text. I used to work for a newspaper that would preface stories about child rape and other disgusting stuff with an "editor's note" that the following article may be disturbing.

I agree that this is not perfect and still does not solve the problem of impartiality when it comes to what needs to be pointed out. But if some people are going to fight until they get some kind of labeling system, I believe this is the least-intrusive one possible. Mwalcoff 20:14, 10 May 2005 (UTC)[reply]

I completely disagree with any system that tries to rate offensiveness. This is inherently POV and will never work. You just need to tag the images with what they are, and allow the user to block whatever they want. I, for instance, might block Category:Images of surgery, or block Category:Images containing nudity at work, but not at home.
Crap. I am going along saying it would be a per-user setting when what I really want is a per-browser setting. Is there an HTML tag for this kind of thing? Then the software would add the HTML content="nudity" to every article in the nudity category, and then individual browsers could be set to block the content with no other mediawiki software changes.
Crap. That sounds like it wouldn't work well, either, since you need a browser that supports it. So there is also the per-user or per-browser aspect to consider... - Omegatron 20:34, May 10, 2005 (UTC)

One possible alternative approach[edit]

I found my way to this discussion after visiting Wikipedia for the first time recently and happening -- through one interesting link after another -- to find myself on the capital punishment page. Expecting to find a sober presentation of capital punishment statistics and views on both sides (which I did in part), I also found there a list of methods that have been used to execute people over the course of history and -- quite unexpectedly -- this list disturbed me a lot psychologically.

Afterwards, I posted a comment in the discussion section of the capital punishment page about the possible need to warn readers about its content -- including but not exclusively children, and I received one emphatic response to the effect that anyone visiting the page should "deal with it."

The person who wrote that response probably imagined that they and I have less in common with each other than we really do. In particular, I oppose censoring the content of Wikipedia in any way other than within the general guidelines and policies that are established by its creators, and, more specifically, I am opposed to limiting the content of Wikipedia, however we might decide to limit the forms in which we present that content.

Having said that, I do support in principle the idea of embedding warning tools for parents and others with reason to restrict the viewing of their children or themselves. But that is not what I am proposing here. Many other folks on both sides already have discussed that topic.

My suggestion here is is to handle certain types of "difficult" content through an alternative that doesn't involve either censoring or labeling. For lack of a better name, I call the alternative "helping."

Helping would recognize 2 ideas, one which (I think) is mostly uncontroversial, and one which may be more controversial to some people at first. The 2 ideas are are that:

(1) Every viewer approaches "content" from a different perspective, and with different abilities to understand (i.e., "deal" with) it. This is no less true of Wikipedia content about, say, atmospheric physics and Shakespearean sonets (both of which some people find tough!) than it is true about articles about sex or torture. It's the reason why so many of Wikipedia's best articles do a great job of introducing difficult concepts with a generous amount of background and patience.

(2) The ability of people to understand or deal with information is not restricted to their intellectual ability. It also involves their emotional or psychological ability. Simply stated, some people are emotionally ready to contemplate certain disturbing facts and issues, and others aren't. Obviously, many (not all) children are -- on average -- less able to deal emotionally with difficult subject matter than adults, but there are also (many) adults who have trouble where other adults would not.

Some of the issues addressed in Wikipedia can emotionally or psychologically jar and even harm certain unprepared visitors -- and, in a sense, this is appropriate. Wikipedia is an encyclopedia that aspires to treat every issue in the world, and many real world issues are psychologically jarring. We shouldn't shirk them. But that doesn't mean we shouldn't think about how we present them.

Many parents present issues to their kids differently depending on their kids' levels of development. It doesn't matter whether the issues at hand are "controversial" (e.g., sex or violence) or mundane (like "why do I have to go to bed now"). Nearly every parent does it.

I for one happen to think that ALL topics should be available to discuss with ALL kids: sex, war, divorce, bedtimes. But I also think that it can be be damaging to discuss these topics with an 11 year-old, say, in the same way that I discuss them with another adult, so I don't.

My proposal therefore is that Wikipedia consider implementing a mechanism -- at least on Wikipedia pages that are likely to be emotionally difficult for some readers -- directing "emotionally unprepared" readers to resources that can help them "deal with" the content. To me this is no different than helping "intellectually unprepared" readers by pointing them to information background and assistance.

Taking the example of the Wikipedia page that raised this issue in my mind, the one on capital punishment, I would suggest having more than one version of this page, for people who are on different levels of understanding, along with links to resources where readers can get help emotionally dealing with the difficult information they are learning about.

It would be a great project for many a junior high school Social Studies class to do a project on capital punishment, which could involve looking at Wikipedia. But I think a lot of junior high school kids could be psychologically hurt by what they find there now, at least if they aren't offered some decent way to deal with the emotions that that page engenders. Wikipedia should welcome those kids just as much it welcomes the most emotionally together adults, but currently it doesn't.

Even many adults would shirk the capital punishment page as I viewed it for fear of happening upon something "gross" or upsetting. And I think that's a shame. All of us have a responsibility as citizens to know as much as possible about the topic of capital punishment in forming our own opinions and ultimately policy about the subject. The challenge is to HELP people get to the information in a way that is as appropariate as possible to their intellectual and emotional abilities.Pspeck 23:32, 14 May 2005 (UTC)[reply]

Hear, hear. The issue is indeed one of adding value to the content, not reducing it. The "more than one version of this page" idea is definitely a step in the right direction — even if VfD politics meant the alternate page(s) had to be hosted offsite (at a dedicated wikicity, for example). However, I believe it's the strong resistance to the idea of adding POV tags to the (raw, original) content itself that lies at the core of the problem with moving ahead on a schema to value-add labelling, versioning and filtering options.
It seems to me that the onus is on those of us who see an advantage in this course of action to develop a technical implementation which does not require that the raw, original wikipedia articles are tagged or categorised "in-line", and which provides this added functionality to wikipedia readers on an opt-in, customisable basis. Lack of full integration (ie. anything involving developer time) needn't be an impediment initially, and might only come about if the utility of the schema can be demonstrated. — JEREMY 05:45, 17 May 2005 (UTC)[reply]
I absolutely disagree. Since when could junior high school students be hurt by a list of methods of capital punishment? You are being massively naive and nannyish. Middle school students (and children in general) are far smarter and more considerate than you give them credit for being. This is an encyclopedia, not a self-help book. --FCYTravis 04:16, 29 May 2005 (UTC)[reply]
I beg to differ. I was sixteen when I saw A Clockwork Orange at school and I can't begin to elaborate on how disturbed I was by the end of it. Not all people are of the mental stability that you believe them to be. You Can't See Me! 02:29, 21 May 2007 (UTC)[reply]
I object to any content labelling system on the grounds that implementing it properly is a huge amount of work and a distraction from Wikipedia's real objective. Rating systems for content and actual content producing systems are different things, and Wikipedia is of the latter kind. Making a half-hearted attempt at the former is not in my opinion useful, but would be a distraction from creating an encyclopedia and produce unsatisfactory results anyway. Starting a separate project concentrating solely on rating things is a much better idea. (I'm not particularly fond of the existing category system either for similar reasons, and yes, I think the categories are mostly useless and there could be a much better system if implemented separately.)
I also object to such a system affecting Wikipedia itself in any visible way. I would consider visible ratings on the pages to be advertising of censorship ideology and thus very POV. Any debate of such ratings on the talk pages would clutter them up with endless controversial discussions unrelated to the actual content of the articles, and we have enough of that as it is.
Given the visibility constraints, the only possible option is putting the ratings in invisible HTML tags (e.g. comments), which could then be used by filtering software. The ratings could be automatically submitted by an external, trusted authority and automatically accepted by Wikipedia software. The "authority" could be e.g. WikiRatings Project or something, which could just be ignored by those Wikipedia users who are not interested in it. Instead of making any further proposals like the one in this article, successfully starting such a project would be much more productive. (I'm noting my opinion here largely because this article is still often referred to in discussions and the discussion above misses some of the relevant objections.) Coffee2theorems 03:27, 6 September 2006 (UTC)[reply]
This entire discussion would make more sense if instead of thinking in terms of labeling the content of articles, we think in terms of recording the various emotional reactions of people who view articles. Nobody can define exactly what is "obscene" or "offensive" because those are not objective attributes. Instead they are words various observers attach to things which cause them to experience unpleasant emotions. Those emotions are objective neurological phenomena (how something makes you feel is how it makes you feel, regardless of whether someone else views your emotional reaction as culturally appropriate, and in most cases people have very little conscious control over their emotions). Emotional reactions can in principle be measured (via polygraph, PET scan, etc.). In practice, emotional reactions can be reliably self-reported, since it is easy for most people to indicate whether they find something disturbing, and to what degree.
Imagine a system whereby every viewer of every article has the option to rate the article according to his or her perception of its offensiveness, suitability for children, etc. The system keeps track of the ratings for each article, and all the ratings by each individual. Over time, the system would accumulate a vast number of article rankings, and it would be straightforward to identify users whose rankings are similar. Suppose two users, A and B, have each rated the same 100 articles very similarly, and A has rated another 100 articles that B has not yet viewed. Because their ratings of articles they have both viewed are similar, they probably have similar emotional responses to things. Thus A's rankings of the 100 articles that B has not yet viewed might predict how B would react to those articles upon viewing them.
Generalizing to a large number of users, it should be possible to design a system that effectively groups people by similarity of taste, and generates advisories perhaps like the following: "90% of people who agree with 90% of your common article rankings consider this article offensive. Do you wish to view this article? (Yes) (No)" A user could then set in his or her preferences whether such advisories should appear before displaying potentially offensive articles.
The system would have to account for constantly changing articles, perhaps by tracking the date of each article ranking, and applying an aging factor to them based on some measure of how much an article has changed.
Note the similarity with democracy. In a democratic political system, people with some similarity in opinion form political parties. When a new candidate earns his or her party's endorsement, party members can be fairly certain that candidate shares their interests, or at least is more likely to share their interests than an opposing party's candidate. A party member who doesn't have time to study the candidate in detail trusts the collective judgement of fellow party members. Classifying candidates according to subjective labels such as liberal or conservative is difficult and debatable; however, election results have the potential to be fully objective (in terms of merely counting the votes).
Also note that the article rating system I am describing could be built outside of Wikipedia, and the rating system could encompass all Web pages. It would only be necessary for Web browsers to provide users the option of hooking into it, so they could rate any pages they view, and pool their ratings with other users. This is a generalization of similar systems that allow multiple users to share their classifications of E-mail spam. --Teratornis 15:46, 6 June 2007 (UTC)[reply]