Breaking News and Updates
- Abolition Of Work
- Alternative Medicine
- Artificial Intelligence
- Atlas Shrugged
- Ayn Rand
- Basic Income Guarantee
- Conscious Evolution
- Cosmic Heaven
- Designer Babies
- Ethical Egoism
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom of Speech
- Gene Medicine
- Genetic Engineering
- Germ Warfare
- Golden Rule
- Government Oppression
- High Seas
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Longevity
- Immortality Medicine
- Intentional Communities
- Life Extension
- Mars Colonization
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- New Utopia
- Personal Empowerment
- Political Correctness
- Politically Incorrect
- Post Human
- Post Humanism
- Private Islands
- Resource Based Economy
- Ron Paul
- Second Amendment
- Second Amendment
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Teilhard De Charden
- The Singularity
- Tor Browser
- Transhuman News
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Zeitgeist Movement
The Evolutionary Perspective
Tag Archives: health
Posted: October 23, 2016 at 4:23 am
Eugenics (; from Greek eugenes “well-born” from eu, “good, well” and genos, “race, stock, kin”) is a set of beliefs and practices that aims at improving the genetic quality of the human population. It is a social philosophy advocating the improvement of human genetic traits through the promotion of higher rates of sexual reproduction for people with desired traits (positive eugenics), or reduced rates of sexual reproduction and sterilization of people with less-desired or undesired traits (negative eugenics), or both. Alternatively, gene selection rather than “people selection” has recently been made possible through advances in genome editing (e.g. CRISPR). The exact definition of eugenics has been a matter of debate since the term was coined. The definition of it as a “social philosophy”that is, a philosophy with implications for social orderis not universally accepted, and was taken from Frederick Osborn’s 1937 journal article “Development of a Eugenic Philosophy”.
While eugenic principles have been practiced as far back in world history as Ancient Greece, the modern history of eugenics began in the early 20th century when a popular eugenics movement emerged in the United Kingdom and spread to many countries, including the United States, Canada and most European countries. In this period, eugenic ideas were espoused across the political spectrum. Consequently, many countries adopted eugenic policies meant to improve the genetic stock of their countries. Such programs often included both “positive” measures, such as encouraging individuals deemed particularly “fit” to reproduce, and “negative” measures such as marriage prohibitions and forced sterilization of people deemed unfit for reproduction. People deemed unfit to reproduce often included people with mental or physical disabilities, people who scored in the low ranges of different IQ tests, criminals and deviants, and members of disfavored minority groups. The eugenics movement became negatively associated with Nazi Germany and the Holocaust when many of the defendants at the Nuremberg trials attempted to justify their human rights abuses by claiming there was little difference between the Nazi eugenics programs and the US eugenics programs. In the decades following World War II, with the institution of human rights, many countries gradually abandoned eugenics policies, although some Western countries, among them the United States, continued to carry out forced sterilizations.
Since the 1980s and 1990s when new assisted reproductive technology procedures became available, such as gestational surrogacy (available since 1985), preimplantation genetic diagnosis (available since 1989) and cytoplasmic transfer (first performed in 1996), fear about a possible future revival of eugenics and a widening of the gap between the rich and the poor has emerged.
A major criticism of eugenics policies is that, regardless of whether “negative” or “positive” policies are used, they are vulnerable to abuse because the criteria of selection are determined by whichever group is in political power. Furthermore, negative eugenics in particular is considered by many to be a violation of basic human rights, which include the right to reproduction. Another criticism is that eugenic policies eventually lead to a loss of genetic diversity, resulting in inbreeding depression instead due to a low genetic variation.
The idea of positive eugenics to produce better human beings has existed at least since Plato suggested selective mating to produce a guardian class. The idea of negative eugenics to decrease the birth of inferior human beings has existed at least since William Goodell (1829-1894) advocated the castration and spaying of the insane.
However, the term “eugenics” to describe a modern project of improving the human population through breeding was originally developed by Francis Galton. Galton had read his half-cousin Charles Darwin’s theory of evolution, which sought to explain the development of plant and animal species, and desired to apply it to humans. Based on his biographical studies, Galton believed that desirable human qualities were hereditary traits, though Darwin strongly disagreed with this elaboration of his theory. In 1883, one year after Darwin’s death, Galton gave his research a name: eugenics. Throughout its recent history, eugenics has remained controversial.
Eugenics became an academic discipline at many colleges and universities, and received funding from many sources. Organisations formed to win public support and sway opinion towards responsible eugenic values in parenthood, including the British Eugenics Education Society of 1907, and the American Eugenics Society of 1921. Both sought support from leading clergymen, and modified their message to meet religious ideals. In 1909 the Anglican clergymen William Inge and James Peile both wrote for the British Eugenics Education Society. Inge was an invited speaker at the 1921 International Eugenics Conference, which was also endorsed by the Roman Catholic Archbishop of New York Patrick Joseph Hayes.
Three International Eugenics Conferences presented a global venue for eugenists with meetings in 1912 in London, and in 1921 and 1932 in New York City. Eugenic policies were first implemented in the early 1900s in the United States. It also took root in France, Germany, and Great Britain. Later, in the 1920s and 30s, the eugenic policy of sterilizing certain mental patients was implemented in other countries, including Belgium,Brazil,Canada,Japan and Sweden.
In addition to being practiced in a number of countries, eugenics was internationally organized through the International Federation of Eugenics Organizations. Its scientific aspects were carried on through research bodies such as the Kaiser Wilhelm Institute of Anthropology, Human Heredity, and Eugenics, the Cold Spring Harbour Carnegie Institution for Experimental Evolution, and the Eugenics Record Office. Politically, the movement advocated measures such as sterilization laws. In its moral dimension, eugenics rejected the doctrine that all human beings are born equal, and redefined moral worth purely in terms of genetic fitness. Its racist elements included pursuit of a pure “Nordic race” or “Aryan” genetic pool and the eventual elimination of “less fit” races.
Early critics of the philosophy of eugenics included the American sociologist Lester Frank Ward, the English writer G. K. Chesterton, the German-American anthropologist Franz Boas, and Scottish tuberculosis pioneer and author Halliday Sutherland. Ward’s 1913 article “Eugenics, Euthenics, and Eudemics”, Chesterton’s 1917 book Eugenics and Other Evils, and Boas’ 1916 article “Eugenics” (published in The Scientific Monthly) were all harshly critical of the rapidly growing movement. Sutherland identified eugenists as a major obstacle to the eradication and cure of tuberculosis in his 1917 address “Consumption: Its Cause and Cure”, and criticism of eugenists and Neo-Malthusians in his 1921 book Birth Control led to a writ for libel from the eugenist Marie Stopes. Several biologists were also antagonistic to the eugenics movement, including Lancelot Hogben. Other biologists such as J. B. S. Haldane and R. A. Fisher expressed skepticism that sterilization of “defectives” would lead to the disappearance of undesirable genetic traits.
Among institutions, the Catholic Church was an opponent of state-enforced sterilizations. Attempts by the Eugenics Education Society to persuade the British government to legalise voluntary sterilisation were opposed by Catholics and by the Labour Party.[pageneeded] The American Eugenics Society initially gained some Catholic supporters, but Catholic support declined following the 1930 papal encyclical Casti connubii. In this, Pope Pius XI explicitly condemned sterilization laws: “Public magistrates have no direct power over the bodies of their subjects; therefore, where no crime has taken place and there is no cause present for grave punishment, they can never directly harm, or tamper with the integrity of the body, either for the reasons of eugenics or for any other reason.”
As a social movement, eugenics reached its greatest popularity in the early decades of the 20th century, when it was practiced around the world and promoted by governments, institutions, and influential individuals. Many countries enacted various eugenics policies, including: genetic screening, birth control, promoting differential birth rates, marriage restrictions, segregation (both racial segregation and sequestering the mentally ill), compulsory sterilization, forced abortions or forced pregnancies, culminating in genocide.
The scientific reputation of eugenics started to decline in the 1930s, a time when Ernst Rdin used eugenics as a justification for the racial policies of Nazi Germany. Adolf Hitler had praised and incorporated eugenic ideas in Mein Kampf in 1925 and emulated eugenic legislation for the sterilization of “defectives” that had been pioneered in the United States once he took power. Some common early 20th century eugenics methods involved identifying and classifying individuals and their families, including the poor, mentally ill, blind, deaf, developmentally disabled, promiscuous women, homosexuals, and racial groups (such as the Roma and Jews in Nazi Germany) as “degenerate” or “unfit”, leading to their their segregation or institutionalization, sterilization, euthanasia, and even their mass murder. The Nazi practice of euthanasia was carried out on hospital patients in the Aktion T4 centers such as Hartheim Castle.
By the end of World War II, many discriminatory eugenics laws were abandoned, having become associated with Nazi Germany. H. G. Wells, who had called for “the sterilization of failures” in 1904, stated in his 1940 book The Rights of Man: Or What are we fighting for? that among the human rights he believed should be available to all people was “a prohibition on mutilation, sterilization, torture, and any bodily punishment”. After World War II, the practice of “imposing measures intended to prevent births within [a population] group” fell within the definition of the new international crime of genocide, set out in the Convention on the Prevention and Punishment of the Crime of Genocide. The Charter of Fundamental Rights of the European Union also proclaims “the prohibition of eugenic practices, in particular those aiming at selection of persons”. In spite of the decline in discriminatory eugenics laws, some government mandated sterilization continued into the 21st century. During the ten years President Alberto Fujimori led Peru from 1990 to 2000, allegedly 2,000 persons were involuntarily sterilized. China maintained its coercive one-child policy until 2015 as well as a suite of other eugenics based legislation to reduce population size and manage fertility rates of different populations. In 2007 the United Nations reported coercive sterilisations and hysterectomies in Uzbekistan. During the years 200506 to 201213, nearly one-third of the 144 California prison inmates who were sterilized did not give lawful consent to the operation.
Developments in genetic, genomic, and reproductive technologies at the end of the 20th century are raising numerous questions regarding the ethical status of eugenics, effectively creating a resurgence of interest in the subject. Some, such as UC Berkeley sociologist Troy Duster, claim that modern genetics is a back door to eugenics. This view is shared by White House Assistant Director for Forensic Sciences, Tania Simoncelli, who stated in a 2003 publication by the Population and Development Program at Hampshire College that advances in pre-implantation genetic diagnosis (PGD) are moving society to a “new era of eugenics”, and that, unlike the Nazi eugenics, modern eugenics is consumer driven and market based, “where children are increasingly regarded as made-to-order consumer products”. In a 2006 newspaper article, Richard Dawkins said that discussion regarding eugenics was inhibited by the shadow of Nazi misuse, to the extent that some scientists would not admit that breeding humans for certain abilities is at all possible. He believes that it is not physically different from breeding domestic animals for traits such as speed or herding skill. Dawkins felt that enough time had elapsed to at least ask just what the ethical differences were between breeding for ability versus training athletes or forcing children to take music lessons, though he could think of persuasive reasons to draw the distinction.
In October 2015, the United Nations’ International Bioethics Committee wrote that the ethical problems of human genetic engineering should not be confused with the ethical problems of the 20th century eugenics movements; however, it is still problematic because it challenges the idea of human equality and opens up new forms of discrimination and stigmatization for those who do not want or cannot afford the enhancements.
Transhumanism is often associated with eugenics, although most transhumanists holding similar views nonetheless distance themselves from the term “eugenics” (preferring “germinal choice” or “reprogenetics”) to avoid having their position confused with the discredited theories and practices of early-20th-century eugenic movements.
The term eugenics and its modern field of study were first formulated by Francis Galton in 1883, drawing on the recent work of his half-cousin Charles Darwin. Galton published his observations and conclusions in his book Inquiries into Human Faculty and Its Development.
The origins of the concept began with certain interpretations of Mendelian inheritance, and the theories of August Weismann. The word eugenics is derived from the Greek word eu (“good” or “well”) and the suffix -gens (“born”), and was coined by Galton in 1883 to replace the word “stirpiculture”, which he had used previously but which had come to be mocked due to its perceived sexual overtones. Galton defined eugenics as “the study of all agencies under human control which can improve or impair the racial quality of future generations”. Galton did not understand the mechanism of inheritance.
Historically, the term has referred to everything from prenatal care for mothers to forced sterilization and euthanasia. To population geneticists, the term has included the avoidance of inbreeding without altering allele frequencies; for example, J. B. S. Haldane wrote that “the motor bus, by breaking up inbred village communities, was a powerful eugenic agent.” Debate as to what exactly counts as eugenics has continued to the present day.
Edwin Black, journalist and author of War Against the Weak, claims eugenics is often deemed a pseudoscience because what is defined as a genetic improvement of a desired trait is often deemed a cultural choice rather than a matter that can be determined through objective scientific inquiry. The most disputed aspect of eugenics has been the definition of “improvement” of the human gene pool, such as what is a beneficial characteristic and what is a defect. This aspect of eugenics has historically been tainted with scientific racism.
Early eugenists were mostly concerned with perceived intelligence factors that often correlated strongly with social class. Some of these early eugenists include Karl Pearson and Walter Weldon, who worked on this at the University College London.
Eugenics also had a place in medicine. In his lecture “Darwinism, Medical Progress and Eugenics”, Karl Pearson said that everything concerning eugenics fell into the field of medicine. He basically placed the two words as equivalents. He was supported in part by the fact that Francis Galton, the father of eugenics, also had medical training.
Eugenic policies have been conceptually divided into two categories. Positive eugenics is aimed at encouraging reproduction among the genetically advantaged; for example, the reproduction of the intelligent, the healthy, and the successful. Possible approaches include financial and political stimuli, targeted demographic analyses, in vitro fertilization, egg transplants, and cloning. The movie Gattaca provides a fictional example of positive eugenics done voluntarily. Negative eugenics aimed to eliminate, through sterilization or segregation, those deemed physically, mentally, or morally “undesirable”. This includes abortions, sterilization, and other methods of family planning. Both positive and negative eugenics can be coercive; abortion for fit women, for example, was illegal in Nazi Germany.
Jon Entine claims that eugenics simply means “good genes” and using it as synonym for genocide is an “all-too-common distortion of the social history of genetics policy in the United States.” According to Entine, eugenics developed out of the Progressive Era and not “Hitler’s twisted Final Solution”.
According to Richard Lynn, eugenics may be divided into two main categories based on the ways in which the methods of eugenics can be applied.
The first major challenge to conventional eugenics based upon genetic inheritance was made in 1915 by Thomas Hunt Morgan, who demonstrated the event of genetic mutation occurring outside of inheritance involving the discovery of the hatching of a fruit fly (Drosophila melanogaster) with white eyes from a family of red-eyes. Morgan claimed that this demonstrated that major genetic changes occurred outside of inheritance and that the concept of eugenics based upon genetic inheritance was not completely scientifically accurate. Additionally, Morgan criticized the view that subjective traits, such as intelligence and criminality, were caused by heredity because he believed that the definitions of these traits varied and that accurate work in genetics could only be done when the traits being studied were accurately defined. In spite of Morgan’s public rejection of eugenics, much of his genetic research was absorbed by eugenics.
The heterozygote test is used for the early detection of recessive hereditary diseases, allowing for couples to determine if they are at risk of passing genetic defects to a future child. The goal of the test is to estimate the likelihood of passing the hereditary disease to future descendants.
Recessive traits can be severely reduced, but never eliminated unless the complete genetic makeup of all members of the pool was known, as aforementioned. As only very few undesirable traits, such as Huntington’s disease, are dominant, it could be argued[by whom?] from certain perspectives that the practicality of “eliminating” traits is quite low.
There are examples of eugenic acts that managed to lower the prevalence of recessive diseases, although not influencing the prevalence of heterozygote carriers of those diseases. The elevated prevalence of certain genetically transmitted diseases among the Ashkenazi Jewish population (TaySachs, cystic fibrosis, Canavan’s disease, and Gaucher’s disease), has been decreased in current populations by the application of genetic screening.
Pleiotropy occurs when one gene influences multiple, seemingly unrelated phenotypic traits, an example being phenylketonuria, which is a human disease that affects multiple systems but is caused by one gene defect. Andrzej Pkalski, from the University of Wrocaw, argues that eugenics can cause harmful loss of genetic diversity if a eugenics program selects for a pleiotropic gene that is also associated with a positive trait. Pekalski uses the example of a coercive government eugenics program that prohibits people with myopia from breeding but has the unintended consequence of also selecting against high intelligence since the two go together.
Eugenic policies could also lead to loss of genetic diversity, in which case a culturally accepted “improvement” of the gene pool could very likelyas evidenced in numerous instances in isolated island populations (e.g., the dodo, Raphus cucullatus, of Mauritius)result in extinction due to increased vulnerability to disease, reduced ability to adapt to environmental change, and other factors both known and unknown. A long-term species-wide eugenics plan might lead to a scenario similar to this because the elimination of traits deemed undesirable would reduce genetic diversity by definition.
Edward M. Miller claims that, in any one generation, any realistic program should make only minor changes in a fraction of the gene pool, giving plenty of time to reverse direction if unintended consequences emerge, reducing the likelihood of the elimination of desirable genes. Miller also argues that any appreciable reduction in diversity is so far in the future that little concern is needed for now.
While the science of genetics has increasingly provided means by which certain characteristics and conditions can be identified and understood, given the complexity of human genetics, culture, and psychology there is at this point no agreed objective means of determining which traits might be ultimately desirable or undesirable. Some diseases such as sickle-cell disease and cystic fibrosis respectively confer immunity to malaria and resistance to cholera when a single copy of the recessive allele is contained within the genotype of the individual. Reducing the instance of sickle-cell disease genes in Africa where malaria is a common and deadly disease could indeed have extremely negative net consequences.
However, some genetic diseases such as haemochromatosis can increase susceptibility to illness, cause physical deformities, and other dysfunctions, which provides some incentive for people to re-consider some elements of eugenics.
Autistic people have advocated a shift in perception of autism spectrum disorders as complex syndromes rather than diseases that must be cured. Proponents of this view reject the notion that there is an “ideal” brain configuration and that any deviation from the norm is pathological; they promote tolerance for what they call neurodiversity. Baron-Cohen argues that the genes for Asperger’s combination of abilities have operated throughout recent human evolution and have made remarkable contributions to human history. The possible reduction of autism rates through selection against the genetic predisposition to autism is a significant political issue in the autism rights movement, which claims that autism is a part of neurodiversity.
Many culturally Deaf people oppose attempts to cure deafness, believing instead deafness should be considered a defining cultural characteristic not a disease. Some people have started advocating the idea that deafness brings about certain advantages, often termed “Deaf Gain.”
Societal and political consequences of eugenics call for a place in the discussion on the ethics behind the eugenics movement. Many of the ethical concerns regarding eugenics arise from its controversial past, prompting a discussion on what place, if any, it should have in the future. Advances in science have changed eugenics. In the past, eugenics had more to do with sterilization and enforced reproduction laws. Now, in the age of a progressively mapped genome, embryos can be tested for susceptibility to disease, gender, and genetic defects, and alternative methods of reproduction such as in vitro fertilization are becoming more common. Therefore, eugenics is no longer ex post facto regulation of the living but instead preemptive action on the unborn.
With this change, however, there are ethical concerns which lack adequate attention, and which must be addressed before eugenic policies can be properly implemented in the future. Sterilized individuals, for example, could volunteer for the procedure, albeit under incentive or duress, or at least voice their opinion. The unborn fetus on which these new eugenic procedures are performed cannot speak out, as the fetus lacks the voice to consent or to express his or her opinion. Philosophers disagree about the proper framework for reasoning about such actions, which change the very identity and existence of future persons.
A common criticism of eugenics is that “it inevitably leads to measures that are unethical”. Some fear future “eugenics wars” as the worst-case scenario: the return of coercive state-sponsored genetic discrimination and human rights violations such as compulsory sterilization of persons with genetic defects, the killing of the institutionalized and, specifically, segregation and genocide of races perceived as inferior. Health law professor George Annas and technology law professor Lori Andrews are prominent advocates of the position that the use of these technologies could lead to such human-posthuman caste warfare.
In his 2003 book Enough: Staying Human in an Engineered Age, environmental ethicist Bill McKibben argued at length against germinal choice technology and other advanced biotechnological strategies for human enhancement. He claims that it would be morally wrong for humans to tamper with fundamental aspects of themselves (or their children) in an attempt to overcome universal human limitations, such as vulnerability to aging, maximum life span and biological constraints on physical and cognitive ability. Attempts to “improve” themselves through such manipulation would remove limitations that provide a necessary context for the experience of meaningful human choice. He claims that human lives would no longer seem meaningful in a world where such limitations could be overcome technologically. Even the goal of using germinal choice technology for clearly therapeutic purposes should be relinquished, since it would inevitably produce temptations to tamper with such things as cognitive capacities. He argues that it is possible for societies to benefit from renouncing particular technologies, using as examples Ming China, Tokugawa Japan and the contemporary Amish.
Some, such as Nathaniel C. Comfort from Johns Hopkins University, claim that the change from state-led reproductive-genetic decision-making to individual choice has moderated the worst abuses of eugenics by transferring the decision-making from the state to the patient and their family. Comfort suggests that “the eugenic impulse drives us to eliminate disease, live longer and healthier, with greater intelligence, and a better adjustment to the conditions of society; and the health benefits, the intellectual thrill and the profits of genetic bio-medicine are too great for us to do otherwise.” Others, such as bioethicist Stephen Wilkinson of Keele University and Honorary Research Fellow Eve Garrard at the University of Manchester, claim that some aspects of modern genetics can be classified as eugenics, but that this classification does not inherently make modern genetics immoral. In a co-authored publication by Keele University, they stated that “[e]ugenics doesn’t seem always to be immoral, and so the fact that PGD, and other forms of selective reproduction, might sometimes technically be eugenic, isn’t sufficient to show that they’re wrong.”
In their 2000 book From Chance to Choice: Genetics and Justice, bioethicists Allen Buchanan, Dan Brock, Norman Daniels and Daniel Wikler argued that liberal societies have an obligation to encourage as wide an adoption of eugenic enhancement technologies as possible (so long as such policies do not infringe on individuals’ reproductive rights or exert undue pressures on prospective parents to use these technologies) in order to maximize public health and minimize the inequalities that may result from both natural genetic endowments and unequal access to genetic enhancements.
Original position, a hypothetical situation developed by American philosopher John Rawls, has been used as an argument for negative eugenics.
See the rest here:
Posted: October 13, 2016 at 5:35 am
No individual applying for health coverage through the individual Marketplace will be discouraged from applying for benefits, turned down for coverage, or charged more premium because of health status, medical condition, mental illness claims experience, medical history, genetic information or health disability. In addition, no individual will be denied coverage based on race, color, religion, national origin, sex, sexual orientation, marital status, personal appearance, political affiliation or source of income.
References to UnitedHealthcare pertain to each individual company or other UnitedHealthcare affiliated companies. Dental and Vision products are administrated by related companies. Each company is a separate entity and is not responsible for another’s financial or contractual obligations. Administrative services are provided by United HealthCare Services, Inc.
Products and services offered are underwritten by Golden Rule Insurance Company, Oxford Health Insurance, Inc., UnitedHealthcare Life Insurance Company. In New Mexico, products and services offered are only underwritten by Golden Rule Insurance Company.
All products require separate applications. Separate policies or certificates are issued. Golden Rule Short Term Medical plans are medically underwritten. Related insurance products offered by either company may be medically underwritten see the product brochures and applications. Healthiest You is not an insurance product and is provided by HY Holdings, Inc., d/b/a Healthiest You. Travel Health Insurance and Pet Insurance are underwritten by different companies that are not related to the UnitedHealthcare family of companies. Product availability varies by state.
Read this article:
Posted: October 4, 2016 at 1:21 pm
Any technology that offers benefits will usually come with risks as well. In order to make wise decisions about using a technology, we must understand its potential impacts well enough to decide whether the risks are acceptably low.
What are the risks posed by the use of genetic engineering (GE) in agriculture? The answers fall mostly into two categories: risks to human health, and environmental impacts.
Photo: Roy Kaltschmidt, Lawrence Berkeley National Laboratories
Health risks of genetic engineering have sometimes been described in exaggerated, alarmist terms, implying that foods made from GE crops are inherently unsafe. There is no evidence, for instance, that refined products derived from GE crops, such as starch, sugar and oils, are different than those derived from conventionally bred crops.
It is also an exaggeration, however, to state that there are no health risks associated with GE. For one thing, not enough is known: research on the effects of specific genes has been limitedand tightly controlled by the industry.
But we do know of ways in which genetically engineered crops could cause health problems. For instance, genes from an allergenic plant could transfer this unwanted trait to the target plant. This phenomenon was documented in 1996, as soybeans with a Brazil nut geneadded to improve their value as animal feedproduced an allergic response in test subjects with Brazil nut allergies.
Unintended consequences like these underscore the need for effective regulation of GE products. In the absence of a rigorous approval process, there is nothing to ensure that GE crops that cause health problems will always be identified and kept off the market.
Genetically engineered crops can potentially cause environmental problems that result directly from the engineered traits. For instance, an engineered gene may cause a GE crop (or a wild relative of that crop) to become invasive or toxic to wildlife.
But the most damaging impact of GE in agriculture so far is the phenomenon of pesticide resistance. Millions of acres of U.S. farmland are now infested by weeds that have become resistant to the herbicide glyphosate. Overuse of Monsanto’s “Roundup Ready” trait, which is engineered to tolerate the herbicide, has promoted the accelerated development of resistance in several weed species.
Looking for ways to fight back against these “superweeds,” farmers are now turning to older, more toxic herbicides such as 2,4-D and dicamba. As if on cue, agribusiness companies have begun to develop new GE crops engineered to tolerate these older herbicideswith no guarantee that the Roundup Ready story will not repeat itself, producing a new wave of resistant weeds.
And this issue is not confined to herbicides: recent reports suggest a growing problem of corn rootworms resistant to the insecticide Bt, which some corn varieties have been engineered to produce.
As the superweed crisis illustrates, current applications of genetic engineering have become a key component of an unsustainable approach to food production: industrial agriculture, with its dependence on monoculturesupported by costly chemical inputsat the expense of the long-term health and productivity of the farm.
A different approach to farming is availablewhat UCS calls “healthy farms.” This approach is not only more sustainable than industrial agriculture, but often more cost-effective. Yet as long as the marketplace of agricultural products and policies is dominated by the industrial model, prioritizing expensive products over knowledge-based agroecological approaches, healthy farm solutions face an uphill battle.
In the case of GE, better solutions include crop breeding (often assisted by molecular biology techniques) and agroecological practices such as crop rotation, cover crops, and integrated crop/livestock management.
Such healthy farm practices are the future of U.S. agricultureand policymakers can help speed the transition by supporting research and education on them. In the meantime, stronger regulation of the biotechnology industry is needed to minimize health and environmental risks from GE products.
Posted: October 1, 2016 at 1:45 am
Hippocrates, the Father of Medicine Focused on Energetics of Food
Benefits of Whole Food Supplements
Whole food nutritional supplements are foods that have been compressed into tablet form, poured into capsules or powdered.
The word whole indicates that the end product a supplement does not contain parts of foods, or synthetic or isolated vitamins.
Ideally, the foods comprising these supplements have not been subjected to irradiation, contain no pesticide or herbacide residues.
When it comes to providing the best food supplement for our family and friends, that is composed of 17 different fruits and vegetables, there is only one choice. Click here to learn more.
Why? Because of the research that has proved they work.
The clinical studies have PROVEN that they:
The research and has been published in scientific and medical journals, including:
Whole food nutritional supplements are one step away from fresh foods. Medical Science reminds us every day that good nutrition and good health go hand in hand especially when it comes to the health benefits of eating fresh, raw fruits and vegetables.
Researchers continue to find elements in fruits and vegetables that strengthen our immune systems, impede the development of degenerative diseases like cancer and heart disease, and contribute to good health in many other ways.
Unfortunately, most people dont eat nearly enough fruits and vegetables, especially not every day. Those we do eat tend to be over processed, overcooked, or too far removed from the field, and thus lack much of the nutrition provided by fresh, raw fruits and vegetables.
Now people can increase their intake of raw fruits and vegetables without changing their eating habits, without the hassle of shopping and trying to find foods that may not be in season, without having to taste unfamiliar or unpleasant food and best of all, at an affordable price!
Health food supplements are the next best thing to eating fresh, raw fruits and vegetables. Certainly everyone should be encouraged to eat more raw fruits and vegetables but we know that most people simply wont do it.
Whole food nutritional supplements are much more than a vitamin or mineral supplement. Regular vitamins and minerals are isolated nutrients, and they are not always derived from natural sources.
Whole food nutritional supplements are whole food based nutrition, providing not only a wide variety of naturally occurring vitamins, antioxidants and minerals, but also many of the other nutrients phytochemicals, enzymes, even the fiber found in fresh, raw fruits and vegetables themselves.
In nature, vitamins and minerals are never isolated. They are always provided in whole foods in combination with all the other nutrients found there, working together in ways science is only beginning to understand.
In her book Biochemistry of Foods and Supplements, Judith DeCava expresses this perfectly: To isolate or separate a vitamin, mineral, amino acid or other component and call it a nutrient is just as impractical as isolating a steering wheel, battery, or carburetor and calling it an automobile. It wont work without the other parts.
There are thousands of phytonutrients in every food. Each one we study is proving to play an important role in human health and vitality. Without them, we lay the foundation for a weak immune system and degenerative disease. A traditional vitamin and mineral supplement cannot begin to scratch the surface of this vast array of nutrition.
For example, research concerning tomatoes indicates that even a few servings per week can reduce the risk of prostrate cancer. It appears that lycopene and other components in this fruit/vegetable can actually decrease tumor size and kill cancer cells. But, if you take lycopene by itself, it’s not going to have nearly the positive effect of eating whole tomatoes or taking whole food nutritional supplements made from dried organic tomatoes.
Vitamin and mineral supplements are necessary for specific people with specific needs. Whole food supplements are for everyone. Whole food nutritional supplements are the key to good nutrition, and are simply a way to get the healthful dose of the daily nutrition you need from fresh fruits and vegetables in a convenient form.
Whole Food Nutritional Supplements
Medical evidence is mounting that whole food based nutrition, like that found in whole food nutritional supplements is the key to better health, especially when it comes to helping prevent degenerative diseases like heart disease, stroke and cancer. Despite this growing evidence of the value of good, whole food nutrition, people including children are eating more poorly than ever.
Like most breakthrough products, the idea behind whole food supplements is simple. Whole food nutritional supplements contain natural fruit and vegetable juice powders in capsule form. The powders are concentrated from fruit and vegetable juices using a proprietary, low temperature process that leaves as much of the nutrition as possible intact.
Whole food based nutrition is the answer to better health, and whole food supplements with a wide variety of nutrients, including vitamins and antioxidants, phytochemicals and enzymes, minerals and fiber are leading the way.
Our recommendation is JP+. Click Here to learn more.
Things you should consider before you buy a nutritional supplement.
Disease is easier to PREVENT than it is to cure.
Eat 7-13 servings of fresh fruits and vegetables every day.
Almost no one does.
Whole FOOD SUPPLEMENTS help fill the nutritional gaps.
Much more than whole food supplements discussed back at the Home Page
Posted: September 18, 2016 at 8:09 am
Home Topics AZ Psoriasis
Author:Hon A/Prof Amanda Oakley, Hamilton, New Zealand. Revised and updated, August 2014.
Psoriasis is a chronic inflammatory skin condition characterised by clearly defined, red and scaly plaques (thickened skin). It is classified into several subtypes.
Psoriasis affects 24% of males and females. It can start at any age including childhood, with peaks of onset at 1525 years and 5060 years. It tends to persist lifelong, fluctuating in extent and severity. It is particularly common in Caucasians, but may affect people of any race. About one third of patients with psoriasis have family members with psoriasis.
Psoriasis is multifactorial. It is classified as an immune-mediated inflammatory disease (IMID).
Genetic factors are important. An individual’s genetic profile influences their type of psoriasis and its response to treatment.
Genome-wide association studies report that HLA-Cw6 is associated with early onset psoriasis and guttate psoriasis. This major histocompatibility complex is not associated with arthritis, nail dystrophy or late onset psoriasis.
Theories about the causes of psoriasis need to explain why the skin is red, inflamed and thickened. It is clear that immune factors and inflammatory cytokines (messenger proteins) such is IL1 and TNF are responsible for the clinical features of psoriasis. Current theories are exploring the TH17 pathway and release of the cytokine IL17A.
Psoriasis usually presents with symmetrically distributed, red, scaly plaques with well-defined edges. The scale is typically silvery white, except in skin folds where the plaques often appear shiny and they may have a moist peeling surface. The most common sites are scalp, elbows and knees, but any part of the skin can be involved. The plaques are usually very persistent without treatment.
Itch is mostly mild but may be severe in some patients, leading to scratching and lichenification (thickened leathery skin with increased skin markings). Painful skin cracks or fissures may occur.
When psoriatic plaques clear up, they may leave brown or pale marks that can be expected to fade over several months.
Certain features of psoriasis can be categorised to help determine appropriate investigations and treatment pathways. Overlap may occur.
Generalised pustulosis and localised palmoplantar pustulosis are no longer classified within the psoriasis spectrum.
Patients with psoriasis are more likely than other people to have other health conditions listed here.
Psoriasis is diagnosed by its clinical features. If necessary, diagnosis is supported by typical skin biopsy findings.
Medical assessment entails a careful history, examination, questioning about effect of psoriasis on daily life, and evaluation of comorbid factors.
Validated tools used to evaluate psoriasis include:
The severity of psoriasis is classified as mild in 60% of patients, moderate in 30% and severe in 10%.
Evaluation of comorbidities may include:
Patients with psoriasis should ensure they are well informed about their skin condition and its treatment. There are benefits from not smoking, avoiding excessive alcohol and maintaining optimal weight.
Mild psoriasis is generally treated with topical agents alone. Which treatment is selected may depend on body site, extent and severity of the psoriasis.
Most psoriasis centres offer phototherapy with ultraviolet (UV) radiation, often in combination with topical or systemic agents. Types of phototherapy include
Moderate to severe psoriasis warrants treatment with a systemic agent and/or phototherapy. The most common treatments are:
Other medicines occasionally used for psoriasis include:
Systemic corticosteroids are best avoided due to risk of severe withdrawal flare of psoriasis and adverse effects.
Biologics or targeted therapies are reserved for conventional treatment-resistant severe psoriasis, mainly because of expense, as side effects compare favourably with other systemic agents. These include:
The rest is here:
Psoriasis | DermNet New Zealand
Posted: at 8:09 am
What Is Psoriasis?
Psoriasis is a genetically programmed inflammatory disease that primarily affects the skin in about 3% of individuals in the United States. Psoriasis is characterized by skin cells that multiply up to 10 times faster than normal. When these cells reach the surface and die, raised, red plaques covered with white scales form. Psoriasis begins as a small scaling papule. When multiple papules coalesce, they form scaling plaques. These plaques tend to occur in the scalp, elbows, and knees.
Although psoriatic plaques can be limited to only a few small areas, the condition can involve widespread areas of skin anywhere on the body. Psoriasis symptoms vary depending on the type of psoriasis you have. Common psoriasis symptoms can include the following:
Plaque psoriasis is the most common type of psoriasis and it gets its name from the plaques that build up on the skin. There tend to be well-defined patches of red raised skin that can appear on any area of the skin, but the knees, elbows, scalp, trunk, and nails are the most common locations. There is also a flaky, white build up on top of the plaques, called scales. Possible plaque psoriasis symptoms include skin pain, itching, and cracking.
There are plenty of over-the-counter products that are effective in the treatment of plaque psoriasis. 1% hydrocortisone cream is a topical steroid that can suppress mild disease and preparations containing tar are effective in treating plaque psoriasis.
Scalp psoriasis is a common skin disorder that makes raised, reddish, often scaly patches. Scalp psoriasis can affect your whole scalp, or just pop up as one patch. This type of psoriasis can even spread to the forehead, the back of the neck, or behind the ears. Scalp psoriasis symptoms may include only slight, fine scaling. Moderate to severe scalp psoriasis symptoms may include dandruff-like flaking, dry scalp, and hair loss. Scalp psoriasis does not directly cause hair loss, but stress and excess scratching or picking of the scalp may result in hair loss.
Scalp psoriasis can be treated with medicated shampoos, creams, gels, oils, ointments, and soaps. Salicylic acid and coal tar are two medications in over-the-counter products that help treat scalp psoriasis. Steroid injections and phototherapy may help treat mild scalp psoriasis. Biologics are the latest class of medications that can also help treat severe scalp psoriasis.
Guttate psoriasis looks like small, pink dots or drops on the skin. The word guttate is from the Latin word gutta, meaning drop. There tends to be fine scales with guttate psoriasis that is finer than the scales in plaque psoriasis. Guttate psoriasis is typically triggered by streptococcal (strep throat) and the outbreak will usually occur two to three weeks after having strep throat.
Guttate psoriasis tends to go away after a few weeks without treatment. Moisturizers can be used to soften the skin. If there is a history of psoriasis, a doctor may take a throat culture to determine if strep throat is present. If the throat culture shows that streptococcal is present, a doctor may prescribe antibiotics.
Many patients with psoriasis have abnormal nails. Psoriatic nails often have a horizontal white or yellow margin at the tip of the nail called distal onycholysis because the nail is lifted away from the skin. There can often be small pits in the nail plate, and the nail is often yellow and crumbly.
The same treatment for skin psoriasis is beneficial for nail psoriasis. However, since nails grow slow, it may take a while for improvements to be evident. Nail psoriasis can be treated with phototherapy, systemic therapy (medications that spread throughout the body), and steroids (cream or injection). If medications do not improve the condition of nail psoriasis, a doctor may surgically remove the nail.
Psoriasis can be associated with a destructive arthritis called psoriatic arthritis. Damage can be serious enough to permanently damage the affected joints. Prevention of joint damage in such cases is very important.
Psoriatic arthritis is a chronic disease characterized by a form of inflammation of the skin and joints. About 15%-25% of patients with psoriasis also develop an inflammation of their joints. Psoriatic arthritis is a systemic rheumatic disease that can not only cause inflammation of the skin, but in the eyes, heart, kidneys, and lungs as well. Currently, the cause of psoriatic arthritis is unknown, but a combination of genetic, immune, and environmental facts is likely involved.
Typically, a patient will have psoriasis months or years before they develop psoriatic arthritis. Psoriatic arthritis usually involves the knees, ankles, and joints in the feet. There may also be a loss of range of motion of the involved joints as well as joint stiffness. Psoriatic arthritis can also cause inflammation of the spine and the sacrum, which causes pain and stiffness in the low back, buttocks, neck, and upper back.
Treatment for psoriatic arthritis generally involves anti-inflammatory medications and exercise. It is important to stretch or take a hot shower before exercise in order to relax the muscles. Ice application after exercise can help minimize soreness and inflammation. Nonsteroidal anti-inflammatory drugs may also reduce joint inflammation, pain, and stiffness.
It is now clear that there is a genetic basis for psoriasis. This hereditary predisposition is necessary before the disease can be triggered by environmental factors. White blood cells called T-cells mediate the development of the psoriatic plaques that are present in the skin. When someone has psoriasis, their body is unable to offer protection from invaders. Instead, inflammation is promoted and skin cells are on overdrive. When cell growth is increased, old skin cells pile up instead of flaking off, causing psoriasis to occur. Currently, most experts conclude that environmental, genetic and immunologic factors interact to cause the disease.
If you have the genetic basis of psoriasis, a trigger can cause psoriasis to flare up. The following are triggers that may set off ones psoriasis:
No, psoriasis is not contagious. People used to believe that psoriasis was the same as leprosy, but that is not the case. You cannot get psoriasis by touching, kissing, or having sex with someone who has psoriasis. People get psoriasis because of their genes, not their hygiene, diet, lifestyle, or any other habits.
Psoriasis is often diagnosed or at least suspected on the basis of its appearance and distribution. However, psoriasis may resemble eczema or other skin diseases and further tests may be required. It may be necessary to remove a small piece of skin (a biopsy) and have it examined by a pathologist to confirm the diagnosis. If there are joint symptoms, X-rays and other laboratory tests may be in order. Psoriasis cannot be cured, but like many other medical conditions, it is controllable with treatment. Your doctor may have you seen by a consultant such as a dermatologist, rheumatologist or immunologist to help diagnose and treat your form of psoriasis.
Since psoriasis mainly affects the skin, topical treatments are very useful because they are relatively safe, fairly effective, and can be applied directly to the affected skin. They take the form of lotions, foams, creams, ointments, gels, and shampoos. They include topical steroids, tar preparations, and calcium- modulating drugs. The precise drug used and the form in which it is delivered depends on the areas involved. In widespread disease in patients with more than 10% of the body surface involved, it may not be practical to use topical medication alone.
For more extensive psoriasis, a useful option is ultraviolet (UV) light exposure. UV light can treat large areas of skin with few side effects, if performed in the physicians office. It should be kept in mind that all UV light causes mutational events, which can lead to skin cancer. At this time, the most popular type of UV light for psoriasis is called narrow-band UVB. Only a small portion of the UV light spectrum is used, which seems to be particularly beneficial for psoriasis and may be less carcinogenic. This UVB is quite different from the UVA, the wavelength available in tanning salons, which is not effective in psoriasis. Phototherapy can be used alone or with medications when treating psoriasis.
Excimer lasers or pulsed dye lasers are used in laser therapy. A pulsed dye laser will create a concentrated beam of yellow light. When this light hits the skin, it converts to heat. The heat then destroys the extra blood vessels in the skin that contribute to psoriasis. Excimer lasers will deliver ultraviolet light to localized areas of the skin that help treat psoriasis. These lasers produce UV light in wavelengths similar to narrow-band UVB. Laser therapy uses intense doses of laser light to help control areas of mild to moderate psoriasis without damaging surrounding healthy skin. These can be quite effective for small plaques of psoriasis, but since only small areas of skin can be treated at once, they are not practical for extensive disease.
There are a variety of drugs administered systemically that are useful in controlling psoriasis. As a generalization, most oral medications act by targeting portions of the immune system. The only exception currently is a drug called acitretin (Soriatane), which is structurally similar to vitamin A. Since the immune system is necessary in order to survive, systemic treatments do have a downside. Drugs like methotrexate and cyclosporine are administered orally and can affect the liver, kidney, and bone marrow. A new oral medication recently approved for treatment of psoriasis is called Otezla (apremilast). Otezla selectively targets molecules inside immune cells and adjusts (reduces) the processes of inflammation within the cell, which in turn helps treat psoriasis. This drug appears to be considerably safer that most of its predecessors but is also quite expensive.
A new class of drugs has recently been developed called biologics; they’re called biologics because living cells synthesize them. Since these drugs are proteins, they cannot be administered orally and must be given by injection through the skin or by an intravenous infusion. This treatment is recommended in patients with moderate to severe psoriasis. These drugs target the immune response that leads to the rapid skin cell growth of psoriasis. This seems to have increased their safety profile as well as their effectiveness when compared to older drugs. On the other hand, they are quite expensive costing up to $30,000 a year.
There are many home remedies that can be used in the treatment of psoriasis. It is very important to keep the skin moist to avoid dryness. Petroleum jelly, shortening, or olive oil can be used as a moisturizer. Take fewer showers and baths to avoid stripping the skin of its natural oils. Adding salts, oil, or finely ground oatmeal to the bath can soothe the skin. Heliotherapy (medicinal sunbathing) can be effective in controlling psoriasis. There is also evidence that increased body mass is associated with psoriasis and that heavier individuals are more difficult to treat.
At the edge of Israel’s Dead Sea, there are a group of resorts that cater to psoriasis patients by offering a combination of graded solar exposure and the application of crude coal tar along with a spa-like experience. The Dead Sea is the lowest point on earth, more than 400 meters below sea level. Once the suns rays pass through the haze, the harmful ultraviolet rays are filtered out and the remaining rays are highly effective in treating psoriasis. For those with the time and the money, this is a reasonable alternative to standard medical treatment.
Although there is no doubt that psoriasis is a potent inducer of stress, the evidence that stress causes psoriasis is sparse. However, stress can make psoriasis worse, and psoriasis can make one stressed. Dealing with stress with or without psoriasis is a challenge for most people living in the 21st century. The following are tips to reduce stress:
Not only are the physical effects of psoriasis frustrating, but the emotional effects of psoriasis can be much worse. Psoriasis may cause your relationships to change and people may treat you differently. Unfortunately, this may lead to stress, which then leads to worsening psoriasis. A doctor may prescribe antidepressant medications if psoriasis is diminishing your quality of life. Support groups can also help you cope with psoriasis by talking to other people who are suffering from the same disease.
Fall and winter may bring shorter days, colder temperatures, and dry air. These can all lead to worsening psoriasis symptoms. The suns ultraviolet light hinders the rapid growth of skin cells that is characteristic of psoriasis. Therefore, spending less time in the sun may cause psoriasis symptoms to flare. The dry weather may remove moisture in your skin so it is important to use moisturizer and/or a humidifier at home.
There are many different remedies that may ease psoriasis symptoms. The following is a partial list of alternative medicine to help treat psoriasis:
Consult your doctor before trying new medications.
There is plenty of evidence that extensive psoriasis can have a very significant negative effect on a patients self-image and emotions. This is especially true in social situations, although all aspects of life can be disturbed. Inverse psoriasis, which affects the genital skin, and scalp psoriasis can be particularly troubling. Psoriasis affecting the hands may make it impossible to interact normally with others. It is important to remember that there are ways to manage and treat psoriasis flares. It may seem as if ones quality of life has diminished, but there are many organizations that offer support to psoriasis patients. The National Psoriasis Foundation is an excellent source of accurate information as well as emotional support for afflicted patients.
Get the latest health and medical information delivered direct to your inbox! Plus receive a free PDF Health Report when you sign up!
IMAGES PROVIDED BY:
This tool does not provide medical advice. See additional information:
THIS TOOL DOES NOT PROVIDE MEDICAL ADVICE. It is intended for general informational purposes only and does not address individual circumstances. It is not a substitute for professional medical advice, diagnosis or treatment and should not be relied on to make decisions about your health. Never ignore professional medical advice in seeking treatment because of something you have read on the eMedicineHealth Site. If you think you may have a medical emergency, immediately call your doctor or dial 911.
Read the rest here:
Slideshow Pictures: Psoriasis — Symptoms, Causes and …
Posted: September 16, 2016 at 5:26 am
No one should ever work.
Work is the source of nearly all the misery in the world. Almost any evil you’d care to name comes from working or from living in a world designed for work. In order to stop suffering, we have to stop working.
That doesn’t mean we have to stop doing things. It does mean creating a new way of life based on play; in other words, a ludic conviviality, commensality, and maybe even art. There is more to play than child’s play, as worthy as that is. I call for a collective adventure in generalized joy and freely interdependent exuberance. Play isn’t passive. Doubtless we all need a lot more time for sheer sloth and slack than we ever enjoy now, regardless of income or occupation, but once recovered from employment-induced exhaustion nearly all of us want to act. Oblomovism and Stakhanovism are two sides of the same debased coin.
The ludic life is totally incompatible with existing reality. So much the worse for “reality,” the gravity hole that sucks the vitality from the little in life that still distinguishes it from mere survival. Curiously — or maybe not — all the old ideologies are conservative because they believe in work. Some of them, like Marxism and most brands of anarchism, believe in work all the more fiercely because they believe in so little else.
Liberals say we should end employment discrimination. I say we should end employment. Conservatives support right-to-work laws. Following Karl Marx’s wayward son-in-law Paul Lafargue I support the right to be lazy. Leftists favor full employment. Like the surrealists — except that I’m not kidding — I favor full unemployment. Trotskyists agitate for permanent revolution. I agitate for permanent revelry. But if all the ideologues (as they do) advocate work — and not only because they plan to make other people do theirs — they are strangely reluctant to say so. They will carry on endlessly about wages, hours, working conditions, exploitation, productivity, profitability. They’ll gladly talk about anything but work itself. These experts who offer to do our thinking for us rarely share their conclusions about work, for all its saliency in the lives of all of us. Among themselves they quibble over the details. Unions and management agree that we ought to sell the time of our lives in exchange for survival, although they haggle over the price. Marxists think we should be bossed by bureaucrats. Libertarians think we should be bossed by businessmen. Feminists don’t care which form bossing takes so long as the bosses are women. Clearly these ideology-mongers have serious differences over how to divvy up the spoils of power. Just as clearly, none of them have any objection to power as such and all of them want to keep us working.
You may be wondering if I’m joking or serious. I’m joking and serious. To be ludic is not to be ludicrous. Play doesn’t have to be frivolous, although frivolity isn’t triviality: very often we ought to take frivolity seriously. I’d like life to be a game — but a game with high stakes. I want to play for keeps.
The alternative to work isn’t just idleness. To be ludic is not to be quaaludic. As much as I treasure the pleasure of torpor, it’s never more rewarding than when it punctuates other pleasures and pastimes. Nor am I promoting the managed time-disciplined safety-valve called “leisure”; far from it. Leisure is nonwork for the sake of work. Leisure is the time spent recovering from work and in the frenzied but hopeless attempt to forget about work. Many people return from vacation so beat that they look forward to returning to work so they can rest up. The main difference between work and leisure is that work at least you get paid for your alienation and enervation.
I am not playing definitional games with anybody. When I say I want to abolish work, I mean just what I say, but I want to say what I mean by defining my terms in non-idiosyncratic ways. My minimum definition of work is forced labor, that is, compulsory production. Both elements are essential. Work is production enforced by economic or political means, by the carrot or the stick. (The carrot is just the stick by other means.) But not all creation is work. Work is never done for its own sake, it’s done on account of some product or output that the worker (or, more often, somebody else) gets out of it. This is what work necessarily is. To define it is to despise it. But work is usually even worse than its definition decrees. The dynamic of domination intrinsic to work tends over time toward elaboration. In advanced work-riddled societies, including all industrial societies whether capitalist of “Communist,” work invariably acquires other attributes which accentuate its obnoxiousness.
Usually — and this is even more true in “Communist” than capitalist countries, where the state is almost the only employer and everyone is an employee — work is employment, i. e., wage-labor, which means selling yourself on the installment plan. Thus 95% of Americans who work, work for somebody (or something) else. In the USSR or Cuba or Yugoslavia or any other alternative model which might be adduced, the corresponding figure approaches 100%. Only the embattled Third World peasant bastions — Mexico, India, Brazil, Turkey — temporarily shelter significant concentrations of agriculturists who perpetuate the traditional arrangement of most laborers in the last several millenia, the payment of taxes (= ransom) to the state or rent to parasitic landlords in return for being otherwise left alone. Even this raw deal is beginning to look good. All industrial (and office) workers are employees and under the sort of surveillance which ensures servility.
But modern work has worse implications. People don’t just work, they have “jobs.” One person does one productive task all the time on an or-else basis. Even if the task has a quantum of intrinsic interest (as increasingly many jobs don’t) the monotony of its obligatory exclusivity drains its ludic potential. A “job” that might engage the energies of some people, for a reasonably limited time, for the fun of it, is just a burden on those who have to do it for forty hours a week with no say in how it should be done, for the profit of owners who contribute nothing to the project, and with no opportunity for sharing tasks or spreading the work among those who actually have to do it. This is the real world of work: a world of bureaucratic blundering, of sexual harassment and discrimination, of bonehead bosses exploiting and scapegoating their subordinates who — by any rational-technical criteria — should be calling the shots. But capitalism in the real world subordinates the rational maximization of productivity and profit to the exigencies of organizational control.
The degradation which most workers experience on the job is the sum of assorted indignities which can be denominated as “discipline.” Foucault has complexified this phenomenon but it is simple enough. Discipline consists of the totality of totalitarian controls at the workplace — surveillance, rotework, imposed work tempos, production quotas, punching -in and -out, etc. Discipline is what the factory and the office and the store share with the prison and the school and the mental hospital. It is something historically original and horrible. It was beyond the capacities of such demonic dictators of yore as Nero and Genghis Khan and Ivan the Terrible. For all their bad intentions they just didn’t have the machinery to control their subjects as thoroughly as modern despots do. Discipline is the distinctively diabolical modern mode of control, it is an innovative intrusion which must be interdicted at the earliest opportunity.
Such is “work.” Play is just the opposite. Play is always voluntary. What might otherwise be play is work if it’s forced. This is axiomatic. Bernie de Koven has defined play as the “suspension of consequences.” This is unacceptable if it implies that play is inconsequential. The point is not that play is without consequences. This is to demean play. The point is that the consequences, if any, are gratuitous. Playing and giving are closely related, they are the behavioral and transactional facets of the same impulse, the play-instinct. They share an aristocratic disdain for results. The player gets something out of playing; that’s why he plays. But the core reward is the experience of the activity itself (whatever it is). Some otherwise attentive students of play, like Johan Huizinga (Homo Ludens), define it as game-playing or following rules. I respect Huizinga’s erudition but emphatically reject his constraints. There are many good games (chess, baseball, Monopoly, bridge) which are rule-governed but there is much more to play than game-playing. Conversation, sex, dancing, travel — these practices aren’t rule-governed but they are surely play if anything is. And rules can be played with at least as readily as anything else.
Work makes a mockery of freedom. The official line is that we all have rights and live in a democracy. Other unfortunates who aren’t free like we are have to live in police states. These victims obey orders or-else, no matter how arbitrary. The authorities keep them under regular surveillance. State bureaucrats control even the smaller details of everyday life. The officials who push them around are answerable only to higher-ups, public or private. Either way, dissent and disobedience are punished. Informers report regularly to the authorities. All this is supposed to be a very bad thing.
And so it is, although it is nothing but a description of the modern workplace. The liberals and conservatives and libertarians who lament totalitarianism are phonies and hypocrites. There is more freedom in any moderately deStalinized dictatorship than there is in the ordinary American workplace. You find the same sort of hierarchy and discipline in an office or factory as you do in a prison or monastery. In fact, as Foucault and others have shown, prisons and factories came in at about the same time, and their operators consciously borrowed from each other’s control techniques. A worker is a part time slave. The boss says when to show up, when to leave, and what to do in the meantime. He tells you how much work to do and how fast. He is free to carry his control to humiliating extremes, regulating, if he feels like it, the clothes you wear or how often you go to the bathroom. With a few exceptions he can fire you for any reason, or no reason. He has you spied on by snitches and supervisors, he amasses a dossier on every employee. Talking back is called “insubordination,” just as if a worker is a naughty child, and it not only gets you fired, it disqualifies you for unemployment compensation. Without necessarily endorsing it for them either, it is noteworthy that children at home and in school receive much the same treatment, justified in their case by their supposed immaturity. What does this say about their parents and teachers who work?
The demeaning system of domination I’ve described rules over half the waking hours of a majority of women and the vast majority of men for decades, for most of their lifespans. For certain purposes it’s not too misleading to call our system democracy or capitalism or — better still — industrialism, but its real names are factory fascism and office oligarchy. Anybody who says these people are “free” is lying or stupid. You are what you do. If you do boring, stupid monotonous work, chances are you’ll end up boring, stupid and monotonous. Work is a much better explanation for the creeping cretinization all around us than even such significant moronizing mechanisms as television and education. People who are regimented all their lives, handed off to work from school and bracketed by the family in the beginning and the nursing home at the end, are habituated to heirarchy and psychologically enslaved. Their aptitude for autonomy is so atrophied that their fear of freedom is among their few rationally grounded phobias. Their obedience training at work carries over into the families they start, thus reproducing the system in more ways than one, and into politics, culture and everything else. Once you drain the vitality from people at work, they’ll likely submit to heirarchy and expertise in everything. They’re used to it.
We are so close to the world of work that we can’t see what it does to us. We have to rely on outside observers from other times or other cultures to appreciate the extremity and the pathology of our present position. There was a time in our own past when the “work ethic” would have been incomprehensible, and perhaps Weber was on to something when he tied its appearance to a religion, Calvinism, which if it emerged today instead of four centuries ago would immediately and appropriately be labeled a cult. Be that as it may, we have only to draw upon the wisdom of antiquity to put work in perspective. The ancients saw work for what it is, and their view prevailed, the Calvinist cranks notwithstanding, until overthrown by industrialism — but not before receiving the endorsement of its prophets.
Let’s pretend for a moment that work doesn’t turn people into stultified submissives. Let’s pretend, in defiance of any plausible psychology and the ideology of its boosters, that it has no effect on the formation of character. And let’s pretend that work isn’t as boring and tiring and humiliating as we all know it really is. Even then, work would still make a mockery of all humanistic and democratic aspirations, just because it usurps so much of our time. Socrates said that manual laborers make bad friends and bad citizens because they have no time to fulfill the responsibilities of friendship and citizenship. He was right. Because of work, no matter what we do we keep looking at our watches. The only thing “free” about so-called free time is that it doesn’t cost the boss anything. Free time is mostly devoted to getting ready for work, going to work, returning from work, and recovering from work. Free time is a euphemism for the peculiar way labor as a factor of production not only transports itself at its own expense to and from the workplace but assumes primary responsibility for its own maintenance and repair. Coal and steel don’t do that. Lathes and typewriters don’t do that. But workers do. No wonder Edward G. Robinson in one of his gangster movies exclaimed, “Work is for saps!”
Both Plato and Xenophon attribute to Socrates and obviously share with him an awareness of the destructive effects of work on the worker as a citizen and a human being. Herodotus identified contempt for work as an attribute of the classical Greeks at the zenith of their culture. To take only one Roman example, Cicero said that “whoever gives his labor for money sells himself and puts himself in the rank of slaves.” His candor is now rare, but contemporary primitive societies which we are wont to look down upon have provided spokesmen who have enlightened Western anthropologists. The Kapauku of West Irian, according to Posposil, have a conception of balance in life and accordingly work only every other day, the day of rest designed “to regain the lost power and health.” Our ancestors, even as late as the eighteenth century when they were far along the path to our present predicament, at least were aware of what we have forgotten, the underside of industrialization. Their religious devotion to “St. Monday” — thus establishing a de facto five-day week 150-200 years before its legal consecration — was the despair of the earliest factory owners. They took a long time in submitting to the tyranny of the bell, predecessor of the time clock. In fact it was necessary for a generation or two to replace adult males with women accustomed to obedience and children who could be molded to fit industrial needs. Even the exploited peasants of the ancient regime wrested substantial time back from their landlord’s work. According to Lafargue, a fourth of the French peasants’ calendar was devoted to Sundays and holidays, and Chayanov’s figures from villages in Czarist Russia — hardly a progressive society — likewise show a fourth or fifth of peasants’ days devoted to repose. Controlling for productivity, we are obviously far behind these backward societies. The exploited muzhiks would wonder why any of us are working at all. So should we.
To grasp the full enormity of our deterioration, however, consider the earliest condition of humanity, without government or property, when we wandered as hunter-gatherers. Hobbes surmised that life was then nasty, brutish and short. Others assume that life was a desperate unremitting struggle for subsistence, a war waged against a harsh Nature with death and disaster awaiting the unlucky or anyone who was unequal to the challenge of the struggle for existence. Actually, that was all a projection of fears for the collapse of government authority over communities unaccustomed to doing without it, like the England of Hobbes during the Civil War. Hobbes’ compatriots had already encountered alternative forms of society which illustrated other ways of life — in North America, particularly — but already these were too remote from their experience to be understandable. (The lower orders, closer to the condition of the Indians, understood it better and often found it attractive. Throughout the seventeenth century, English settlers defected to Indian tribes or, captured in war, refused to return. But the Indians no more defected to white settlements than Germans climb the Berlin Wall from the west.) The “survival of the fittest” version — the Thomas Huxley version — of Darwinism was a better account of economic conditions in Victorian England than it was of natural selection, as the anarchist Kropotkin showed in his book Mutual Aid, A Factor of Evolution. (Kropotkin was a scientist — a geographer — who’d had ample involuntary opportunity for fieldwork whilst exiled in Siberia: he knew what he was talking about.) Like most social and political theory, the story Hobbes and his successors told was really unacknowledged autobiography.
The anthropologist Marshall Sahlins, surveying the data on contemporary hunter-gatherers, exploded the Hobbesian myth in an article entitled “The Original Affluent Society.” They work a lot less than we do, and their work is hard to distinguish from what we regard as play. Sahlins concluded that “hunters and gatherers work less than we do; and rather than a continuous travail, the food quest is intermittent, leisure abundant, and there is a greater amount of sleep in the daytime per capita per year than in any other condition of society.” They worked an average of four hours a day, assuming they were “working” at all. Their “labor,” as it appears to us, was skilled labor which exercised their physical and intellectual capacities; unskilled labor on any large scale, as Sahlins says, is impossible except under industrialism. Thus it satisfied Friedrich Schiller’s definition of play, the only occasion on which man realizes his complete humanity by giving full “play” to both sides of his twofold nature, thinking and feeling. As he put it: “The animal works when deprivation is the mainspring of its activity, and it plays when the fullness of its strength is this mainspring, when superabundant life is its own stimulus to activity.” (A modern version — dubiously developmental — is Abraham Maslow’s counterposition of “deficiency” and “growth” motivation.) Play and freedom are, as regards production, coextensive. Even Marx, who belongs (for all his good intentions) in the productivist pantheon, observed that “the realm of freedom does not commence until the point is passed where labor under the compulsion of necessity and external utility is required.” He never could quite bring himself to identify this happy circumstance as what it is, the abolition of work — it’s rather anomalous, after all, to be pro-worker and anti-work — but we can.
The aspiration to go backwards or forwards to a life without work is evident in every serious social or cultural history of pre-industrial Europe, among them M. Dorothy George’s England In Transition and Peter Burke’s Popular Culture in Early Modern Europe. Also pertinent is Daniel Bell’s essay, “Work and its Discontents,” the first text, I believe, to refer to the “revolt against work” in so many words and, had it been understood, an important correction to the complacency ordinarily associated with the volume in which it was collected, The End of Ideology. Neither critics nor celebrants have noticed that Bell’s end-of-ideology thesis signaled not the end of social unrest but the beginning of a new, uncharted phase unconstrained and uninformed by ideology. It was Seymour Lipset (in Political Man), not Bell, who announced at the same time that “the fundamental problems of the Industrial Revolution have been solved,” only a few years before the post- or meta-industrial discontents of college students drove Lipset from UC Berkeley to the relative (and temporary) tranquility of Harvard.
As Bell notes, Adam Smith in The Wealth of Nations, for all his enthusiasm for the market and the division of labor, was more alert to (and more honest about) the seamy side of work than Ayn Rand or the Chicago economists or any of Smith’s modern epigones. As Smith observed: “The understandings of the greater part of men are necessarily formed by their ordinary employments. The man whose life is spent in performing a few simple operations… has no occasion to exert his understanding… He generally becomes as stupid and ignorant as it is possible for a human creature to become.” Here, in a few blunt words, is my critique of work. Bell, writing in 1956, the Golden Age of Eisenhower imbecility and American self-satisfaction, identified the unorganized, unorganizable malaise of the 1970’s and since, the one no political tendency is able to harness, the one identified in HEW’s report Work in America, the one which cannot be exploited and so is ignored. That problem is the revolt against work. It does not figure in any text by any laissez-faire economist — Milton Friedman, Murray Rothbard, Richard Posner — because, in their terms, as they used to say on Star Trek, “it does not compute.”
If these objections, informed by the love of liberty, fail to persuade humanists of a utilitarian or even paternalist turn, there are others which they cannot disregard. Work is hazardous to your health, to borrow a book title. In fact, work is mass murder or genocide. Directly or indirectly, work will kill most of the people who read these words. Between 14,000 and 25,000 workers are killed annually in this country on the job. Over two million are disabled. Twenty to twenty-five million are injured every year. And these figures are based on a very conservative estimation of what constitutes a work-related injury. Thus they don’t count the half million cases of occupational disease every year. I looked at one medical textbook on occupational diseases which was 1,200 pages long. Even this barely scratches the surface. The available statistics count the obvious cases like the 100,000 miners who have black lung disease, of whom 4,000 die every year, a much higher fatality rate than for AIDS, for instance, which gets so much media attention. This reflects the unvoiced assumption that AIDS afflicts perverts who could control their depravity whereas coal-mining is a sacrosanct activity beyond question. What the statistics don’t show is that tens of millions of people have heir lifespans shortened by work — which is all that homicide means, after all. Consider the doctors who work themselves to death in their 50’s. Consider all the other workaholics.
Even if you aren’t killed or crippled while actually working, you very well might be while going to work, coming from work, looking for work, or trying to forget about work. The vast majority of victims of the automobile are either doing one of these work-obligatory activities or else fall afoul of those who do them. To this augmented body-count must be added the victims of auto-industrial pollution and work-induced alcoholism and drug addiction. Both cancer and heart disease are modern afflictions normally traceable, directly, or indirectly, to work.
Work, then, institutionalizes homicide as a way of life. People think the Cambodians were crazy for exterminating themselves, but are we any different? The Pol Pot regime at least had a vision, however blurred, of an egalitarian society. We kill people in the six-figure range (at least) in order to sell Big Macs and Cadillacs to the survivors. Our forty or fifty thousand annual highway fatalities are victims, not martyrs. They died for nothing — or rather, they died for work. But work is nothing to die for.
Bad news for liberals: regulatory tinkering is useless in this life-and-death context. The federal Occupational Safety and Health Administration was designed to police the core part of the problem, workplace safety. Even before Reagan and the Supreme Court stifled it, OSHA was a farce. At previous and (by current standards) generous Carter-era funding levels, a workplace could expect a random visit from an OSHA inspector once every 46 years.
State control of the economy is no solution. Work is, if anything, more dangerous in the state-socialist countries than it is here. Thousands of Russian workers were killed or injured building the Moscow subway. Stories reverberate about covered-up Soviet nuclear disasters which make Times Beach and Three-Mile Island look like elementary-school air-raid drills. On the other hand, deregulation, currently fashionable, won’t help and will probably hurt. From a health and safety standpoint, among others, work was at its worst in the days when the economy most closely approximated laissez-faire.
Historians like Eugene Genovese have argued persuasively that — as antebellum slavery apologists insisted — factory wage-workers in the Northern American states and in Europe were worse off than Southern plantation slaves. No rearrangement of relations among bureaucrats and businessmen seems to make much difference at the point of production. Serious enforcement of even the rather vague standards enforceable in theory by OSHA would probably bring the economy to a standstill. The enforcers apparently appreciate this, since they don’t even try to crack down on most malefactors.
What I’ve said so far ought not to be controversial. Many workers are fed up with work. There are high and rising rates of absenteeism, turnover, employee theft and sabotage, wildcat strikes, and overall goldbricking on the job. There may be some movement toward a conscious and not just visceral rejection of work. And yet the prevalent feeling, universal among bosses and their agents and also widespread among workers themselves is that work itself is inevitable and necessary.
I disagree. It is now possible to abolish work and replace it, insofar as it serves useful purposes, with a multitude of new kinds of free activities. To abolish work requires going at it from two directions, quantitative and qualitative. On the one hand, on the quantitative side, we have to cut down massively on the amount of work being done. At present most work is useless or worse and we should simply get rid of it. On the other hand — and I think this is the crux of the matter and the revolutionary new departure — we have to take what useful work remains and transform it into a pleasing variety of game-like and craft-like pastimes, indistinguishable from other pleasurable pastimes, except that they happen to yield useful end-products. Surely that shouldn’t make them less enticing to do. Then all the artificial barriers of power and property could come down. Creation could become recreation. And we could all stop being afraid of each other.
I don’t suggest that most work is salvageable in this way. But then most work isn’t worth trying to save. Only a small and diminishing fraction of work serves any useful purpose independent of the defense and reproduction of the work-system and its political and legal appendages. Twenty years ago, Paul and Percival Goodman estimated that just five percent of the work then being done — presumably the figure, if accurate, is lower now — would satisfy our minimal needs for food, clothing, and shelter. Theirs was only an educated guess but the main point is quite clear: directly or indirectly, most work serves the unproductive purposes of commerce or social control. Right off the bat we can liberate tens of millions of salesmen, soldiers, managers, cops, stockbrokers, clergymen, bankers, lawyers, teachers, landlords, security guards, ad-men and everyone who works for them. There is a snowball effect since every time you idle some bigshot you liberate his flunkeys and underlings also. Thus the economy implodes.
Forty percent of the workforce are white-collar workers, most of whom have some of the most tedious and idiotic jobs ever concocted. Entire industries, insurance and banking and real estate for instance, consist of nothing but useless paper-shuffling. It is no accident that the “tertiary sector,” the service sector, is growing while the “secondary sector” (industry) stagnates and the “primary sector” (agriculture) nearly disappears. Because work is unnecessary except to those whose power it secures, workers are shifted from relatively useful to relatively useless occupations as a measure to assure public order. Anything is better than nothing. That’s why you can’t go home just because you finish early. They want your time, enough of it to make you theirs, even if they have no use for most of it. Otherwise why hasn’t the average work week gone down by more than a few minutes in the past fifty years?
Next we can take a meat-cleaver to production work itself. No more war production, nuclear power, junk food, feminine hygiene deodorant — and above all, no more auto industry to speak of. An occasional Stanley Steamer or Model-T might be all right, but the auto-eroticism on which such pestholes as Detroit and Los Angeles depend on is out of the question. Already, without even trying, we’ve virtually solved the energy crisis, the environmental crisis and assorted other insoluble social problems.
Finally, we must do away with far and away the largest occupation, the one with the longest hours, the lowest pay and some of the most tedious tasks around. I refer to housewives doing housework and child-rearing. By abolishing wage-labor and achieving full unemployment we undermine the sexual division of labor. The nuclear family as we know it is an inevitable adaptation to the division of labor imposed by modern wage-work. Like it or not, as things have been for the last century or two it is economically rational for the man to bring home the bacon, for the woman to do the shitwork to provide him with a haven in a heartless world, and for the children to be marched off to youth concentration camps called “schools,” primarily to keep them out of Mom’s hair but still under control, but incidentally to acquire the habits of obedience and punctuality so necessary for workers. If you would be rid of patriarchy, get rid of the nuclear family whose unpaid “shadow work,” as Ivan Illich says, makes possible the work-system that makes it necessary. Bound up with this no-nukes strategy is the abolition of childhood and the closing of the schools. There are more full-time students than full-time workers in this country. We need children as teachers, not students. They have a lot to contribute to the ludic revolution because they’re better at playing than grown-ups are. Adults and children are not identical but they will become equal through interdependence. Only play can bridge the generation gap.
I haven’t as yet even mentioned the possibility of cutting way down on the little work that remains by automating and cybernizing it. All the scientists and engineers and technicians freed from bothering with war research and planned obsolescence would have a good time devising means to eliminate fatigue and tedium and danger from activities like mining. Undoubtedly they’ll find other projects to amuse themselves with. Perhaps they’ll set up world-wide all-inclusive multi-media communications systems or found space colonies. Perhaps. I myself am no gadget freak. I wouldn’t care to live in a pushbutton paradise. I don’t want robot slaves to do everything; I want to do things myself. There is, I think, a place for labor-saving technology, but a modest place. The historical and pre-historical record is not encouraging. When productive technology went from hunting-gathering to agriculture and on to industry, work increased while skills and self-determination diminished. The further evolution of industrialism has accentuated what Harry Braverman called the degradation of work. Intelligent observers have always been aware of this. John Stuart Mill wrote that all the labor-saving inventions ever devised haven’t saved a moment’s labor. Karl Marx wrote that “it would be possible to write a history of the inventions, made since 1830, for the sole purpose of supplying capital with weapons against the revolts of the working class.” The enthusiastic technophiles — Saint-Simon, Comte, Lenin, B. F. Skinner — have always been unabashed authoritarians also; which is to say, technocrats. We should be more than sceptical about the promises of the computer mystics. They work like dogs; chances are, if they have their way, so will the rest of us. But if they have any particularized contributions more readily subordinated to human purposes than the run of high tech, let’s give them a hearing.
What I really want to see is work turned into play. A first step is to discard the notions of a “job” and an “occupation.” Even activities that already have some ludic content lose most of it by being reduced to jobs which certain people, and only those people are forced to do to the exclusion of all else. Is it not odd that farm workers toil painfully in the fields while their air-conditioned masters go home every weekend and putter about in their gardens? Under a system of permanent revelry, we will witness the Golden Age of the dilettante which will put the Renaissance to shame. There won’t be any more jobs, just things to do and people to do them.
The secret of turning work into play, as Charles Fourier demonstrated, is to arrange useful activities to take advantage of whatever it is that various people at various times in fact enjoy doing. To make it possible for some people to do the things they could enjoy it will be enough just to eradicate the irrationalities and distortions which afflict these activities when they are reduced to work. I, for instance, would enjoy doing some (not too much) teaching, but I don’t want coerced students and I don’t care to suck up to pathetic pedants for tenure.
Second, there are some things that people like to do from time to time, but not for too long, and certainly not all the time. You might enjoy baby-sitting for a few hours in order to share the company of kids, but not as much as their parents do. The parents meanwhile, profoundly appreciate the time to themselves that you free up for them, although they’d get fretful if parted from their progeny for too long. These differences among individuals are what make a life of free play possible. The same principle applies to many other areas of activity, especially the primal ones. Thus many people enjoy cooking when they can practice it seriously at their leisure, but not when they’re just fueling up human bodies for work.
Third — other things being equal — some things that are unsatisfying if done by yourself or in unpleasant surroundings or at the orders of an overlord are enjoyable, at least for a while, if these circumstances are changed. This is probably true, to some extent, of all work. People deploy their otherwise wasted ingenuity to make a game of the least inviting drudge-jobs as best they can. Activities that appeal to some people don’t always appeal to all others, but everyone at least potentially has a variety of interests and an interest in variety. As the saying goes, “anything once.” Fourier was the master at speculating how aberrant and perverse penchants could be put to use in post-civilized society, what he called Harmony. He thought the Emperor Nero would have turned out all right if as a child he could have indulged his taste for bloodshed by working in a slaughterhouse. Small children who notoriously relish wallowing in filth could be organized in “Little Hordes” to clean toilets and empty the garbage, with medals awarded to the outstanding. I am not arguing for these precise examples but for the underlying principle, which I think makes perfect sense as one dimension of an overall revolutionary transformation. Bear in mind that we don’t have to take today’s work just as we find it and match it up with the proper people, some of whom would have to be perverse indeed. If technology has a role in all this it is less to automate work out of existence than to open up new realms for re/creation. To some extent we may want to return to handicrafts, which William Morris considered a probable and desirable upshot of communist revolution. Art would be taken back from the snobs and collectors, abolished as a specialized department catering to an elite audience, and its qualities of beauty and creation restored to integral life from which they were stolen by work. It’s a sobering thought that the grecian urns we write odes about and showcase in museums were used in their own time to store olive oil. I doubt our everyday artifacts will fare as well in the future, if there is one. The point is that there’s no such thing as progress in the world of work; if anything it’s just the opposite. We shouldn’t hesitate to pilfer the past for what it has to offer, the ancients lose nothing yet we are enriched.
The reinvention of daily life means marching off the edge of our maps. There is, it is true, more suggestive speculation than most people suspect. Besides Fourier and Morris — and even a hint, here and there, in Marx — there are the writings of Kropotkin, the syndicalists Pataud and Pouget, anarcho-communists old (Berkman) and new (Bookchin). The Goodman brothers’ Communitas is exemplary for illustrating what forms follow from given functions (purposes), and there is something to be gleaned from the often hazy heralds of alternative/appropriate/intermediate/convivial technology, like Schumacher and especially Illich, once you disconnect their fog machines. The situationists — as represented by Vaneigem’s Revolution of Daily Life and in the Situationist International Anthology — are so ruthlessly lucid as to be exhilarating, even if they never did quite square the endorsement of the rule of the worker’s councils with the abolition of work. Better their incongruity, though than any extant version of leftism, whose devotees look to be the last champions of work, for if there were no work there would be no workers, and without workers, who would the left have to organize?
So the abolitionists would be largely on their own. No one can say what would result from unleashing the creative power stultified by work. Anything can happen. The tiresome debater’s problem of freedom vs. necessity, with its theological overtones, resolves itself practically once the production of use-values is coextensive with the consumption of delightful play-activity.
Life will become a game, or rather many games, but not — as it is now – — a zero/sum game. An optimal sexual encounter is the paradigm of productive play, The participants potentiate each other’s pleasures, nobody keeps score, and everybody wins. The more you give, the more you get. In the ludic life, the best of sex will diffuse into the better part of daily life. Generalized play leads to the libidinization of life. Sex, in turn, can become less urgent and desperate, more playful. If we play our cards right, we can all get more out of life than we put into it; but only if we play for keeps.
No one should ever work. Workers of the world… relax!
Originally posted here:
Posted: at 5:26 am
Loving & Understanding an Empath. _ Elephant Journal
On Healing Other Peoples Negativity _ Go DEEP With Ven
Janov’s Reflections on the Human Condition More on Feelings
Unethical Mental Health Practices What Do They Look Like Anchored -In- Knowledge Counseling
How to Find the Work You Were Meant to Do
The Drama of Deception _ Psychology Today
Forgiving Yourself After Abuse the Reconciliation of Heart and Mind _ Narcissist, Sociopath, And Psychopath Abuse Recover
Soulmates in Hell 15 Distinctive Phases of a Relationship With an N_S_P
What It Means to Be Addicted to a Narcissist and How to Break Free From It _ Narcissism Recovery and Relationships Blog
Why Do We Repeat the Past in Our Relationships _ Psychology Today
Patience, The Art of Intelligent Waiting – By Sara Childre
The Art of Self-Forgiveness _ Wildmind Buddhist Meditation
Needy Narcissism – Article by Dr. Lynne Namka
Filling the Hole in Your Heart Recovering From Childhood _ Psychology Today
8 Types of Toxic Patterns in Mother-Daughter Relationships _ Psychology Today
54 Principles of Emotional Healing _ Go DEEP With Ven
20 Characteristics of a Con Man Sociopath _ True Love Scam
10 Things That Cloak Bad Therapists Anchored -In- Knowledge Counseling
9 Signs Your Coworker is a Psychopath _ Business _ News _ the Independent
The Last Reflex Deja (Back Up) Past Lives, Cosmic Adventures of the Being
4 Reasons Why Intelligent People Have the Hardest Time Finding Love – DavidWolfe.com
4 Facts About Trauma-bonding in Abusive Relationships _ Avalanche of the Soul
3 Ways to Maintain Your Self-Preservation When Dealing With a Narcissist _ Life, Health, Career Coaching
3 Major Signs YouRe in a Trauma Bond _ Avalanche of the Soul
1 in 25 People Have No Conscience – What You Should Know About Psychopaths – Simona Rich
Posted: September 8, 2016 at 6:39 am
LEGAL DISCLAIMER: The VRTO Conference and Expo is for all ages over 13, though some demos may not accommodate minors. Tickets are non-refundable. VRTO attendance does not guarantee access to any particular demo, but rather the event as a whole.
By attending Virtual Reality Toronto Conference and Expo 2016 (aka VRTO Virtual & Augmented Reality World Conference and Expo, VRTO Con, or VRTOC) at 50 Carlton St. Toronto, Ontario, Canada and Holiday Inn Toronto Downtown Centre (the Location)on June 25th 27th, 2016, you release VRTO Conference and Expo, the Location, all of the events organizers, volunteers, employees, exhibitors, vendors and contractors from any and all liability financial or otherwise that may arise from my attendance at the above mentioned event. You are hereby aware that some people experience nausea, disorientation, motion sickness, general discomfort, headaches or other health issues when experiencing virtual reality.
Your entry signals your agreement to take full responsibility for these or any other consequences that may arise from attendance at VRTO Conference and Expo. Be aware that some content being exhibited is in an unfinished, prototype state, and may heighten the above sensations. Demo experiences are explicitly not intended for users under 13 years of age, and your entry signals your agreement to take full responsibility for any use of virtual reality or any other technologies exhibited at VRTO Conference and Expo by any minors in your care. By remaining on the premises you agree you will not pursue legal action against VRTO Conference and Expo, the Location, any of the events organizers, volunteers, employees, exhibitors, vendors, or contractors for any damages real or perceived arising from attendance at VRTO Conference and Expo. If you do not wish to be subject to the foregoing, please do not attend VRTO Conference and Expo.
Your attendance signals your irrevocable consent to, and authorization without compensation for VRTO, its successors, assigns, contractors and other film crews permitted by the VRTO to film at VRTO Conference and Expo, to use your likeness, voice, and to make video and audio recordings of your attendance at VRTO Conference and Expo. You are hereby aware of such recording, and relinquish your rights to any compensation for any release of these recordings in any media now extant or to be devised in the future. If you do not wish to be subject to the foregoing, please do not attend VRTO Conference and Expo.