Author Archives: jos

Human – Wikipedia

Posted: October 25, 2016 at 7:34 am

Human[1] Temporal range: 0.1950Ma Middle Pleistocene Recent An adult human male (left) and female (right) in Northern Thailand. Scientific classification Kingdom: Animalia Phylum: Chordata Clade: Synapsida Class: Mammalia Order: Primates Suborder: Haplorhini Family: Hominidae Genus: Homo Species: H.sapiens Binomial name Homo sapiens Linnaeus, 1758 Subspecies

Homo sapiens idaltu White et al., 2003 Homo sapiens sapiens

Modern humans (Homo sapiens, primarily ssp. Homo sapiens sapiens) are the only extant members of Hominina clade (or human clade), a branch of the taxonomical tribe Hominini belonging to the family of great apes. They are characterized by erect posture and bipedal locomotion; manual dexterity and increased tool use, compared to other animals; and a general trend toward larger, more complex brains and societies.[3][4]

Early homininsparticularly the australopithecines, whose brains and anatomy are in many ways more similar to ancestral non-human apesare less often referred to as “human” than hominins of the genus Homo.[5] Several of these hominins used fire, occupied much of Eurasia, and gave rise to anatomically modern Homo sapiens in Africa about 200,000 years ago.[6][7] They began to exhibit evidence of behavioral modernity around 50,000 years ago. In several waves of migration, anatomically modern humans ventured out of Africa and populated most of the world.[8]

The spread of humans and their large and increasing population has had a profound impact on large areas of the environment and millions of native species worldwide. Advantages that explain this evolutionary success include a relatively larger brain with a particularly well-developed neocortex, prefrontal cortex and temporal lobes, which enable high levels of abstract reasoning, language, problem solving, sociality, and culture through social learning. Humans use tools to a much higher degree than any other animal, are the only extant species known to build fires and cook their food, and are the only extant species to clothe themselves and create and use numerous other technologies and arts.

Humans are uniquely adept at utilizing systems of symbolic communication (such as language and art) for self-expression and the exchange of ideas, and for organizing themselves into purposeful groups. Humans create complex social structures composed of many cooperating and competing groups, from families and kinship networks to political states. Social interactions between humans have established an extremely wide variety of values,[9]social norms, and rituals, which together form the basis of human society. Curiosity and the human desire to understand and influence the environment and to explain and manipulate phenomena (or events) has provided the foundation for developing science, philosophy, mythology, religion, anthropology, and numerous other fields of knowledge.

Though most of human existence has been sustained by hunting and gathering in band societies,[10] increasing numbers of human societies began to practice sedentary agriculture approximately some 10,000 years ago,[11] domesticating plants and animals, thus allowing for the growth of civilization. These human societies subsequently expanded in size, establishing various forms of government, religion, and culture around the world, unifying people within regions to form states and empires. The rapid advancement of scientific and medical understanding in the 19th and 20th centuries led to the development of fuel-driven technologies and increased lifespans, causing the human population to rise exponentially. By February 2016, the global human population had exceeded 7.3 billion.[12]






















In common usage, the word “human” generally refers to the only extant species of the genus Homo anatomically and behaviorally modern Homo sapiens.

In scientific terms, the meanings of “hominid” and “hominin” have changed during the recent decades with advances in the discovery and study of the fossil ancestors of modern humans. The previously clear boundary between humans and apes has blurred, resulting in now acknowledging the hominids as encompassing multiple species, and Homo and close relatives since the split from chimpanzees as the only hominins. There is also a distinction between anatomically modern humans and Archaic Homo sapiens, the earliest fossil members of the species.

The English adjective human is a Middle English loanword from Old French humain, ultimately from Latin hmnus, the adjective form of hom “man.” The word’s use as a noun (with a plural: humans) dates to the 16th century.[13] The native English term man can refer to the species generally (a synonym for humanity), and could formerly refer to specific individuals of either sex, though this latter use is now obsolete.[14]

The species binomial Homo sapiens was coined by Carl Linnaeus in his 18th century work Systema Naturae.[15] The generic name Homo is a learned 18th century derivation from Latin hom “man,” ultimately “earthly being” (Old Latin hem a cognate to Old English guma “man,” from PIE demon-, meaning “earth” or “ground”).[16] The species-name sapiens means “wise” or “sapient.” Note that the Latin word homo refers to humans of either gender, and that sapiens is the singular form (while there is no such word as sapien).[17]

The genus Homo evolved and diverged from other hominins in Africa, after the human clade split from the chimpanzee lineage of the hominids (great apes) branch of the primates. Modern humans, defined as the species Homo sapiens or specifically to the single extant subspecies Homo sapiens sapiens, proceeded to colonize all the continents and larger islands, arriving in Eurasia 125,00060,000 years ago,[18][19]Australia around 40,000 years ago, the Americas around 15,000 years ago, and remote islands such as Hawaii, Easter Island, Madagascar, and New Zealand between the years 300 and 1280.[20][21]

The closest living relatives of humans are chimpanzees (genus Pan) and gorillas (genus Gorilla).[22] With the sequencing of both the human and chimpanzee genome, current estimates of similarity between human and chimpanzee DNA sequences range between 95% and 99%.[22][23][24] By using the technique called a molecular clock which estimates the time required for the number of divergent mutations to accumulate between two lineages, the approximate date for the split between lineages can be calculated. The gibbons (Hylobatidae) and orangutans (genus Pongo) were the first groups to split from the line leading to the humans, then gorillas (genus Gorilla) followed by the chimpanzees (genus Pan). The splitting date between human and chimpanzee lineages is placed around 48 million years ago during the late Miocene epoch.[25][26] During this split, chromosome 2 was formed from two other chromosomes, leaving humans with only 23 pairs of chromosomes, compared to 24 for the other apes.[27][28]

There is little fossil evidence for the divergence of the gorilla, chimpanzee and hominin lineages.[29][30] The earliest fossils that have been proposed as members of the hominin lineage are Sahelanthropus tchadensis dating from 7 million years ago, Orrorin tugenensis dating from 5.7 million years ago, and Ardipithecus kadabba dating to 5.6 million years ago. Each of these species has been argued to be a bipedal ancestor of later hominins, but all such claims are contested. It is also possible that any one of the three is an ancestor of another branch of African apes, or is an ancestor shared between hominins and other African Hominoidea (apes). The question of the relation between these early fossil species and the hominin lineage is still to be resolved. From these early species the australopithecines arose around 4 million years ago diverged into robust (also called Paranthropus) and gracile branches, possibly one of which (such as A. garhi, dating to 2.5 million years ago) is a direct ancestor of the genus Homo.[citation needed]

The earliest members of the genus Homo are Homo habilis which evolved around 2.8 million years ago.[31]Homo habilis has been considered the first species for which there is clear evidence of the use of stone tools. More recently, however, in 2015, stone tools, perhaps predating Homo habilis, have been discovered in northwestern Kenya that have been dated to 3.3 million years old.[32] Nonetheless, the brains of Homo habilis were about the same size as that of a chimpanzee, and their main adaptation was bipedalism as an adaptation to terrestrial living. During the next million years a process of encephalization began, and with the arrival of Homo erectus in the fossil record, cranial capacity had doubled. Homo erectus were the first of the hominina to leave Africa, and these species spread through Africa, Asia, and Europe between 1.3to1.8 million years ago. One population of H. erectus, also sometimes classified as a separate species Homo ergaster, stayed in Africa and evolved into Homo sapiens. It is believed that these species were the first to use fire and complex tools. The earliest transitional fossils between H. ergaster/erectus and archaic humans are from Africa such as Homo rhodesiensis, but seemingly transitional forms are also found at Dmanisi, Georgia. These descendants of African H. erectus spread through Eurasia from ca. 500,000 years ago evolving into H. antecessor, H. heidelbergensis and H. neanderthalensis. The earliest fossils of anatomically modern humans are from the Middle Paleolithic, about 200,000 years ago such as the Omo remains of Ethiopia and the fossils of Herto sometimes classified as Homo sapiens idaltu.[33] Later fossils of archaic Homo sapiens from Skhul in Israel and Southern Europe begin around 90,000 years ago.[34]

Human evolution is characterized by a number of morphological, developmental, physiological, and behavioral changes that have taken place since the split between the last common ancestor of humans and chimpanzees. The most significant of these adaptations are 1. bipedalism, 2. increased brain size, 3. lengthened ontogeny (gestation and infancy), 4. decreased sexual dimorphism (neoteny). The relationship between all these changes is the subject of ongoing debate.[35] Other significant morphological changes included the evolution of a power and precision grip, a change first occurring in H. erectus.[36]

Bipedalism is the basic adaption of the hominin line, and it is considered the main cause behind a suite of skeletal changes shared by all bipedal hominins. The earliest bipedal hominin is considered to be either Sahelanthropus[37] or Orrorin, with Ardipithecus, a full bipedal, coming somewhat later.[citation needed] The knuckle walkers, the gorilla and chimpanzee, diverged around the same time, and either Sahelanthropus or Orrorin may be humans’ last shared ancestor with those animals.[citation needed] The early bipedals eventually evolved into the australopithecines and later the genus Homo.[citation needed] There are several theories of the adaptational value of bipedalism. It is possible that bipedalism was favored because it freed up the hands for reaching and carrying food, because it saved energy during locomotion, because it enabled long distance running and hunting, or as a strategy for avoiding hyperthermia by reducing the surface exposed to direct sun.[citation needed]

The human species developed a much larger brain than that of other primates typically 1,330 cm3 in modern humans, over twice the size of that of a chimpanzee or gorilla.[38] The pattern of encephalization started with Homo habilis which at approximately 600cm3 had a brain slightly larger than chimpanzees, and continued with Homo erectus (8001100cm3), and reached a maximum in Neanderthals with an average size of 12001900cm3, larger even than Homo sapiens (but less encephalized).[39] The pattern of human postnatal brain growth differs from that of other apes (heterochrony), and allows for extended periods of social learning and language acquisition in juvenile humans. However, the differences between the structure of human brains and those of other apes may be even more significant than differences in size.[40][41][42][43] The increase in volume over time has affected different areas within the brain unequally the temporal lobes, which contain centers for language processing have increased disproportionately, as has the prefrontal cortex which has been related to complex decision making and moderating social behavior.[38] Encephalization has been tied to an increasing emphasis on meat in the diet,[44][45] or with the development of cooking,[46] and it has been proposed [47] that intelligence increased as a response to an increased necessity for solving social problems as human society became more complex.

The reduced degree of sexual dimorphism is primarily visible in the reduction of the male canine tooth relative to other ape species (except gibbons). Another important physiological change related to sexuality in humans was the evolution of hidden estrus. Humans are the only ape in which the female is fertile year round, and in which no special signals of fertility are produced by the body (such as genital swelling during estrus). Nonetheless humans retain a degree of sexual dimorphism in the distribution of body hair and subcutaneous fat, and in the overall size, males being around 25% larger than females. These changes taken together have been interpreted as a result of an increased emphasis on pair bonding as a possible solution to the requirement for increased parental investment due to the prolonged infancy of offspring.[citation needed]

By the beginning of the Upper Paleolithic period (50,000 BP), full behavioral modernity, including language, music and other cultural universals had developed.[48][49] As modern humans spread out from Africa they encountered other hominids such as Homo neanderthalensis and the so-called Denisovans. The nature of interaction between early humans and these sister species has been a long-standing source of controversy, the question being whether humans replaced these earlier species or whether they were in fact similar enough to interbreed, in which case these earlier populations may have contributed genetic material to modern humans.[50] Recent studies of the human and Neanderthal genomes suggest gene flow between archaic Homo sapiens and Neanderthals and Denisovans.[51][52][53] In March 2016, studies were published that suggest that modern humans bred with hominins, including Denisovans and Neanderthals, on multiple occasions.[54]

This dispersal out of Africa is estimated to have begun about 70,000 years BP from Northeast Africa. Current evidence suggests that there was only one such dispersal and that it only involved a few hundred individuals. The vast majority of humans stayed in Africa and adapted to a diverse array of environments.[55] Modern humans subsequently spread globally, replacing earlier hominins (either through competition or hybridization). They inhabited Eurasia and Oceania by 40,000 years BP, and the Americas at least 14,500 years BP.[56][57]

Until about 10,000 years ago, humans lived as hunter-gatherers. They gradually gained domination over much of the natural environment. They generally lived in small nomadic groups known as band societies, often in caves. The advent of agriculture prompted the Neolithic Revolution, when access to food surplus led to the formation of permanent human settlements, the domestication of animals and the use of metal tools for the first time in history. Agriculture encouraged trade and cooperation, and led to complex society.[citation needed]

The early civilizations of Mesopotamia, Egypt, India, China, Maya, Greece and Rome were some of the cradles of civilization.[58][59][60] The Late Middle Ages and the Early Modern Period saw the rise of revolutionary ideas and technologies. Over the next 500 years, exploration and European colonialism brought great parts of the world under European control, leading to later struggles for independence. The concept of the modern world as distinct from an ancient world is based on a rapid change progress in a brief period of time in many areas.[citation needed] Advances in all areas of human activity prompted new theories such as evolution and psychoanalysis, which changed humanity’s views of itself.[citation needed] The Scientific Revolution, Technological Revolution and the Industrial Revolution up until the 19th century resulted in independent discoveries such as imaging technology, major innovations in transport, such as the airplane and automobile; energy development, such as coal and electricity.[61] This correlates with population growth (especially in America)[62] and higher life expectancy, the World population rapidly increased numerous times in the 19th and 20th centuries as nearly 10% of the 100 billion people lived in the past century.[63]

With the advent of the Information Age at the end of the 20th century, modern humans live in a world that has become increasingly globalized and interconnected. As of 2010, almost 2billion humans are able to communicate with each other via the Internet,[64] and 3.3 billion by mobile phone subscriptions.[65] Although interconnection between humans has encouraged the growth of science, art, discussion, and technology, it has also led to culture clashes and the development and use of weapons of mass destruction.[citation needed] Human civilization has led to environmental destruction and pollution significantly contributing to the ongoing mass extinction of other forms of life called the Holocene extinction event,[66] which may be further accelerated by global warming in the future.[67]

Early human settlements were dependent on proximity to water and, depending on the lifestyle, other natural resources used for subsistence, such as populations of animal prey for hunting and arable land for growing crops and grazing livestock. But humans have a great capacity for altering their habitats by means of technology, through irrigation, urban planning, construction, transport, manufacturing goods, deforestation and desertification. Deliberate habitat alteration is often done with the goals of increasing material wealth, increasing thermal comfort, improving the amount of food available, improving aesthetics, or improving ease of access to resources or other human settlements. With the advent of large-scale trade and transport infrastructure, proximity to these resources has become unnecessary, and in many places, these factors are no longer a driving force behind the growth and decline of a population. Nonetheless, the manner in which a habitat is altered is often a major determinant in population change.[citation needed]

Technology has allowed humans to colonize all of the continents and adapt to virtually all climates. Within the last century, humans have explored Antarctica, the ocean depths, and outer space, although large-scale colonization of these environments is not yet feasible. With a population of over seven billion, humans are among the most numerous of the large mammals. Most humans (61%) live in Asia. The remainder live in the Americas (14%), Africa (14%), Europe (11%), and Oceania (0.5%).[68]

Human habitation within closed ecological systems in hostile environments, such as Antarctica and outer space, is expensive, typically limited in duration, and restricted to scientific, military, or industrial expeditions. Life in space has been very sporadic, with no more than thirteen humans in space at any given time.[69] Between 1969 and 1972, two humans at a time spent brief intervals on the Moon. As of October 2016, no other celestial body has been visited by humans, although there has been a continuous human presence in space since the launch of the initial crew to inhabit the International Space Station on October 31, 2000.[70] However, other celestial bodies have been visited by human-made objects.[71][72][73]

Since 1800, the human population has increased from one billion[74] to over seven billion,[75] In 2004, some 2.5 billion out of 6.3 billion people (39.7%) lived in urban areas. In February 2008, the U.N. estimated that half the world’s population would live in urban areas by the end of the year.[76] Problems for humans living in cities include various forms of pollution and crime,[77] especially in inner city and suburban slums. Both overall population numbers and the proportion residing in cities are expected to increase significantly in the coming decades.[78]

Humans have had a dramatic effect on the environment. Humans are apex predators, being rarely preyed upon by other species.[79] Currently, through land development, combustion of fossil fuels, and pollution, humans are thought to be the main contributor to global climate change.[80] If this continues at its current rate it is predicted that climate change will wipe out half of all plant and animal species over the next century.[81][82]

Most aspects of human physiology are closely homologous to corresponding aspects of animal physiology. The human body consists of the legs, the torso, the arms, the neck, and the head. An adult human body consists of about 100 trillion (1014) cells. The most commonly defined body systems in humans are the nervous, the cardiovascular, the circulatory, the digestive, the endocrine, the immune, the integumentary, the lymphatic, the muscoskeletal, the reproductive, the respiratory, and the urinary system.[83][84]

Humans, like most of the other apes, lack external tails, have several blood type systems, have opposable thumbs, and are sexually dimorphic. The comparatively minor anatomical differences between humans and chimpanzees are a result of human bipedalism. One difference is that humans have a far faster and more accurate throw than other animals. Humans are also among the best long-distance runners in the animal kingdom, but slower over short distances.[85][86] Humans’ thinner body hair and more productive sweat glands help avoid heat exhaustion while running for long distances.[87]

As a consequence of bipedalism, human females have narrower birth canals. The construction of the human pelvis differs from other primates, as do the toes. A trade-off for these advantages of the modern human pelvis is that childbirth is more difficult and dangerous than in most mammals, especially given the larger head size of human babies compared to other primates. This means that human babies must turn around as they pass through the birth canal, which other primates do not do, and it makes humans the only species where females require help from their conspecifics[clarification needed] to reduce the risks of birthing. As a partial evolutionary solution, human fetuses are born less developed and more vulnerable. Chimpanzee babies are cognitively more developed than human babies until the age of six months, when the rapid development of human brains surpasses chimpanzees. Another difference between women and chimpanzee females is that women go through the menopause and become unfertile decades before the end of their lives. All species of non-human apes are capable of giving birth until death. Menopause probably developed as it has provided an evolutionary advantage (more caring time) to young relatives.[86]

Apart from bipedalism, humans differ from chimpanzees mostly in smelling, hearing, digesting proteins, brain size, and the ability of language. Humans’ brains are about three times bigger than in chimpanzees. More importantly, the brain to body ratio is much higher in humans than in chimpanzees, and humans have a significantly more developed cerebral cortex, with a larger number of neurons. The mental abilities of humans are remarkable compared to other apes. Humans’ ability of speech is unique among primates. Humans are able to create new and complex ideas, and to develop technology, which is unprecedented among other organisms on Earth.[86]

It is estimated that the worldwide average height for an adult human male is about 172cm (5ft 712in),[citation needed] while the worldwide average height for adult human females is about 158cm (5ft 2in).[citation needed] Shrinkage of stature may begin in middle age in some individuals, but tends to be typical in the extremely aged.[88] Through history human populations have universally become taller, probably as a consequence of better nutrition, healthcare, and living conditions.[89] The average mass of an adult human is 5464kg (120140lb) for females and 7683kg (168183lb) for males.[90] Like many other conditions, body weight and body type is influenced by both genetic susceptibility and environment and varies greatly among individuals. (see obesity)[91][92]

Although humans appear hairless compared to other primates, with notable hair growth occurring chiefly on the top of the head, underarms and pubic area, the average human has more hair follicles on his or her body than the average chimpanzee. The main distinction is that human hairs are shorter, finer, and less heavily pigmented than the average chimpanzee’s, thus making them harder to see.[93] Humans have about 2 million sweat glands spread over their entire bodies, many more than chimpanzees, whose sweat glands are scarce and are mainly located on the palm of the hand and on the soles of the feet.[94]

The dental formula of humans is: Humans have proportionately shorter palates and much smaller teeth than other primates. They are the only primates to have short, relatively flush canine teeth. Humans have characteristically crowded teeth, with gaps from lost teeth usually closing up quickly in young individuals. Humans are gradually losing their wisdom teeth, with some individuals having them congenitally absent.[95]

Like all mammals, humans are a diploid eukaryotic species. Each somatic cell has two sets of 23 chromosomes, each set received from one parent; gametes have only one set of chromosomes, which is a mixture of the two parental sets. Among the 23 pairs of chromosomes there are 22 pairs of autosomes and one pair of sex chromosomes. Like other mammals, humans have an XY sex-determination system, so that females have the sex chromosomes XX and males have XY.[96]

One human genome was sequenced in full in 2003, and currently efforts are being made to achieve a sample of the genetic diversity of the species (see International HapMap Project). By present estimates, humans have approximately 22,000 genes.[97] The variation in human DNA is very small compared to other species, possibly suggesting a population bottleneck during the Late Pleistocene (around 100,000 years ago), in which the human population was reduced to a small number of breeding pairs.[98][99]Nucleotide diversity is based on single mutations called single nucleotide polymorphisms (SNPs). The nucleotide diversity between humans is about 0.1%, i.e. 1 difference per 1,000 base pairs.[100][101] A difference of 1 in 1,000 nucleotides between two humans chosen at random amounts to about 3 million nucleotide differences, since the human genome has about 3 billion nucleotides. Most of these single nucleotide polymorphisms (SNPs) are neutral but some (about 3 to 5%) are functional and influence phenotypic differences between humans through alleles.[citation needed]

By comparing the parts of the genome that are not under natural selection and which therefore accumulate mutations at a fairly steady rate, it is possible to reconstruct a genetic tree incorporating the entire human species since the last shared ancestor. Each time a certain mutation (SNP) appears in an individual and is passed on to his or her descendants, a haplogroup is formed including all of the descendants of the individual who will also carry that mutation. By comparing mitochondrial DNA, which is inherited only from the mother, geneticists have concluded that the last female common ancestor whose genetic marker is found in all modern humans, the so-called mitochondrial Eve, must have lived around 90,000 to 200,000 years ago.[102][103][104]

Human accelerated regions, first described in August 2006,[105][106] are a set of 49 segments of the human genome that are conserved throughout vertebrate evolution but are strikingly different in humans. They are named according to their degree of difference between humans and their nearest animal relative (chimpanzees) (HAR1 showing the largest degree of human-chimpanzee differences). Found by scanning through genomic databases of multiple species, some of these highly mutated areas may contribute to human-specific traits.[citation needed]

The forces of natural selection have continued to operate on human populations, with evidence that certain regions of the genome display directional selection in the past 15,000 years.[107]

As with other mammals, human reproduction takes place as internal fertilization by sexual intercourse. During this process, the male inserts his erect penis into the female’s vagina and ejaculates semen, which contains sperm. The sperm travels through the vagina and cervix into the uterus or Fallopian tubes for fertilization of the ovum. Upon fertilization and implantation, gestation then occurs within the female’s uterus.

The zygote divides inside the female’s uterus to become an embryo, which over a period of 38 weeks (9 months) of gestation becomes a fetus. After this span of time, the fully grown fetus is birthed from the woman’s body and breathes independently as an infant for the first time. At this point, most modern cultures recognize the baby as a person entitled to the full protection of the law, though some jurisdictions extend various levels of personhood earlier to human fetuses while they remain in the uterus.

Compared with other species, human childbirth is dangerous. Painful labors lasting 24 hours or more are not uncommon and sometimes lead to the death of the mother, the child or both.[108] This is because of both the relatively large fetal head circumference and the mother’s relatively narrow pelvis.[109][110] The chances of a successful labor increased significantly during the 20th century in wealthier countries with the advent of new medical technologies. In contrast, pregnancy and natural childbirth remain hazardous ordeals in developing regions of the world, with maternal death rates approximately 100 times greater than in developed countries.[111]

In developed countries, infants are typically 34kg (69pounds) in weight and 5060cm (2024inches) in height at birth.[112][not in citation given] However, low birth weight is common in developing countries, and contributes to the high levels of infant mortality in these regions.[113] Helpless at birth, humans continue to grow for some years, typically reaching sexual maturity at 12 to 15years of age. Females continue to develop physically until around the age of 18, whereas male development continues until around age 21. The human life span can be split into a number of stages: infancy, childhood, adolescence, young adulthood, adulthood and old age. The lengths of these stages, however, have varied across cultures and time periods. Compared to other primates, humans experience an unusually rapid growth spurt during adolescence, where the body grows 25% in size. Chimpanzees, for example, grow only 14%, with no pronounced spurt.[114] The presence of the growth spurt is probably necessary to keep children physically small until they are psychologically mature. Humans are one of the few species in which females undergo menopause. It has been proposed that menopause increases a woman’s overall reproductive success by allowing her to invest more time and resources in her existing offspring, and in turn their children (the grandmother hypothesis), rather than by continuing to bear children into old age.[115][116]

For various reasons, including biological/genetic causes,[117] women live on average about four years longer than menas of 2013 the global average life expectancy at birth of a girl is estimated at 70.2 years compared to 66.1 for a boy.[118] There are significant geographical variations in human life expectancy, mostly correlated with economic developmentfor example life expectancy at birth in Hong Kong is 84.8years for girls and 78.9 for boys, while in Swaziland, primarily because of AIDS, it is 31.3years for both sexes.[119] The developed world is generally aging, with the median age around 40years. In the developing world the median age is between 15 and 20years. While one in five Europeans is 60years of age or older, only one in twenty Africans is 60years of age or older.[120] The number of centenarians (humans of age 100years or older) in the world was estimated by the United Nations at 210,000 in 2002.[121] At least one person, Jeanne Calment, is known to have reached the age of 122years;[122] higher ages have been claimed but they are not well substantiated.

Humans are omnivorous, capable of consuming a wide variety of plant and animal material.[123][124] Varying with available food sources in regions of habitation, and also varying with cultural and religious norms, human groups have adopted a range of diets, from purely vegetarian to primarily carnivorous. In some cases, dietary restrictions in humans can lead to deficiency diseases; however, stable human groups have adapted to many dietary patterns through both genetic specialization and cultural conventions to use nutritionally balanced food sources.[125] The human diet is prominently reflected in human culture, and has led to the development of food science.

Until the development of agriculture approximately 10,000 years ago, Homo sapiens employed a hunter-gatherer method as their sole means of food collection. This involved combining stationary food sources (such as fruits, grains, tubers, and mushrooms, insect larvae and aquatic mollusks) with wild game, which must be hunted and killed in order to be consumed.[126] It has been proposed that humans have used fire to prepare and cook food since the time of Homo erectus.[127] Around ten thousand years ago, humans developed agriculture,[128] which substantially altered their diet. This change in diet may also have altered human biology; with the spread of dairy farming providing a new and rich source of food, leading to the evolution of the ability to digest lactose in some adults.[129][130] Agriculture led to increased populations, the development of cities, and because of increased population density, the wider spread of infectious diseases. The types of food consumed, and the way in which they are prepared, have varied widely by time, location, and culture.

In general, humans can survive for two to eight weeks without food, depending on stored body fat. Survival without water is usually limited to three or four days. About 36 million humans die every year from causes directly or indirectly related to starvation.[131] Childhood malnutrition is also common and contributes to the global burden of disease.[132] However global food distribution is not even, and obesity among some human populations has increased rapidly, leading to health complications and increased mortality in some developed, and a few developing countries. Worldwide over one billion people are obese,[133] while in the United States 35% of people are obese, leading to this being described as an “obesity epidemic.”[134] Obesity is caused by consuming more calories than are expended, so excessive weight gain is usually caused by an energy-dense diet.[133]

No two humansnot even monozygotic twinsare genetically identical. Genes and environment influence human biological variation from visible characteristics to physiology to disease susceptibly to mental abilities. The exact influence of genes and environment on certain traits is not well understood.[135][136]

Most current genetic and archaeological evidence supports a recent single origin of modern humans in East Africa,[137] with first migrations placed at 60,000 years ago. Compared to the great apes, human gene sequenceseven among African populationsare remarkably homogeneous.[138] On average, genetic similarity between any two humans is 99.9%.[139][140] There is about 23 times more genetic diversity within the wild chimpanzee population, than in the entire human gene pool.[141][142][143]

The human body’s ability to adapt to different environmental stresses is remarkable, allowing humans to acclimatize to a wide variety of temperatures, humidity, and altitudes. As a result, humans are a cosmopolitan species found in almost all regions of the world, including tropical rainforests, arid desert, extremely cold arctic regions, and heavily polluted cities. Most other species are confined to a few geographical areas by their limited adaptability.[144]

There is biological variation in the human specieswith traits such as blood type, cranial features, eye color, hair color and type, height and build, and skin color varying across the globe. Human body types vary substantially. The typical height of an adult human is between 1.4m and 1.9m (4ft 7 in and 6ft 3 in), although this varies significantly depending, among other things, on sex and ethnic origin.[145][146] Body size is partly determined by genes and is also significantly influenced by environmental factors such as diet, exercise, and sleep patterns, especially as an influence in childhood. Adult height for each sex in a particular ethnic group approximately follows a normal distribution. Those aspects of genetic variation that give clues to human evolutionary history, or are relevant to medical research, have received particular attention. For example, the genes that allow adult humans to digest lactose are present in high frequencies in populations that have long histories of cattle domestication, suggesting natural selection having favored that gene in populations that depend on cow milk. Some hereditary diseases such as sickle cell anemia are frequent in populations where malaria has been endemic throughout historyit is believed that the same gene gives increased resistance to malaria among those who are unaffected carriers of the gene. Similarly, populations that have for a long time inhabited specific climates, such as arctic or tropical regions or high altitudes, tend to have developed specific phenotypes that are beneficial for conserving energy in those environmentsshort stature and stocky build in cold regions, tall and lanky in hot regions, and with high lung capacities at high altitudes. Similarly, skin color varies clinally with darker skin around the equatorwhere the added protection from the sun’s ultraviolet radiation is thought to give an evolutionary advantageand lighter skin tones closer to the poles.[147][148][149][150]

The hue of human skin and hair is determined by the presence of pigments called melanins. Human skin color can range from darkest brown to lightest peach, or even nearly white or colorless in cases of albinism.[143] Human hair ranges in color from white to red to blond to brown to black, which is most frequent.[151] Hair color depends on the amount of melanin (an effective sun blocking pigment) in the skin and hair, with hair melanin concentrations in hair fading with increased age, leading to grey or even white hair. Most researchers believe that skin darkening is an adaptation that evolved as protection against ultraviolet solar radiation, which also helps balancing folate, which is destroyed by ultraviolet radiation. Light skin pigmentation protects against depletion of vitamin D, which requires sunlight to make.[152] Skin pigmentation of contemporary humans is clinally distributed across the planet, and in general correlates with the level of ultraviolet radiation in a particular geographic area. Human skin also has a capacity to darken (tan) in response to exposure to ultraviolet radiation.[153][154][155]

Within the human species, the greatest degree of genetic variation exists between males and females. While the nucleotide genetic variation of individuals of the same sex across global populations is no greater than 0.1%, the genetic difference between males and females is between 1% and 2%. Although different in nature[clarification needed], this approaches the genetic differentiation between men and male chimpanzees or women and female chimpanzees. The genetic difference between sexes contributes to anatomical, hormonal, neural, and physiological differences between men and women, although the exact degree and nature of social and environmental influences on sexes are not completely understood. Males on average are 15% heavier and 15cm taller than females. There is a difference between body types, body organs and systems, hormonal levels, sensory systems, and muscle mass between sexes. On average, there is a difference of about 4050% in upper body strength and 2030% in lower body strength between men and women. Women generally have a higher body fat percentage than men. Women have lighter skin than men of the same population; this has been explained by a higher need for vitamin D (which is synthesized by sunlight) in females during pregnancy and lactation. As there are chromosomal differences between females and males, some X and Y chromosome related conditions and disorders only affect either men or women. Other conditional differences between males and females are not related to sex chromosomes. Even after allowing for body weight and volume, the male voice is usually an octave deeper than the female voice. Women have a longer life span in almost every population around the world.[157][158][159][160][161][162][163][164][165]

Males typically have larger tracheae and branching bronchi, with about 30% greater lung volume per unit body mass. They have larger hearts, 10% higher red blood cell count, and higher hemoglobin, hence greater oxygen-carrying capacity. They also have higher circulating clotting factors (vitamin K, prothrombin and platelets). These differences lead to faster healing of wounds and higher peripheral pain tolerance.[166] Females typically have more white blood cells (stored and circulating), more granulocytes and B and T lymphocytes. Additionally, they produce more antibodies at a faster rate than males. Hence they develop fewer infectious diseases and these continue for shorter periods.[166]Ethologists argue that females, interacting with other females and multiple offspring in social groups, have experienced such traits as a selective advantage.[167][168][169][170][171] According to Daly and Wilson, “The sexes differ more in human beings than in monogamous mammals, but much less than in extremely polygamous mammals.”[172] But given that sexual dimorphism in the closest relatives of humans is much greater than among humans, the human clade must be considered to be characterized by decreasing sexual dimorphism, probably due to less competitive mating patterns. One proposed explanation is that human sexuality has developed more in common with its close relative the bonobo, which exhibits similar sexual dimorphism, is polygynandrous and uses recreational sex to reinforce social bonds and reduce aggression.[173]

Humans of the same sex are 99.9% genetically identical. There is extremely little variation between human geographical populations, and most of the variation that does occur is at the personal level within local areas, and not between populations.[143][174][175] Of the 0.1% of human genetic differentiation, 85% exists within any randomly chosen local population, be they Italians, Koreans, or Kurds. Two randomly chosen Koreans may be genetically as different as a Korean and an Italian. Any ethnic group contains 85% of the human genetic diversity of the world. Genetic data shows that no matter how population groups are defined, two people from the same population group are about as different from each other as two people from any two different population groups.[143][176][177][178]

Current genetic research has demonstrated that humans on the African continent are the most genetically diverse.[179] There is more human genetic diversity in Africa than anywhere else on Earth. The genetic structure of Africans was traced to 14 ancestral population clusters. Human genetic diversity decreases in native populations with migratory distance from Africa and this is thought to be the result of bottlenecks during human migration.[180][181] Humans have lived in Africa for the longest time, which has allowed accumulation of a higher diversity of genetic mutations in these populations. Only part of Africa’s population migrated out of the continent, bringing just part of the original African genetic variety with them. African populations harbor genetic alleles that are not found in other places of the world. All the common alleles found in populations outside of Africa are found on the African continent.[143]

Geographical distribution of human variation is complex and constantly shifts through time which reflects complicated human evolutionary history. Most human biological variation is clinally distributed and blends gradually from one area to the next. Groups of people around the world have different frequencies of polymorphic genes. Furthermore, different traits are non-concordant and each have different clinal distribution. Adaptability varies both from person to person and from population to population. The most efficient adaptive responses are found in geographical populations where the environmental stimuli are the strongest (e.g. Tibetans are highly adapted to high altitudes). The clinal geographic genetic variation is further complicated by the migration and mixing between human populations which has been occurring since prehistoric times.[143][182][183][184][185][186]

Here is the original post:
Human – Wikipedia

Posted in Human Genetics | Comments Off on Human – Wikipedia

Supercourse: Epidemiology, the Internet, and Global Health

Posted: at 7:34 am


Academic research council

Achievements public health

Achievements public health

Acne therapeutic strategies

Acute coronary symptoms

Acute coronary syndromes

Adenoviridae and iridoviridae

Adherence hypertension treatment

Administration management medical organizations

Adolescent health risk behavior

Adolescents reproductive health

Adolescents reproductive health

Adverse drug reactions

Advocacy strategy planning

African sleeping sickness

Aids/ hiv current senario

Airborne contaminants

Air pollution armenia

Air pollution armenia

American heart association

Aminoglycosidearginine conjugates

Analytic epidemiology

Anaplasmosis taxonomic

Anemia family practice

Anger regulation interventions

Antimicrobial resistance

Antimicrobrial peptides

Antiretroviral agents

Assessing disease frequency

Assessment bioterrorism threat

Assessment nutritional

Assistive technology devices

Attack preparedness events

Avian influenza: zoonosis

Bacterial membrane vesicles

Bacterial vaginosis pregnancy

Bases of biostatistics

Behaviour medical sciences

Betaserk treatment stroke

Bias confounding chance

Bimaristans (hospitals) islamic

Binomial distribution

Biochemical system medicine

Biological challenges

Biological epidemiologic studies


Biostatistics public health

Blood donors non-donors

Blood glucose normaization

Bmj triages manuscripts

Body fluid volume regulation

Bolonya declaration education

Bone marrow transplantation

Breast self examination

Bronchial asthma treatmen

Building vulnerability

Burden infectious diseases

Burnout in physicians

Cncer en mxico

Cancer survivorship research

Canine monocytic ehrlichiosis

Capability development

Capture-recapture techniques

Cardiology practice grenada

Cardiometabolic syndrome

Cardiopulmonary resuscitation

Cardio-respiratory illness

Cardiovascular disease

Cardiovascular disease black

Cardiovascular disease prevention

Cardiovascular diseases

Cardiovascular system

Carpal tunnel syndrome

Caseous lymphadenitis

Cause epidemiological approach

Central nervous system

Cervical cancer screening

Changing interpretations

Chemical weapon bioterrorism

Chemiosmotic paradigm

Chickenpox children pregnancy

Child health kazakhstan

Childhood asthma bedding.

Childhood asthma prevalence

Childhood diabetes mellitus

Childhood hearing impairment

Children september 11th attacks


Chinese herbal medicines

Chns hypertension control

Cholera global health

Cholesterol education program

Chronic disease management

Chronic fatigue syndrome

Chronic liver disease

Chronic lung diseases

Chronic noncommunicable diseases

Chronic obstructive pulmonary disease

Chronic pulmonary heart

Continue reading here:
Supercourse: Epidemiology, the Internet, and Global Health

Posted in Human Genetics | Comments Off on Supercourse: Epidemiology, the Internet, and Global Health

Eugenics – Wikipedia

Posted: October 23, 2016 at 4:23 am

Eugenics (; from Greek eugenes “well-born” from eu, “good, well” and genos, “race, stock, kin”)[2][3] is a set of beliefs and practices that aims at improving the genetic quality of the human population.[4][5] It is a social philosophy advocating the improvement of human genetic traits through the promotion of higher rates of sexual reproduction for people with desired traits (positive eugenics), or reduced rates of sexual reproduction and sterilization of people with less-desired or undesired traits (negative eugenics), or both.[6] Alternatively, gene selection rather than “people selection” has recently been made possible through advances in genome editing (e.g. CRISPR).[7] The exact definition of eugenics has been a matter of debate since the term was coined. The definition of it as a “social philosophy”that is, a philosophy with implications for social orderis not universally accepted, and was taken from Frederick Osborn’s 1937 journal article “Development of a Eugenic Philosophy”.[6]

While eugenic principles have been practiced as far back in world history as Ancient Greece, the modern history of eugenics began in the early 20th century when a popular eugenics movement emerged in the United Kingdom[8] and spread to many countries, including the United States, Canada[9] and most European countries. In this period, eugenic ideas were espoused across the political spectrum. Consequently, many countries adopted eugenic policies meant to improve the genetic stock of their countries. Such programs often included both “positive” measures, such as encouraging individuals deemed particularly “fit” to reproduce, and “negative” measures such as marriage prohibitions and forced sterilization of people deemed unfit for reproduction. People deemed unfit to reproduce often included people with mental or physical disabilities, people who scored in the low ranges of different IQ tests, criminals and deviants, and members of disfavored minority groups. The eugenics movement became negatively associated with Nazi Germany and the Holocaust when many of the defendants at the Nuremberg trials attempted to justify their human rights abuses by claiming there was little difference between the Nazi eugenics programs and the US eugenics programs.[10] In the decades following World War II, with the institution of human rights, many countries gradually abandoned eugenics policies, although some Western countries, among them the United States, continued to carry out forced sterilizations.

Since the 1980s and 1990s when new assisted reproductive technology procedures became available, such as gestational surrogacy (available since 1985), preimplantation genetic diagnosis (available since 1989) and cytoplasmic transfer (first performed in 1996), fear about a possible future revival of eugenics and a widening of the gap between the rich and the poor has emerged.

A major criticism of eugenics policies is that, regardless of whether “negative” or “positive” policies are used, they are vulnerable to abuse because the criteria of selection are determined by whichever group is in political power. Furthermore, negative eugenics in particular is considered by many to be a violation of basic human rights, which include the right to reproduction. Another criticism is that eugenic policies eventually lead to a loss of genetic diversity, resulting in inbreeding depression instead due to a low genetic variation.

The idea of positive eugenics to produce better human beings has existed at least since Plato suggested selective mating to produce a guardian class.[12] The idea of negative eugenics to decrease the birth of inferior human beings has existed at least since William Goodell (1829-1894) advocated the castration and spaying of the insane.[13][14]

However, the term “eugenics” to describe a modern project of improving the human population through breeding was originally developed by Francis Galton. Galton had read his half-cousin Charles Darwin’s theory of evolution, which sought to explain the development of plant and animal species, and desired to apply it to humans. Based on his biographical studies, Galton believed that desirable human qualities were hereditary traits, though Darwin strongly disagreed with this elaboration of his theory.[15] In 1883, one year after Darwin’s death, Galton gave his research a name: eugenics.[16] Throughout its recent history, eugenics has remained controversial.

Eugenics became an academic discipline at many colleges and universities, and received funding from many sources.[18] Organisations formed to win public support and sway opinion towards responsible eugenic values in parenthood, including the British Eugenics Education Society of 1907, and the American Eugenics Society of 1921. Both sought support from leading clergymen, and modified their message to meet religious ideals.[19] In 1909 the Anglican clergymen William Inge and James Peile both wrote for the British Eugenics Education Society. Inge was an invited speaker at the 1921 International Eugenics Conference, which was also endorsed by the Roman Catholic Archbishop of New York Patrick Joseph Hayes.[19]

Three International Eugenics Conferences presented a global venue for eugenists with meetings in 1912 in London, and in 1921 and 1932 in New York City. Eugenic policies were first implemented in the early 1900s in the United States.[20] It also took root in France, Germany, and Great Britain.[21] Later, in the 1920s and 30s, the eugenic policy of sterilizing certain mental patients was implemented in other countries, including Belgium,[22]Brazil,[23]Canada,[24]Japan and Sweden.

In addition to being practiced in a number of countries, eugenics was internationally organized through the International Federation of Eugenics Organizations. Its scientific aspects were carried on through research bodies such as the Kaiser Wilhelm Institute of Anthropology, Human Heredity, and Eugenics, the Cold Spring Harbour Carnegie Institution for Experimental Evolution, and the Eugenics Record Office. Politically, the movement advocated measures such as sterilization laws. In its moral dimension, eugenics rejected the doctrine that all human beings are born equal, and redefined moral worth purely in terms of genetic fitness. Its racist elements included pursuit of a pure “Nordic race” or “Aryan” genetic pool and the eventual elimination of “less fit” races.

Early critics of the philosophy of eugenics included the American sociologist Lester Frank Ward,[33] the English writer G. K. Chesterton, the German-American anthropologist Franz Boas,[34] and Scottish tuberculosis pioneer and author Halliday Sutherland. Ward’s 1913 article “Eugenics, Euthenics, and Eudemics”, Chesterton’s 1917 book Eugenics and Other Evils, and Boas’ 1916 article “Eugenics” (published in The Scientific Monthly) were all harshly critical of the rapidly growing movement. Sutherland identified eugenists as a major obstacle to the eradication and cure of tuberculosis in his 1917 address “Consumption: Its Cause and Cure”,[35] and criticism of eugenists and Neo-Malthusians in his 1921 book Birth Control led to a writ for libel from the eugenist Marie Stopes. Several biologists were also antagonistic to the eugenics movement, including Lancelot Hogben.[36] Other biologists such as J. B. S. Haldane and R. A. Fisher expressed skepticism that sterilization of “defectives” would lead to the disappearance of undesirable genetic traits.[37]

Among institutions, the Catholic Church was an opponent of state-enforced sterilizations.[38] Attempts by the Eugenics Education Society to persuade the British government to legalise voluntary sterilisation were opposed by Catholics and by the Labour Party.[pageneeded] The American Eugenics Society initially gained some Catholic supporters, but Catholic support declined following the 1930 papal encyclical Casti connubii.[19] In this, Pope Pius XI explicitly condemned sterilization laws: “Public magistrates have no direct power over the bodies of their subjects; therefore, where no crime has taken place and there is no cause present for grave punishment, they can never directly harm, or tamper with the integrity of the body, either for the reasons of eugenics or for any other reason.”[39]

As a social movement, eugenics reached its greatest popularity in the early decades of the 20th century, when it was practiced around the world and promoted by governments, institutions, and influential individuals. Many countries enacted[40] various eugenics policies, including: genetic screening, birth control, promoting differential birth rates, marriage restrictions, segregation (both racial segregation and sequestering the mentally ill), compulsory sterilization, forced abortions or forced pregnancies, culminating in genocide.

The scientific reputation of eugenics started to decline in the 1930s, a time when Ernst Rdin used eugenics as a justification for the racial policies of Nazi Germany. Adolf Hitler had praised and incorporated eugenic ideas in Mein Kampf in 1925 and emulated eugenic legislation for the sterilization of “defectives” that had been pioneered in the United States once he took power. Some common early 20th century eugenics methods involved identifying and classifying individuals and their families, including the poor, mentally ill, blind, deaf, developmentally disabled, promiscuous women, homosexuals, and racial groups (such as the Roma and Jews in Nazi Germany) as “degenerate” or “unfit”, leading to their their segregation or institutionalization, sterilization, euthanasia, and even their mass murder. The Nazi practice of euthanasia was carried out on hospital patients in the Aktion T4 centers such as Hartheim Castle.

By the end of World War II, many discriminatory eugenics laws were abandoned, having become associated with Nazi Germany.[43] H. G. Wells, who had called for “the sterilization of failures” in 1904,[44] stated in his 1940 book The Rights of Man: Or What are we fighting for? that among the human rights he believed should be available to all people was “a prohibition on mutilation, sterilization, torture, and any bodily punishment”.[45] After World War II, the practice of “imposing measures intended to prevent births within [a population] group” fell within the definition of the new international crime of genocide, set out in the Convention on the Prevention and Punishment of the Crime of Genocide.[46] The Charter of Fundamental Rights of the European Union also proclaims “the prohibition of eugenic practices, in particular those aiming at selection of persons”.[47] In spite of the decline in discriminatory eugenics laws, some government mandated sterilization continued into the 21st century. During the ten years President Alberto Fujimori led Peru from 1990 to 2000, allegedly 2,000 persons were involuntarily sterilized.[48] China maintained its coercive one-child policy until 2015 as well as a suite of other eugenics based legislation to reduce population size and manage fertility rates of different populations.[49][50][51] In 2007 the United Nations reported coercive sterilisations and hysterectomies in Uzbekistan.[52] During the years 200506 to 201213, nearly one-third of the 144 California prison inmates who were sterilized did not give lawful consent to the operation.[53]

Developments in genetic, genomic, and reproductive technologies at the end of the 20th century are raising numerous questions regarding the ethical status of eugenics, effectively creating a resurgence of interest in the subject. Some, such as UC Berkeley sociologist Troy Duster, claim that modern genetics is a back door to eugenics.[54] This view is shared by White House Assistant Director for Forensic Sciences, Tania Simoncelli, who stated in a 2003 publication by the Population and Development Program at Hampshire College that advances in pre-implantation genetic diagnosis (PGD) are moving society to a “new era of eugenics”, and that, unlike the Nazi eugenics, modern eugenics is consumer driven and market based, “where children are increasingly regarded as made-to-order consumer products”.[55] In a 2006 newspaper article, Richard Dawkins said that discussion regarding eugenics was inhibited by the shadow of Nazi misuse, to the extent that some scientists would not admit that breeding humans for certain abilities is at all possible. He believes that it is not physically different from breeding domestic animals for traits such as speed or herding skill. Dawkins felt that enough time had elapsed to at least ask just what the ethical differences were between breeding for ability versus training athletes or forcing children to take music lessons, though he could think of persuasive reasons to draw the distinction.[56]

In October 2015, the United Nations’ International Bioethics Committee wrote that the ethical problems of human genetic engineering should not be confused with the ethical problems of the 20th century eugenics movements; however, it is still problematic because it challenges the idea of human equality and opens up new forms of discrimination and stigmatization for those who do not want or cannot afford the enhancements.[57]

Transhumanism is often associated with eugenics, although most transhumanists holding similar views nonetheless distance themselves from the term “eugenics” (preferring “germinal choice” or “reprogenetics”)[58] to avoid having their position confused with the discredited theories and practices of early-20th-century eugenic movements.

The term eugenics and its modern field of study were first formulated by Francis Galton in 1883,[59] drawing on the recent work of his half-cousin Charles Darwin.[60][61] Galton published his observations and conclusions in his book Inquiries into Human Faculty and Its Development.

The origins of the concept began with certain interpretations of Mendelian inheritance, and the theories of August Weismann. The word eugenics is derived from the Greek word eu (“good” or “well”) and the suffix -gens (“born”), and was coined by Galton in 1883 to replace the word “stirpiculture”, which he had used previously but which had come to be mocked due to its perceived sexual overtones.[63] Galton defined eugenics as “the study of all agencies under human control which can improve or impair the racial quality of future generations”.[64] Galton did not understand the mechanism of inheritance.[65]

Historically, the term has referred to everything from prenatal care for mothers to forced sterilization and euthanasia.[citation needed] To population geneticists, the term has included the avoidance of inbreeding without altering allele frequencies; for example, J. B. S. Haldane wrote that “the motor bus, by breaking up inbred village communities, was a powerful eugenic agent.”[66] Debate as to what exactly counts as eugenics has continued to the present day.[67]

Edwin Black, journalist and author of War Against the Weak, claims eugenics is often deemed a pseudoscience because what is defined as a genetic improvement of a desired trait is often deemed a cultural choice rather than a matter that can be determined through objective scientific inquiry.[68] The most disputed aspect of eugenics has been the definition of “improvement” of the human gene pool, such as what is a beneficial characteristic and what is a defect. This aspect of eugenics has historically been tainted with scientific racism.

Early eugenists were mostly concerned with perceived intelligence factors that often correlated strongly with social class. Some of these early eugenists include Karl Pearson and Walter Weldon, who worked on this at the University College London.[15]

Eugenics also had a place in medicine. In his lecture “Darwinism, Medical Progress and Eugenics”, Karl Pearson said that everything concerning eugenics fell into the field of medicine. He basically placed the two words as equivalents. He was supported in part by the fact that Francis Galton, the father of eugenics, also had medical training.[69]

Eugenic policies have been conceptually divided into two categories. Positive eugenics is aimed at encouraging reproduction among the genetically advantaged; for example, the reproduction of the intelligent, the healthy, and the successful. Possible approaches include financial and political stimuli, targeted demographic analyses, in vitro fertilization, egg transplants, and cloning.[70] The movie Gattaca provides a fictional example of positive eugenics done voluntarily. Negative eugenics aimed to eliminate, through sterilization or segregation, those deemed physically, mentally, or morally “undesirable”. This includes abortions, sterilization, and other methods of family planning.[70] Both positive and negative eugenics can be coercive; abortion for fit women, for example, was illegal in Nazi Germany.[71]

Jon Entine claims that eugenics simply means “good genes” and using it as synonym for genocide is an “all-too-common distortion of the social history of genetics policy in the United States.” According to Entine, eugenics developed out of the Progressive Era and not “Hitler’s twisted Final Solution”.[72]

According to Richard Lynn, eugenics may be divided into two main categories based on the ways in which the methods of eugenics can be applied.[73]

The first major challenge to conventional eugenics based upon genetic inheritance was made in 1915 by Thomas Hunt Morgan, who demonstrated the event of genetic mutation occurring outside of inheritance involving the discovery of the hatching of a fruit fly (Drosophila melanogaster) with white eyes from a family of red-eyes. Morgan claimed that this demonstrated that major genetic changes occurred outside of inheritance and that the concept of eugenics based upon genetic inheritance was not completely scientifically accurate. Additionally, Morgan criticized the view that subjective traits, such as intelligence and criminality, were caused by heredity because he believed that the definitions of these traits varied and that accurate work in genetics could only be done when the traits being studied were accurately defined.[109] In spite of Morgan’s public rejection of eugenics, much of his genetic research was absorbed by eugenics.[110][111]

The heterozygote test is used for the early detection of recessive hereditary diseases, allowing for couples to determine if they are at risk of passing genetic defects to a future child.[112] The goal of the test is to estimate the likelihood of passing the hereditary disease to future descendants.[112]

Recessive traits can be severely reduced, but never eliminated unless the complete genetic makeup of all members of the pool was known, as aforementioned. As only very few undesirable traits, such as Huntington’s disease, are dominant, it could be argued[by whom?] from certain perspectives that the practicality of “eliminating” traits is quite low.[citation needed]

There are examples of eugenic acts that managed to lower the prevalence of recessive diseases, although not influencing the prevalence of heterozygote carriers of those diseases. The elevated prevalence of certain genetically transmitted diseases among the Ashkenazi Jewish population (TaySachs, cystic fibrosis, Canavan’s disease, and Gaucher’s disease), has been decreased in current populations by the application of genetic screening.[113]

Pleiotropy occurs when one gene influences multiple, seemingly unrelated phenotypic traits, an example being phenylketonuria, which is a human disease that affects multiple systems but is caused by one gene defect.[114] Andrzej Pkalski, from the University of Wrocaw, argues that eugenics can cause harmful loss of genetic diversity if a eugenics program selects for a pleiotropic gene that is also associated with a positive trait. Pekalski uses the example of a coercive government eugenics program that prohibits people with myopia from breeding but has the unintended consequence of also selecting against high intelligence since the two go together.[115]

Eugenic policies could also lead to loss of genetic diversity, in which case a culturally accepted “improvement” of the gene pool could very likelyas evidenced in numerous instances in isolated island populations (e.g., the dodo, Raphus cucullatus, of Mauritius)result in extinction due to increased vulnerability to disease, reduced ability to adapt to environmental change, and other factors both known and unknown. A long-term species-wide eugenics plan might lead to a scenario similar to this because the elimination of traits deemed undesirable would reduce genetic diversity by definition.[116]

Edward M. Miller claims that, in any one generation, any realistic program should make only minor changes in a fraction of the gene pool, giving plenty of time to reverse direction if unintended consequences emerge, reducing the likelihood of the elimination of desirable genes.[117] Miller also argues that any appreciable reduction in diversity is so far in the future that little concern is needed for now.[117]

While the science of genetics has increasingly provided means by which certain characteristics and conditions can be identified and understood, given the complexity of human genetics, culture, and psychology there is at this point no agreed objective means of determining which traits might be ultimately desirable or undesirable. Some diseases such as sickle-cell disease and cystic fibrosis respectively confer immunity to malaria and resistance to cholera when a single copy of the recessive allele is contained within the genotype of the individual. Reducing the instance of sickle-cell disease genes in Africa where malaria is a common and deadly disease could indeed have extremely negative net consequences.

However, some genetic diseases such as haemochromatosis can increase susceptibility to illness, cause physical deformities, and other dysfunctions, which provides some incentive for people to re-consider some elements of eugenics.

Autistic people have advocated a shift in perception of autism spectrum disorders as complex syndromes rather than diseases that must be cured. Proponents of this view reject the notion that there is an “ideal” brain configuration and that any deviation from the norm is pathological; they promote tolerance for what they call neurodiversity.[118] Baron-Cohen argues that the genes for Asperger’s combination of abilities have operated throughout recent human evolution and have made remarkable contributions to human history.[119] The possible reduction of autism rates through selection against the genetic predisposition to autism is a significant political issue in the autism rights movement, which claims that autism is a part of neurodiversity.

Many culturally Deaf people oppose attempts to cure deafness, believing instead deafness should be considered a defining cultural characteristic not a disease.[120][121][122] Some people have started advocating the idea that deafness brings about certain advantages, often termed “Deaf Gain.”[123][124]

Societal and political consequences of eugenics call for a place in the discussion on the ethics behind the eugenics movement.[125] Many of the ethical concerns regarding eugenics arise from its controversial past, prompting a discussion on what place, if any, it should have in the future. Advances in science have changed eugenics. In the past, eugenics had more to do with sterilization and enforced reproduction laws.[126] Now, in the age of a progressively mapped genome, embryos can be tested for susceptibility to disease, gender, and genetic defects, and alternative methods of reproduction such as in vitro fertilization are becoming more common.[127] Therefore, eugenics is no longer ex post facto regulation of the living but instead preemptive action on the unborn.[128]

With this change, however, there are ethical concerns which lack adequate attention, and which must be addressed before eugenic policies can be properly implemented in the future. Sterilized individuals, for example, could volunteer for the procedure, albeit under incentive or duress, or at least voice their opinion. The unborn fetus on which these new eugenic procedures are performed cannot speak out, as the fetus lacks the voice to consent or to express his or her opinion.[129] Philosophers disagree about the proper framework for reasoning about such actions, which change the very identity and existence of future persons.[130]

A common criticism of eugenics is that “it inevitably leads to measures that are unethical”.[131] Some fear future “eugenics wars” as the worst-case scenario: the return of coercive state-sponsored genetic discrimination and human rights violations such as compulsory sterilization of persons with genetic defects, the killing of the institutionalized and, specifically, segregation and genocide of races perceived as inferior.[132] Health law professor George Annas and technology law professor Lori Andrews are prominent advocates of the position that the use of these technologies could lead to such human-posthuman caste warfare.[133][134]

In his 2003 book Enough: Staying Human in an Engineered Age, environmental ethicist Bill McKibben argued at length against germinal choice technology and other advanced biotechnological strategies for human enhancement. He claims that it would be morally wrong for humans to tamper with fundamental aspects of themselves (or their children) in an attempt to overcome universal human limitations, such as vulnerability to aging, maximum life span and biological constraints on physical and cognitive ability. Attempts to “improve” themselves through such manipulation would remove limitations that provide a necessary context for the experience of meaningful human choice. He claims that human lives would no longer seem meaningful in a world where such limitations could be overcome technologically. Even the goal of using germinal choice technology for clearly therapeutic purposes should be relinquished, since it would inevitably produce temptations to tamper with such things as cognitive capacities. He argues that it is possible for societies to benefit from renouncing particular technologies, using as examples Ming China, Tokugawa Japan and the contemporary Amish.[135]

Some, such as Nathaniel C. Comfort from Johns Hopkins University, claim that the change from state-led reproductive-genetic decision-making to individual choice has moderated the worst abuses of eugenics by transferring the decision-making from the state to the patient and their family.[136] Comfort suggests that “the eugenic impulse drives us to eliminate disease, live longer and healthier, with greater intelligence, and a better adjustment to the conditions of society; and the health benefits, the intellectual thrill and the profits of genetic bio-medicine are too great for us to do otherwise.”[137] Others, such as bioethicist Stephen Wilkinson of Keele University and Honorary Research Fellow Eve Garrard at the University of Manchester, claim that some aspects of modern genetics can be classified as eugenics, but that this classification does not inherently make modern genetics immoral. In a co-authored publication by Keele University, they stated that “[e]ugenics doesn’t seem always to be immoral, and so the fact that PGD, and other forms of selective reproduction, might sometimes technically be eugenic, isn’t sufficient to show that they’re wrong.”[138]

In their 2000 book From Chance to Choice: Genetics and Justice, bioethicists Allen Buchanan, Dan Brock, Norman Daniels and Daniel Wikler argued that liberal societies have an obligation to encourage as wide an adoption of eugenic enhancement technologies as possible (so long as such policies do not infringe on individuals’ reproductive rights or exert undue pressures on prospective parents to use these technologies) in order to maximize public health and minimize the inequalities that may result from both natural genetic endowments and unequal access to genetic enhancements.[139]

Original position, a hypothetical situation developed by American philosopher John Rawls, has been used as an argument for negative eugenics.[140][141]

See the rest here:

Eugenics – Wikipedia

Posted in Eugenics | Comments Off on Eugenics – Wikipedia

NEW TOWN UTOPIA by Christopher Ian Smith Kickstarter

Posted: October 20, 2016 at 11:38 pm

New Town Utopia is a documentary feature film that explores the original utopian dreams of a post-war British New Town Basildon, Essex – and compares this to the modern concrete reality.We’re close to finishing production, and after four years of serious hard work, have hundreds of hours of footage ready to be crafted into a poetic, challenging film.

It is a meditation on British social history that asks the question: do people make the place or does a place make the people?

An audiovisual journey through art, architecture and memory, the story is brought to life through the thoughts, performances and work of artists from the town.An inspirational group of characters persevering in the face of austerity, adversity and personal battles.

These are individuals driven by an unwavering desire to help their community through poetry, music, sculpture and puppets.

After WW2, New Towns were designed as social ‘Utopias’ in the model of Thomas More’s vision – to create a new type of citizen, a healthy, self-respecting, dignified person with a sense of beauty, culture and civic pride.

Basildon, the largest of the first wave of New Towns, was invested with these hopes and aspirations. However, 60 years on, art and culture are almost a distant memory.The town plans, public art and architecture, once thought so progressive, are vilified in the face of a struggling local economy and fragmented communities.

New Town Utopia questions how this dream has faded over time. In doing so, it explores the influence of environment and architecture on our psyche, and the impact of austerity on our towns and communities. In an environment where support for art and culture is at an all-time low, this film contemplates and celebrates the unceasing power of creative spirit.

The team behind New Town Utopia includes Essex-raised Producer-Director Christopher Ian Smith and Executive Producer Margaret Matheson (Sleep Furiously, Scum, Sid and Nancy).

So far,New Town Utopia has only been made possible through the kindness, time and talent of a dedicated crew with belief in the project.

Now we need to raise the money to finish the film – this requires a wide range of specialist skills, technologies, facilities and time to make it happen, including:

These campaign funds will also contribute to the distribution and marketing of the film. Were already in a great place with significant followers on Facebook, Twitter (2.5k) and Instagram (6.4k) without anyone having seen the film. There is a large community of people out there, with a strong interest in documentary, social history art and architecture.

Whats more, any money we raise above the target will contribute to the distribution of the film and outreach activities, including screenings for communities around the UK – using the film to build awareness of issues that face our towns and their people.

Your gratefully received contributions will be exchanged for a range of unique rewards relating to the film. This includes perks such as:


2016 marks the 500th anniversary of Thomas More’s Utopia and the 70th anniversary of the New Towns Act.New Town Utopia hopes to shine a light on some of the current challenges for Basildon, New Towns and other towns in the UK facing economic, social and cultural changes. We hope the film will have a positive impact on Basildon and the films distribution strategy will incorporate community action initiatives and projects involving Basildon and other New Towns.

The New Towns movement did not end in the 50s… New Towns continue to be built across the world with similar hopes, dreams and challenges and are often cited as the cure for housing crises around the globe. If we did it again how can we make it work?

Whether you make a contribution or whether you spread the word. Every little counts!You can also:


There are several key risks and challenges that the film faces ahead. Producing a feature film is a considerable undertaking with many moving parts. It will take a lot of time, thought, dedication and talent to see it through.

1. New Town Utopia has a small crew of dedicated people its a labour of love – so a key risk is that those involved have to move onto other projects in order to pay their bills. The funds raised by this campaign will ensure that this dedicated team can focus on getting this film over the line.

2. Time is also a challenge. We want to finish the film in 2016, as after four years of production its time to move onto the next stage which is finding an audience and making an impact. Most importantly, the content of the film is incredibly timely as we look at how we deal with globalisation, Brexit and the housing crisis. New Town Utopia is a window into a real community and how a ‘top-down approach to planning, management and politics has an impact on this place and its people.

The funds from this campaign will ensure a focus on delivering the film for festivals and distribution in 2017.

3. Even when the film is completed its tough to get it seen. As the way we watch and consume films is changing, there are many ways an audience can find a film. It’s crucial that New Town Utopia gets exposure through key film festivals. This will then hopefully lead to screenings on TV, Netflix or similar. We will also look to build buzz and awareness of the film through community screenings and online activity.

Read the original here:

NEW TOWN UTOPIA by Christopher Ian Smith Kickstarter

Posted in New Utopia | Comments Off on NEW TOWN UTOPIA by Christopher Ian Smith Kickstarter

Annual Reviews – Home

Posted: at 11:32 pm

This site uses cookies to improve performance. If your browser does not accept cookies, you cannot view this site.

There are many reasons why a cookie could not be set correctly. Below are the most common reasons:

This site uses cookies to improve performance by remembering that you are logged in when you go from page to page. To provide access without cookies would require the site to create a new session for every page you visit, which slows the system down to an unacceptable level.

This site stores nothing other than an automatically generated session ID in the cookie; no other information is captured.

In general, only the information that you provide, or the choices you make while visiting a web site, can be stored in a cookie. For example, the site cannot determine your email name unless you choose to type it. Allowing a website to create a cookie does not give that or any other site access to the rest of your computer, and only the site that created the cookie can read it.

See the rest here:
Annual Reviews – Home

Posted in Human Genetics | Comments Off on Annual Reviews – Home

How to increase serotonin in the human brain without drugs

Posted: October 17, 2016 at 1:20 am

For the last 4 decades, the question of how to manipulate the serotonergic system with drugs has been an important area of research in biological psychiatry, and this research has led to advances in the treatment of depression. Research on the association between various polymorphisms and depression supports the idea that serotonin plays a role, not only in the treatment of depression but also in susceptibility to depression and suicide. The research focus here has been on polymorphisms of the serotonin transporter, but other serotonin-related genes may also be involved.15 In the future, genetic research will make it possible to predict with increasing accuracy who is susceptible to depression. Much less attention has been given to how this information will be used for the benefit of individuals with a serotonin-related susceptibility to depression, and little evidence exists concerning strategies to prevent depression in those with such a susceptibility. Various studies have looked at early intervention in those with prodromal symptoms as well as at population strategies for preventing depression.611 Obviously, prevention is preferable to early intervention; moreover, although population strategies are important, they are ideally supplemented with preventive interventions that can be used over long periods of time in targeted individuals who do not yet exhibit even nonclinical symptoms. Clearly, pharmacologic approaches are not appropriate, and given the evidence for serotonin’s role in the etiology and treatment of depression, nonpharmacologic methods of increasing serotonin are potential candidates to test for their ability to prevent depression.

Another reason for pursuing nonpharmacologic methods of increasing serotonin arises from the increasing recognition that happiness and well-being are important, both as factors protecting against mental and physical disorders and in their own right.1214 Conversely, negative moods are associated with negative outcomes. For example, the negative mood hostility is a risk factor for many disorders. For the sake of brevity, hostility is discussed here mainly in relation to one of the biggest sources of mortality, coronary heart disease (CHD). A meta-analysis of 45 studies demonstrated that hostility is a risk factor for CHD and for all-cause mortality.15 More recent research confirms this. Hostility is associated not only with the development of CHD but also with poorer survival in coronary artery disease (CAD) patients.16 Hostility may lead to decreased social support and social isolation,17 and low perceived social support is associated with greater mortality in those with CAD.18 Effects are not just limited to CHD. For example, the opposite of hostility, agreeableness, was a significant protective factor against mortality in a sample of older, frail participants.19

The constitution of the WHO states Health is a state of complete physical, mental and social well-being and not merely the absence of disease or infirmity.20 This may sound exaggerated but positive mood within the normal range is an important predictor of health and longevity. In a classic study, those in the lowest quartile for positive emotions, rated from autobiographies written at a mean age of 22 years, died on average 10 years earlier than those in the highest quartile.21 Even taking into account possible confounders, other studies found the same solid link between feeling good and living longer.12 In a series of recent studies, negative emotions were associated with increased disability due to mental and physical disorders,22 increased incidence of depression,23 increased suicide24 and increased mortality25 up to 2 decades later. Positive emotions protected against these outcomes. A recent review including meta-analyses assessed cross-sectional, longitudinal and experimental studies and concluded that happiness is associated with and precedes numerous successful outcomes.26 Mood may influence social behaviour, and social support is one of the most studied psychosocial factors in relation to health and disease.27 Low social support is associated with higher levels of stress, depression, dysthymia and posttraumatic stress disorder and with increased morbidity and mortality from a host of medical illnesses.27

Research confirms what might be intuitively expected, that positive emotions and agreeableness foster congenial relationships with others.28,29 This in turn will create the conditions for an increase in social support.

Several studies found an association between measures related to serotonin and mood in the normal range. Lower platelet serotonin2 receptor function was associated with lower mood in one study,30 whereas better mood was associated with higher blood serotonin levels in another.31 Two studies found that greater prolactin release in response to fenfluramine was associated with more positive mood.32,33 The idea that these associations indicate a causal association between serotonin function and mood within the normal range is consistent with a study demonstrating that, in healthy people with high trait irritability, tryptophan, relative to placebo, decreased quarrelsome behaviours, increased agreeable behaviours and improved mood.34 Serotonin may be associated with physical health as well as mood. In otherwise healthy individuals, a low prolactin response to the serotonin-releasing drug fenfluramine was associated with the metabolic syndrome, a risk factor for heart disease,35 suggesting that low serotonin may predispose healthy individuals to suboptimal physical as well as mental functioning.

Nonpharmacologic methods of raising brain serotonin may not only improve mood and social functioning of healthy people a worthwhile objective even without additional considerations but would also make it possible to test the idea that increases in brain serotonin may help protect against the onset of various mental and physical disorders. Four strategies that are worth further investigation are discussed below.

The article by Perreau-Linck and colleagues36 (page 430 of this issue) provides an initial lead about one possible strategy for raising brain serotonin. Using positron emission tomography, they obtained a measure of serotonin synthesis in the brains of healthy participants who underwent positive, negative and neutral mood inductions. Reported levels of happiness were positively correlated and reported levels of sadness were negatively correlated with serotonin synthesis in the right anterior cingulate cortex. The idea that alterations in thought, either self-induced or due to psychotherapy, can alter brain metabolism is not new. Numerous studies have demonstrated changes in blood flow in such circumstances. However, reports related to specific transmitters are much less common. In one recent study, meditation was reported to increase release of dopamine.37 The study by Perreau-Linck and colleagues36 is the first to report that self-induced changes in mood can influence serotonin synthesis. This raises the possibility that the interaction between serotonin synthesis and mood may be 2-way, with serotonin influencing mood and mood influencing serotonin. Obviously, more work is needed to answer questions in this area. For example, is the improvement in mood associated with psychotherapy accompanied by increases in serotonin synthesis? If more precise information is obtained about the mental states that increase serotonin synthesis, will this help to enhance therapy techniques?

Exposure to bright light is a second possible approach to increasing serotonin without drugs. Bright light is, of course, a standard treatment for seasonal depression, but a few studies also suggest that it is an effective treatment for nonseasonal depression38 and also reduces depressed mood in women with premenstrual dysphoric disorder39 and in pregnant women suffering from depression.40 The evidence relating these effects to serotonin is indirect. In human postmortem brain, serotonin levels are higher in those who died in summer than in those who died in winter.41 A similar conclusion came from a study on healthy volunteers, in which serotonin synthesis was assessed by measurements of the serotonin metabolite 5-hydroxyindoleacetic acid (5-HIAA) in the venous outflow from the brain.42 There was also a positive correlation between serotonin synthesis and the hours of sunlight on the day the measurements were made, independent of season. In rats, serotonin is highest during the light part of the lightdark cycle, and this state is driven by the photic cycle rather than the circadian rhythm.43,44 The existence of a retinoraphe tract may help explain why, in experimental animals, neuronal firing rates, c-fos expression and the serotonin content in the raphe nuclei are responsive to retinal light exposure.4448 In humans, there is certainly an interaction between bright light and the serotonin system. The mood-lowering effect of acute tryptophan depletion in healthy women is completely blocked by carrying out the study in bright light (3000 lux) instead of dim light.49

Relatively few generations ago, most of the world population was involved in agriculture and was outdoors for much of the day. This would have resulted in high levels of bright light exposure even in winter. Even on a cloudy day, the light outside can be greater than 1000 lux, a level never normally achieved indoors. In a recent study carried out at around latitude 45 N, daily exposure to light greater than 1000 lux averaged about 30 minutes in winter and only about 90 minutes in summer50 among people working at least 30 hours weekly; weekends were included. In this group, summer bright light exposure was probably considerably less than the winter exposure of our agricultural ancestors. We may be living in a bright lightdeprived society. A large literature that is beyond the scope of this editorial exists on the beneficial effect of bright light exposure in healthy individuals. Lamps designed for the treatment of seasonal affective disorder, which provide more lux than is ever achieved by normal indoor lighting, are readily available, although incorporating their use into a daily routine may be a challenge for some. However, other strategies, both personal and institutional, exist. Light cafes pioneered in Scandinavia have come to the United Kingdom,51 and an Austrian village that receives no sunshine in the winter because of its surrounding mountains is building a series of giant mirrors to reflect sunlight into the valley.52 Better use of daylight in buildings is an issue that architects are increasingly aware of. Working indoors does not have to be associated with suboptimal exposure to bright light.

A third strategy that may raise brain serotonin is exercise. A comprehensive review of the relation between exercise and mood concluded that antidepressant and anxiolytic effects have been clearly demonstrated.53 In the United Kingdom the National Institute for Health and Clinical Excellence, which works on behalf of the National Health Service and makes recommendations on treatments according to the best available evidence, has published a guide on the treatment of depression.54 The guide recommends treating mild clinical depression with various strategies, including exercise rather than antidepressants, because the riskbenefit ratio is poor for antidepressant use in patients with mild depression. Exercise improves mood in subclinical populations as well as in patients. The most consistent effect is seen when regular exercisers undertake aerobic exercise at a level with which they are familiar.53 However, some skepticism remains about the antidepressant effect of exercise, and the National Institute of Mental Health in the United States is currently funding a clinical trial of the antidepressant effect of exercise that is designed to overcome sources of potential bias and threats to internal and external validity that have limited previous research.55

Several lines of research suggest that exercise increases brain serotonin function in the human brain. Post and colleagues56 measured biogenic amine metabolites in cerebrospinal fluid (CSF) of patients with depression before and after they increased their physical activity to simulate mania. Physical activity increased 5-HIAA, but it is not clear that this was due to increased serotonin turnover or to mixing of CSF from higher regions, which contain higher levels of 5-HIAA, with lumbar CSF (or to a combination of both mechanisms). Nonetheless, this finding stimulated many animal studies on the effects of exercise. For example, Chaouloff and colleagues57 showed that exercise increased tryptophan and 5-HIAA in rat ventricles. More recent studies using intracerebral dialysis have shown that exercise increases extracellular serotonin and 5-HIAA in various brain areas, including the hippocampus and cortex (for example, see5860). Two different mechanisms may be involved in this effect. As reviewed by Jacobs and Fornal,61 motor activity increases the firing rates of serotonin neurons, and this results in increased release and synthesis of serotonin.62 In addition, there is an increase in the brain of the serotonin precursor tryptophan that persists after exercise.63

The largest body of work in humans looking at the effect of exercise on tryptophan availability to the brain is concerned with the hypothesis that fatigue during exercise is associated with elevated brain tryptophan and serotonin synthesis. A large body of evidence supports the idea that exercise, including exercise to fatigue, is associated with an increase in plasma tryptophan and a decrease in the plasma level of the branched chain amino acids (BCAAs) leucine, isoleucine and valine (see64,65 for reviews). The BCAAs inhibit tryptophan transport into the brain.66 Because of the increase in plasma tryptophan and decrease in BCAA, there is a substantial increase in tryptophan availability to the brain. Tryptophan is an effective mild hypnotic,67 a fact that stimulated the hypothesis that it may be involved in fatigue. A full discussion of this topic is not within the scope of this editorial; however, it is notable that several clinical trials of BCAA investigated whether it was possible to counter fatigue by lowering brain tryptophan, with results that provided little support for the hypothesis. Further, exercise results in an increase in the plasma ratio of tryptophan to the BCAAs before the onset of fatigue.64,65 The conclusion of these studies is that, in humans, a rise in precursor availability should increase serotonin synthesis during and after exercise and that this is not related to fatigue, although it may be related to improved mood. Whether motor activity increases the firing rate of serotonin neurons in humans, as in animals, is not known. However, it is clear that aerobic exercise can improve mood.

As with exposure to bright light, there has been a large change in the level of vigorous physical exercise experienced since humans were hunter-gatherers or engaged primarily in agriculture.68 Lambert68 argued that the decline in vigorous physical exercise and, in particular, in effort-based rewards may contribute to the high level of depression in today’s society. The effect of exercise on serotonin suggests that the exercise itself, not the rewards that stem from exercise, may be important. If trials of exercise to prevent depression are successful, then prevention of depression can be added to the numerous other benefits of exercise.

The fourth factor that could play a role in raising brain serotonin is diet. According to some evidence, tryptophan, which increases brain serotonin in humans as in experimental animals,69 is an effective antidepressant in mild-to-moderate depression.67,70 Further, in healthy people with high trait irritability, it increases agreeableness, decreases quarrelsomeness and improves mood.34 However, whether tryptophan should be considered primarily as a drug or a dietary component is a matter of some dispute. In the United States, it is classified as a dietary component, but Canada and some European countries classify it as a drug. Treating tryptophan as a drug is reasonable because, first, there is normally no situation in which purified tryptophan is needed for dietary reasons, and second, purified tryptophan and foods containing tryptophan have different effects on brain serotonin. Although purified tryptophan increases brain serotonin, foods containing tryptophan do not.71 This is because tryptophan is transported into the brain by a transport system that is active toward all the large neutral amino acids and tryptophan is the least abundant amino acid in protein. There is competition between the various amino acids for the transport system, so after the ingestion of a meal containing protein, the rise in the plasma level of the other large neutral amino acids will prevent the rise in plasma tryptophan from increasing brain tryptophan. The idea, common in popular culture, that a high-protein food such as turkey will raise brain tryptophan and serotonin is, unfortunately, false. Another popular myth that is widespread on the Internet is that bananas improve mood because of their serotonin content. Although it is true that bananas contain serotonin, it does not cross the bloodbrain barrier.

-Lactalbumin, a minor constituent of milk, is one protein that contains relatively more tryptophan than most proteins. Acute ingestion of -lactalbumin by humans can improve mood and cognition in some circumstances, presumably owing to increased serotonin.72,73 Enhancing the tryptophan content of the diet chronically with -lactalbumin is probably not practical. However, increasing the tryptophan content of the diet relative to that of the other amino acids is something that possibly occurred in the past and could occur again in the future. Kerem and colleagues74 studied the tryptophan content of both wild chickpeas and the domesticated chickpeas that were bred from them in the Near East in neolithic times. The mean protein content (per mg dry seed) was similar for 73 cultivars and 15 wild varieties. In the cultivated group, however, the tryptophan content was almost twice that of the wild seeds. Interestingly, the greater part of the increase was due to an increase in the free tryptophan content (i.e., not part of the protein). In cultivated chickpeas, almost two-thirds of the tryptophan was in the free form. Kerem and colleagues74 argue that there was probably selection for seeds with a higher tryptophan content. This is plausible, given another example of an early strategy to increase the available tryptophan content of an important food source. Pellagra is a disorder caused by niacin deficiency, usually owing to poverty and a diet relying heavily on corn (maize), which has a low level of niacin and its precursor tryptophan. Cultures in the Americas that relied greatly on corn used alkali during its processing (e.g., boiling the corn in lime when making tortillas). This enhanced the nutritional quality of the corn by increasing the bioavailability of both niacin and tryptophan, a practice that prevented pellagra.75 The Europeans transported corn around the world but did not transport the traditional alkali-processing methods, thereby causing epidemics of pellagra in past centuries. Breeding corn with a higher tryptophan content was shown in the 1980s to prevent pellagra76; presumably, it also raised brain serotonin. In a recent issue of Nature Biotechnology, Morris and Sands77 argue that plant breeders should be focusing more on nutrition than on yield. They ask, Could consumption of tryptophan-rich foods play a role in reducing the prevalence of depression and aggression in society? Cross-national studies have reported a positive association between corn consumption and homicide rates78 and a negative association between dietary tryptophan and suicide rates.79 Although the idea behind such studies is interesting, any causal attribution must remain speculative, given the possible confounds. Nonetheless, the possibility that the mental health of a population could be improved by increasing the dietary intake of tryptophan relative to the dietary intake of other amino acids remains an interesting idea that should be explored.

The primary purpose of this editorial is to point out that pharmacologic strategies are not the only ones worthy of study when devising strategies to increase brain serotonin function. The effect of nonpharmacologic interventions on brain serotonin and the implications of increased serotonin for mood and behaviour need to be studied more. The amount of money and effort put into research on drugs that alter serotonin is very much greater than that put into non-pharmacologic methods. The magnitude of the discrepancy is probably neither in tune with the wishes of the public nor optimal for progress in the prevention and treatment of mental disorders.

See the original post:
How to increase serotonin in the human brain without drugs

Posted in Human Longevity | Comments Off on How to increase serotonin in the human brain without drugs

Posthumanism – Wikipedia

Posted: at 1:19 am

This article is about a critique of humanism. For the futurist ideology and movement, see transhumanism.

Posthumanism or post-humanism (meaning “after humanism” or “beyond humanism”) is a term with at least seven definitions according to philosopher Francesca Ferrando:[1]

Philosopher Ted Schatzki suggests there are two varieties of posthumanism of the philosophical kind:[12]

One, which he calls ‘objectivism’, tries to counter the overemphasis of the subjective or intersubjective that pervades humanism, and emphasises the role of the nonhuman agents, whether they be animals and plants, or computers or other things.[12]

A second prioritizes practices, especially social practices, over individuals (or individual subjects) which, they say, constitute the individual.[12]

There may be a third kind of posthumanism, propounded by the philosopher Herman Dooyeweerd. Though he did not label it as ‘posthumanism’, he made an extensive and penetrating immanent critique of Humanism, and then constructed a philosophy that presupposed neither Humanist, nor Scholastic, nor Greek thought but started with a different religious ground motive.[13] Dooyeweerd prioritized law and meaningfulness as that which enables humanity and all else to exist, behave, live, occur, etc. “Meaning is the being of all that has been created,” Dooyeweerd wrote, “and the nature even of our selfhood.”[14] Both human and nonhuman alike function subject to a common ‘law-side’, which is diverse, composed of a number of distinct law-spheres or aspects.[15] The temporal being of both human and non-human is multi-aspectual; for example, both plants and humans are bodies, functioning in the biotic aspect, and both computers and humans function in the formative and lingual aspect, but humans function in the aesthetic, juridical, ethical and faith aspects too. The Dooyeweerdian version is able to incorporate and integrate both the objectivist version and the practices version, because it allows nonhuman agents their own subject-functioning in various aspects and places emphasis on aspectual functioning.[16]

Ihab Hassan, theorist in the academic study of literature, once stated:

Humanism may be coming to an end as humanism transforms itself into something one must helplessly call posthumanism.[17]

This view predates most currents of posthumanism which have developed over the late 20th century in somewhat diverse, but complementary, domains of thought and practice. For example, Hassan is a known scholar whose theoretical writings expressly address postmodernity in society.[citation needed] Beyond postmodernist studies, posthumanism has been developed and deployed by various cultural theorists, often in reaction to problematic inherent assumptions within humanistic and enlightenment thought.[4]

Theorists who both complement and contrast Hassan include Michel Foucault, Judith Butler, cyberneticists such as Gregory Bateson, Warren McCullouch, Norbert Wiener, Bruno Latour, Cary Wolfe, Elaine Graham, N. Katherine Hayles, Donna Haraway Peter Sloterdijk, Stefan Lorenz Sorgner, Evan Thompson, Francisco Varela, Humberto Maturana and Douglas Kellner. Among the theorists are philosophers, such as Robert Pepperell, who have written about a “posthuman condition”, which is often substituted for the term “posthumanism”.[5][6]

Posthumanism differs from classical humanism by relegating humanity back to one of many natural species, thereby rejecting any claims founded on anthropocentric dominance.[18] According to this claim, humans have no inherent rights to destroy nature or set themselves above it in ethical considerations a priori. Human knowledge is also reduced to a less controlling position, previously seen as the defining aspect of the world. The limitations and fallibility of human intelligence are confessed, even though it does not imply abandoning the rational tradition of humanism.[citation needed]

Proponents of a posthuman discourse, suggest that innovative advancements and emerging technologies have transcended the traditional model of the human, as proposed by Descartes among others associated with philosophy of the Enlightenment period.[19] In contrast to humanism, the discourse of posthumanism seeks to redefine the boundaries surrounding modern philosophical understanding of the human. Posthumanism represents an evolution of thought beyond that of the contemporary social boundaries and is predicated on the seeking of truth within a postmodern context context. In so doing, it rejects previous attempts to establish ‘anthropological universals’ that are imbued with anthropocentric assumptions.[18]

The philosopher Michel Foucault placed posthumanism within a context that differentiated humanism from enlightenment thought. According to Foucault, the two existed in a state of tension: as humanism sought to establish norms while Enlightenment thought attempted to transcend all that is material, including the boundaries that are constructed by humanistic thought.[18] Drawing on the Enlightenments challenges to the boundaries of humanism, posthumanism rejects the various assumptions of human dogmas (anthropological, political, scientific) and take the next step by attempting to change the nature of thought about what it means to be human. This requires not only decentering the human in multiple discourses (evolutionary, ecological, technological) but also examining those discourses to uncover inherent humanistic, anthropocentric, normative notions of humanness and the concept of the human.[4]

Posthumanistic discourse aims to open up spaces to examine what it means to be human and critically question the concept of “the human” in light of current cultural and historical contexts[4] In her book How We Became Posthuman, N. Katherine Hayles, writes about the struggle between different versions of the posthuman as it continually co-evolves alongside intelligent machines.[20] Such coevolution, according to some strands of the posthuman discourse, allows one to extend their subjective understandings of real experiences beyond the boundaries of embodied existence. According to Hayles’s view of posthuman, often referred to as technological posthumanism, visual perception and digital representations thus paradoxically become ever more salient. Even as one seeks to extend knowledge by deconstructing perceived boundaries, it is these same boundaries that make knowledge acquisition possible. The use of technology in a contemporary society is thought to complicate this relationship.

Hayles discusses the translation of human bodies into information (as suggested by Hans Moravec) in order illuminate how the boundaries of our embodied reality have been compromised in the current age and how narrow definitions of humanness no longer apply. Because of this, according to Hayles, posthumanism is characterized by a loss of subjectivity based on bodily boundaries.[4] This strand of posthumanism, including the changing notion of subjectivity and the disruption of ideas concerning what it means to be human, is often associated with Donna Haraways concept of the cyborg.[4] However, Haraway has distanced herself from posthumanistic discourse due to other theorists use of the term to promote utopian views of technological innovation to extend the human biological capacity[21] (even though these notions would more correctly fall into the realm of transhumanism[4]).

While posthumanism is a broad and complex ideology, it has relevant implications today and for the future. It attempts to redefine social structures without inherently humanly or even biological origins, but rather in terms of social and psychological systems where consciousness and communication could potentially exist as unique disembodied entities. Questions subsequently emerge with respect to the current use and the future of technology in shaping human existence,[18] as do new concerns with regards to language, symbolism, subjectivity, phenomenology, ethics, justice and creativity.[22]

Posthumanism is sometimes used as a synonym for an ideology of technology known as “transhumanism” because it affirms the possibility and desirability of achieving a “posthuman future”, albeit in purely evolutionary terms.

James Hughes comments that there is considerable confusion between the two terms.[23][24]

Some critics have argued that all forms of posthumanism have more in common than their respective proponents realize.[25]

However, posthumanists in the humanities and the arts are critical of transhumanism, in part, because they argue that it incorporates and extends many of the values of Enlightenment humanism and classical liberalism, namely scientism, according to performance philosopher Shannon Bell:[26]

Altruism, mutualism, humanism are the soft and slimy virtues that underpin liberal capitalism. Humanism has always been integrated into discourses of exploitation: colonialism, imperialism, neoimperialism, democracy, and of course, American democratization. One of the serious flaws in transhumanism is the importation of liberal-human values to the biotechno enhancement of the human. Posthumanism has a much stronger critical edge attempting to develop through enactment new understandings of the self and others, essence, consciousness, intelligence, reason, agency, intimacy, life, embodiment, identity and the body.[26]

While many modern leaders of thought are accepting of nature of ideologies described by posthumanism, some are more skeptical of the term. Donna Haraway, the author of A Cyborg Manifesto, has outspokenly rejected the term, though acknowledges a philosophical alignment with posthumanim. Haraway opts instead for the term of companion species, referring to nonhuman entities with which humans coexist.[21]

Questions of race, some argue, are suspiciously elided within the “turn” to posthumanism. Noting that the terms “post” and “human” are already loaded with racial meaning, critical theorist Zakiyyah Iman Jackson argues that the impulse to move “beyond” the human within posthumanism too often ignores praxes of humanity and critiques produced by black people, including Frantz Fanon and Aime Cesaire to Hortense Spillers and Fred Moten. Interrogating the conceptual grounds in which such a mode of beyond is rendered legible and viable, Jackson argues that it is important to observe that blackness conditions and constitutes the very nonhuman disruption and/or disruption” which posthumanists invite. In other words, given that race in general and blackness in particular constitutes the very terms through which human/nonhuman distinctions are made, for example in enduring legacies of scientific racism, a gesture toward a beyond actually returns us to a Eurocentric transcendentalism long challenged.

Visit link:
Posthumanism – Wikipedia

Posted in Post Human | Comments Off on Posthumanism – Wikipedia

Articles about Freedom Of Speech – latimes

Posted: October 15, 2016 at 5:23 am


June 17, 1990 | RANDY LEWIS

W ithout freedom of thought there can be no such thing as wisdom, and no such thing as public liberty without freedom of speech; which is the right of every man as far as by it he does not hurt and control the right of another: and this is the only check it ought to suffer, and the only bounds it ought to know. –Benjamin Franklin, 1722 (at age 16) Warning: This column contains words and ideas that may be offensive to some readers.



Stanford University, joining a national trend, has adopted new rules against racial and sexual harassment by students, officials announced Friday. However, as at other campuses, opponents contend that the regulations violate freedom of speech.


February 16, 1994 | ESTHER IVEREM, NEWSDAY

Comedian Martin Lawrence has titled his first film “You So Crazy,” after the name of his national stand-up comedy tour. But the Motion Picture Assn. of America ratings board, which has slapped the film with an NC-17 rating, thinks it’s more like “You So Nasty.” Lawrence held a press conference Tuesday at Manhattan’s Omni Berkshire Hotel to announce his appeal of the rating. The appeal is scheduled to be heard Feb. 23, nine days before the film opens in New York and Los Angeles.


October 5, 1989 | STEVE HOCHMAN

First Amendment activists and a member of Congress said this week that the FBI may have stepped out of line with a letter accusing a Compton rap group of encouraging “violence against and disrespect” for law enforcement officers. “The FBI should stay out of the business of censorship,” said Rep. Don Edwards (D-San Jose), chairman of the House Judiciary Committee’s subcommittee on civil and constitutional rights, when informed of an Aug.



What the

Originally posted here:
Articles about Freedom Of Speech – latimes

Posted in Freedom of Speech | Comments Off on Articles about Freedom Of Speech – latimes

Casino Gambling Web | Best Online Gambling News and Casinos …

Posted: October 13, 2016 at 5:36 am

The Top Online Casino Gambling News Reporting Site Since 2002! Latest News From the Casino Gambling Industry

Cheers and Jeers Abound for New UK Online Gambling Law May 19, 2014 The new UK betting law is expected to be finalized by July 1st and go into effect by September 1st. However, many are concerned the law could create another wild-west situation in the UK… Speculation on Casino Gambling Legalization in Japan Continues May 13, 2014 LVS owner Sheldon Adelson continues to create gambling news across the world, this time in Japan as he salivates at the possibility of legalization before the 2020 Olympics… LVS Owner Adelson Pulling the Strings of Politicians in the US May 8, 2014 Las Vegas Sands is playing the political system, and its owner, Sheldon Adelson, is the puppet master behind the curtain pulling the strings, according to new reports… New Jersey Bets Big on Sports Gambling, Loses – So Far… May 5, 2014 Governor Chris Christie may need a win in the Supreme Court to justify his defense for his initiative to legalize sports betting in the state… Tribal And Private Gaming Owners Square Off In Massachusetts April 28, 2014 Steve Wynn and the Mohegan Sun are squaring off in a battle for a casino license in Massachusetts, and the two have vastly different views of how regulations are being constructed…

Below is a quick guide to the best gambling sites online. One is for USA players, the other is for players in the rest of the world. Good luck!

As laws change in 2012 the internet poker craze is set to boom once again in North America. Bovada, formerly known as Bodog, is one of the only sites that weathered the storm and they are now the best place to play online. More players gamble here than anywhere else.

The goal of Casino Gambling Web is to provide each of our visitors with an insider’s view of every aspect of the gambling world. We have over 30 feeds releasing news to more than 30 specific gaming related categories in order to achieve our important goal of keeping you well updated and informed.

The main sections of our site are broken up into 5 broad areas of gambling news. The first area of news we cover is about issues concerning brick and mortar casinos like those found in Atlantic City, Las Vegas, the Gulf Coast Region, and well, now the rest of the USA. The second area of gambling news we cover concerns itself with the Internet casino community. We also have reporters who cover the international poker community and also the world of sports gambling. And finally, we cover news about the law when it effects any part of the gambling community; such legal news could include information on updates to the UIGEA, or issues surrounding gambling petitions to repeal that law, or information and stories related to new poker laws that are constantly being debated in state congresses.

We go well beyond simply reporting the news. We get involved with the news and sometimes we even become the news. We pride ourselves on providing follow up coverage to individual news stories. We had reporters in Washington D.C. on the infamous night when the internet gambling ban was passed by a now proven to be corrupt, former senator Bill Frist led congress, and we have staff constantly digging to get important details to American citizens. We had reporters at the World Series of Poker in Las Vegas when Jamie Gold won his ring and changed the online gambling world, and we have representatives playing in the tournament each and every year.

It is our pleasure and proud duty to serve as a reliable source of gambling news and quality online casino reviews for all of the international gaming community. Please take a few moments to look around our site and discover why we, and most other insiders of the industry, have considered CGW the #1 Top Casino Gambling News eporting Organization since 2002.

The United States changed internet gambling when they passed the Unlawful Internet Gambling Enforcement Act (UIGEA), so now when searching for top online casinos you must focus your energies on finding post-UIGEA information as opposed to pre-UIGEA information. Before the law passed you could find reliable info on most gambling portals across the internet. Most of those portals simply advertised casinos and gambling sites that were tested and approved by eCogra, and in general you would be hard pressed to find an online casino that had a bad reputation. However, now that these gambling sites were forced out of the US they may be changing how they run their business. That is why it important to get your information from reliable sources who have been following the industry and keeping up with which companies have remained honorable. So good luck and happy hunting!

The Unlawful Internet Gambling Enforcement Act (UIGEA), in short, states that anything that may be illegal on a state level is now also illegal on a federal level. However, the day after Christmas in 2011, President Barrack Obama’s administration delivered what the online gaming industry will view forever as a great big beautifully wrapped present. The government released a statement declaring that the 1961 Federal Wire Act only covers sports betting. What this means for the industry on an international level is still unknown, but what it means in the USA is that states can begin running online poker sites and selling lottery tickets to its citizens within its borders. The EU and WTO will surely have some analysis and we will keep you updated as this situation unfolds. Be sure to check with state laws before you start to gamble online.

The UK was the first high-power territory to legalize and regulate gambling online with a law passed in 2007. They allow all forms of betting but have strict requirements on advertisers. They first attracted offshore companies to come on land, which gave the gambling companies who complied the appearance of legitamacy. However, high taxes forced many who originally came to land, back out to sea and the battle forever rages on, but on a whole, the industry regulations have proven greatly successful and have since served as a model for other gaming enlightened countries around the world.

Since then, many European countries have regulated the industry, breaking up long term monopolies, sometimes even breaking up government backed empires, finally allowing competition – and the industry across the globe (outside of the USA) is thriving with rave reviews, even from those who are most interested in protecting the innocent and vulnerable members of society.

We strive to provide our visitors with the most valuable information about problem gambling and addiction in society. We have an entire section of our site dedicated to news about the subject. When a state or territory implements new technology to safeguard itself from allowing problem gamblers to proliferate, we will report it to you. If there is a new story that reveals some positive or negative information about gambling as it is related to addiction, we will report it to you. And if you think you have a problem with gambling right now, please visit Gamblers Anonymous if you feel you have a gambling problem.

In order to get all the information you need about this industry it is important to visit Wiki’s Online Gambling page. It provides an unbiased view of the current state of the Internet gambling industry. If you are interested in learning about other issues you may also enjoy visiting the National Council on Problem Gambling, a righteous company whose sole purpose is to help protect and support problem gamblers. They have a lot of great resources for anyone interested in learning more.

Read the original post:

Casino Gambling Web | Best Online Gambling News and Casinos …

Posted in Gambling | Comments Off on Casino Gambling Web | Best Online Gambling News and Casinos …

How America Lost the War on Drugs – News | Rolling Stone:

Posted: October 6, 2016 at 3:02 pm

Taibbi on Six Million Adults Who Won’t Influence This Presidential Race

One in 40 Americans can’t vote because of a criminal conviction. But the rules aren’t exactly fair

Lee Daniels Rejoins Jay Z-Produced Richard Pryor Film

Harvey Weinstein offers no details about director’s return to Mike Epps-starring biopic he left in May

Furry Community Shocked After Gory Triple Murder

Members of misunderstood subculture mourn loss, prepare for potential backlash

Jay Z Talks Kalief Browder Doc, Inhumanity of Solitary Confinement

Six-part ‘Time: The Kalief Browder Story’ will premiere early next year

Hear Pusha T, Rivers Cuomo on Soft Rock Zeds Dead Jam ‘Too Young’

Canadian electronic duo will release debut LP this month

Keith Urban to Headline Nashville New Year’s Eve Party

A Thousand Horses and Charlie Worsham also represent country on the all-genre bill

Watch ‘The Warriors’ Reunite to Discuss Cult Film’s Legacy, Fandom

Vermin, Cochise, Swan and more look back on Walter Hill’s 1979 gangland New York classic in exclusive new video

Watch U2 Blast Donald Trump During San Francisco Show

Bono blasts Republican nominee for trying to “run off with the American dream”

See Dolly Parton Play ‘Dollywood Squares’

Icon visits ‘The Talk’ for a humorous game, during which she reminisces about working with Burt Reynolds and writing “Jolene”

Watch Lera Lynn Perform Rumbling ‘What You Done’

Slow-burning accusation is the latest in singer-songwriter’s Live at Resistor Studio series

‘Insecure’ Creator Issa Rae Talks Drake, Maintaining ‘Awkward’-Ness

Comedian on translating her sensibility to small screen: “I didn’t set out to be like ‘I want to tell the black female millennial story'”

Watch Nick Cave and the Bad Seeds’ Mournful ‘Girl in Amber’ Video

Band performs ballad in new clip made from ‘One More Time With Feeling’ footage

Go Behind the Scenes of Metallica’s Raging ‘Moth Into Flame’ Video

Director Tom Kirk talks blending vintage, contemporary styles while James Hetfield plays with warehouse equipment

Flashback: David Bowie Plays a Haunting ‘Heroes’ in 1978

London performance was professionally recorded for a concert movie that has yet to see an official release

D.R.A.M. Announces Debut Album ‘Big Baby D.R.A.M.’

“Cha Cha” and “Broccoli” rapper will make solo late-night television debut on ‘Conan’

Watch Norah Jones Unleash Fiery ‘Flipside’ on ‘Fallon’

Musician delivers politically charged cut off new LP ‘Day Breaks’

Dustin Lynch’s Next Album Will Be ‘Very Sexy’

Country singer reports much of the upcoming project is about falling in and out of love

Watch Pixies Prowl for Love in ‘Um Chagga Lagga’ Video

Black Francis makes directorial debut with peculiar new clip

Hear Love and Theft’s Sexy New Song ‘Candyland’

“Angel Eyes” duo releases acoustic performance of lead single from forthcoming album

Hear Tift Merritt’s Blues-Fueled ‘Dusty Old Man’

Singer-songwriter looked to Bonnie Raitt’s first album as inspiration for her bluesy, guitar-driven new tune

Read more:

How America Lost the War on Drugs – News | Rolling Stone:

Posted in War On Drugs | Comments Off on How America Lost the War on Drugs – News | Rolling Stone: