ScienceWeek August 11, 2007

2 views
Skip to first unread message

Dan Agin

unread,
Aug 12, 2007, 6:37:25 PM8/12/07
to sci...@googlegroups.com
SCIENCEWEEK

August 11, 2007

Vol. 11 - Number 31

--------------------------------

The way to do research is to attack the facts at
the point of greatest astonishment.

-- Celia Green

--------------------------------

Contents (full text below):

1. Neuroscience: Shining Light on Depression

2. Evolution: An Embarrassment of Switches

3. Philosophy Of Science: The Cha-Cha-Cha Theory of Discovery

4. Books: Fame, Philosophy, and Physics

5. Systems Neuroscience: Timing is Everything

6. Chemical Biology: Ions Illuminated

7. Electrostatics: Color Discrimination

8. Plagiarism: Academic Accused of Living on Borrowed Lines

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

1.

Science 10 August 2007: Vol. 317. no. 5839, pp. 757 - 758 DOI:
10.1126/science.1147565

Neuroscience: Shining Light on Depression

Thomas R. Insel

Just as research during the Decade of the Brain (1990-2000)
forged the bridge between the mind and the brain, research in the
current decade is helping us to understand mental illnesses as
brain disorders. As a result, the distinction between disorders
of neurology (e.g., Parkinson's and Alzheimer's diseases) and
disorders of psychiatry (e.g., schizophrenia and depression) may
turn out to be increasingly subtle. That is, the former may
result from focal lesions in the brain, whereas the latter arise
from abnormal activity in specific brain circuits in the absence
of a detectable lesion. As we become more adept at detecting
lesions that lead to abnormal function, it is even possible that
the distinction between neurological and psychiatric disorders
will vanish, leading to a combined discipline of clinical
neuroscience (1).

But before we can understand depression as a brain disorder, we
need information on the specific neuronal circuits that
contribute to the hopeless despair that forms the core of this
illness. Neuroimaging studies of people with depression might be
helpful for identifying brain regions of interest, but the
temporal and spatial resolution of current functional magnetic
resonance imaging and positron emission tomography may not
capture the real-time dynamics of brain function that are most
relevant to mood and cognition. In a new approach, Airan et al.
report on page 819 of this issue the use of optical imaging to
capture cellular activity at millisecond resolution in brain
slices (2). Their study, which uses rodents with some of the
behavioral features of depression, does not define the
neurobiology of depression in humans, but it demonstrates how
optical imaging--in this case, using voltage-sensitive dyes--can
identify changes in brain activity, enabling correlations between
real-time cellular activity and changing affective state.

The findings of Airan et al. are consistent with other results
that implicate the hippocampus in rodent studies of depression.
Chronic or intense stressors, such as social defeat, result in
behaviors that resemble human depression, and these stressors
have been reported to reduce hippocampal neurogenesis (3). They
also down-regulate the hippocampal expression of brain-derived
neurotrophic factor (4), a molecule that promotes neuron
survival, proliferation, and differentiation. Clinically
effective antidepressants increase hippocampal neurogenesis (5),
and blocking neurogenesis during treatment prevents the
antidepressant effect in rodents (6).

What about the hippocampus and human depression? Major depressive
disorder is associated with cognitive deficits and dysregulation
of the hypothalamic-pituitary-adrenal (HPA) axis, part of the
neuroendocrine system that controls the stress response. Because
the hippocampus is involved in both forming new memories and
regulating the HPA axis, one might expect a link between
depression and the hippocampus. Indeed, some human neuroimaging
studies have reported a subtle reduction in the size of the
hippocampus in patients with depression (7), and postmortem
studies have reported alterations in hippocampal gene expression
(8). But the evidence thus far is unconvincing. Humans with
hippocampal lesions have memory deficits but not mood disorders
(9). And none of the imaging or postmortem findings have been
shown to be specific to the hippocampus or to major depressive
disorder. Although the absence of evidence is hardly evidence of
absence, most recent clinical studies of the neurobiology of
depression have been following a different lead.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

2.

Science 10 August 2007: Vol. 317. no. 5839, pp. 758 - 759 DOI:
10.1126/science.1146921

Evolution: An Embarrassment of Switches

Leonid Kruglyak and David L. Stern

What makes a human different from a chimpanzee or a mouse? Of
course, we know the answer in broad outline. Mutations in the
genome, sifted by natural selection, cause changes in appearance,
physiology, and behavior--what geneticists call the phenotype.
But we have only a vague picture of a more detailed answer.
Precisely which mutations generate phenotypic evolution? It's not
that we can't find the mutations. Today's DNA sequencing
technology readily identifies all differences between two
genomes. There are simply too many differences--tens of millions
between human and chimp, for example (1). An unknown fraction of
these mutations alter the phenotype. Nonetheless, the molecular
effects of mutations provide a rough guide to their phenotypic
effects. Some mutations change the amino acid sequence of
proteins, thereby altering their functions, and some change so-
called cis-regulatory regions, altering when and where proteins
are produced. We know a lot about the first class, but much less
about the second. Several recent papers, including one by
Borneman et al. on page 815 of this issue (2), demonstrate a
surprising abundance of cis-regulatory changes between closely
related species.

It is easy to identify mutations that alter proteins, because of
the simplicity of the genetic code. Linear strings of DNA
nucleotide triplets encode proteins, and each triplet always
specifies a particular amino acid. Thus, mutations that alter a
protein can be immediately read off from the DNA sequence. By
contrast, we are only beginning to understand how the cis-
regulatory code works (3). Cis-regulatory regions contain short
strings of nucleotides, from 6 to 20 nucleotides in length,
scattered irregularly in the vicinity of the protein-coding DNA.
Proteins called transcription factors bind to these short DNA
strings--transcription factor binding sites--to regulate the
production of messenger RNA and thus the synthesis of proteins.
In 1975, King and Wilson found that only about 1% of amino acids
differed between a set of human and chimpanzee proteins (4). They
thus proposed that changes in cis-regulatory regions--
evolutionary switching of transcription factor binding sites--
might cause the majority of phenotypic differences between
species. This hypothesis has gained support from studies over the
past decade (5).

Recent computational studies across species illustrate that many
transcription factor binding sites have evolved quickly. That is,
binding sites present in one species are often absent in another
(6-8). New findings provide experimental evidence for this
conclusion. These studies use a technique called chromatin
immunoprecipitation to capture from cells a particular
transcription factor along with its bound short DNA strings.
These DNA strings are then identified by hybridization to DNA
microarrays (9, 10).

Using this approach, Borneman et al. examined binding of two
transcription factors in three yeast species. In only about 20%
of cases did a transcription factor bind to the same site
(meaning, approximately the same position with respect to the
target gene) in all three species. In some cases, the absence of
binding corresponded to a loss of the appropriate binding site.
Surprisingly, in other cases, the absence of binding in one
species occurred despite conservation of the DNA sequence. In a
similar study that compared transcription factor binding between
human and mouse genomes, Odom et al. (11) found that 41 to 89% of
cis-regulatory regions bound in one species were not bound in the
other. Even when the same gene region was bound by a particular
transcription factor in both species, the precise position of the
bound site with respect to the target gene often differed between
species.

Do all of these evolutionary switches in transcription factor
binding sites cause phenotypic differences? For two reasons, it
seems likely that many do not. First, change of a single site may
not alter gene expression. Transcription factors often bind to
multiple sites within the same cis-regulatory region and act
synergistically to regulate gene expression (3) (see the figure).
Thus, individual binding sites may be gained and lost during
evolution while the phenotype remains the same (12, 13). Second,
the phenotype is robust to some changes in gene expression (14).
For example, changes in enzyme concentration often have little
effect on the output of a metabolic pathway (15).

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

3.

Science 10 August 2007: Vol. 317. no. 5839, pp. 761 - 762 DOI:
10.1126/science.1147166

Philosophy Of Science: The Cha-Cha-Cha Theory of Scientific
Discovery

Daniel E. Koshland Jr.

Scientific discoveries are the steps--some small, some big--on
the staircase called progress, which has led to a better life for
the citizens of the world. Each scientific discovery is made
possible by the arrangement of neurons in the brain of one
individual and as such is idiosyncratic. In looking back on
centuries of scientific discoveries, however, a pattern emerges
which suggests that they fall into three categories--Charge,
Challenge, and Chance--that combine into a "Cha-Cha-Cha" Theory
of Scientific Discovery. (Nonscientific discoveries can be
categorized similarly.)

"Charge" discoveries solve problems that are quite obvious--cure
heart disease, understand the movement of stars in the sky--but
in which the way to solve the problem is not so clear. In these,
the scientist is called on, as Nobel laureate Albert Szent-
Györgyi put it, "to see what everyone else has seen and think
what no one else has thought before." Thus, the movement of stars
in the sky and the fall of an apple from a tree were apparent to
everyone, but Isaac Newton came up with the concept of gravity to
explain it all in one great theory.

"Challenge" discoveries are a response to an accumulation of
facts or concepts that are unexplained by or incongruous with
scientific theories of the time. The discoverer perceives that a
new concept or a new theory is required to pull all the phenomena
into one coherent whole. Sometimes the discoverer sees the
anomalies and also provides the solution. Sometimes many people
perceive the anomalies, but they wait for the discoverer to
provide a new concept. Those individuals, whom we might call
"uncoverers," contribute greatly to science, but it is the
individual who proposes the idea explaining all of the anomalies
who deserves to be called a discoverer.

"Chance" discoveries are those that are often called
serendipitous and which Louis Pasteur felt favored "the prepared
mind." In this category are the instances of a chance event that
the ready mind recognizes as important and then explains to other
scientists. This category not only would include Pasteur's
discovery of optical activity (D and L isomers), but also W. C.
Roentgen's x-rays and Roy Plunkett's Teflon. These scientists saw
what no one else had seen or reported and were able to realize
its importance.

There are well-known examples in each one of the Cha-Cha-Cha
categories (see the figure). Two conclusions are immediately
apparent. The first is that the original contribution of the
discoverer can be applied at different points in the solution of
a problem. In the Charge category, originality lies in the
devising of a solution, not in the perception of the problem. In
the Challenge category, the originality is in perceiving the
anomalies and their importance and devising a new concept that
explains them. In the Chance category, the original contribution
is the perception of the importance of the accident and
articulating the phenomenon on which it throws light.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

4.

Science 10 August 2007: Vol. 317. no. 5839, pp. 752 - 753 DOI:
10.1126/science.1145110

Books: Fame, Philosophy, and Physics

Jeroen van Dongen (Reviewer)

Einstein: A Biography

by Jürgen Neffe

Translated from the German (1) by Shelley Frisch. Farrar, Straus
and Giroux, New York, 2007. 487 pp. $30.

Einstein: His Life and Universe

by Walter Isaacson

Simon and Schuster, New York, 2007. 718 pp., illus. $32.

In 1921 when the earliest Einstein biography, by the Berlin
publicist Alexander Moszkowski (2), was about to appear, Einstein
tried to halt its publication, because seeking the limelight was
frowned upon in the German academic milieu of his day. His name
had been widely publicized following the 1919 British eclipse
expedition that had confirmed central predictions of the theory
of relativity. In its aftermath, a group of rightist physicists
and agitators had started to publicly protest the clamor about
relativity and its Jewish, liberal, and pacifist creator.

Despite Einstein's initial resistance, his fame has far from
diminished. This year, a great many biographies later, two new
books try to capture again his science, politics, and private
life: Walter Isaacson's Einstein: His Life and Universe and
Jürgen Neffe's Einstein: A Biography. Isaacson and Neffe, both
successful journalists, shared a privilege that their
predecessors lacked: access to Einstein's most private
correspondence that had remained closed in the Einstein Archives
at the Hebrew University in Jerusalem until the summer of 2006.
New perspectives on Einstein's personal life might therefore be
expected from their books.

Indeed, Neffe discusses at length Einstein's divorce from his
first wife, Mileva Mari , and the troubled relationship with his
two sons. Einstein could at times be harsh and selfish toward his
family, as when he presented Mari (who desperately wanted to
remain married) with chilling terms under which he might agree to
endure living together with her; she would practically have been
reduced to his maid. Although bad endings to bad marriages happen
to good people, others too have observed a lack of empathy on
Einstein's part [e.g., Thomas Levenson (3)]. Neffe, however,
seems to be short of sympathy for his subject and consistently
portrays Einstein in the darkest light imaginable. He even
mentions an unnamed diary that is supposed to state that Einstein
was beating Mileva. Neffe does not shy away from sensationalism
or simplistic explanations: He offers as a matter of course the
presumption that Einstein's talent had to be accompanied by some
form of autism. And when Einstein's second wife (and cousin),
Elsa, passed away after close to 20 years of marriage, Neffe
claims that her "ensnared husband" exhibited barely any emotion
and simply started to work harder. Isaacson's account is better
informed: Einstein wept when Elsa died. He did delve into his
work, but "ashen with grief," as his collaborator Banesh Hoffmann
recalled.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

5.

Nature 448, 652-654 (9 August 2007) | doi:10.1038/448652a;
Published online 8 August 2007

Systems Neuroscience: Timing is Everything

Phillip Larimer & Ben W. Strowbridge

Interactions among neurons in brain circuits underlie sensory
perception and information storage. Work in locusts shows how the
timing of different neuronal signals is synchronized to ensure
effective communication.

Most biological systems can adapt to different conditions and
environments. The nervous system has elaborated on this ability
and developed mechanisms that use prior experience to predict
future events. Many of these mechanisms could potentially support
behavioural prediction. However, little is known about which
specific mechanisms are used during common tasks, such as
learning how to hit a baseball or remembering to avoid poison
ivy. In a seminal study, Cassenaer and Laurent1 (page 709 of this
issue) demonstrate a specific predictive mechanism that operates
during olfactory learning in locusts.

In both mammals and insects, olfactory stimuli trigger diffuse,
but reproducible, patterns of neural activity in many
interconnected brain regions2. At the initial processing stage,
odorants in the environment evoke all-or-none electrical
discharges, which are recorded in neurons as spikes (action
potentials). As the cells involved in the odorant-to-spiking
conversion have only broad selectivity3, the activity of any one
neuron is a poor predictor of odorant identity. Instead, odorant
identity seems to be encoded by populations of neurons whose
activity becomes transiently synchronized in response to sensory
stimulation. Individual neurons often respond to several odorants
and probably participate in many transient 'cell assemblies'2.
The insect brain affords excellent accessibility for electrical
recordings from several neurons, making it useful for determining
how odorant-evoked activity patterns develop.

Network oscillations also have an important role in the
processing of olfactory information by linking together the
neurons that collectively represent a specific odorant. The
presence or absence of a single spike on a specific oscillation
cycle defines cell assemblies that are activated by an odorant.
In honeybees4 the disruption of network oscillations impairs
olfactory discrimination, highlighting the oscillations'
relevance to information processing.

Olfactory information is processed sequentially by different
brain regions that are linked by network oscillations. In
insects, simple olfactory stimuli activate large subsets of
projection neurons in the antennal lobe - a region analogous to
the olfactory bulb in mammals. The neural representation of
sensory information becomes significantly sparser in the second5-
and third1-order stages of olfactory processing (Fig. 1a). Sparse
coding is advantageous because it facilitates the recall of
memories from partial cues and allows for denser, more reliable
storage of biological information6

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

6.

Nature 448, 654-655 (9 August 2007) | doi:10.1038/448654a;
Published online 8 August 2007

Chemical Biology: Ions Illuminated

Christopher J. Chang

Calcium ions act as signals between cells, but their exact
locations - at the nanometre scale - have been difficult to
pinpoint. The latest biosensor promises to reveal these details
in dynamic living systems.

Cell signalling is all about location. This concept is best
illustrated with calcium signals - cells funnel bursts of calcium
ions to specific locations, where the ions selectively activate a
wide variety of physiological functions. Calcium signals ebb and
flow to cellular hotspots that are confined to regions ranging in
size from micrometres down to tens of nanometres. But despite the
importance of localization for controlling the effects of calcium
signals on cells, it has been a daunting task to study calcium
and other transient cellular signals at the nanometre scale.

Reporting in Nature Chemical Biology, Tour et al.1 describe a
promising approach to this long-standing problem. They have
developed a calcium sensor that allows rapid, selective and
sensitive tracking of localized calcium signals with high
temporal and spatial resolution.

Fluorescence microscopy is a powerful technique for imaging, in
real time, many aspects of communication within and between
cells. The difficulty with this method for determining the
movement of dynamic cell signals such as calcium is detecting the
non-uniform variation in signal concentrations within highly
localized regions. Being able to detect these signal fluctuations
is essential, as they may lead to drastically different
biological outcomes. Synthetic, small-molecule (that is, non-
protein) fluorescent indicators - such as those in the Fura, Fluo
and Calcium Green families of compounds - show very rapid and
selective responses to calcium. But these indicators are
distributed diffusely in cells and so are unable to provide
resolutions using conventional light microscopy. Alternatively,
protein-based biosensors can be introduced at specific
subcellular locations using genetic engineering. This approach
provides an easy way to place calcium probes into cells, but such
sensors are limited by their slow responses, and their large
sizes can perturb the system of interest.

The strategy now presented by Tour et al.1 combines the
tunability and small size of synthetic chemical indicators with
the spatial resolution and control of genetically targeted
proteins. They have developed a prototype small-molecule sensor,
known as Calcium Green FlAsH (CaGF; Fig. 1a). This molecule
comprises a receptor that binds selectively to calcium, a
fluorescent reporter that responds to calcium binding, and two
arsenic groups that label proteins only at specially incorporated
peptide sequences that consist of four cysteine amino acids. This
study builds upon previous work from the same group2 that showed
that small arsenic-containing dyes target tetracysteine peptide
motifs. The addition of a calcium-reporting group to the dyes
introduces an extra dimension that allows calcium's function in
cellular systems to be studied using molecular imaging.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

7.

Nature 448, 656 (9 August 2007) | doi:10.1038/448656a; Published
online 8 August 2007

Electrostatics: Colour Discrimination

Richard Webb

Like charges repel, unlike charges attract. The simplest way to
show this is to charge up different pieces of insulating plastic
by rubbing them on your shirt and watching what they do when
brought up close to one another. Amit Mehrotra and colleagues use
a similar idea to separate a mixture of red and blue sand grains
falling into a hollow acrylic cylinder, purely through the
different amount of charge each is carrying (A. Mehrotra et al.
Phys. Rev. Lett. 99, 058001; 2007).

The red and blue grains were all of the same size and positively
charged, with the charge density of the blue grains being about
six times that of the red. The authors also made the cylinder
positively charged by rubbing it lightly with nitrile gloves. The
grains were mixed up on a vibratory feeder, and then discharged
into the cylinder from a metal chute.

On entering the cylinder, the charged grains separated
spontaneously into red and blue components (pictured). Oddly,
however, it was the more positively charged blue grains that
moved towards the positively charged cylinder walls - rather than
being more strongly repelled, as basic electrostatics would seem
to demand.

The authors show through simulations that the sand particles are
not, in fact, going against the grain. The effect is caused by
negative charges induced on the underside of the metal chute,
whose concentrated attraction causes a 'beard' of falling sand
grains to grow on the lip of the chute. This beard is
sufficiently repulsive that the more highly charged blue grains
levitate more strongly off the end of the chute, resulting in two
falling streams separated according to colour.

Pretty as it is, the experiment also has a practical aspect. The
ability to separate grains by how much charge they carry, rather
than by charge sign, could have applications in technologies that
exploit electrostatic charging - aerosol drug delivery,
xerography and filtration, for example.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

8.

Nature 448, 632-633 (9 August 2007) | doi:10.1038/448632b;
Published online 8 August 2007; Corrected 8 August 2007

Plagiarism: Academic Accused of Living on Borrowed Lines

Allegations prompt fears over prevalence of plagiarism.

A shockwave could be about to hit the normally tranquil waters of
social science. A German economist, specializing in environmental
science and technology, has allegedly committed serial plagiarism
and invented academic affiliations going back decades. The case
should act as a warning sign to editors about how widespread
plagiarism and deception may be, experts say.

Events may only now be catching up with Hans Werner Gottinger,
63, who is drifting into retirement in the town of Ingolstadt,
Germany. This week the journal Research Policy is set to retract
a 1993 paper by Gottinger, which analysed demand for spin-off
technologies from Ronald Reagan's Strategic Defense Initiative.
An accompanying editorial explains that two referees concluded
that the article substantially plagiarized a paper published in
1980 in the Journal of Business by Frank Bass, then an economist
at Purdue University in West Lafayette, Indiana. The editorial
also profiles other cases of plagiarism.

Gottinger claims that he has "only scant recollection" of events
so long in the past, but insists that he did not intend to
plagiarize. He adds that he has "sincerely apologized" for any
misunderstandings.

Problems with his paper came to the journal editors' attention in
June, when a student noted that whole paragraphs and strings of
complex mathematical equations in the Bass paper - which analysed
demand for consumer-durable technologies - had been repeated
almost exactly in Gottinger's paper. Gottinger did not
acknowledge the Bass paper in his work.

Further investigations by one of the journal editors, Ben Martin,
an expert in science policy at the University of Sussex in
Brighton, UK, revealed that this was not the first such case. In
1999, the editors of the economics journal Kyklos had withdrawn a
1996 paper by Gottinger after finding that it had plagiarized a
1992 paper in the journal Economics of Innovation and New
Technology. And by googling various strings of text from half a
dozen other Gottinger papers, Martin identified yet another case
- a 2002 paper by Gottinger about an economic model of global
warming in the International Journal of Global Energy Issues, in
which whole sections were remarkably similar to a 1997 article in
the Journal of Environmental Economics and Management, by
economist Zhiqi Chen of Carleton University in Ottawa, Canada.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

Reply all
Reply to author
Forward
0 new messages