July 13, 2015
Melbourne researchers have identified a protein responsible for preserving the antibody-producing cells that lead to long-term immunity after infection or vaccination.
Dr Kim Good-Jacobson, Professor David Tarlinton and colleagues from the Walter and Eliza Hall Institute discovered the presence of a protein called Myb was essential for antibody-producing plasma cells to migrate into bone marrow, preserving them for many years or even decades. Their findings were published in the Journal of Experimental Medicine.
Dr Good-Jacobson said plasma cells were created when the immune system was exposed to pathogens such as viruses or bacteria. “When our immune system encounters a new pathogen, it can create plasma cells that secrete antibodies to specifically prevent future infections, generating immunity,” she said.
“Our bone marrow is like a long-term storage facility for plasma cells, allowing them to continue producing antibodies to protect against future infections. Until now, it was not known why some plasma cells moved into the bone marrow, while others remained in the blood stream and perished after a few days.”
The research team discovered that when the gene that produces the protein Myb was removed, plasma cells were no longer able to move into the bone marrow to provide long-term immunity. “Myb is a type of protein called a transcription factor, which binds to DNA and, in effect, switches genes ‘on’ or ‘off’,” Dr Good-Jacobson said.
“We found that if a plasma cell produced Myb at some stage during an immune response, then those plasma cells had the ability to migrate into the bone marrow. If we can understand how to flip the molecular switch in plasma cells and activate Myb production, we might be able to encourage the immune system to create long-term immunity for a range of infections.”
Plasma cells are created during an immune response in temporary structures called germinal centres, Dr Good-Jacobson said. “Germinal centres act as a rapid proto-typing facility, improving the design of antibodies to better recognise invading pathogens in the future,” she said. “The Myb protein marks the plasma cells that produce high-quality antibodies for preservation.”
Professor Tarlinton said the discovery would mean researchers could now search for the trigger of Myb production and find out what genes Myb controls. “Now that we know Myb is critical in creating long-term immunity, we can begin dissecting the pathways it uses to mark plasma cells for storage and the genes involved in migrating to the bone marrow,” he said.
“Some pathogens, such as malaria, typically trigger the creation of short-lived plasma cells. If we don’t create long-lived plasma cells, we don’t develop lasting immunity to the disease. If we can trigger the expression of Myb in plasma cells responding to pathogens – either by infection or by immunisation – we might be able to convince the immune system to store these plasma cells in the bone marrow to offer protection against future infections.”
June 24, 2015
Moving closer to the possibility of “materials that compute” and wearing your computer on your sleeve, researchers at the University of Pittsburgh Swanson School of Engineering have designed a responsive hybrid material that is fueled by an oscillatory chemical reaction and can perform computations based on changes in the environment or movement, and potentially even respond to human vital signs. The material system is sufficiently small and flexible that it could ultimately be integrated into a fabric or introduced as an inset into a shoe.
Anna C. Balazs, Ph.D., distinguished professor of chemical and petroleum engineering, and Steven P. Levitan, Ph.D., John A. Jurenko professor of electrical and computer engineering, integrated models for self-oscillating polymer gels and piezoelectric micro-electric-mechanical systems to devise a new reactive material system capable of performing computations without external energy inputs, amplification or computer mediation.
Their research, “Achieving synchronization with active hybrid materials: Coupling self-oscillating gels and piezoelectric (PZ) films,” appeared June 24th in the journal Scientific Reports, published by Nature (DOI: 10.1038/srep11577). The studies combine Balazs’ research in Belousov-Zhabotinsky (BZ) gels, a substance that oscillates in the absence of external stimuli, and Levitan’s expertise in computational modeling and oscillator-based computing systems. By working with Dr. Victor V. Yashin, research assistant professor of chemical and petroleum engineering and lead author on the paper, the researchers developed design rules for creating a hybrid “BZ-PZ” material.
“The BZ reaction drives the periodic oxidation and reduction of a metal catalyst that is anchored to the gel; this, in turn, makes the gel swell and shrink. We put a thin piezoelectric (PZ) cantilever over the gel so that when the PZ is bent by the oscillating gel, it generates an electric potential (voltage). Conversely, an electric potential applied to the PZ cantilever causes it to bend,” said Balazs. “So, when a single BZ-PZ unit is wired to another such unit, the expansion of the oscillating BZ gel in the first unit deflects the piezoelectric cantilever, which produces an electrical voltage. The generated voltage in turn causes a deflection of the cantilever in the second unit; this deflection imposes a force on the underlying BZ gel that modifies its oscillations. The resulting “see-saw-like” oscillation permits communication and an exchange of information between the units.
Multiple BZ-PZ units can be connected in serial or parallel, allowing more complicated patterns of oscillation to be generated and stored in the system. In effect, these different oscillatory patterns form a type of “memory,” allowing the material to be used for computation. Levitan adds, however, the computations would not be general purpose, but rather specific to pattern-matching and recognition, or other non-Boolean operations.
“Imagine a group of organ pipes, and each is a different chord. When you introduce a new chord, one resonates with that particular pattern,” Levitan said. “Similarly, let’s say you have an array of oscillators and they each have an oscillating pattern. Each set of oscillators would reflect a particular pattern. Then you introduce a new external input pattern, say from a touch or a heartbeat. The materials themselves recognize the pattern and respond accordingly, thereby performing the actual computing.”
Developing so-called “materials that compute” addresses limitations inherent to the systems currently used by researchers to perform either chemical computing or oscillator-based computing. Chemical computing systems are limited by both the lack of an internal power system and the rate of diffusion as the chemical waves spread throughout the system, enabling only local coupling. Further, oscillator-based computing has not been translated into a potentially wearable material. The hybrid BZ-PZ model, which has never been proposed previously, solves these problems and points to the potential of designing synthetic material systems that are self-powered.
Balazs and Levitan note that the current BZ-PZ gel model oscillates in periods of tens of seconds, which would allow for simple non-Boolean operations or pattern recognition of patterns like human movement. The next step for Drs. Balazs and Levitan is to add an input layer for the pattern recognition, something that has been accomplished in other technologies but will be applied to self-oscillating gels and piezoelectric films for the first time.
June 15, 2015
Humans are unlikely to be the only animal capable of self-awareness, a new study has shown.
Conducted by University of Warwick researchers, the study found that humans and other animals capable of mentally simulating environments require at least a primitive sense of self. The finding suggests that any animal that can simulate environments must have a form of self-awareness.
Often viewed as one of man’s defining characteristics, the study strongly suggests that self-awareness is not unique to mankind and is instead likely to be common among animals.
The researchers, from the University of Warwick’s Departments of Phycology and Philosophy, used thought experiments to discover which capabilities animals must have in order to mentally simulate their environment.
Commenting on the research Professor Thomas Hills, study co-author from Warwick’s Department of Psychology, said: “The study’s key insight is that those animals capable of simulating their future actions must be able to distinguish between their imagined actions and those that are actually experienced.”
The researchers were inspired by work conducted in the 1950s on maze navigation in rats. It was observed that rats, at points in the maze that required them to make decisions on what they would do next, often stopped and appeared to deliberate over their future actions.
Recent neuroscience research found that at these ‘choice points’ rats and other vertebrates activate regions of their hippocampus that appear to simulate choices and their potential outcomes.
Professor Hills and Professor Stephen Butterfill, from Warwick’s Department of Philosophy, created different descriptive models to explain the process behind the rat’s deliberation at the ‘choice points’.
One model, the Naive Model, assumed that animals inhibit action during simulation. However, this model created false memories because the animal would be unable to tell the differences between real and imagined actions.
A second, the Self-actuating Model, was able to solve this problem by ‘tagging’ real versus imagined experience. Hills and Butterfill called this tagging the ‘primal self’.
Commenting on the finding the Professor Hills, said: “The study answers a very old question: do animals have a sense of self? Our first aim was to understand the recent neural evidence that animals can project themselves into the future. What we wound up understanding is that, in order to do so, they must have a primal sense of self.”
“As such, humans must not be the only animal capable of self-awareness. Indeed, the answer we are led to is that anything, even robots, that can adaptively imagine themselves doing what they have not yet done, must be able to separate the knower from the known.”
The study, From foraging to autonoetic consciousness: The primal self as a consequence of embodied prospective foraging, is published by Current Zoology.
June 3, 2015
An international team of scientists led by Cardiff University researchers has provided the strongest evidence yet of what causes schizophrenia – a condition that affects around 1% of the global population.
Published in the journal Neuron, their work presents strong evidence that disruption of a delicate chemical balance in the brain is heavily implicated in the disorder.
In the largest ever study of its kind, the team found that disease-linked mutations disrupt specific sets of genes contributing to excitatory and inhibitory signalling, the balance of which plays a crucial role in healthy brain development and function.
The breakthrough builds on two landmark studies led by members of the Cardiff University team, published last year in the journal Nature.
“We’re finally starting to understand what goes wrong in schizophrenia,” says lead author Dr Andrew Pocklington from Cardiff University’s MRC Centre for Neuropsychiatric Genetics and Genomics.
“Our study marks a significant step towards understanding the biology underpinning schizophrenia, which is an incredibly complex condition and has up until very recently kept scientists largely mystified as to its origins.
“We now have what we hope is a pretty sizeable piece of the jigsaw puzzle that will help us develop a coherent model of the disease, while helping us to rule out some of the alternatives.
“A reliable model of disease is urgently needed to direct future efforts in developing new treatments, which haven’t really improved a great deal since the 1970s.”
Professor Hugh Perry, who chairs the Medical Research Council Neuroscience and Mental Health Board said: “This work builds on our understanding of the genetic causes of schizophrenia – unravelling how a combination of genetic faults can disrupt the chemical balance of the brain.
“Scientists in the UK, as part of an international consortium, are uncovering the genetic causes of a range of mental health issues, such as schizophrenia.
“In the future, this work could lead to new ways of predicting an individual’s risk of developing schizophrenia and form the basis of new targeted treatments that are based on an individual’s genetic makeup.”
A healthy brain is able to function properly thanks to a precise balance between chemical signals that excite and inhibit nerve cell activity. Researchers studying psychiatric disorders have previously suspected that disruption of this balance contributes to schizophrenia.
The first evidence that schizophrenia mutations interfere with excitatory signalling was uncovered in 2011 by the same team, based at Cardiff University’s MRC Centre for Neuropsychiatric Genetics and Genomics.
This paper not only confirms their previous findings, but also provides the first strong genetic evidence that disruption of inhibitory signalling contributes to the disorder.
To reach their conclusions scientists compared the genetic data of 11,355 patients with schizophrenia against a control group of 16,416 people without the condition.
They looked for types of mutation known as copy number variants (CNVs), mutations in which large stretches of DNA are either deleted or duplicated.
Comparing the CNVs found in people with schizophrenia to those found in unaffected people, the team was able to show that the mutations in individuals with the disorder tended to disrupt genes involved in specific aspects of brain function.
The disease-causing effects of CNVs are also suspected to be involved in other neurodevelopmental disorders such as intellectual disability, Autism Spectrum Disorder and ADHD.
Around 635,000 people in the UK will at some stage in their lives be affected by schizophrenia.
The estimated cost of schizophrenia and psychosis to society is around Â£11.8 billion a year.
The symptoms of schizophrenia can be extremely disruptive, and have a large impact on a person’s ability to carry out everyday tasks, such as going to work, maintaining relationships and caring for themselves or others.
April 22, 2015
How is lightning initiated in thunderclouds? This is difficult to answer – how do you measure electric fields inside large, dangerously charged clouds? It was discovered, more or less by coincidence, that cosmic rays provide suitable probes to measure electric fields within thunderclouds. This surprising finding is published in Physical Review Letters on April 24th. The measurements were performed with the LOFAR radio telescope located in the Netherlands.
‘We used to throw away LOFAR measurements taken during thunderstorms. They were too messy.’ says astronomer Pim Schellart. ‘Well, we didn’t actually throw them away of course, we just didn’t analyze them.’ Schellart, who completed his PhD in March this year at Radboud University in Nijmegen and is supervised by Prof. Heino Falcke, is interested in cosmic rays. These high-energy particles, originating from exploding stars and other astrophysical sources, continuously bombard Earth from space.
High in the atmosphere these particles strike atmospheric molecules and create ‘showers’ of elementary particles. These showers can also be measured from the radio emission that is generated when their constituent particles are deflected by the magnetic field of the Earth. The radio emission also gives information about the original particles. These measurements are routinely conducted with LOFAR at ASTRON in Dwingeloo, but not during thunderstorms.
That changed when the data were examined in a collaborative effort with astrophysicist Gia Trinh, Prof. Olaf Scholten from the University of Groningen and lightning expert Ute Ebert from the Centrum Wiskunde & Informatica in Amsterdam.
‘We modeled how the electric field in thunderstorms can explain the different measurements. This worked very well. How the radio emission changes gives us a lot of information about the electric fields in thunderstorms. We could even determine the strength of the electric field at a certain height in the cloud.’ says Schellart.
This field can be as strong as 50 kV/m. This translates into a voltage of hundreds of millions of volts over a distance of multiple kilometers: a thundercloud contains enormous amounts of energy.
Lightning is a highly unpredictable natural phenomenon that inflicts damage to infrastructure and claims victims around the world. This new method to measure electric fields in thunderclouds will contribute to a better understanding and ultimately better predictions of lightning activity. Current measurement methods from planes, balloons or little rockets are dangerous and too localized. Most importantly the presence of the measurement equipment influences the measurements. Cosmic rays probe the thunderclouds from top to bottom. Moving at almost the speed of light they provide a near instantaneous ‘picture’ of the electric fields in the cloud. Moreover, they are created by nature and are freely available.
‘This research is an exemplary form of interdisciplinary collaboration between astronomers, particle physicists and geophysicists’, says Heino Falcke. ‘We hope to develop the model further to ultimately answer the question: how is lightning initiated within thunderclouds?’
February 14, 2015
We’re so used murder mysteries that we don’t even notice how mystery authors play with time. Typically the murder occurs well before the midpoint of the book, but there is an information blackout at that point and the reader learns what happened then only on the last page.
If the last page were ripped out of the book, physicist Kater Murch, PhD, said, would the reader be better off guessing what happened by reading only up to the fatal incident or by reading the entire book?
The answer, so obvious in the case of the murder mystery, is less so in world of quantum mechanics, where indeterminacy is fundamental rather than contrived for our reading pleasure.
Even if you know everything quantum mechanics can tell you about a quantum particle, said Murch, an assistant professor of physics in Arts & Sciences at Washington University in St. Louis, you cannot predict with certainty the outcome of a simple experiment to measure its state. All quantum mechanics can offer are statistical probabilities for the possible results.
The orthodox view is that this indeterminacy is not a defect of the theory, but rather a fact of nature. The particle’s state is not merely unknown, but truly undefined before it is measured. The act of measurement itself forces the particle to collapse to a definite state.
In the Feb. 13 issue of Physical Review Letters, Kater Murch describes a way to narrow the odds. By combining information about a quantum system’s evolution after a target time with information about its evolution up to that time, his lab was able to narrow the odds of correctly guessing the state of the two-state system from 50-50 to 90-10.
It’s as if what we did today, changed what we did yesterday. And as this analogy suggests, the experimental results have spooky implications for time and causality–at least in microscopic world to which quantum mechanics applies.
Measuring a phantom
Until recently physicists could explore the quantum mechanical properties of single particles only through thought experiments, because any attempt to observe them directly caused them to shed their mysterious quantum properties.
But in the 1980s and 1990s physicists invented devices that allowed them to measure these fragile quantum systems so gently that they don’t immediately collapse to a definite state.
The device Murch uses to explore quantum space is a simple superconducting circuit that enters quantum space when it is cooled to near absolute zero. Murch’s team uses the bottom two energy levels of this qubit, the ground state and an excited state, as their model quantum system. Between these two states, there are an infinite number of quantum states that are superpositions, or combinations, of the ground and excited states.
The quantum state of the circuit is detected by putting it inside a microwave box. A few microwave photons are sent into the box, where their quantum fields interact with the superconducting circuit. So when the photons exit the box they bear information about the quantum system.
Crucially, these “weak,” off-resonance measurements do not disturb the qubit, unlike “strong” measurements with photons that are resonant with the energy difference between the two states, which knock the circuit into one or the other state.
A quantum guessing game
In Physical Review Letters, Murch describes a quantum guessing game played with the qubit.
“We start each run by putting the qubit in a superposition of the two states,” he said. “Then we do a strong measurement but hide the result, continuing to follow the system with weak measurements.”
They then try to guess the hidden result, which is their version of the missing page of the murder mystery.
“Calculating forward, using the Born equation that expresses the probability of finding the system in a particular state, your odds of guessing right are only 50-50,” Murch said. “But you can also calculate backward using something called an effect matrix. Just take all the equations and flip them around. They still work and you can just run the trajectory backward.
“So there’s a backward-going trajectory and a forward-going trajectory and if we look at them both together and weight the information in both equally, we get something we call a hindsight prediction, or “retrodiction.”
The shattering thing about the retrodiction is that it is 90 percent accurate. When the physicists check it against the stored measurement of the system’s earlier state it is right nine times out of 10.
Down the rabbit hole
The quantum guessing game suggests ways to make both quantum computing and the quantum control of open systems, such as chemical reactions, more robust. But it also has implications for much deeper problems in physics.
For one thing, it suggests that in the quantum world time runs both backward and forward whereas in the classical world it only runs forward.
“I always thought the measurement would resolve the time symmetry in quantum mechanics,” Murch said. “If we measure a particle in a superposition of states and it collapses into one of two states, well, that sounds like a process that goes forward in time.”
But in the quantum guessing experiment, time symmetry has returned. The improved odds imply the measured quantum state somehow incorporates information from the future as well as the past. And that implies that time, notoriously an arrow in the classical world, is a double-headed arrow in the quantum world.
“It’s not clear why in the real world, the world made up of many particles, time only goes forward and entropy always increases,” Murch said. “But many people are working on that problem and I expect it will be solved in a few years,” he said.
In a world where time is symmetric, however, is there such a thing as cause and effect? To find out, Murch proposes to run a qubit experiment that would set up feedback loops (which are chains of cause and effect) and try to run them both forward and backward.
“It takes 20 or 30 minutes to run one of these experiments,” Murch said, “several weeks to process it, and a year to scratch our heads to see if we’re crazy or not.”
“At the end of the day,” he said, “I take solace in the fact that we have a real experiment and real data that we plot on real curves.”
January 23, 2015
Recent developments in science are beginning to suggest that the universe naturally produces complexity. The emergence of life in general and perhaps even rational life, with its associated technological culture, may be extremely common, argues Clemson researcher Kelly Smith in a recently published paper in the journal Space Policy.
What’s more, he suggests, this universal tendency has distinctly religious overtones and may even establish a truly universal basis for morality.
Smith, a Philosopher and Evolutionary Biologist, applies recent theoretical developments in Biology and Complex Systems Theory to attempt new answers to the kind of enduring questions about human purpose and obligation that have long been considered the sole province of the humanities.
He points out that scientists are increasingly beginning to discuss how the basic structure of the universe seems to favor the creation of complexity. The large scale history of the universe strongly suggests a trend of increasing complexity: disordered energy states produce atoms and molecules, which combine to form suns and associated planets, on which life evolves. Life then seems to exhibit its own pattern of increasing complexity, with simple organisms getting more complex over evolutionary time until they eventually develop rationality and complex culture.
And recent theoretical developments in Biology and complex systems theory suggest this trend may be real, arising from the basic structure of the universe in a predictable fashion.
“If this is right,” says Smith, “you can look at the universe as a kind of ‘complexity machine’, which raises all sorts of questions about what this means in a broader sense. For example, does believing the universe is structured to produce complexity in general, and rational creatures in particular, constitute a religious belief? It need not imply that the universe was created by a God, but on the other hand, it does suggest that the kind of rationality we hold dear is not an accident.”
And Smith feels another similarity to religion are the potential moral implications of this idea. If evolution tends to favor the development of sociality, reason, and culture as a kind of “package deal”, then it’s a good bet that any smart extraterrestrials we encounter will have similar evolved attitudes about their basic moral commitments.
In particular, they will likely agree with us that there is something morally special about rational, social creatures. And such universal agreement, argues Smith, could be the foundation for a truly universal system of ethics.
Smith will soon take sabbatical to lay the groundwork for a book exploring these issues in more detail.
January 8, 2015
Researchers from the University of Cambridge and the University of Plymouth have shown that follow-through – such as when swinging a golf club or tennis racket – can help us to learn two different skills at once, or to learn a single skill faster. The research provides new insight into the way tasks are learned, and could have implications for rehabilitation, such as re-learning motor skills following a stroke.
The researchers found that the particular motor memory which is active and modifiable in the brain at any given time depends on both lead-in and follow-through movement, and that skills which may otherwise interfere can be learned at the same time if their follow-through motions are unique. The research is published today (8 January) in the journal Current Biology.
While follow-through in sports such as tennis or golf cannot affect the movement of the ball after it has been hit, it does serve two important purposes: it both helps maximise velocity or force at the point of impact, and helps prevent injuries by allowing a gradual slowdown of a movement.
Now, researchers have found a third important role for follow-through: it allows distinct motor memories to be learned. In other words, by practising the same action with different follow-throughs, different motor memories can be learned for a single movement.
If a new task, whether that is serving a tennis ball or learning a passage on a musical instrument, is repeated enough times, a motor memory of that task is developed. The brain is able to store, protect and reactivate this memory, quickly instructing the muscles to perform the task so that it can be performed seemingly without thinking.
The problem with learning similar but distinct tasks is that they can ‘interfere’ with each other in the brain. For example, tennis and racquetball are both racket sports. However, the strokes for the two sports are slightly different, as topspin is great for a tennis player, but not for a racquetball player. Despite this, in theory it should be possible to learn both sports independently. However, many people find it difficult to perform at a high level in both sports, due to interference between the two strokes.
In order to determine whether we learn a separate motor memory for each task, or a single motor memory for both, the researchers examined either the presence or absence of interference by having participants learn a ‘reaching’ task in the presence of two opposite force-fields.
Participants grasped the handle of a robotic interface and made a reaching movement through an opposing force-field to a central target, followed immediately by a second unopposed follow-through movement to one of two possible final targets. The direction of the force-field was changed, representing different tasks, and the researchers were able to examine whether the tasks are learned separately, in which case there would be no interference, or whether we learn the mean of the two opposing force-fields, in which case there would be complete interference.
The researchers found that the specific motor memory which is active at any given moment depends on the movement that will be made in the near future. When a follow-through movement was made that anticipated the force-field direction, there was a substantial reduction in interference. This suggests that different follow-throughs may activate distinct motor memories, allowing us to learn two different skills without them interfering, even when the rest of the movement is identical. However, while practising a variable follow-through can activate multiple motor memories, practising a consistent follow-through allowed for tasks to be learned much faster.
“There is always noise in our movements, which arrives in the sensory information we receive, the planning we undertake, and the output of our motor system,” said Dr David Franklin of Cambridge’s Department of Engineering, a senior author on the research. “Because of this, every movement we make is slightly different from the last one even if we try really hard to make it exactly the same – there will always be variability within our movements and therefore within our follow-through as well.”
When practicing a new skill such as a tennis stroke, we may think that we do not need to care as much about controlling the variability after we hit the ball as it can’t actually affect the movement of the ball itself. “However this research suggests that this variability has another very important point – that it reduces the speed of learning of the skill that is being practiced,” said Franklin.
The research may also have implications for rehabilitation, such as re-learning skills after a stroke. When trying to re-learn skills after a stroke, many patients actually exhibit a great deal of variability in their movements. “Since we have shown that learning occurs faster with consistent movements, it may therefore be important to consider methods to reduce this variability in order to improve the speed of rehabilitation,” said Dr Ian Howard of the University of Plymouth, the paper’s lead author.
The work was supported by the Wellcome Trust, Human Frontier Science Program, Plymouth University and the Royal Society.
November 5, 2014
The physics community has spent three decades searching for and finding no evidence that dark matter is made of tiny exotic particles. Case Western Reserve University theoretical physicists suggest researchers consider looking for candidates more in the ordinary realm and, well, more massive.
Dark matter is unseen matter, that, combined with normal matter, could create the gravity that, among other things, prevents spinning galaxies from flying apart. Physicists calculate that dark matter comprises 27 percent of the universe; normal matter 5 percent.
Instead of WIMPS, weakly interacting massive particles, or axions, which are weakly interacting low-mass particles, dark matter may be made of macroscopic objects, anywhere from a few ounces to the size of a good asteroid, and probably as dense as a neutron star, or the nucleus of an atom, the researchers suggest.
Physics professor Glenn Starkman and David Jacobs, who received his PhD in Physics from CWRU in May and is now a fellow at the University of Cape Town, say published observations provide guidance, limiting where to look. They lay out the possibilities in a paper at http://arxiv.org/pdf/1410.2236.pdf.
The Macros, as Starkman and Jacobs call them, would not only dwarf WIMPS and axions, but differ in an important way. They could potentially be assembled out of particles in the Standard Model of particle physics instead of requiring new physics to explain their existence.
“We’ve been looking for WIMPs for a long time and haven’t seen them,” Starkman said. “We expected to make WIMPS in the Large Hadron Collider, and we haven’t.”
WIMPS and axions remain possible candidates for dark matter, but there’s reason to search elsewhere, the theorists argue.
“The community had kind of turned away from the idea that dark matter could be made of normal-ish stuff in the late ’80s,” Starkman said. “We ask, was that completely correct and how do we know dark matter isn’t more ordinary stuff— stuff that could be made from quarks and electrons?”
After eliminating most ordinary matter, including failed Jupiters, white dwarfs, neutron stars, stellar black holes, the black holes in centers of galaxies and neutrinos with a lot of mass, as possible candidates, physicists turned their focus on the exotics.
Matter that was somewhere in between ordinary and exotic—relatives of neutron stars or large nuclei—was left on the table, Starkman said. “We say relatives because they probably have a considerable admixture of strange quarks, which are made in accelerators and ordinarily have extremely short lives,” he said.
Although strange quarks are highly unstable, Starkman points out that neutrons are also highly unstable. But in helium, bound with stable protons, neutrons remain stable.
“That opens the possibility that stable strange nuclear matter was made in the early universe and dark matter is nothing more than chunks of strange nuclear matter or other bound states of quarks, or of baryons, which are themselves made of quarks,” he said. Such dark matter would fit the Standard Model.
The Macros would have to be assembled from ordinary and strange quarks or baryons before the strange quarks or baryons decay, and at a temperature above 3.5 trillion degrees Celsius, comparable to the temperature in the center of a massive supernova, Starkman and Jacobs calculated. The quarks would have to be assembled with 90 percent efficiency, leaving just 10 percent to form the protons and neutrons found in the universe today.
The limits of the possible dark matter are as follows:
- A minimum of 55 grams. If dark matter were smaller, it would have been seen in detectors in Skylab or in tracks found in sheets of mica.
- A maximum of 1024 (a million billion billion) grams. Above this, the Macros would be so massive they would bend starlight, which has not been seen.
- The range of 1017 to 1020 grams per centimeter squared should also be eliminated from the search, the theorists say. Dark matter in that range would be massive for gravitational lensing to affect individual photons from gamma ray bursts in ways that have not been seen.
If dark matter is within this allowed range, there are reasons it hasn’t been seen.
- At the mass of 1018 grams, dark matter Macros would hit the Earth about once every billion years.
- At lower masses, they would strike the Earth more frequently but might not leave a recognizable record or observable mark.
- In the range of 109 to 1018, dark matter would collide with the Earth once annually, providing nothing to the underground dark matter detectors in place.
October 27, 2014
Making mistakes while learning can benefit memory and lead to the correct answer, but only if the guesses are close-but-no-cigar, according to new research findings from Baycrest Health Sciences.
“Making random guesses does not appear to benefit later memory for the right answer , but near-miss guesses act as stepping stones for retrieval of the correct information – and this benefit is seen in younger and older adults,” says lead investigator Andrée-Ann Cyr, a graduate student with Baycrest’s Rotman Research Institute and the Department of Psychology at the University of Toronto.
Cyr’s paper is posted online today in the Journal of Experimental Psychology: Learning, Memory, and Cognition (ahead of print publication). The study expands upon a previous paper she published in Psychology and Aging in 2012 that found that learning information the hard way by making mistakes (as opposed to just being told the correct answer) may be the best boot camp for older brains.
That paper raised eyebrows since the scientific literature has traditionally recommended that older adults avoid making mistakes – unlike their younger peers who actually benefit from them. But recent evidence from Cyr and other researchers is challenging this perspective and prompting professional educators and cognitive rehabilitation clinicians to take note.
Cyr’s latest research provides evidence that trial-and-error learning can benefit memory in both young and old when errors are meaningfully related to the right answer, and can actually harm memory when they are not.
In their latest study, 65 healthy younger adults (average age 22) and 64 healthy older adults (average age 72) learned target words (e.g., rose) based either on the semantic category it belongs to (e.g., a flower) or its word stem (e.g., a word that begins with the letters ‘ro’). For half of the words, participants were given the answer right away (e.g., “the answer is rose”) and for the other half, they were asked to guess at it before seeing the answer (e.g., a flower: “Is it tulip?” or ro___ : “is it rope?”).
On a later memory test, participants were shown the categories or word stems and had to come up with the right answer. The researchers wanted to know if participants would be better at remembering rose if they had made wrong guesses prior to studying it rather than seeing it right away. They found that this was only true if participants learned based on the categories (e.g., a flower). Guessing actually made memory worse when words were learned based on word stems (e.g., ro___). This was the case for both younger and older adults. Cyr and her colleagues suggest this is because our memory organizes information based on how it is conceptually rather than lexically related to other information. For example, when you think of the word pear, your mind is more likely to jump to another fruit, such as apple, than to a word that looks similar, such as peer. Wrong guesses only add value when they have something meaningful in common with right answers. The guess tulip may be wrong, but it is still conceptually close to the right answer rose (both are flowers).
By guessing first as opposed to just reading the answer, one is thinking harder about the information and making useful connections that can help memory. Indeed, younger and older participants were more likely to remember the answer if they also remembered their wrong guesses, suggesting that these acted as stepping stones. By contrast, when guesses only have letters in common with answers, they clutter memory because one cannot link them meaningfully. The word rope is nowhere close to rose in our memory. In these situations, where your guesses are likely to be out in left field, it is best to bypass mistakes altogether.
“The fact that this pattern was found for older adults as well shows that aging does not influence how we learn from mistakes,” says Cyr.
“These results have profound clinical and practical implications. They turn traditional views of best practices in memory rehabilitation for healthy seniors on their head by demonstrating that making the right kind of errors can be beneficial. They also provide great hope for lifelong learning and guidance for how seniors should study,” says Dr. Nicole Anderson, senior scientist with Baycrest’s Rotman Research Institute and senior author on the study.
The study was funded by the Canadian Institutes of Health Research.