July 3, 2015
Mass killings and school shootings in the U.S. appear to be contagious, according to a team of scientists from Arizona State University and Northeastern Illinois University.
Study author Sherry Towers, research professor in the ASU Simon A. Levin Mathematical, Computational and Modeling Sciences Center, explained, “The hallmark of contagion is observing patterns of many events that are bunched in time, rather than occurring randomly in time.”
Her team examined databases on past high-profile mass killings and school shootings in the U.S. and fit a contagion model to the data to determine if these tragedies inspired similar events in the near future.
They determined that mass killings – events with four or more deaths – and school shootings create a period of contagion that lasts an average of 13 days. Roughly 20 to 30 percent of such tragedies appear to arise from contagion.
Their paper, “Contagion in Mass Killings and School Shootings,” appears in the July 2 edition of PLOS ONE.
The analysis was inspired by actual events in Towers’ life.
“In January of 2014 I was due to have a meeting with a group of researchers at Purdue University,” she said. “That morning there was a tragic campus shooting and stabbing incident that left one student dead. I realized that there had been three other school shootings in the news in the week prior, and I wondered if it was just a statistical fluke, or if somehow through news media those events were sometimes planting unconscious ideation in vulnerable people for a short time after each event.”
The researchers noted that previous studies have shown that suicide in youths can be contagious, where one suicide in a school appears to spark the idea in other vulnerable youths to do the same.
“It occurred to us that mass killings and school shootings that attract attention in the national news media can potentially do the same thing, but at a larger scale,” Towers said. “While we can never determine which particular shootings were inspired by unconscious ideation, this analysis helps us understand aspects of the complex dynamics that can underlie these events.”
On average, mass killings involving firearms occur approximately every two weeks in the U.S., and school shootings occur on average monthly. The team found that the incidence of these tragedies is significantly higher in states with a high prevalence of firearm ownership.
July 3, 2015
Detailed tabletop experiments are helping researchers understand how Earth’s landscapes erode to form networks of hills and valleys. The findings, which highlight a balance between processes that send sediments down hills and those that wash them out of valleys, might also help researchers predict how climate change could transform landscapes in the future. Kristin Sweeney and colleagues developed a laboratory device that mimicked the processes that smooth or disturb soil to make hillslopes, and those that cut it away to make valleys. To achieve the former effect, they bombarded their artificial landscape with large, energetic water droplets, and to achieve the latter effect, the landscape was misted. Results reveal that the hillslope processes that involve transport of sand down a hill, such as when dirt is knocked loose by an animal, play a key role in the transformation of landscapes. In a series of experiments, the researchers show that larger drops of simulated rainfall are associated with smoother, wider valleys and that mist tends to form denser networks with hills that are located closer together. Taken together, their findings lend support to a popular theory of landscape evolution, suggesting that the scale of erosion depends on the balance between tumbling sediments and runoff processes that carve out rivers and valleys. Since this balance is altered by changes in climate and land use, the researchers’ methods might offer a new way to study landscape evolution in the face of such changes. A Perspective article by Scott McCoy discusses the experiments in greater detail.
June 24, 2015
Moving closer to the possibility of “materials that compute” and wearing your computer on your sleeve, researchers at the University of Pittsburgh Swanson School of Engineering have designed a responsive hybrid material that is fueled by an oscillatory chemical reaction and can perform computations based on changes in the environment or movement, and potentially even respond to human vital signs. The material system is sufficiently small and flexible that it could ultimately be integrated into a fabric or introduced as an inset into a shoe.
Anna C. Balazs, Ph.D., distinguished professor of chemical and petroleum engineering, and Steven P. Levitan, Ph.D., John A. Jurenko professor of electrical and computer engineering, integrated models for self-oscillating polymer gels and piezoelectric micro-electric-mechanical systems to devise a new reactive material system capable of performing computations without external energy inputs, amplification or computer mediation.
Their research, “Achieving synchronization with active hybrid materials: Coupling self-oscillating gels and piezoelectric (PZ) films,” appeared June 24th in the journal Scientific Reports, published by Nature (DOI: 10.1038/srep11577). The studies combine Balazs’ research in Belousov-Zhabotinsky (BZ) gels, a substance that oscillates in the absence of external stimuli, and Levitan’s expertise in computational modeling and oscillator-based computing systems. By working with Dr. Victor V. Yashin, research assistant professor of chemical and petroleum engineering and lead author on the paper, the researchers developed design rules for creating a hybrid “BZ-PZ” material.
“The BZ reaction drives the periodic oxidation and reduction of a metal catalyst that is anchored to the gel; this, in turn, makes the gel swell and shrink. We put a thin piezoelectric (PZ) cantilever over the gel so that when the PZ is bent by the oscillating gel, it generates an electric potential (voltage). Conversely, an electric potential applied to the PZ cantilever causes it to bend,” said Balazs. “So, when a single BZ-PZ unit is wired to another such unit, the expansion of the oscillating BZ gel in the first unit deflects the piezoelectric cantilever, which produces an electrical voltage. The generated voltage in turn causes a deflection of the cantilever in the second unit; this deflection imposes a force on the underlying BZ gel that modifies its oscillations. The resulting “see-saw-like” oscillation permits communication and an exchange of information between the units.
Multiple BZ-PZ units can be connected in serial or parallel, allowing more complicated patterns of oscillation to be generated and stored in the system. In effect, these different oscillatory patterns form a type of “memory,” allowing the material to be used for computation. Levitan adds, however, the computations would not be general purpose, but rather specific to pattern-matching and recognition, or other non-Boolean operations.
“Imagine a group of organ pipes, and each is a different chord. When you introduce a new chord, one resonates with that particular pattern,” Levitan said. “Similarly, let’s say you have an array of oscillators and they each have an oscillating pattern. Each set of oscillators would reflect a particular pattern. Then you introduce a new external input pattern, say from a touch or a heartbeat. The materials themselves recognize the pattern and respond accordingly, thereby performing the actual computing.”
Developing so-called “materials that compute” addresses limitations inherent to the systems currently used by researchers to perform either chemical computing or oscillator-based computing. Chemical computing systems are limited by both the lack of an internal power system and the rate of diffusion as the chemical waves spread throughout the system, enabling only local coupling. Further, oscillator-based computing has not been translated into a potentially wearable material. The hybrid BZ-PZ model, which has never been proposed previously, solves these problems and points to the potential of designing synthetic material systems that are self-powered.
Balazs and Levitan note that the current BZ-PZ gel model oscillates in periods of tens of seconds, which would allow for simple non-Boolean operations or pattern recognition of patterns like human movement. The next step for Drs. Balazs and Levitan is to add an input layer for the pattern recognition, something that has been accomplished in other technologies but will be applied to self-oscillating gels and piezoelectric films for the first time.
June 15, 2015
Humans are unlikely to be the only animal capable of self-awareness, a new study has shown.
Conducted by University of Warwick researchers, the study found that humans and other animals capable of mentally simulating environments require at least a primitive sense of self. The finding suggests that any animal that can simulate environments must have a form of self-awareness.
Often viewed as one of man’s defining characteristics, the study strongly suggests that self-awareness is not unique to mankind and is instead likely to be common among animals.
The researchers, from the University of Warwick’s Departments of Phycology and Philosophy, used thought experiments to discover which capabilities animals must have in order to mentally simulate their environment.
Commenting on the research Professor Thomas Hills, study co-author from Warwick’s Department of Psychology, said: “The study’s key insight is that those animals capable of simulating their future actions must be able to distinguish between their imagined actions and those that are actually experienced.”
The researchers were inspired by work conducted in the 1950s on maze navigation in rats. It was observed that rats, at points in the maze that required them to make decisions on what they would do next, often stopped and appeared to deliberate over their future actions.
Recent neuroscience research found that at these ‘choice points’ rats and other vertebrates activate regions of their hippocampus that appear to simulate choices and their potential outcomes.
Professor Hills and Professor Stephen Butterfill, from Warwick’s Department of Philosophy, created different descriptive models to explain the process behind the rat’s deliberation at the ‘choice points’.
One model, the Naive Model, assumed that animals inhibit action during simulation. However, this model created false memories because the animal would be unable to tell the differences between real and imagined actions.
A second, the Self-actuating Model, was able to solve this problem by ‘tagging’ real versus imagined experience. Hills and Butterfill called this tagging the ‘primal self’.
Commenting on the finding the Professor Hills, said: “The study answers a very old question: do animals have a sense of self? Our first aim was to understand the recent neural evidence that animals can project themselves into the future. What we wound up understanding is that, in order to do so, they must have a primal sense of self.”
“As such, humans must not be the only animal capable of self-awareness. Indeed, the answer we are led to is that anything, even robots, that can adaptively imagine themselves doing what they have not yet done, must be able to separate the knower from the known.”
The study, From foraging to autonoetic consciousness: The primal self as a consequence of embodied prospective foraging, is published by Current Zoology.
June 3, 2015
An international team of scientists led by Cardiff University researchers has provided the strongest evidence yet of what causes schizophrenia – a condition that affects around 1% of the global population.
Published in the journal Neuron, their work presents strong evidence that disruption of a delicate chemical balance in the brain is heavily implicated in the disorder.
In the largest ever study of its kind, the team found that disease-linked mutations disrupt specific sets of genes contributing to excitatory and inhibitory signalling, the balance of which plays a crucial role in healthy brain development and function.
The breakthrough builds on two landmark studies led by members of the Cardiff University team, published last year in the journal Nature.
“We’re finally starting to understand what goes wrong in schizophrenia,” says lead author Dr Andrew Pocklington from Cardiff University’s MRC Centre for Neuropsychiatric Genetics and Genomics.
“Our study marks a significant step towards understanding the biology underpinning schizophrenia, which is an incredibly complex condition and has up until very recently kept scientists largely mystified as to its origins.
“We now have what we hope is a pretty sizeable piece of the jigsaw puzzle that will help us develop a coherent model of the disease, while helping us to rule out some of the alternatives.
“A reliable model of disease is urgently needed to direct future efforts in developing new treatments, which haven’t really improved a great deal since the 1970s.”
Professor Hugh Perry, who chairs the Medical Research Council Neuroscience and Mental Health Board said: “This work builds on our understanding of the genetic causes of schizophrenia – unravelling how a combination of genetic faults can disrupt the chemical balance of the brain.
“Scientists in the UK, as part of an international consortium, are uncovering the genetic causes of a range of mental health issues, such as schizophrenia.
“In the future, this work could lead to new ways of predicting an individual’s risk of developing schizophrenia and form the basis of new targeted treatments that are based on an individual’s genetic makeup.”
A healthy brain is able to function properly thanks to a precise balance between chemical signals that excite and inhibit nerve cell activity. Researchers studying psychiatric disorders have previously suspected that disruption of this balance contributes to schizophrenia.
The first evidence that schizophrenia mutations interfere with excitatory signalling was uncovered in 2011 by the same team, based at Cardiff University’s MRC Centre for Neuropsychiatric Genetics and Genomics.
This paper not only confirms their previous findings, but also provides the first strong genetic evidence that disruption of inhibitory signalling contributes to the disorder.
To reach their conclusions scientists compared the genetic data of 11,355 patients with schizophrenia against a control group of 16,416 people without the condition.
They looked for types of mutation known as copy number variants (CNVs), mutations in which large stretches of DNA are either deleted or duplicated.
Comparing the CNVs found in people with schizophrenia to those found in unaffected people, the team was able to show that the mutations in individuals with the disorder tended to disrupt genes involved in specific aspects of brain function.
The disease-causing effects of CNVs are also suspected to be involved in other neurodevelopmental disorders such as intellectual disability, Autism Spectrum Disorder and ADHD.
Around 635,000 people in the UK will at some stage in their lives be affected by schizophrenia.
The estimated cost of schizophrenia and psychosis to society is around Â£11.8 billion a year.
The symptoms of schizophrenia can be extremely disruptive, and have a large impact on a person’s ability to carry out everyday tasks, such as going to work, maintaining relationships and caring for themselves or others.
April 22, 2015
How is lightning initiated in thunderclouds? This is difficult to answer – how do you measure electric fields inside large, dangerously charged clouds? It was discovered, more or less by coincidence, that cosmic rays provide suitable probes to measure electric fields within thunderclouds. This surprising finding is published in Physical Review Letters on April 24th. The measurements were performed with the LOFAR radio telescope located in the Netherlands.
‘We used to throw away LOFAR measurements taken during thunderstorms. They were too messy.’ says astronomer Pim Schellart. ‘Well, we didn’t actually throw them away of course, we just didn’t analyze them.’ Schellart, who completed his PhD in March this year at Radboud University in Nijmegen and is supervised by Prof. Heino Falcke, is interested in cosmic rays. These high-energy particles, originating from exploding stars and other astrophysical sources, continuously bombard Earth from space.
High in the atmosphere these particles strike atmospheric molecules and create ‘showers’ of elementary particles. These showers can also be measured from the radio emission that is generated when their constituent particles are deflected by the magnetic field of the Earth. The radio emission also gives information about the original particles. These measurements are routinely conducted with LOFAR at ASTRON in Dwingeloo, but not during thunderstorms.
That changed when the data were examined in a collaborative effort with astrophysicist Gia Trinh, Prof. Olaf Scholten from the University of Groningen and lightning expert Ute Ebert from the Centrum Wiskunde & Informatica in Amsterdam.
‘We modeled how the electric field in thunderstorms can explain the different measurements. This worked very well. How the radio emission changes gives us a lot of information about the electric fields in thunderstorms. We could even determine the strength of the electric field at a certain height in the cloud.’ says Schellart.
This field can be as strong as 50 kV/m. This translates into a voltage of hundreds of millions of volts over a distance of multiple kilometers: a thundercloud contains enormous amounts of energy.
Lightning is a highly unpredictable natural phenomenon that inflicts damage to infrastructure and claims victims around the world. This new method to measure electric fields in thunderclouds will contribute to a better understanding and ultimately better predictions of lightning activity. Current measurement methods from planes, balloons or little rockets are dangerous and too localized. Most importantly the presence of the measurement equipment influences the measurements. Cosmic rays probe the thunderclouds from top to bottom. Moving at almost the speed of light they provide a near instantaneous ‘picture’ of the electric fields in the cloud. Moreover, they are created by nature and are freely available.
‘This research is an exemplary form of interdisciplinary collaboration between astronomers, particle physicists and geophysicists’, says Heino Falcke. ‘We hope to develop the model further to ultimately answer the question: how is lightning initiated within thunderclouds?’
February 14, 2015
We’re so used murder mysteries that we don’t even notice how mystery authors play with time. Typically the murder occurs well before the midpoint of the book, but there is an information blackout at that point and the reader learns what happened then only on the last page.
If the last page were ripped out of the book, physicist Kater Murch, PhD, said, would the reader be better off guessing what happened by reading only up to the fatal incident or by reading the entire book?
The answer, so obvious in the case of the murder mystery, is less so in world of quantum mechanics, where indeterminacy is fundamental rather than contrived for our reading pleasure.
Even if you know everything quantum mechanics can tell you about a quantum particle, said Murch, an assistant professor of physics in Arts & Sciences at Washington University in St. Louis, you cannot predict with certainty the outcome of a simple experiment to measure its state. All quantum mechanics can offer are statistical probabilities for the possible results.
The orthodox view is that this indeterminacy is not a defect of the theory, but rather a fact of nature. The particle’s state is not merely unknown, but truly undefined before it is measured. The act of measurement itself forces the particle to collapse to a definite state.
In the Feb. 13 issue of Physical Review Letters, Kater Murch describes a way to narrow the odds. By combining information about a quantum system’s evolution after a target time with information about its evolution up to that time, his lab was able to narrow the odds of correctly guessing the state of the two-state system from 50-50 to 90-10.
It’s as if what we did today, changed what we did yesterday. And as this analogy suggests, the experimental results have spooky implications for time and causality–at least in microscopic world to which quantum mechanics applies.
Measuring a phantom
Until recently physicists could explore the quantum mechanical properties of single particles only through thought experiments, because any attempt to observe them directly caused them to shed their mysterious quantum properties.
But in the 1980s and 1990s physicists invented devices that allowed them to measure these fragile quantum systems so gently that they don’t immediately collapse to a definite state.
The device Murch uses to explore quantum space is a simple superconducting circuit that enters quantum space when it is cooled to near absolute zero. Murch’s team uses the bottom two energy levels of this qubit, the ground state and an excited state, as their model quantum system. Between these two states, there are an infinite number of quantum states that are superpositions, or combinations, of the ground and excited states.
The quantum state of the circuit is detected by putting it inside a microwave box. A few microwave photons are sent into the box, where their quantum fields interact with the superconducting circuit. So when the photons exit the box they bear information about the quantum system.
Crucially, these “weak,” off-resonance measurements do not disturb the qubit, unlike “strong” measurements with photons that are resonant with the energy difference between the two states, which knock the circuit into one or the other state.
A quantum guessing game
In Physical Review Letters, Murch describes a quantum guessing game played with the qubit.
“We start each run by putting the qubit in a superposition of the two states,” he said. “Then we do a strong measurement but hide the result, continuing to follow the system with weak measurements.”
They then try to guess the hidden result, which is their version of the missing page of the murder mystery.
“Calculating forward, using the Born equation that expresses the probability of finding the system in a particular state, your odds of guessing right are only 50-50,” Murch said. “But you can also calculate backward using something called an effect matrix. Just take all the equations and flip them around. They still work and you can just run the trajectory backward.
“So there’s a backward-going trajectory and a forward-going trajectory and if we look at them both together and weight the information in both equally, we get something we call a hindsight prediction, or “retrodiction.”
The shattering thing about the retrodiction is that it is 90 percent accurate. When the physicists check it against the stored measurement of the system’s earlier state it is right nine times out of 10.
Down the rabbit hole
The quantum guessing game suggests ways to make both quantum computing and the quantum control of open systems, such as chemical reactions, more robust. But it also has implications for much deeper problems in physics.
For one thing, it suggests that in the quantum world time runs both backward and forward whereas in the classical world it only runs forward.
“I always thought the measurement would resolve the time symmetry in quantum mechanics,” Murch said. “If we measure a particle in a superposition of states and it collapses into one of two states, well, that sounds like a process that goes forward in time.”
But in the quantum guessing experiment, time symmetry has returned. The improved odds imply the measured quantum state somehow incorporates information from the future as well as the past. And that implies that time, notoriously an arrow in the classical world, is a double-headed arrow in the quantum world.
“It’s not clear why in the real world, the world made up of many particles, time only goes forward and entropy always increases,” Murch said. “But many people are working on that problem and I expect it will be solved in a few years,” he said.
In a world where time is symmetric, however, is there such a thing as cause and effect? To find out, Murch proposes to run a qubit experiment that would set up feedback loops (which are chains of cause and effect) and try to run them both forward and backward.
“It takes 20 or 30 minutes to run one of these experiments,” Murch said, “several weeks to process it, and a year to scratch our heads to see if we’re crazy or not.”
“At the end of the day,” he said, “I take solace in the fact that we have a real experiment and real data that we plot on real curves.”
January 23, 2015
Recent developments in science are beginning to suggest that the universe naturally produces complexity. The emergence of life in general and perhaps even rational life, with its associated technological culture, may be extremely common, argues Clemson researcher Kelly Smith in a recently published paper in the journal Space Policy.
What’s more, he suggests, this universal tendency has distinctly religious overtones and may even establish a truly universal basis for morality.
Smith, a Philosopher and Evolutionary Biologist, applies recent theoretical developments in Biology and Complex Systems Theory to attempt new answers to the kind of enduring questions about human purpose and obligation that have long been considered the sole province of the humanities.
He points out that scientists are increasingly beginning to discuss how the basic structure of the universe seems to favor the creation of complexity. The large scale history of the universe strongly suggests a trend of increasing complexity: disordered energy states produce atoms and molecules, which combine to form suns and associated planets, on which life evolves. Life then seems to exhibit its own pattern of increasing complexity, with simple organisms getting more complex over evolutionary time until they eventually develop rationality and complex culture.
And recent theoretical developments in Biology and complex systems theory suggest this trend may be real, arising from the basic structure of the universe in a predictable fashion.
“If this is right,” says Smith, “you can look at the universe as a kind of ‘complexity machine’, which raises all sorts of questions about what this means in a broader sense. For example, does believing the universe is structured to produce complexity in general, and rational creatures in particular, constitute a religious belief? It need not imply that the universe was created by a God, but on the other hand, it does suggest that the kind of rationality we hold dear is not an accident.”
And Smith feels another similarity to religion are the potential moral implications of this idea. If evolution tends to favor the development of sociality, reason, and culture as a kind of “package deal”, then it’s a good bet that any smart extraterrestrials we encounter will have similar evolved attitudes about their basic moral commitments.
In particular, they will likely agree with us that there is something morally special about rational, social creatures. And such universal agreement, argues Smith, could be the foundation for a truly universal system of ethics.
Smith will soon take sabbatical to lay the groundwork for a book exploring these issues in more detail.
January 8, 2015
Researchers from the University of Cambridge and the University of Plymouth have shown that follow-through – such as when swinging a golf club or tennis racket – can help us to learn two different skills at once, or to learn a single skill faster. The research provides new insight into the way tasks are learned, and could have implications for rehabilitation, such as re-learning motor skills following a stroke.
The researchers found that the particular motor memory which is active and modifiable in the brain at any given time depends on both lead-in and follow-through movement, and that skills which may otherwise interfere can be learned at the same time if their follow-through motions are unique. The research is published today (8 January) in the journal Current Biology.
While follow-through in sports such as tennis or golf cannot affect the movement of the ball after it has been hit, it does serve two important purposes: it both helps maximise velocity or force at the point of impact, and helps prevent injuries by allowing a gradual slowdown of a movement.
Now, researchers have found a third important role for follow-through: it allows distinct motor memories to be learned. In other words, by practising the same action with different follow-throughs, different motor memories can be learned for a single movement.
If a new task, whether that is serving a tennis ball or learning a passage on a musical instrument, is repeated enough times, a motor memory of that task is developed. The brain is able to store, protect and reactivate this memory, quickly instructing the muscles to perform the task so that it can be performed seemingly without thinking.
The problem with learning similar but distinct tasks is that they can ‘interfere’ with each other in the brain. For example, tennis and racquetball are both racket sports. However, the strokes for the two sports are slightly different, as topspin is great for a tennis player, but not for a racquetball player. Despite this, in theory it should be possible to learn both sports independently. However, many people find it difficult to perform at a high level in both sports, due to interference between the two strokes.
In order to determine whether we learn a separate motor memory for each task, or a single motor memory for both, the researchers examined either the presence or absence of interference by having participants learn a ‘reaching’ task in the presence of two opposite force-fields.
Participants grasped the handle of a robotic interface and made a reaching movement through an opposing force-field to a central target, followed immediately by a second unopposed follow-through movement to one of two possible final targets. The direction of the force-field was changed, representing different tasks, and the researchers were able to examine whether the tasks are learned separately, in which case there would be no interference, or whether we learn the mean of the two opposing force-fields, in which case there would be complete interference.
The researchers found that the specific motor memory which is active at any given moment depends on the movement that will be made in the near future. When a follow-through movement was made that anticipated the force-field direction, there was a substantial reduction in interference. This suggests that different follow-throughs may activate distinct motor memories, allowing us to learn two different skills without them interfering, even when the rest of the movement is identical. However, while practising a variable follow-through can activate multiple motor memories, practising a consistent follow-through allowed for tasks to be learned much faster.
“There is always noise in our movements, which arrives in the sensory information we receive, the planning we undertake, and the output of our motor system,” said Dr David Franklin of Cambridge’s Department of Engineering, a senior author on the research. “Because of this, every movement we make is slightly different from the last one even if we try really hard to make it exactly the same – there will always be variability within our movements and therefore within our follow-through as well.”
When practicing a new skill such as a tennis stroke, we may think that we do not need to care as much about controlling the variability after we hit the ball as it can’t actually affect the movement of the ball itself. “However this research suggests that this variability has another very important point – that it reduces the speed of learning of the skill that is being practiced,” said Franklin.
The research may also have implications for rehabilitation, such as re-learning skills after a stroke. When trying to re-learn skills after a stroke, many patients actually exhibit a great deal of variability in their movements. “Since we have shown that learning occurs faster with consistent movements, it may therefore be important to consider methods to reduce this variability in order to improve the speed of rehabilitation,” said Dr Ian Howard of the University of Plymouth, the paper’s lead author.
The work was supported by the Wellcome Trust, Human Frontier Science Program, Plymouth University and the Royal Society.
November 5, 2014
The physics community has spent three decades searching for and finding no evidence that dark matter is made of tiny exotic particles. Case Western Reserve University theoretical physicists suggest researchers consider looking for candidates more in the ordinary realm and, well, more massive.
Dark matter is unseen matter, that, combined with normal matter, could create the gravity that, among other things, prevents spinning galaxies from flying apart. Physicists calculate that dark matter comprises 27 percent of the universe; normal matter 5 percent.
Instead of WIMPS, weakly interacting massive particles, or axions, which are weakly interacting low-mass particles, dark matter may be made of macroscopic objects, anywhere from a few ounces to the size of a good asteroid, and probably as dense as a neutron star, or the nucleus of an atom, the researchers suggest.
Physics professor Glenn Starkman and David Jacobs, who received his PhD in Physics from CWRU in May and is now a fellow at the University of Cape Town, say published observations provide guidance, limiting where to look. They lay out the possibilities in a paper at http://arxiv.org/pdf/1410.2236.pdf.
The Macros, as Starkman and Jacobs call them, would not only dwarf WIMPS and axions, but differ in an important way. They could potentially be assembled out of particles in the Standard Model of particle physics instead of requiring new physics to explain their existence.
“We’ve been looking for WIMPs for a long time and haven’t seen them,” Starkman said. “We expected to make WIMPS in the Large Hadron Collider, and we haven’t.”
WIMPS and axions remain possible candidates for dark matter, but there’s reason to search elsewhere, the theorists argue.
“The community had kind of turned away from the idea that dark matter could be made of normal-ish stuff in the late ’80s,” Starkman said. “We ask, was that completely correct and how do we know dark matter isn’t more ordinary stuff— stuff that could be made from quarks and electrons?”
After eliminating most ordinary matter, including failed Jupiters, white dwarfs, neutron stars, stellar black holes, the black holes in centers of galaxies and neutrinos with a lot of mass, as possible candidates, physicists turned their focus on the exotics.
Matter that was somewhere in between ordinary and exotic—relatives of neutron stars or large nuclei—was left on the table, Starkman said. “We say relatives because they probably have a considerable admixture of strange quarks, which are made in accelerators and ordinarily have extremely short lives,” he said.
Although strange quarks are highly unstable, Starkman points out that neutrons are also highly unstable. But in helium, bound with stable protons, neutrons remain stable.
“That opens the possibility that stable strange nuclear matter was made in the early universe and dark matter is nothing more than chunks of strange nuclear matter or other bound states of quarks, or of baryons, which are themselves made of quarks,” he said. Such dark matter would fit the Standard Model.
The Macros would have to be assembled from ordinary and strange quarks or baryons before the strange quarks or baryons decay, and at a temperature above 3.5 trillion degrees Celsius, comparable to the temperature in the center of a massive supernova, Starkman and Jacobs calculated. The quarks would have to be assembled with 90 percent efficiency, leaving just 10 percent to form the protons and neutrons found in the universe today.
The limits of the possible dark matter are as follows:
- A minimum of 55 grams. If dark matter were smaller, it would have been seen in detectors in Skylab or in tracks found in sheets of mica.
- A maximum of 1024 (a million billion billion) grams. Above this, the Macros would be so massive they would bend starlight, which has not been seen.
- The range of 1017 to 1020 grams per centimeter squared should also be eliminated from the search, the theorists say. Dark matter in that range would be massive for gravitational lensing to affect individual photons from gamma ray bursts in ways that have not been seen.
If dark matter is within this allowed range, there are reasons it hasn’t been seen.
- At the mass of 1018 grams, dark matter Macros would hit the Earth about once every billion years.
- At lower masses, they would strike the Earth more frequently but might not leave a recognizable record or observable mark.
- In the range of 109 to 1018, dark matter would collide with the Earth once annually, providing nothing to the underground dark matter detectors in place.