February 14, 2015
We’re so used murder mysteries that we don’t even notice how mystery authors play with time. Typically the murder occurs well before the midpoint of the book, but there is an information blackout at that point and the reader learns what happened then only on the last page.
If the last page were ripped out of the book, physicist Kater Murch, PhD, said, would the reader be better off guessing what happened by reading only up to the fatal incident or by reading the entire book?
The answer, so obvious in the case of the murder mystery, is less so in world of quantum mechanics, where indeterminacy is fundamental rather than contrived for our reading pleasure.
Even if you know everything quantum mechanics can tell you about a quantum particle, said Murch, an assistant professor of physics in Arts & Sciences at Washington University in St. Louis, you cannot predict with certainty the outcome of a simple experiment to measure its state. All quantum mechanics can offer are statistical probabilities for the possible results.
The orthodox view is that this indeterminacy is not a defect of the theory, but rather a fact of nature. The particle’s state is not merely unknown, but truly undefined before it is measured. The act of measurement itself forces the particle to collapse to a definite state.
In the Feb. 13 issue of Physical Review Letters, Kater Murch describes a way to narrow the odds. By combining information about a quantum system’s evolution after a target time with information about its evolution up to that time, his lab was able to narrow the odds of correctly guessing the state of the two-state system from 50-50 to 90-10.
It’s as if what we did today, changed what we did yesterday. And as this analogy suggests, the experimental results have spooky implications for time and causality–at least in microscopic world to which quantum mechanics applies.
Measuring a phantom
Until recently physicists could explore the quantum mechanical properties of single particles only through thought experiments, because any attempt to observe them directly caused them to shed their mysterious quantum properties.
But in the 1980s and 1990s physicists invented devices that allowed them to measure these fragile quantum systems so gently that they don’t immediately collapse to a definite state.
The device Murch uses to explore quantum space is a simple superconducting circuit that enters quantum space when it is cooled to near absolute zero. Murch’s team uses the bottom two energy levels of this qubit, the ground state and an excited state, as their model quantum system. Between these two states, there are an infinite number of quantum states that are superpositions, or combinations, of the ground and excited states.
The quantum state of the circuit is detected by putting it inside a microwave box. A few microwave photons are sent into the box, where their quantum fields interact with the superconducting circuit. So when the photons exit the box they bear information about the quantum system.
Crucially, these “weak,” off-resonance measurements do not disturb the qubit, unlike “strong” measurements with photons that are resonant with the energy difference between the two states, which knock the circuit into one or the other state.
A quantum guessing game
In Physical Review Letters, Murch describes a quantum guessing game played with the qubit.
“We start each run by putting the qubit in a superposition of the two states,” he said. “Then we do a strong measurement but hide the result, continuing to follow the system with weak measurements.”
They then try to guess the hidden result, which is their version of the missing page of the murder mystery.
“Calculating forward, using the Born equation that expresses the probability of finding the system in a particular state, your odds of guessing right are only 50-50,” Murch said. “But you can also calculate backward using something called an effect matrix. Just take all the equations and flip them around. They still work and you can just run the trajectory backward.
“So there’s a backward-going trajectory and a forward-going trajectory and if we look at them both together and weight the information in both equally, we get something we call a hindsight prediction, or “retrodiction.”
The shattering thing about the retrodiction is that it is 90 percent accurate. When the physicists check it against the stored measurement of the system’s earlier state it is right nine times out of 10.
Down the rabbit hole
The quantum guessing game suggests ways to make both quantum computing and the quantum control of open systems, such as chemical reactions, more robust. But it also has implications for much deeper problems in physics.
For one thing, it suggests that in the quantum world time runs both backward and forward whereas in the classical world it only runs forward.
“I always thought the measurement would resolve the time symmetry in quantum mechanics,” Murch said. “If we measure a particle in a superposition of states and it collapses into one of two states, well, that sounds like a process that goes forward in time.”
But in the quantum guessing experiment, time symmetry has returned. The improved odds imply the measured quantum state somehow incorporates information from the future as well as the past. And that implies that time, notoriously an arrow in the classical world, is a double-headed arrow in the quantum world.
“It’s not clear why in the real world, the world made up of many particles, time only goes forward and entropy always increases,” Murch said. “But many people are working on that problem and I expect it will be solved in a few years,” he said.
In a world where time is symmetric, however, is there such a thing as cause and effect? To find out, Murch proposes to run a qubit experiment that would set up feedback loops (which are chains of cause and effect) and try to run them both forward and backward.
“It takes 20 or 30 minutes to run one of these experiments,” Murch said, “several weeks to process it, and a year to scratch our heads to see if we’re crazy or not.”
“At the end of the day,” he said, “I take solace in the fact that we have a real experiment and real data that we plot on real curves.”
January 23, 2015
Recent developments in science are beginning to suggest that the universe naturally produces complexity. The emergence of life in general and perhaps even rational life, with its associated technological culture, may be extremely common, argues Clemson researcher Kelly Smith in a recently published paper in the journal Space Policy.
What’s more, he suggests, this universal tendency has distinctly religious overtones and may even establish a truly universal basis for morality.
Smith, a Philosopher and Evolutionary Biologist, applies recent theoretical developments in Biology and Complex Systems Theory to attempt new answers to the kind of enduring questions about human purpose and obligation that have long been considered the sole province of the humanities.
He points out that scientists are increasingly beginning to discuss how the basic structure of the universe seems to favor the creation of complexity. The large scale history of the universe strongly suggests a trend of increasing complexity: disordered energy states produce atoms and molecules, which combine to form suns and associated planets, on which life evolves. Life then seems to exhibit its own pattern of increasing complexity, with simple organisms getting more complex over evolutionary time until they eventually develop rationality and complex culture.
And recent theoretical developments in Biology and complex systems theory suggest this trend may be real, arising from the basic structure of the universe in a predictable fashion.
“If this is right,” says Smith, “you can look at the universe as a kind of ‘complexity machine’, which raises all sorts of questions about what this means in a broader sense. For example, does believing the universe is structured to produce complexity in general, and rational creatures in particular, constitute a religious belief? It need not imply that the universe was created by a God, but on the other hand, it does suggest that the kind of rationality we hold dear is not an accident.”
And Smith feels another similarity to religion are the potential moral implications of this idea. If evolution tends to favor the development of sociality, reason, and culture as a kind of “package deal”, then it’s a good bet that any smart extraterrestrials we encounter will have similar evolved attitudes about their basic moral commitments.
In particular, they will likely agree with us that there is something morally special about rational, social creatures. And such universal agreement, argues Smith, could be the foundation for a truly universal system of ethics.
Smith will soon take sabbatical to lay the groundwork for a book exploring these issues in more detail.
January 8, 2015
Researchers from the University of Cambridge and the University of Plymouth have shown that follow-through – such as when swinging a golf club or tennis racket – can help us to learn two different skills at once, or to learn a single skill faster. The research provides new insight into the way tasks are learned, and could have implications for rehabilitation, such as re-learning motor skills following a stroke.
The researchers found that the particular motor memory which is active and modifiable in the brain at any given time depends on both lead-in and follow-through movement, and that skills which may otherwise interfere can be learned at the same time if their follow-through motions are unique. The research is published today (8 January) in the journal Current Biology.
While follow-through in sports such as tennis or golf cannot affect the movement of the ball after it has been hit, it does serve two important purposes: it both helps maximise velocity or force at the point of impact, and helps prevent injuries by allowing a gradual slowdown of a movement.
Now, researchers have found a third important role for follow-through: it allows distinct motor memories to be learned. In other words, by practising the same action with different follow-throughs, different motor memories can be learned for a single movement.
If a new task, whether that is serving a tennis ball or learning a passage on a musical instrument, is repeated enough times, a motor memory of that task is developed. The brain is able to store, protect and reactivate this memory, quickly instructing the muscles to perform the task so that it can be performed seemingly without thinking.
The problem with learning similar but distinct tasks is that they can ‘interfere’ with each other in the brain. For example, tennis and racquetball are both racket sports. However, the strokes for the two sports are slightly different, as topspin is great for a tennis player, but not for a racquetball player. Despite this, in theory it should be possible to learn both sports independently. However, many people find it difficult to perform at a high level in both sports, due to interference between the two strokes.
In order to determine whether we learn a separate motor memory for each task, or a single motor memory for both, the researchers examined either the presence or absence of interference by having participants learn a ‘reaching’ task in the presence of two opposite force-fields.
Participants grasped the handle of a robotic interface and made a reaching movement through an opposing force-field to a central target, followed immediately by a second unopposed follow-through movement to one of two possible final targets. The direction of the force-field was changed, representing different tasks, and the researchers were able to examine whether the tasks are learned separately, in which case there would be no interference, or whether we learn the mean of the two opposing force-fields, in which case there would be complete interference.
The researchers found that the specific motor memory which is active at any given moment depends on the movement that will be made in the near future. When a follow-through movement was made that anticipated the force-field direction, there was a substantial reduction in interference. This suggests that different follow-throughs may activate distinct motor memories, allowing us to learn two different skills without them interfering, even when the rest of the movement is identical. However, while practising a variable follow-through can activate multiple motor memories, practising a consistent follow-through allowed for tasks to be learned much faster.
“There is always noise in our movements, which arrives in the sensory information we receive, the planning we undertake, and the output of our motor system,” said Dr David Franklin of Cambridge’s Department of Engineering, a senior author on the research. “Because of this, every movement we make is slightly different from the last one even if we try really hard to make it exactly the same – there will always be variability within our movements and therefore within our follow-through as well.”
When practicing a new skill such as a tennis stroke, we may think that we do not need to care as much about controlling the variability after we hit the ball as it can’t actually affect the movement of the ball itself. “However this research suggests that this variability has another very important point – that it reduces the speed of learning of the skill that is being practiced,” said Franklin.
The research may also have implications for rehabilitation, such as re-learning skills after a stroke. When trying to re-learn skills after a stroke, many patients actually exhibit a great deal of variability in their movements. “Since we have shown that learning occurs faster with consistent movements, it may therefore be important to consider methods to reduce this variability in order to improve the speed of rehabilitation,” said Dr Ian Howard of the University of Plymouth, the paper’s lead author.
The work was supported by the Wellcome Trust, Human Frontier Science Program, Plymouth University and the Royal Society.
November 5, 2014
The physics community has spent three decades searching for and finding no evidence that dark matter is made of tiny exotic particles. Case Western Reserve University theoretical physicists suggest researchers consider looking for candidates more in the ordinary realm and, well, more massive.
Dark matter is unseen matter, that, combined with normal matter, could create the gravity that, among other things, prevents spinning galaxies from flying apart. Physicists calculate that dark matter comprises 27 percent of the universe; normal matter 5 percent.
Instead of WIMPS, weakly interacting massive particles, or axions, which are weakly interacting low-mass particles, dark matter may be made of macroscopic objects, anywhere from a few ounces to the size of a good asteroid, and probably as dense as a neutron star, or the nucleus of an atom, the researchers suggest.
Physics professor Glenn Starkman and David Jacobs, who received his PhD in Physics from CWRU in May and is now a fellow at the University of Cape Town, say published observations provide guidance, limiting where to look. They lay out the possibilities in a paper at http://arxiv.org/pdf/1410.2236.pdf.
The Macros, as Starkman and Jacobs call them, would not only dwarf WIMPS and axions, but differ in an important way. They could potentially be assembled out of particles in the Standard Model of particle physics instead of requiring new physics to explain their existence.
“We’ve been looking for WIMPs for a long time and haven’t seen them,” Starkman said. “We expected to make WIMPS in the Large Hadron Collider, and we haven’t.”
WIMPS and axions remain possible candidates for dark matter, but there’s reason to search elsewhere, the theorists argue.
“The community had kind of turned away from the idea that dark matter could be made of normal-ish stuff in the late ’80s,” Starkman said. “We ask, was that completely correct and how do we know dark matter isn’t more ordinary stuff— stuff that could be made from quarks and electrons?”
After eliminating most ordinary matter, including failed Jupiters, white dwarfs, neutron stars, stellar black holes, the black holes in centers of galaxies and neutrinos with a lot of mass, as possible candidates, physicists turned their focus on the exotics.
Matter that was somewhere in between ordinary and exotic—relatives of neutron stars or large nuclei—was left on the table, Starkman said. “We say relatives because they probably have a considerable admixture of strange quarks, which are made in accelerators and ordinarily have extremely short lives,” he said.
Although strange quarks are highly unstable, Starkman points out that neutrons are also highly unstable. But in helium, bound with stable protons, neutrons remain stable.
“That opens the possibility that stable strange nuclear matter was made in the early universe and dark matter is nothing more than chunks of strange nuclear matter or other bound states of quarks, or of baryons, which are themselves made of quarks,” he said. Such dark matter would fit the Standard Model.
The Macros would have to be assembled from ordinary and strange quarks or baryons before the strange quarks or baryons decay, and at a temperature above 3.5 trillion degrees Celsius, comparable to the temperature in the center of a massive supernova, Starkman and Jacobs calculated. The quarks would have to be assembled with 90 percent efficiency, leaving just 10 percent to form the protons and neutrons found in the universe today.
The limits of the possible dark matter are as follows:
- A minimum of 55 grams. If dark matter were smaller, it would have been seen in detectors in Skylab or in tracks found in sheets of mica.
- A maximum of 1024 (a million billion billion) grams. Above this, the Macros would be so massive they would bend starlight, which has not been seen.
- The range of 1017 to 1020 grams per centimeter squared should also be eliminated from the search, the theorists say. Dark matter in that range would be massive for gravitational lensing to affect individual photons from gamma ray bursts in ways that have not been seen.
If dark matter is within this allowed range, there are reasons it hasn’t been seen.
- At the mass of 1018 grams, dark matter Macros would hit the Earth about once every billion years.
- At lower masses, they would strike the Earth more frequently but might not leave a recognizable record or observable mark.
- In the range of 109 to 1018, dark matter would collide with the Earth once annually, providing nothing to the underground dark matter detectors in place.
October 27, 2014
Making mistakes while learning can benefit memory and lead to the correct answer, but only if the guesses are close-but-no-cigar, according to new research findings from Baycrest Health Sciences.
“Making random guesses does not appear to benefit later memory for the right answer , but near-miss guesses act as stepping stones for retrieval of the correct information – and this benefit is seen in younger and older adults,” says lead investigator Andrée-Ann Cyr, a graduate student with Baycrest’s Rotman Research Institute and the Department of Psychology at the University of Toronto.
Cyr’s paper is posted online today in the Journal of Experimental Psychology: Learning, Memory, and Cognition (ahead of print publication). The study expands upon a previous paper she published in Psychology and Aging in 2012 that found that learning information the hard way by making mistakes (as opposed to just being told the correct answer) may be the best boot camp for older brains.
That paper raised eyebrows since the scientific literature has traditionally recommended that older adults avoid making mistakes – unlike their younger peers who actually benefit from them. But recent evidence from Cyr and other researchers is challenging this perspective and prompting professional educators and cognitive rehabilitation clinicians to take note.
Cyr’s latest research provides evidence that trial-and-error learning can benefit memory in both young and old when errors are meaningfully related to the right answer, and can actually harm memory when they are not.
In their latest study, 65 healthy younger adults (average age 22) and 64 healthy older adults (average age 72) learned target words (e.g., rose) based either on the semantic category it belongs to (e.g., a flower) or its word stem (e.g., a word that begins with the letters ‘ro’). For half of the words, participants were given the answer right away (e.g., “the answer is rose”) and for the other half, they were asked to guess at it before seeing the answer (e.g., a flower: “Is it tulip?” or ro___ : “is it rope?”).
On a later memory test, participants were shown the categories or word stems and had to come up with the right answer. The researchers wanted to know if participants would be better at remembering rose if they had made wrong guesses prior to studying it rather than seeing it right away. They found that this was only true if participants learned based on the categories (e.g., a flower). Guessing actually made memory worse when words were learned based on word stems (e.g., ro___). This was the case for both younger and older adults. Cyr and her colleagues suggest this is because our memory organizes information based on how it is conceptually rather than lexically related to other information. For example, when you think of the word pear, your mind is more likely to jump to another fruit, such as apple, than to a word that looks similar, such as peer. Wrong guesses only add value when they have something meaningful in common with right answers. The guess tulip may be wrong, but it is still conceptually close to the right answer rose (both are flowers).
By guessing first as opposed to just reading the answer, one is thinking harder about the information and making useful connections that can help memory. Indeed, younger and older participants were more likely to remember the answer if they also remembered their wrong guesses, suggesting that these acted as stepping stones. By contrast, when guesses only have letters in common with answers, they clutter memory because one cannot link them meaningfully. The word rope is nowhere close to rose in our memory. In these situations, where your guesses are likely to be out in left field, it is best to bypass mistakes altogether.
“The fact that this pattern was found for older adults as well shows that aging does not influence how we learn from mistakes,” says Cyr.
“These results have profound clinical and practical implications. They turn traditional views of best practices in memory rehabilitation for healthy seniors on their head by demonstrating that making the right kind of errors can be beneficial. They also provide great hope for lifelong learning and guidance for how seniors should study,” says Dr. Nicole Anderson, senior scientist with Baycrest’s Rotman Research Institute and senior author on the study.
The study was funded by the Canadian Institutes of Health Research.
October 9, 2014
The discovery of a new particle will “transform our understanding” of the fundamental force of nature that binds the nuclei of atoms, researchers argue.
Led by scientists from the University of Warwick, the discovery of the new particle will help provide greater understanding of the strong interaction, the fundamental force of nature found within the protons of an atom’s nucleus.
Named Ds3*(2860)ˉ, the particle, a new type of meson, was discovered by analysing data collected with the LHCb detector at CERN’s Large Hadron Collider (LHC) .
The new particle is bound together in a similar way to protons. Due to this similarity, the Warwick researchers argue that scientists will now be able to study the particle to further understand strong interactions.
Along with gravity, the electromagnetic interaction and weak nuclear force, strong-interactions are one of four fundamental forces. Lead scientist Professor Tim Gershon, from The University of Warwick’s Department of Physics, explains:
“Gravity describes the universe on a large scale from galaxies to Newton’s falling apple, whilst the electromagnetic interaction is responsible for binding molecules together and also for holding electrons in orbit around an atom’s nucleus.
“The strong interaction is the force that binds quarks, the subatomic particles that form protons within atoms, together. It is so strong that the binding energy of the proton gives a much larger contribution to the mass, through Einstein’s equation E = mc2, than the quarks themselves. ”
Due in part to the forces’ relative simplicity, scientists have previously been able to solve the equations behind gravity and electromagnetic interactions, but the strength of the strong interaction makes it impossible to solve the equations in the same way.
“Calculations of strong interactions are done with a computationally intensive technique called Lattice QCD,” says Professor Gershon. “In order to validate these calculations it is essential to be able to compare predictions to experiments. The new particle is ideal for this purpose because it is the first known that both contains a charm quark and has spin 3.”
There are six quarks known to physicists; Up, Down, Strange, Charm, Beauty and Top. Protons and neutrons are composed of up and down quarks, but particles produced in accelerators such as the LHC can contain the unstable heavier quarks. In addition, some of these particles have higher spin values than the naturally occurring stable particles.
“Because the Ds3*(2860)ˉ particle contains a heavy charm quark it is easier for theorists to calculate its properties. And because it has spin 3, there can be no ambiguity about what the particle is,” adds Professor Gershon. “Therefore it provides a benchmark for future theoretical calculations. Improvements in these calculations will transform our understanding of how nuclei are bound together.”
Spin is one of the labels used by physicists to distinguish between particles. It is a concept that arises in quantum mechanics that can be thought of as being similar to angular momentum: in this sense higher spin corresponds to the quarks orbiting each other faster than those with a lower spin.
Warwick Ph.D. student Daniel Craik, who worked on the study, adds “Perhaps the most exciting part of this new result is that it could be the first of many similar discoveries with LHC data. Whether we can use the same technique, as employed with our research into Ds3*(2860)ˉ, to also improve our understanding of the weak interaction is a key question raised by this discovery. If so, this could help to answer one of the biggest mysteries in physics: why there is more matter than antimatter in the Universe.”
The results are detailed in two papers that will be published in the next editions of the journals Physical Review Letters and Physical Review D. Both papers have been given the accolade of being selected as Editors’ Suggestions.
Contact: Tom Frew, International Press Officer.
P: +44 (0)2476575910
Notes for Editors:
The results are detailed in papers titled:
- “Observation of overlapping spin-1 and spin-3 D0K- resonances at mass 2.86 GeV/c2″, to be published inPhysical Review Letters
-”Dalitz plot analysis of Bs0→D0K-π+ decays”, to be published in Physical Review D
- The Ds3*(2860)ˉ particle is a meson that contains a charm anti-quark and a strange quark. The subscript 3 denotes that it has spin 3, while the number 2860 in parentheses is the mass of the particle in the units of MeV/c2 that are favoured by particle physicists. The value of 2860 MeV/c2 corresponds to approximately 3 times the mass of the proton.
- The particle was discovered in the decay chain Bs0→D0K–π+ , where the Bs0, D0, K– and π+ mesons contain respectively a bottom anti-quark and a strange quark, a charm anti-quark and an up quark, an up anti-quark and a strange quark, and a down anti-quark and an up quark. The Ds3*(2860)ˉ particle is observed as a peak in the mass of combinations of the D0 and K– mesons. The distributions of the angles between the D0, K– and π+ particles allow the spin of the Ds3*(2860)ˉ meson to be unambiguously determined.
- Quarks are bound by the strong interaction into one of two types of particles: baryons, such as the proton, are composed of three quarks; mesons are composed of one quark and one anti-quark, where an anti-quark is the antimatter version of a quark.
- CERN, the European Organization for Nuclear Research, is the world’s leading laboratory for particle physics. It has its headquarters in Geneva. At present, its Member States are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Israel, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland and the United Kingdom. Romania is a Candidate for Accession. Serbia is an Associate Member in the pre-stage to Membership. India, Japan, the Russian Federation, the United States of America, Turkey, the European Commission and UNESCO have Observer Status.
- The LHCb experiment is one of the four main experiments at the CERN Large Hadron Collider, and is set up to explore what happened after the Big Bang that allowed matter to survive and build the Universe we inhabit today. The LHCb collaboration comprises about 700 physicists from 67 institutes in 17 countries.
August 26, 2014
A unique experiment at the U.S. Department of Energy’s Fermi National Accelerator Laboratory called the Holometer has started collecting data that will answer some mind-bending questions about our universe – including whether we live in a hologram.
Much like characters on a television show would not know that their seemingly 3 – D world exists only on a 2 – D screen, we could be clueless that our 3 – D space is just an illusion. The information about everything in our universe could actually be encoded in tiny packets in two dimensions.
Get close enough to your TV screen and you’ll see pixels, small points of data that make a seamless image if you stand back. Scientists think that the universe’s information may be contained in the same way, and that the natural “pixel size” of space is roughly 10 trillion trillion times smaller than an atom, a distance that physicists refer to as the Planck scale.
“We want to find out whether spacetime is a quantum system just like matter is,” said Craig Hogan, director of Fermilab’s Center for Particle Astrophysics and the developer of the holographic noise theory. “If we see something, it will completely change ideas about space we’ve used for thousands of years.”
Quantum theory suggests that it is impossible to know both the exact location and the exact speed of subatomic particles. If space comes in 2-D bits with limited information about the precise location of objects, then space itself would fall under the same theory of uncertainty . The same way that matter continues to jiggle (as quantum waves) even when cooled to absolute zero, this digitized space should have built-in vibrations even in its lowest energy state.
Essentially, the experiment probes the limits of the universe’s ability to store information. If there are a set number of bits that tell you where something is, it eventually becomes impossible to find more specific information about the location – even in principle. The instrument testing these limits is Fermilab’s Holometer, or holographic interferometer, the most sensitive device ever created to measure the quantum jitter of space itself.
Now operating at full power, the Holometer uses a pair of interferometers placed close to one another. Each one sends a one-kilowatt laser beam (the equivalent of 200,000 laser pointers) at a beam splitter and down two perpendicular 40-meter arms. The light is then reflected back to the beam splitter where the two beams recombine, creating fluctuations in brightness if there is motion. Researchers analyze these fluctuations in the returning light to see if the beam splitter is moving in a certain way – being carried along on a jitter of space itself.
“Holographic noise” is expected to be present at all frequencies, but the scientists’ challenge is not to be fooled by other sources of vibrations. The Holometer is testing a frequency so high – millions of cycles per second – that motions of normal matter are not likely to cause problems. Rather, the dominant background noise is more often due to radio waves emitted by nearby electronics. The Holometer experiment is designed to identify and eliminate noise from such conventional sources.
“If we find a noise we can’t get rid of, we might be detecting something fundamental about nature – a noise that is intrinsic to spacetime,” said Fermilab physicist Aaron Chou, lead scientist and project manager for the Holometer. “It’s an exciting moment for physics. A positive result will open a whole new avenue of questioning about how space works.”
The Holometer experiment, funded by the U.S. Department of Energy Office of Science and other sources , is expected to gather data over the coming year.
The Holometer team comprises 21 scientists and students from Fermilab, Massachusetts Institute of Technology, University of Chicago, and University of Michigan. For more information about the experiment, visit http://holometer.fnal.gov/ .
Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance, LLC. Visit Fermilab’s website athttp://www.fnal.gov and follow us on Twitter at @FermilabToday .
The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov .
July 14, 2014
The discovery 30 years ago of soccer-ball-shaped carbon molecules called buckyballs helped to spur an explosion of nanotechnology research. Now, there appears to be a new ball on the pitch.
Researchers from Brown University, Shanxi University and Tsinghua University in China have shown that a cluster of 40 boron atoms forms a hollow molecular cage similar to a carbon buckyball. It’s the first experimental evidence that a boron cage structure – previously only a matter of speculation – does indeed exist.
“This is the first time that a boron cage has been observed experimentally,” said Lai-Sheng Wang, a professor of chemistry at Brown who led the team that made the discovery. “As a chemist, finding new molecules and structures is always exciting. The fact that boron has the capacity to form this kind of structure is very interesting.”
Wang and his colleagues describe the molecule, which they’ve dubbed borospherene, in the journal Nature Chemistry.
Carbon buckyballs are made of 60 carbon atoms arranged in pentagons and hexagons to form a sphere – like a soccer ball. Their discovery in 1985 was soon followed by discoveries of other hollow carbon structures including carbon nanotubes. Another famous carbon nanomaterial – a one-atom-thick sheet called graphene – followed shortly after.
After buckyballs, scientists wondered if other elements might form these odd hollow structures. One candidate was boron, carbon’s neighbor on the periodic table. But because boron has one less electron than carbon, it can’t form the same 60-atom structure found in the buckyball. The missing electrons would cause the cluster to collapse on itself. If a boron cage existed, it would have to have a different number of atoms.
Wang and his research group have been studying boron chemistry for years. In a paper published earlier this year, Wang and his colleagues showed that clusters of 36 boron atoms form one-atom-thick disks, which might be stitched together to form an analog to graphene, dubbed borophene. Wang’s preliminary work suggested that there was also something special about boron clusters with 40 atoms. They seemed to be abnormally stable compared to other boron clusters. Figuring out what that 40-atom cluster actually looks like required a combination of experimental work and modeling using high-powered supercomputers.
On the computer, Wang’s colleagues modeled over 10,000 possible arrangements of 40 boron atoms bonded to each other. The computer simulations estimate not only the shapes of the structures, but also estimate the electron binding energy for each structure – a measure of how tightly a molecule holds its electrons. The spectrum of binding energies serves as a unique fingerprint of each potential structure.
The next step is to test the actual binding energies of boron clusters in the lab to see if they match any of the theoretical structures generated by the computer. To do that, Wang and his colleagues used a technique called photoelectron spectroscopy.
Chunks of bulk boron are zapped with a laser to create vapor of boron atoms. A jet of helium then freezes the vapor into tiny clusters of atoms. The clusters of 40 atoms were isolated by weight then zapped with a second laser, which knocks an electron out of the cluster. The ejected electron flies down a long tube Wang calls his “electron racetrack.” The speed at which the electrons fly down the racetrack is used to determine the cluster’s electron binding energy spectrum – its structural fingerprint.
The experiments showed that 40-atom-clusters form two structures with distinct binding spectra. Those spectra turned out to be a dead-on match with the spectra for two structures generated by the computer models. One was a semi-flat molecule and the other was the buckyball-like spherical cage.
“The experimental sighting of a binding spectrum that matched our models was of paramount importance,” Wang said. “The experiment gives us these very specific signatures, and those signatures fit our models.”
The borospherene molecule isn’t quite as spherical as its carbon cousin. Rather than a series of five- and six-membered rings formed by carbon, borospherene consists of 48 triangles, four seven-sided rings and two six-membered rings. Several atoms stick out a bit from the others, making the surface of borospherene somewhat less smooth than a buckyball.
As for possible uses for borospherene, it’s a little too early to tell, Wang says. One possibility, he points out, could be hydrogen storage. Because of the electron deficiency of boron, borospherene would likely bond well with hydrogen. So tiny boron cages could serve as safe houses for hydrogen molecules.
But for now, Wang is enjoying the discovery.
“For us, just to be the first to have observed this, that’s a pretty big deal,” Wang said. “Of course if it turns out to be useful that would be great, but we don’t know yet. Hopefully this initial finding will stimulate further interest in boron clusters and new ideas to synthesize them in bulk quantities.”
The theoretical modeling was done with a group led by Prof. Si-Dian Li from Shanxi University and a group led by Prof. Jun Li from Tsinghua University. The work was supported by the U.S. National Science Foundation (CHE-1263745) and the National Natural Science Foundation of China.’
July 11, 2014
Why does a relentless stream of subjective experiences normally fill your mind? Maybe that’s just one of those mysteries that will always elude us.
Yet, research from Northwestern University suggests that consciousness lies well within the realm of scientific inquiry — as impossible as that may currently seem. Although scientists have yet to agree on an objective measure to index consciousness, progress has been made with this agenda in several labs around the world.
“The debate about the neural basis of consciousness rages because there is no widely accepted theory about what happens in the brain to make consciousness possible,” said Ken Paller, professor of psychology in the Weinberg College of Arts and Sciences and director of the Cognitive Neuroscience Program at Northwestern.
“Scientists and others acknowledge that damage to the brain can lead to systematic changes in consciousness. Yet, we don’t know exactly what differentiates brain activity associated with conscious experience from brain activity that is instead associated with mental activity that remains unconscious,” he said.
In a new article, Paller and Satoru Suzuki, also professor of psychology at Northwestern, point out flawed assumptions about consciousness to suggest that a wide range of scientific perspectives can offer useful clues about consciousness.
“It’s normal to think that if you attentively inspect something you must be aware of it and that analyzing it to a high level would necessitate consciousness,” Suzuki noted. “Results from experiments on perception belie these assumptions.
“Likewise, it feels like we can freely decide at a precise moment, when actually the process of deciding begins earlier, via neurocognitive processing that does not enter awareness,” he said.
The authors write that unconscious processing can influence our conscious decisions in ways we never suspect.
If these and other similar assumptions are incorrect, the researchers state in their article, then mistaken reasoning might be behind arguments for taking the science of consciousness off the table.
“Neuroscientists sometimes argue that we must focus on understanding other aspects of brain function, because consciousness is never going to be understood,” Paller said. “On the other hand, many neuroscientists are actively engaged in probing the neural basis of consciousness, and, in many ways, this is less of a taboo area of research than it used to be.”
Experimental evidence has supported some theories about consciousness that appeal to specific types of neural communication, which can be described in neural terms or more abstractly in computational terms. Further theoretical advances can be expected if specific measures of neural activity can be brought to bear on these ideas.
Paller and Suzuki both conduct research that touches on consciousness. Suzuki studies perception, and Paller studies memory. They said it was important for them to write the article to counter the view that it is hopeless to ever make progress through scientific research on this topic.
They outlined recent advances that provide reason to be optimistic about future scientific inquiries into consciousness and about the benefits that this knowledge could bring for society.
“For example, continuing research on the brain basis of consciousness could inform our concerns about human rights, help us explain and treat diseases that impinge on consciousness, and help us perpetuate environments and technologies that optimally contribute to the well being of individuals and of our society,” the authors wrote.
They conclude that research on human consciousness belongs within the purview of science, despite philosophical or religious arguments to the contrary.
Their paper, “The Source of Consciousness,” has been published online in the journal Trends in Cognitive Sciences.
July 11, 2014
Researchers from Salk Institute for Biological Studies, BGI, and other institutes for the first time evaluated the safety and reliability of the existing targeted gene correction technologies, and successfully developed a new method, TALEN-HDAdV, which could significantly increased gene-correction efficiency in human induced pluripotent stem cell (hiPSC). This study published online in Cell Stell Cell provides an important theoretical foundation for stem cell-based gene therapy.
The combination of stem cells and targeted genome editing technology provides a powerful tool to model human diseases and develop potential cell replacement therapy. Although the utility of genome editing has been extensively documented, but the impact of these technologies on mutational load at the whole-genome level remains unclear.
In the study, researchers performed whole-genome sequencing to evaluate the mutational load at single-base resolution in individual gene-corrected hiPSC clones in three different disease models, including Hutchinson-Gilford progeria syndrome (HGPS), sickle cell disease (SCD), and Parkinson’s disease (PD).
They evaluated the efficiencies of gene-targeting and gene-correction at the haemoglobin gene HBB locus with TALEN, HDAdV, CRISPR/CAS9 nuclease, and found the TALENs, HDAdVs and CRISPR/CAS9 mediated gene-correction methods have a similar efficiency at the gene HBB locus. In addition, the results of deep whole-genome sequencing indicated that TALEN and HDAdV could keep the patient’s genome integrated at a maximum level, proving the safety and reliability of these methods.
Through integrating the advantages of TALEN- and HDAdV-mediated genome editing, researchers developed a new TALEN-HDAdV hybrid vector (talHDAdV), which can significantly increase the gene-correction efficiency in hiPSCs. Almost all the genetic mutations at the gene HBB locus can be detected by telHDAdV, which allows this new developed technology can be applied into the gene repair of different kinds of hemoglobin diseases such as SCD and Thalassemia.