February 17, 2014
A new study finds that football helmets currently used on the field may do little to protect against hits to the side of the head, or rotational force, an often dangerous source of brain injury and encephalopathy. The study released today will be presented at the American Academy of Neurology’s 66th Annual Meeting in Philadelphia, April 26 to May 3, 2014.
“Protection against concussion and complications of brain injury is especially important for young players, including elementary and middle school, high school and college athletes, whose still-developing brains are more susceptible to the lasting effects of trauma,” said study co- author Frank Conidi, MD, DO, MS, director of the Florida Center for Headache and Sports Neurology and Assistant Clinical Professor of Neurology at Florida State University College of Medicine in Port Saint Lucie, Fla. Conidi is also the vice chair of the American Academy of Neurology’s Sports Neurology Section.
For the study, researchers modified the standard drop test system, approved by the National Operating Committee on Standards for Athletic Equipment, that tests impacts and helmet safety. The researchers used a crash test dummy head and neck to simulate impact. Sensors were also placed in the dummy’s head to measure linear and rotational responses to repeated 12 mile-per-hour impacts. The scientists conducted 330 tests to measure how well 10 popular football helmet designs protected against traumatic brain injury, including: Adams a2000, Rawlings Quantum, Riddell 360, Riddell Revolution, Riddell Revolution Speed, Riddell VSR4, Schutt Air Advantage, Schutt DNA Pro+, Xenith X1 and Xenith X2.
The study found that football helmets on average reduced the risk of traumatic brain injury by only 20 percent compared to not wearing a helmet. Of the 10 helmet brands tested, the Adams a2000 provided the best protection against concussion and the Schutt Air Advantage the worst. Overall, the Riddell 360 provided the most protection against closed head injury and the Adams a2000 the least, despite rating the best against concussion.
“Alarmingly, those that offered the least protection are among the most popular on the field,” said Conidi. “Biomechanics researchers have long understood that rotational forces, not linear forces, are responsible for serious brain damage including concussion, brain injury complications and brain bleeds. Yet generations of football and other sports participants have been under the assumption that their brains are protected by their investment in headwear protection.”
The study found that football helmets provided protection from linear impacts, or those leading to bruising and skull fracture. Compared to tests using dummies with no helmets, leading football helmets reduced the risk of skull fracture by 60 to 70 percent and reduced the risk of focal brain tissue bruising by 70 to 80 percent.
The study was supported by BRAINS, Inc., a research and development company based in San Antonio, Fla., focused on biomechanics of traumatic brain injury.
February 12, 2014
Timing of dart release or hand position may improve dart throwing accuracy, according to a study published in PLOS ONE on February 12, 2014 by Daiki Nasu from Osaka University, Japan and colleagues.
Two major strategies are attributed to accurate throwing: timing the object release, and the using hand positioning at release to compensate for releasing the object at variable times. To better understand these strategies, researchers investigated whether expert dart players utilize hand movement that can compensate for the variability in their release timing. The study compared the timing variability and hand movement of 8 expert players with those of 8 novices as they threw a dart 60 times, aiming at the bull’s eye. The movements of the dart and index finger were captured using seven cameras and analyzed.
The results revealed two strategies in the expert group. The timing variability of some experts was similar to that of novices, but these experts had a longer window of time in which to release an accurately thrown dart. These subjects selected hand movements that could compensate for the timing variability. Other experts did not use these hand movements, but rather reduced the variability in timing of the dart’s release. The authors indicate that both strategies can equally achieve consistent throwing.
Full article here: http://dx.plos.org/10.1371/journal.pone.0088536
February 4, 2014
The pomp. The pageantry. The exciting wins and devastating losses. Unbelievable feats of athleticism and sheer determination. That’s right – it’s time for the winter Olympics in Sochi, Russia. Everyone has their picks for who will take gold medals and we’re likely to see some unexpected upsets.
But there are certain athletes that may have a leg up on everyone else: the Russians.
In a new article, psychological scientists Mark S. Allen of London South Bank University and Marc V. Jones of Staffordshire University review the existing research on sports and athletic competition and find that there is scientific support for the idea of a “home field advantage.”
Their review is published in the February 2014 issue of Current Directions in Psychological Science, a journal of the Association for Psychological Science.
Allen and Jones investigate two different models that have been proposed to account for the apparent advantage of playing on home turf: the standard model and the territoriality model.
The standard model includes several factors that can influence the psychological states of competitors, coaches, and officials, ultimately impacting their behavior in ways that tend to favor home athletes.
Research shows, for example, that larger home crowds that show encouraging behavior, like cheering, are linked with home-team success. Crowd noise may even impact the kinds of decisions that officials make: When the home crowd is noisy, officials are more likely to make discretionary decisions (such as awarding extra time) that favor the home team and dole out harsher punishments (such as warnings) for the away team.
And findings suggest that the home advantage remains even when there is no audience. This may be due, at least in part, to travel fatigue suffered by the away team – one study indicates that the home advantage increases by as much as 20% with every time zone the away team must cross.
The territoriality model, on the other hand, specifically frames the home advantage as a reflection of players’ natural tendency to defend their home turf.
One study, for example, found that soccer players showed significantly higher testosterone levels before home games than before away games and neutral training sessions. And additional research suggests that increased testosterone may benefit athletic performance through physical aggression and motivation to compete, though the relationship between testosterone and performance needs to be further investigated in the context of competitive sport.
But, as Allen and Jones point out, playing at home may come with certain disadvantages, as well.
Research indicates that cortisol, a stress hormone, is higher when performing at home, adding to self-report data that athletes feel increased pressure to succeed in front of their own fans. Studies show that in high-pressure, high-importance situations, athletes may shift their attention in an effort to control typically automatic movements. This conscious control often leads to worse performance, a phenomenon commonly referred to as “choking.”
Each of the home-field advantage models has evidence to support their main premises, but it’s still unclear how, or whether, they fit together. And more work is needed to understand the specific psychological mechanisms that drive behavior, attention, and stress responses.
Such work “would elucidate under what circumstances, and how, competing at home can enhance (and occasionally harm) athlete and team performance,” Allen and Jones conclude.
So, will the Russians feed off the energy of their home crowd and rack up the medals? Or will they suffer from the pressure of having to live up to the expectations of their countrymen? We’ll have to watch to find out!
July 9, 2013
Max Scherzer leads Major League Baseball in wins. As a pitcher for the Detroit Tigers, he hasn’t lost a game this season.
His 6-foot, 3-inch frame is a telling example of constructal-law theory, said Duke University engineer Adrian Bejan. The theory predicts that elite pitchers will continue to be taller and thus throw faster and seems also to apply to athletes who compete in golf, hockey and boxing.
Studying athletes — since most sports are meticulous in keeping statistics — provides an insight into the biological evolution of human design in nature, which Bejan terms the constructal-law theory.
Bejan has already demonstrated that runners and swimmers have gotten bigger and taller over the past century. Now he’s applying his theories to other sports, including team sports. In those cases, forward momentum was a major factor in the athletes’ successes.
What unites golf, baseball and hockey is the “falling forward” motion involved, whether it is a pitcher’s arm or golfer’s swing. Basically, the larger and taller the athlete, the more force he or she can bring to bear as his or her mass falls forward, Bejan said.
The results of his analyses were published online in the International Journal of Design & Nature and Ecodynamics.
“Our analysis shows that the constructal-law theory of sports evolution predicts and unites not only speed running and speed swimming, but also the sports where speed is needed for throwing a mass, ball or fist,” Bejan said. “The sports of baseball, golf, hockey and boxing bring both the team and the individual sports under the predictive reach of the constructal theory of sports evolution.”
The falling forward idea states that the larger and taller the individual, the more force can be applied as the ball is hurled forward. For example, former major leaguer Randy Johnson, a 6-foot, 10-inch pitcher, was a terror to batters during his career, notching two no-hitters, five Cy Young awards for best pitcher and the record for strikeouts by a lefthander.
“According to the constructal law predictions, the larger and taller machine, like medieval trebuchets, is capable of hurling a large mass farther and faster,” Bejan said. “The other players on the baseball field do not have to throw a ball as fast, so they tend to be shorter than pitchers, but they too evolve toward more height over time. For pitchers, in particular, height means speed.”
In golf, despite the advances in ball and club design, taller competitors have been driving the ball farther than shorter golfers. In 2010, Bejan found the average golfer in the top 10 of driving distance was on average 2.5 inches taller than the average golfer in the bottom 10 of driving distance.
“This shows that height plays a definite role in the success of an athlete in golf,” Bejan said. “The increase in driving distance with body mass is due to the fact that larger moving bodies are capable of exerting greater forces. Also, the increased size of clubheads has had a distinct affect on the game. The average driving distance on the Professional Golfers Association (PGA) tour has risen 30 yards in the past 30 years.”
The same reasoning also applies to sports equipment, such as golf clubs and hockey sticks. Just as golf clubs have become lighter and more flexible to increase speed of swing, and thus distance, so have hockey sticks, Bejan said.
In terms of boxing, Bejan notes similar trends, even though boxers are classified and compete in specific weight classes. While height and arm reach help boxers, they cannot be too tall, because then they lose core strength, which lessens the falling forward force that powers the punches.
“We looked at the 25 greatest fighters in the lightweight and welterweight classes and found that these boxers have been able to maximize punching power by gaining size without going over weight limits,” Bejan said. “They have done this by adding muscle and cutting water weight before a fight, and these techniques over time provide an explanation for the improvement in boxers’ size and knockout rates.”
The work of Bejan’s group was performed during the course “Constructal Theory and Design,” developed at Duke with the support of the National Science Foundation. Other members of the team were Duke’s Sylvie Lorente, James Royce, Dave Faurie, Tripp Parran, Michael Black and Brian Ash.
June 20, 2013
Differences between Martian meteorites and rocks examined by a NASA rover can be explained if Mars had an oxygen-rich atmosphere 4000 million years ago – well before the rise of atmospheric oxygen on Earth 2500m years ago.
Scientists from Oxford University investigated the compositions of Martian meteorites found on Earth and data from NASA’s ‘Spirit’ rover that examined surface rocks in the Gusev crater on Mars. The fact that the surface rocks are five times richer in nickel than the meteorites was puzzling and had cast doubt on whether the meteorites are typical volcanic products of the red planet.
‘What we have shown is that both meteorites and surface volcanic rocks are consistent with similar origins in the deep interior of Mars but that the surface rocks come from a more oxygen-rich environment, probably caused by recycling of oxygen-rich materials into the interior,’ said Professor Bernard Wood, of Oxford University’s Department of Earth Sciences, who led the research reported in this week’s Nature.
‘This result is surprising because while the meteorites are geologically ‘young’, around 180 million to 1400 million years old, the Spirit rover was analysing a very old part of Mars, more than 3700 million years old.’
Whilst it is possible that the geological composition of Mars varies immensely from region to region the researchers believe that it is more likely that the differences arise through a process known as subduction – in which material is recycled into the interior. They suggest that the Martian surface was oxidised very early in the history of the planet and that, through subduction, this oxygen-rich material was drawn into the shallow interior and recycled back to the surface during eruptions 4000 million years ago. The meteorites, by contrast, are much younger volcanic rocks that emerged from deeper within the planet and so were less influenced by this process.
Professor Wood said: ‘The implication is that Mars had an oxygen-rich atmosphere at a time, about 4000 million years ago, well before the rise of atmospheric oxygen on earth around 2500 million years ago. As oxidation is what gives Mars its distinctive colour it is likely that the ‘red planet’ was wet, warm and rusty billions of years before Earth’s atmosphere became oxygen rich.’
The research was supported by the Science and Technology Facilities Council and the European Research Council.
June 15, 2013
If you are healthy and plan to start running for the first time, it is perfectly all right to put on a pair of completely ordinary ‘neutral’ running shoes without any special support. Even though your feet overpronate when you run – i.e. roll inwards.
There appears to be no risk that overpronation or underpronation can lead to running injuries through using neutral shoes for this special group of healthy beginners.
This is the result of a study conducted at Aarhus University which has just been published in theBritish Journal of Sports Medicine under the title “Foot pronation is not associated with increased injury risk in novice runners wearing a neutral shoe”.
Healthy runners monitored for 12 months
Researchers have followed 927 healthy novice runners with different pronation types for a full year. All study participants received the same model of neutral running shoe, regardless of whether they had neutral foot pronation or not. During the study period, 252 people suffered an injury, and the runners ran a total of 163,401 km.
“We have now compared runners with neutral foot pronation with the runners who pronate to varying degrees, and our findings suggest that overpronating runners do not have a higher risk of injury than anyone else,” says physiotherapist and PhD student Rasmus Ø. Nielsen from Aarhus University, who has conducted the study together with a team of researchers from Aarhus University, Aarhus University Hospital, Aalborg University Hospital and the Netherlands.
“This is a controversial finding as it has been assumed for many years that it is injurious to run in shoes without the necessary support if you over/underpronate,” he says. Rasmus Ø. Nielsen emphasises that the study has not looked at what happens when you run in a pair of non-neutral shoes, and what runners should consider with respect to pronation and choice of shoe once they have already suffered a running injury.
Focus on other risk factors
The researchers are now predicting that in future we will stop regarding foot pronation as a major risk factor in connection with running injuries among healthy novice runners.
Instead, they suggest that beginners should consider other factors such as overweight, training volume and old injuries to avoid running injuries.
“However, we still need to research the extent to which feet with extreme pronation are subject to a greater risk of running injury than feet with normal pronation,” says Rasmus Ø. Nielsen.
Three key results
In the British Journal of Sports Medicine, the researchers point to three key results:
- The study contradicts the current assumption that over/underpronation in the foot leads to an increased risk of running injury if you run in a neutral pair of running shoes.
- The study shows that the risk of injury was the same for runners after the first 250 km, irrespective of their pronation type.
- The study shows that the number of injuries per 1,000 km of running was significantly lower among runners who over/underpronate than among those with neutral foot pronation.
The project has been conducted as a collaboration between PhD student Rasmus Nielsen, Associate Professor Henrik Sørensen, Associate Professor Ellen Aagaard Nøhr and Professor Erik Parner from the Department of Public Health at Aarhus University, Professor Martin Lind from the sports clinic at Aarhus University Hospital, Director of Research and Associate Professor of Orthopaedic Surgery at Aalborg University Hospital Sten Rasmussen, and researchers from the Netherlands.
The project is financed by Aarhus University, the Orthopaedic Research Unit at Aalborg University Hospital and the Danish Rheumatism Association.
May 22, 2013
Most elite athletes consider doping substances “are effective” in improving performance, while recognising that they constitute cheating, can endanger health and entail the obvious risk of sanction. At the same time, the reasons why athletes start to take doping substances are to achieve athletic success, improve performance, for financial gain, to improve recovery and to prevent nutritional deficiencies, as well as “because other athletes also use them”.
These are some of the conclusions of a study conducted by researchers from the Department of Physical and Sports Education at the University of Granada. Their research has also shown a widespread belief among elite athletes that the fight against doping is inefficient and biased, and that the sanctions imposed “are not severe enough”.
In an article in the journal “Sports Medicine“, the most important publication in the field of Sport Sciences, researchers Mikel Zabala and Jaime Morente-Sánchez have analysed the attitudes, beliefs and knowledge about doping of elite athletes from all over the world. To this end, they conducted a literature review of 33 studies on the subject published between 2000 and 2011, in order to analyse the current situation and, as a result of this, to act by developing specific, efficient anti-doping strategies.
Fewer controls in team-based sports
The results of the University of Granada study reveal that athletes participating in team-based sports appear to be less susceptible to using doping substances. However, the authors stress that in team sports anti-doping controls are clearly both quantitatively and qualitatively less exhaustive.
The study indicates that coaches seem to be the principle influence and source of information for athletes when it comes to starting or not starting to take banned substances, while doctors and other specialists are less involved. Athletes are becoming increasingly familiar with anti-doping rules, but there is still a lack of knowledge about the problems entailed in using banned substances and methods, which the researchers believe should be remedied through appropriate educational programmes.
Moreover, they also conclude that a substantial lack of information exists among elite athletes about dietary supplements and the secondary effects of performance-enhancing substances.
In the light of their results, the University of Granada researchers consider it necessary to plan and conduct information and prevention campaigns to influence athletes’ attitudes towards doping and the culture surrounding this banned practice. “We should not just dedicate money almost exclusively to performing anti-doping tests, as we currently do. To improve the situation, it would be enough to designate at least a small part of this budget to educational and prevention programmes that encourage athletes to reject the use of banned substances and methods”, Mikel Zabala and Jaime Morente-Sánchez conclude. In this context, one pioneering example in their opinion is the Spanish Cycling Federation’s “Preventing to win” project.
Doping in Sport: A Review of Elite Athletes’ Attitudes, Beliefs, and Knowledge.
Morente-Sánchez J, Zabala M.
Sports Medicine. 2013 Mar 27.
December 14, 2012
Olympic medallists live longer than the general population, regardless of country of origin, medal won, or type of sport played, finds a study in the Christmas issue published on bmj.com today.
A second study comparing athletes who trained at different physical intensities, found that those from high or moderate intensity sports have no added survival benefit over athletes from low intensity sports. But those who engage in disciplines with high levels of physical contact, such as boxing, rugby and ice hockey, are at an increased risk of death in later life, the data show.
An accompanying editorial adds that everyone could enjoy the “survival advantage” of elite athletes by just meeting physical activity guidelines.
In the first study, researchers compared life expectancy among 15,174 Olympic athletes who won medals between 1896 and 2010 with general population groups matched by country, sex, and age.
All medallists lived an average of 2.8 years longer – a significant survival advantage over the general population in eight out of the nine country groups studied.
Gold, silver and bronze medallists enjoyed roughly the same survival advantage, as did medallists in both endurance and mixed sports. Medallists in power sports had a smaller, but still significant, advantage over the general population.
The authors say that, although their study was not designed to determine why Olympic athletes live longer, “possible explanations include genetic factors, physical activity, healthy lifestyle, and the wealth and status that come from international sporting glory.”
In the second study, researchers measured the effect of high intensity exercise on mortality later in life among former Olympic athletes.
They tracked 9,889 athletes with a known age at death, who took part in at least one Olympic Games between 1896 and 1936. Together they represented 43 disciplines requiring different levels of exercise intensity and physical contact.
After adjusting for sex, year of birth and nationality, they found that athletes from sports with high cardiovascular intensity (such as cycling and rowing) or moderate cardiovascular intensity (such as gymnastics and tennis) had similar mortality rates compared with athletes from low cardiovascular intensity sports, such as golf or cricket.
However, the researchers did find an 11% increased risk of mortality among athletes from disciplines with a high risk of body collision and with high levels of physical contact, such as boxing, rugby and ice hockey, compared with other athletes. They suggest this reflects the impact of repeated collisions and injuries over time.
In an accompanying editorial, two public health experts point out that people who do at least 150 minutes a week of moderate to vigorous intensity physical activity also have a survival advantage compared with the inactive general population. Estimates range from just under a year to several years.
But they argue that, compared with the successes that have been achieved in tobacco control, “our inability to improve physical activity is a public health failure, and it is not yet taken seriously enough by many in government and in the medical establishment.”
“Although the evidence points to a small survival effect of being an Olympian, careful reflection suggests that similar health benefits and longevity could be achieved by all of us through regular physical activity. We could and should all award ourselves that personal gold medal,” they conclude.
August 25, 2012
College football exploits players in an “invisible labor market,” and the only plausible way for student-athletes to address their interests is the credible threat of unionization, according to research from a University of Illinois expert in labor relations and collective bargaining in athletics.
Since traditional collective bargaining is impractical for student-athletes, an “invisible union,” derived from what labor scholars call the “union substitution effect,” could be a viable way to circumnavigate the amateur-professional boundary that has become increasingly blurry in the multi-billion-dollar sport, says Michael LeRoy, a professor of law and of labor and employment relations at Illinois.
“College football players participate in an invisible labor market, meaning that the NCAA monopolizes their services by strictly limiting and allocating the labor force needed to play competitive games,” he said. “So without a credible threat of unionization by student-athletes, the NCAA has no reason to confront the fact that it is professionalizing college football.”
Although the NCAA’s contractual relationship with student-athletes provide grant-in-aid scholarships, it’s also a model premised on the belief that players are amateurs – a view that’s hard to square with the heavy commercialization of NCAA football, including a new championship series that will generate a new and immense revenue stream, LeRoy says.
“While schools reap billions of dollars from TV and licensing agreements, championship tournaments, bowl games and ticket sales, players rarely receive enough aid to pay in full the cost of attending school,” he said. “And when TV deals coordinate NCAA and NFL schedules from August through January to minimize competition and maximize revenues, it is also hard not to conclude that Division I college football players are in the same product market as their professional counterparts.”
While the NCAA recently attempted to adopt reforms that would address some of the problems identified in LeRoy’s study, its board of trustees ultimately quashed any efforts to implement change.
“What that means is that players have no voice in their welfare,” LeRoy said. “These Saturday heroes are solely dependent on a monopoly to enact regulations for their welfare. This is the impetus for proposing collective bargaining for college football players.”
According to the study, college football players have a right to collective bargaining because they function like employees.
“Student-athletes generate great wealth for institutions but share in very little of it,” LeRoy said. “Additionally, they are subject to non-negotiable, one-sided agreements imposed by a monopoly. They receive less than a four-year scholarship; pay out-of-pocket or borrow money for scholarship shortfalls; and are penalized for transferring to other schools. They also happen to play in a violent sport that causes serious injury – on rare occasions, death – but are usually disqualified from worker’s compensation and are uninsured for long-term medical disabilities. And, of course, many student-athletes exhaust their eligibility without earning a degree.”
LeRoy’s study proposes a unique and limited form of collective bargaining customized for college football, one that does not involve wage negotiations or strikes, but draws from existing labor laws for public safety employees that prohibit strikes but allow final offer arbitration on a limited range of bargaining subjects.
“My concept stems from the fact that NCAA football differs significantly from the NFL model because college football players are bona fide student-athletes who must adhere to the tenets of amateurism,” he said.
According to the study, the mere threat of college football player unionizing would produce a “union substitution effect,” whereby employers respond to credible threats of collective action by providing participants with more of a say in their interests and better financial treatment.
“The status quo is ripe for change because the NCAA is an immense monopoly that generates new revenue quickly but reforms itself slowly,” he said. “An invisible union is a plausible middle-ground approach to address the interests of student-athletes. Without a credible threat of unionization, schools have little incentive to concede that they are essentially professionalizing college football.”
February 4, 2012
The U.S. may have its first black president and the Fortune 500 its first black female chief executive, but African American CEOs account for a mere one percent of the chiefs of those 500 largest companies. Andrew Carton, assistant professor of management and organization at Penn State Smeal College of Business, and Ashleigh Shelby Rosette of Duke University, suggest in the current issue of the Academy of Management Journal that what steers people’s perceptions of African Americans are stereotypes about blacks’ leadership failings, biases that may not even be conscious. The researchers found evidence of this phenomenon in a source seemingly remote from the corporate world — newspaper stories about college football quarterbacks. Buried in those press reports is a consistent pattern of associating losses with failed leadership when quarterbacks are black but not when they are white, and associating victories with quarterbacks’ native athletic ability when they are black but not when they are white. “Evaluators adjust the way they use stereotypes according to performance outcomes,” the researchers report. “Specifically, negative leader-based stereotypes will be applied after [a black quarterback's] performance failure and non-leader compensatory stereotypes (i.e., black leaders succeed because of marginal qualities that ‘compensate’ for negative qualities) will be applied after performance success.” This stereotyping, Carton and Rosette observe, “may provide an important missing link in our understanding of bias against black leaders and may serve as an important contributor to barriers that impede the advancement of black leaders in organizations.” The study owed its genesis in part to Carton’s own experience as a member of his college’s varsity football team. “I became aware of certain racial biases, and when I later enrolled as a graduate student at Duke, I mentioned my experience to Professor Rosette, whose research included bias in the workplace. Quarterbacks are a good focus for any research on leadership, because they have an executive role on the field that is unique in sports. The researchers analyzed newspaper reports over the course of a season for 119 teams in the Football Championship Subdivision, the highest level of competition in college football. They randomly sampled one story a week from the leading newspaper of each school’s locale, and coders unaware of the nature of the study were assigned to extract words or phrases that evaluated the quarterback and his performance — for example, where reporters cited a quarterback for “intelligence” or for being “fleet-footed.” Evaluative text was identified for 113 quarterbacks, 82 white and 31 black. Analysis focused particularly on text that conveyed competence or incompetence and athleticism or its lack, the former two intimately related to leadership. Of special interest was how writers accounted for teams’ success in view of this presumption of black incompetence and whether they accounted for success or failure differently depending on quarterbacks’ race. “Black quarterbacks were perceived to be significantly more incompetent than whites when their respective teams lost, but this difference was not found when their respective teams won,” the researchers said. For example, black quarterbacks of defeated teams were more likely than defeated white quarterbacks to be tasked by reporters for making bad decisions under pressure. To help rule out explanations other than bias for the difference in reporters’ perceptions of incompetence, the researchers looked for intellectual or scholastic factors. Neither the academic ratings of the colleges quarterbacks attended nor their grade point averages from high school were significantly associated with these perceptions. Carton and Rosette say that one way to combat corporate CEO biases is for companies to institute “perception-based reform.” This might involve fostering one-on-one or small-group interactions that can serve to enhance people’s awareness of each other as individuals and not stereotypes. The researchers also suggest that black leaders themselves can make their colleagues and subordinates more aware of their qualifications and experience, and of biases caused by stereotyping.