Tech

Women more likely to take Bible literally, but that may be tied to intimacy, not gender

Women are more likely than men to believe the Bible is literally true, but a recent Baylor University study finds this may have more to do with how people relate to God than it does gender. Both men and women who report high levels of closeness to God take the Bible more literally - and this confidence grows stronger as they seek intimacy with God through prayer and Bible study.

The study -- "To Know and Be Known: An Intimacy-Based Explanation for the Gender Gap in Biblical Literalism" -- is published in the Journal for the Scientific Study of Religion.

"While previous research has shown that U.S. women are more likely to report biblical literalism than men, our study provides an explanation as to why those gender differences may exist," said study co-author Christopher M. Pieper, Ph.D., senior lecturer and undergraduate program director of sociology in Baylor's College of Arts & Sciences.

For the study, researchers analyzed data from 1,394 respondents in the national Baylor Religion Survey's third wave. The Baylor Religion Survey is the most extensive study of religion ever conducted into American religious attitudes, behaviors and beliefs, done with initial funding from the John Templeton Foundation and a partnership with the Gallup Organization.

Respondents were asked, "Which statement comes closest to your personal beliefs about the Bible?" and chose one of these responses:

"The Bible means exactly it says. It should be taken literally, word-for-word, on all subjects."

"The Bible is perfectly true, but it should not be taken literally, word-for-word. We must interpret its meaning."

"The Bible contains some human error."

"The Bible is an ancient book of history and legends."

"I don't know."

Respondents also were asked to respond to each of these items about their attachment to God:

"I have a warm relationship with God."

"God knows when I need support."

"I feel that God is generally responsive to me."

"God feels impersonal to me."

"God seems to have little or no interest in my personal problems."

"God seems to have little or no interest in my personal affairs."

Several questions on proximity-seeking -- likened to the behavior of children who seek to build attachments with their primary caregiver-- were asked about respondents' religious proximity-seeking behaviors:

"How often do you attend religious services at a place of worship?"

"Outside of attending religious services, about how often do you spend time alone reading the Bible?"

"About how often do you spend time alone praying outside of religious services?"

Pieper and co-author Blake Victor Kent, Ph.D., now a research fellow at Harvard Medical School and a former Baylor sociologist, noted that men are not inherently less capable of intimacy. Rather, men may seek lower intimacy in their relationships because of how boys are socialized through parents, peers and other cultural mechanisms. Boys, for example, are often taught to mask feelings of vulnerability, a key component of intimacy. At the same time, religious historians note that in the last century and a half, large swaths of U.S. Christianity have moved in a "sentimental" direction, making it difficult for some men to fully participate in emotion-laden religious worship and practices.

"Although this trend is changing, many men are still brought up to be wary of emotions and feeling vulnerable," Pieper said. "And some religious communities, particularly those with literalist cultures, put a high value on this feeling of intimacy, not only with God but with each other." However, he said, "Women are more often socialized to experience deep emotional energy when engaging intimately with God, so taking the Bible more literally makes sense because that way God is more like a person, someone you can talk to and who also talks back."

The authors noted one important exception in their data, in which a small sub-group reported increased intimacy with God despite a less literal view of the Bible. As for the reason, "we can only speculate, but a plausible account is that these respondents may have held a literalist view, later rejecting it after experiencing conflict with a fundamentalist or conservative religious background," Kent said. "Changing views on the moral acceptability of homosexuality, for example, might create discrepancies . . . These believers may come to find that interpreting the Bible -- rather than taking it literally -- is more compatible with pursuing an intimate relationship with God."

This group may be similar to what have been termed "exiles" - those who grew up in the church and are now disconnected from it physically, but nevertheless remain energized in their personal beliefs, the researchers said.

Further study could be valuable in exploring the tie between literalism and attachment to God, the researchers said.

"While this study highlights an important link between emotional attachments and literalism, it is possible some still wrestle with the Bible intellectually prior to deepening their faith," Pieper said. "Famous examples of this can be seen, such as (British writer) C.S. Lewis or (physician geneticist) Francis Collins. This path is probably not the norm, but some people work to trust the claims of the faith intellectually prior to engaging in an intimate relationship with God."

Credit: 
Baylor University

Want to learn about dinosaurs? Pick up some Louisiana roadkill

image: The researchers salvaging tissue from a roadkill armadillo.

Image: 
Photo courtesy of Tom Cullen

Fossil-hunting can be grueling, but it's usually not gross. Paleontologists typically work with things that have been dead for millions of years, mineralized into rock and no longer smelly. At the end of a day in the field, the researchers just have to dust themselves off and wash muddy boots and sweaty clothes. But for a new study delving into the ecosystems that dinosaurs lived in, a team of paleontologists found themselves scraping swamp rabbits and armadillos off the Louisiana highway.

"We want to know what the dinosaurs ate and what their habitats were like, but before we can do that, we need to understand what their environment was like. We need to look at similar environments today," says Thomas Cullen, the lead author of the study in Royal Society Open Science and a postdoctoral researcher at Chicago's Field Museum. More specifically, Cullen and his team needed to look at the remains of animals living in those similar environments today. Even more specifically, they needed to look at the chemical makeup of those animals' remains.

"You are what you eat, more or less," explains Cullen, who completed his doctorate at the University of Toronto and the Royal Ontario Museum, under the supervision of co-author David Evans. "There are naturally occurring variants of elements, called stable isotopes--versions of elements' atoms that are lighter or heavier depending on how many neutrons they contain. Different plants contain different relative amounts of heavy or light stable isotopes for particular elements, such as carbon and oxygen, and when animals eat those plants, or eat other animals that eat those plants, they incorporate those isotopes into their tissues."

And those isotopes don't necessarily go away when an animal dies, or even when it's fossilized. Stable isotopes preserved in the bones or teeth of animals can stay intact for tens of millions of years, meaning that scientists could analyze the isotopes present in a fossil and get some sense of what that animal ate and where it fell in the food chain. Isotopes could ostensibly be a window into dinosaurs' lives.

There's a potential flaw in this premise, though--there's a lot that scientists don't know about the way dinosaurs and other extinct animals partitioned up habitat and resources in their ecosystems, and it's not clear how well this kind of isotopic analysis, often used for mammals living in drier forest or plains settings, would be able to pinpoint the diets of a mix of different animal groups living in a swampy, fragmented landscape. Cullen and his team's solution was to check the analytical method against a modern-day ecosystem similar to the ones that dinosaurs lived in, to see if the isotopic analysis could correctly predict the diets of modern animals.

"We wanted to check if the stable isotope methods that we typically have available really work for dinosaur paleontology, and particularly for characterizing ancient wetland ecosystems. So, we checked those methods in a modern coastal floodplain forest ecosystem where we already roughly know the right answers," says Cullen. "It was a due diligence exercise, and it involved playing with a lot of roadkill."

The team headed to the Atchafalaya River Basin, the nation's largest swamp. It's smack dab in the middle of Louisiana, and roads cutting through the swamps and bayous make for a steady supply of roadkill. "Originally, we had planned a trip down to the area to meet up with people from the Louisiana Department of Wildlife and Fisheries, researchers and collections staff at local universities and museums, and a local naturalist. But the Fish and Wildlife people were kind enough to grant us a special use permit to collect samples, so we spent a few days driving around and collecting roadkill in between meetings," says Cullen. "We didn't really expect it to be too successful, but it actually ended up being super effective. We collected something like 40-50 samples representing about 15 species in the first couple days of driving around." Dozens more followed, along with specimens provided by universities and museums, donated from local naturalists, and even including gar and alligator samples given by the Landry family, who star in the reality show Swamp People.

Collecting a hundred-odd roadkill specimens was a new challenge for the team, even though Cullen notes that "biologists have a reputation for constantly picking up dead things." "The first few were really gross," he recalls. "We found a raccoon that looked fine at first, but its fur was crawling with maggots, so we went to its mouth to pull out a tooth instead. The whole jaw slid out of the mouth. It was really revolting, but we got desensitized."

Once the tissue samples were back in the lab, the team analyzed them in a mass spectrometer, a machine that analyzes the atomic masses of elements present in a sample and provides ratios of heavy to light stable isotopes of the elements of interest. The results that came back were a little fuzzy--if the researchers hadn't gone in knowing what raccoons and gators eat, the analysis based on the isotopes wouldn't have been able to tell them precisely. "We found out that the system is not very cut and dry. Ideally, you'd be able to distinguish each animal by what it was eating and where in the habitat it was living. The problem is, out in that fragmented swampy land, animals have very diverse diets, so they don't segregate in nice little boxes," says Cullen. "With fossils, we can't go out and watch what an animal's eating or exactly what habitat it prefers, so we should be conservative with our interpretations from this sort of data."

With the resources currently available, conventional stable isotopes like carbon and oxygen might not be the perfect tool for piecing together the fine-scale diets and preferences of dinosaurs. However, Cullen notes, future research tracking changes in stable isotope patterns over longer time periods, and using isotopes that better track food web structure, may yield more promising results. "The conventional way of studying fossil stable isotope ecology , using carbon and oxygen, may not be as suitable for dinosaurs, or any animal, in coastal floodplain forests as we'd hoped," says Cullen. "Actually solving this problem may require the use of more exotic stable isotope systems that act the same way nitrogen does in soft tissues, by concentrating or depleting up a food chain, but that are able to preserve in hard tissues, and as a result fossilize."

When asked if the study's ultimate goal, of finding better ways to learn what dinosaurs ate, was worth pulling the teeth out of rotting raccoons in the Louisiana sun, Cullen said, "The geological record gives us a chance to understand how ecology evolves and changes over time and responds to things like climate change. By understanding ecology on as fine a scale as possible, in both the past and the present, we can make predictions about what will happen in the future." Besides, he notes, "It's always cool to think about what dinosaurs were doing."

Credit: 
Field Museum

Indecision under pressure

Chestnut Hill, Mass. (2/19/2019) - In the latest wrinkle to be discovered in cubic boron arsenide, the unusual material contradicts the traditional rules that govern heat conduction, according to a new report by Boston College researchers in today's edition of the journal Nature Communications.

Usually, when a material is compressed, it becomes a better conductor of heat. That was first found in studies about a century ago. In boron arsenide, the research team found that when the material is compressed the conductivity first improves and then deteriorates.

The explanation is based on an unusual competition between different processes that provide heat resistance, according to the co-authors Professor David Broido and Navaneetha K. Ravichandran, a post-doctoral fellow, of the Department of Physics at Boston College. This type of behavior has never been predicted or observed before.

The findings are consistent with the unconventional high thermal conductivity that Broido, a theoretical physicist, and colleagues have previously identified in cubic boron arsenide.

Ravichandran's calculations showed that upon compression, the material first conducts heat better, similar to most materials. But as compression increases, the ability of boron arsenide to conduct heat deteriorates, the co-authors write in the article, titled "Non-monotonic pressure dependence of the thermal conductivity of boron arsenide."

Such odd behavior stems from the unusual way in which heat is transported in boron arsenide, an electrically insulating crystal in which heat is carried by phonons - vibrations of the atoms making up the crystal, Broido said. "Resistance to the flow of heat in materials like boron arsenide is caused by collisions occurring among phonons," he added.

Quantum physics shows that these collisions occur between at least three phonons at a time, he said. For decades, it had been assumed that only collisions between three phonons were important, especially for good heat conductors.

Cubic boron arsenide is unusual in that most of the heat is transported by phonons that rarely collide in triplets, a feature predicted several years ago by Broido and collaborators, including Lucas Lindsay at Oak Ridge National Laboratory and Tom Reinecke of the Naval Research Lab.

In fact, collisions between three phonons are so infrequent in boron arsenide that those between four phonons, which had been expected to be negligible, compete to limit the transport of heat, as shown by other theorists, and by Broido and Ravichandran in earlier publications.

As a result of such rare collision processes among phonon triplets, cubic boron arsenide has turned out to be an excellent thermal conductor, as confirmed by recent measurements.

Drawing on these latest insights, Ravichandran and Broido have shown that by applying hydrostatic pressure, the competition between three-phonon and four-phonon collisions can, in fact, be modulated in the material.

"When boron arsenide is compressed, surprisingly, three-phonon collisions become more frequent, while four-phonon interactions become less frequent, causing the thermal conductivity to first increase and then decrease," Ravichandran said. "Such competing responses of three-phonon and four-phonon collisions to applied pressure has never been predicted or observed in any other material,".

The work of the theorists, supported by a Multi-University Research Initiative grant from the Office of Naval Research, is expected to be taken up by experimentalists to prove the concept, Broido said.

"This scientific prediction awaits confirmation from measurement, but the theoretical and computational approaches used have been demonstrated to be accurate from comparisons to measurements on many other materials, so we're confident that experiments will measure behavior similar to what we found." said Broido.

"More broadly, the theoretical approach we developed may also be useful for studies of the earth's lower mantle where very high temperatures and pressures can occur," said Ravichandran. "Since obtaining experimental data deep in the Earth is challenging, our predictive computational model can help give new insights into the nature of heat flow at the extreme temperature and pressure conditions that exist there."

Credit: 
Boston College

Altered brain activity patterns of parkinson's captured in mice

video: OIST scientists have gained new insight into the abnormal brain activity underlying Parkinson's disease. In a mouse model of the disease, the scientists observed that one group of neurons in a particular brain region--the striatum-fire in sync and dominate the overall activity of that region. When striatal neurons are stimulated continuously, they exhibit this abnormal behavior. By stimulating the neurons in precise pulses, the researchers observed that the neurons returned to their normal activity pattern, with groups of cells firing in turn.

Image: 
OIST

The tell-tale tremors of Parkinson's disease emerge from abnormal activity in a brain region crucial for voluntary movement. Using a mouse model of the disease, researchers at the Okinawa Institute of Science and Technology Graduate University (OIST) identified unusual patterns of brain activity that appear to underlie its signature symptoms.

Parkinson's disrupts the basal ganglia, a set of nuclei that relays information from the wrinkled cortex to brain areas important for movement control. A nucleus known as the striatum acts as the primary input hub for the entire structure. Marked by a steep decline in the chemical messenger dopamine and cells that make it, Parkinson's robs the basal ganglia of the tools it needs to function properly and pushes the striatum into pathological hyperactivity.

"When you take dopamine away, the cells reorganize, and that reorganization leads to most of the symptoms of Parkinson's," said Prof. Gordon Arbuthnott, senior author of the study and principal investigator of the OIST Brain Mechanism for Behaviour Unit. The research, published online on January 30, 2019 by the European Journal of Neuroscience, suggests that the striatal neurons' normal pattern of activity warps when the cells are starved of dopamine. The pattern becomes dominated by one particular subset of cells, often firing in sync. Mice with this pattern of brain activity turn in repetitive circles, which is typical in mouse models of Parkinson's.

The scientists went one step further: By fitting neurons with light-sensitive proteins, they were able to dial up the striatal activity of normal mice by exposing them to light. Remarkably, the neurons reacted differently to different patterns of light. Continuous light recreated the abnormal, synchronized pattern and caused mice to turn in one direction. But pulsed light excited fewer neurons, triggered more typical striatal activity, and actually caused mice to turn in the other direction.

"The fact that, when I changed the stimulation, the animal turned to the opposite side--that was shocking," said Dr. Omar Jaidar, first author of the study and a postdoctoral scholar in the Arbuthnott Unit at the time of the research. (Jaidar is now a postdoc with Prof. Jun Ding at Stanford University.) The observation suggests that the Parkinson's-like symptoms emerge due to the strong activation of many neurons in sync, unchecked by modulatory dopamine signals.

"You need a certain sequence of muscles to contract to execute any movement, and it's the same with [striatal] neurons," said Jaidar. Groups of striatal neurons tend to fire in sequence, splitting their work fairly evenly. If this pattern of activity isn't maintained, he said, the striatum cannot function normally. "People have to dig deeper into the sequence of neurons firing; it could be important to future therapies."

Challenging Old Models of Parkinson's

These results contradict existing models of Parkinson's disease, which focus on how the condition affects different types of neurons. Two types of cells, known as striatal output neurons, receive dopamine in the striatum and react to its signal. The first type, called D1 neurons, are thought to help initiate movements while D2 neurons suppress them. Normally, the cells work together in harmony, modulating movement in real-time like the gas pedal and brake in a car. Many models suggest that D2 neurons are overactive in Parkinson's and inhibit movement to the point of causing stiffness, tremors and even freezing.

But reality may not be so simple.

"Cells don't just have 'plus' and 'minus'--they have analog signals," said Arbuthnott. "And if you have a large group of them, the signal is even more complex." The new study demonstrated that both D1 and D2 neurons contribute equally to the abnormal brain activity seen in Parkinson's models. It's not a matter of D2 overpowering D1, but rather, the entire system warps in the absence of dopamine.

By understanding how patterns of activity change on a circuit level, scientists may be able to develop better interventions for Parkinson's. For instance, this line of research might demystify how deep brain stimulation helps quell parkinsonian symptoms. Is there a better way to electrically stimulate the brain and improve the quality of life for patients?

"The whole circuit is still there, intact," said Jaidar. "The only thing that's missing are those modulatory processes, and that's where therapies come in."

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

The invasive species are likely to spread to a community not adapted to climate change

Climate change increases environmental fluctuations and extreme weather phenomena. They generate many kinds of problems in the nature. A study conducted at the University of Jyvaskyla has shown that varying environmental conditions increase the potential for invasive species to spread.

Invasive species are species that spread to new areas as a result of human activity and can displace local species and be harmful in addition to nature also to forestry and agriculture.

Laboratory experiment indicated that alien species are most likely to invade under varying environmental conditions. The risk of invasions will increase further if the local species community has not adapted to the variation.

"It is very difficult to study and test species adaptation in natural populations, because adapting to different environments takes time. This is why we decided to use bacteria as a test species. Conditions during invasion are easily manipulated and we can also create strains of bacteria that have or have not adapted to environmental conditions. This versality of the system gives us possiblity to study different sorts of theoretical scenarios of invasions very efficiently. In the wild such phenomena are nearly impossible to study in detail" says academy research fellow Tarmo Ketola from the University of Jyvaskyla.

Although, experiments done in a lab are simplification of the nature, this experiment shows that increased variation can lead to increased problems with invasive species. Communities that are poorly adapted to the changes are more susceptible to the spread of invasive species, especially if their habitat changes strongly at the same time. Earlier research has emphasized the importance of adaptation of invasive species in dictating the spread to the new areas. This research is the first to show that it is also important how well the community is adapted to prevailing environmental conditions.

Credit: 
University of Jyväskylä - Jyväskylän yliopisto

Terahertz wireless makes big strides in paving the way to technological singularity

image: Medical AI and doctors at earth stations could remotely conduct a zero-gravity operation aboard a space plane connected via terahertz wireless links.

Image: 
©HIROSHIMA UNIVERSITY, NICT, PANASONIC, AND 123RF.COM

Hiroshima, Japan, February 19, 2019--Hiroshima University, National Institute of Information and Communications Technology, and Panasonic Corporation announced the successful development of a terahertz (THz) transceiver that can transmit or receive digital data at 80 gigabits per second (Gbit/s). The transceiver was implemented using silicon CMOS integrated circuit technology, which would have a great advantage for volume production. Details of the technology will be presented at the International Solid-State Circuits Conference (ISSCC) 2019 to be held from February 17 to February 21 in San Francisco, California [1].

The THz band is a new and vast frequency resource expected to be used for future ultrahigh-speed wireless communications. IEEE Standard 802.15.3d, published in October 2017, defines the use of the lower THz frequency range between 252 gigahertz (GHz) and 325 GHz (the "300-GHz band") as high-speed wireless communication channels. The research group has developed a single-chip transceiver that achieves a communication speed of 80 Gbit/s using the channel 66 defined by the Standard. The research group developed a 300-GHz-band transmitter chip capable of 105 Gbit/s [2] and a receiver chip capable of 32 Gbit/s [3] in the past few years. The group has now integrated a transmitter and a receiver into a single transceiver chip.

"We presented a CMOS transmitter that could do 105 Gbit/s in 2017, but the performance of receivers we developed, or anybody else did for that matter, were way behind [3] for a reason. We can use a technique called 'power combining' in transmitters for performance boosting, but the same technique cannot be applied to receivers. An ultrafast transmitter is useless unless an equally fast receiver is available. We have finally managed to bring the CMOS receiver performance close to 100 Gbit/s," said Prof. Minoru Fujishima, Graduate School of Advanced Sciences of Matter, Hiroshima University.

"People talk a lot about technological singularity these days. The main point of interest seems to be whether artificial superintelligence will appear. But a more meaningful question to ask myself as an engineer is how we can keep the ever-accelerating technological advancement going. That's a prerequisite. Advances in not only computational power but also in communication speed and capacity within and between computers are vitally important. You wouldn't want to have a zero-grav operation on board a space plane without real-time connection with earth stations staffed by medical super-AI and doctors. After all, singularity is a self-fulfilling prophecy. It's not something some genius out there will make happen all of a sudden. It will be a distant outcome of what we develop today and tomorrow," said Prof. Fujishima.

"Of course, there still is a long way to go, but I hope we are steadily paving the way to such a day. And don't you worry you might use up your ten-gigabyte monthly quota within hours, because your monthly quota then will be in terabytes," he added.

Credit: 
Hiroshima University

How our plants have turned into thieves to survive

Scientists have discovered that grasses are able to short cut evolution by taking genes from their neighbours.

The findings suggest wild grasses are naturally genetically modifying themselves to gain a competitive advantage.

Understanding how this is happening may also help scientists reduce the risk of genes escaping from GM crops and creating so called "super-weeds" - which can happen when genes from GM crops transfer into local wild plants, making them herbicide resistant.

Since Darwin, much of the theory of evolution has been based on common descent, where natural selection acts on the genes passed from parent to offspring. However, researchers from the Department of Animal and Plant Sciences at the University of Sheffield have found that grasses are breaking these rules. Lateral gene transfer allows organisms to bypass evolution and skip to the front of the queue by using genes that they acquire from distantly related species.

"Grasses are simply stealing genes and taking an evolutionary shortcut," said Dr Luke Dunning.

"They are acting as a sponge, absorbing useful genetic information from their neighbours to out compete their relatives and survive in hostile habitats without putting in the millions of years it usually takes to evolve these adaptations."

Scientists looked at grasses - some of the most economically and ecologically important plants on Earth including many of the most cultivated crops worldwide such as: wheat, maize, rice, barley, sorghum and sugar cane.

The paper, published in the journal Proceedings of the National Academy of Sciences, explains how scientists sequenced and assembled the genome of the grass Alloteropsis semialata.

Studying the genome of the grass Alloteropsis semialata - which is found across Africa, Asia and Australia - researchers were able to compare it with approximately 150 other grasses (including rice, maize, millets, barley, bamboo etc.). They identified genes in Alloteropsis semialata that were laterally acquired by comparing the similarity of the DNA sequences that make up the genes.

"We also collected samples of Alloteropsis semialata from tropical and subtropical places in Asia, Africa and Australia so that we could track down when and where the transfers happened," said Dr Dunning.

"Counterfeiting genes is giving the grasses huge advantages and helping them to adapt to their surrounding environment and survive - and this research also shows that it is not just restricted to Alloteropsis semialata as we detected it in a wide range of other grass species"

"This research may make us as a society reconsider how we view GM technology as grasses have naturally exploited a similar process.

"Eventually, this research may also help us to understand how genes can escape from GM crops to wild species or other non-GM crops, and provide solutions to reduce the likelihood of this happening."

"The next step is to understand the biological mechanism behind this phenomenon and we will carry out further studies to answer this."

Credit: 
University of Sheffield

Hot great white sharks could motor but prefer to swim slow

Yuuki Watanabe has always been fascinated by speed and power. As a child, he recalls being transfixed by the raw strength of great white sharks (Carcharadon carcharias). 'They look cool' says Watanabe, from the National Institute of Polar Research, Japan. However, he now has another reason for being in awe of these charismatic predatory sharks: 'they are an endothermic fish', he says. In other words, they maintain a warmer body temperature than the surrounding water - in contrast to most fish, which simply go with the thermal flow. This relatively warm-bodied lifestyle should allow them to swim at much higher speeds than their cold-blooded contemporaries. Yet, no one had successfully recorded the great white's behaviour to find out how their relatively warm lifestyle influences their activity. Would their warm muscles allow them to live up to their high-speed reputation? It turns out that although the fish could swim fast, they opt for lower speeds when hunting for fat seal snacks and the team publish this discovery in Journal of Experimental Biology at http://jeb.biologists.org.

Heading to the Neptune Islands Group (Ron and Valerie Taylor) Marine Park off South Australia - popular with tourists keen to dive with the great whites, which congregate there to dine on seals - Watanabe and his colleagues, Nicholas Payne, Jayson Semmens, Andrew Fox and Charlie Huveneers, prepared tags equipped with sensors to detect the sharks' movements, swim speeds and depth. However, Watanabe admits that tagging the colossal fish could be frustrating. 'They needed to swim nice and slow, close to the boat with their dorsal fins breaking the surface, but we rarely had such ideal situations', he smiles. And when the team wanted to get in alongside the sharks to determine their size and sex, Huveneers explains that they had to climb into a cage for protection. 'When seen from the boat, they look very aggressive, but I was surprised how elegantly they swim when seen underwater', says Watanabe.

After the tags detached automatically a couple of days later, the team successfully retrieved nine from the water, with the 10th eventually washing ashore several weeks later. Although one shark did manage to hit the higher speeds (2 m s?1) that the team anticipated - presumably when commuting between the islands - Watanabe was surprised that the sharks that lingered near the seal colony were swimming quite sluggishly (0.8 to 1.35 m s?1). He adds that this was unexpected, because swimming becomes costly and inefficient at very low speeds. However, Watanabe suggests that the animals may benefit from their profligacy when moseying along by increasing their chance of encountering a fat seal to dine on.

'This strategy is as close to a 'sit-and-wait' strategy as is possible for perpetual swimmers, such as white sharks', says Watanabe, who also analysed the shark's manoeuvres when they dived. Noticing that the animals were clearly gliding as they descended, he compared the amount of energy consumed by the sharks during dives with their exertions when swimming at the surface, and it was clear that diving is a far more economical mode of transport than battling through waves at the surface. However, Watanabe suspects that instead of diving to conserve energy, sharks descend primarily in pursuit of seals.

So, even though great white sharks are capable of swimming fast thanks to their warm muscles, living fast may not always be a benefit when waiting for dinner, and Watanabe is now keen to find out how often the super predators successfully snatch a seal for a fat snack.

Credit: 
The Company of Biologists

Illinois researchers first to show hinge morphology of click beetle's latch mechanism

video: University of Illinois Mechanical Science and Engineering professor Aimy Wissa and her colleagues at Illinois were the first to study the click beetle's unique ability to jump without using their legs, unlike most animals and insects. The beetle is segmented by a hinge, which allows it to invert and then flex into a near vertical jump, using it as a way to reposition after becoming inverted. The researchers are creating a self-righting mechanism for autonomous robots inspired by the click beetle, and have led the way for several years in this novel concept.

Image: 
University of Illinois at Urbana-Champaign Department of Mechanical Science and Engineering

Aimy Wissa, assistant professor of mechanical science and engineering (MechSE) at Illinois, leads an interdisciplinary research team to study click beetles to inspire more agile robots. The team, which includes MechSE Assistant Professor Alison Dunn and Dr. Marianne Alleyne, a research scientist in the Department of Entomology, recently presented their ongoing and novel work on the quick release mechanism of click beetles at the 2019 Society for Integrative and Comparative Biology (SICB) Annual Meeting.

Ophelia Bolmin, a graduate student in Wissa's Bio-inspired Adaptive Morphology (BAM) Lab, presented novel synchrotron X-ray footage that showed the internal latch mechanism of the click beetle, and demonstrated for the first time to the scientific community how the hinge morphology and mechanics enable this unique clicking mechanism. The presentation, "The click beetle latch mechanism: An in-vivo study using synchrotron X-rays," was part of an invited symposium on mechanisms of energy flow in organismal movement.

This work builds on research that was initiated by the Illinois team nearly two years ago, detailing the click beetles' legless self-righting jumping mechanism. The team already built prototypes of a hinge-like spring-loaded device that are being incorporated into a robot.

Rather minimal research had been performed on the click beetle's click mechanism in the past, and the Illinois team is the first to explore the insect within the field of bio-inspiration - using inspiration from nature for innovative engineered designs. They continue to be at the forefront of this research, and further studies are scheduled to be published in coming months.

Credit: 
University of Illinois Grainger College of Engineering

A hidden source of air pollution? Your daily household tasks

Cooking, cleaning and other routine household activities generate significant levels of volatile and particulate chemicals inside the average home, leading to indoor air quality levels on par with a polluted major city, University of Colorado Boulder researchers say.

What's more, airborne chemicals that originate inside a house don't stay there: Volatile organic compounds (VOCs) from products such as shampoo, perfume and cleaning solutions eventually escape outside and contribute to ozone and fine particle formation, making up an even greater source of global atmospheric air pollution than cars and trucks do.

The previously underexplored relationship between households and air quality drew focus today at the 2019 AAAS Annual Meeting in Washington, D.C., where researchers from CU Boulder's Cooperative Institute for Research in Environmental Sciences (CIRES) and the university's Department of Mechanical Engineering presented their recent findings during a panel discussion.

"Homes have never been considered an important source of outdoor air pollution and the moment is right to start exploring that," said Marina Vance, an assistant professor of mechanical engineering at CU Boulder. "We wanted to know: How do basic activities like cooking and cleaning change the chemistry of a house?"

In 2018, Vance co-led the collaborative HOMEChem field campaign, which used advanced sensors and cameras to monitor the indoor air quality of a 1,200-square-foot manufactured home on the University of Texas Austin campus. Over the course of a month, Vance and her colleagues conducted a variety of daily household activities, including cooking a full Thanksgiving dinner in the middle of the Texas summer.

While the HOMEChem experiment's results are still pending, Vance said that it's apparent that homes need to be well ventilated while cooking and cleaning, because even basic tasks like boiling water over a stovetop flame can contribute to high levels of gaseous air pollutants and suspended particulates, with negative health impacts.

To her team's surprise, the measured indoor concentrations were high enough that that their sensitive instruments needed to be recalibrated almost immediately.

"Even the simple act of making toast raised particle levels far higher than expected," Vance said. "We had to go adjust many of the instruments."

Indoor and outdoor experts are collaborating to paint a more complete picture of air quality, said Joost de Gouw, a CIRES Visiting Professor. Last year, de Gouw and his colleagues published results in the journal Science showing that regulations on automobiles had pushed transportation-derived emissions down in recent decades while the relative importance of household chemical pollutants had only gone up.

"Many traditional sources like fossil fuel-burning vehicles have become much cleaner than they used to be," said de Gouw. "Ozone and fine particulates are monitored by the EPA, but data for airborne toxins like formaldehyde and benzene and compounds like alcohols and ketones that originate from the home are very sparse."

While de Gouw says that it is too early on in the research to make recommendations on policy or consumer behavior, he said that it's encouraging that the scientific community is now thinking about the "esosphere," derived from the Greek word 'eso,' which translates to 'inner.'

"There was originally skepticism about whether or not these products actually contributed to air pollution in a meaningful way, but no longer," de Gouw said. "Moving forward, we need to re-focus research efforts on these sources and give them the same attention we have given to fossil fuels. The picture that we have in our heads about the atmosphere should now include a house."

Credit: 
University of Colorado at Boulder

Virus-infected bacteria could provide help in the fight against climate change

video: Alison Buchan summarizes her research about viruses and microbes ahead of a presentation for the AAAS 2019 Conference.

Image: 
University of Tennessee, Knoxville

Viruses don't always kill their microbial hosts. In many cases, they develop a mutually beneficial relationship: the virus establishes itself inside the microbe and, in return, grants its host with immunity against attack by similar viruses.

Understanding this relationship is beneficial not only for medical research and practical applications but also in marine biology, says Alison Buchan, Carolyn W. Fite Professor of Microbiology at the University of Tennessee, Knoxville.

"Marine microbes are uniquely responsible for carrying out processes that are essential for all of earth's biogeochemical cycles, including many that play a role in climate change," she said.

Buchan will explain some of these interactions on Sunday, February 17, during the annual meeting of the American Association for the Advancement of Science in Washington, D.C.

Her talk, "It's Only Mostly Dead: Deciphering Mechanisms Underlying Virus-Microbe Interactions," will be part of the scientific session titled Viruses, Microbes and Their Entangled Fates.

The function of a microbial community is in large part dictated by its composition: what microbes are present and how many of each.

Within the community, bacteria compete with one another for resources. In the course of this fight, some bacteria produce antibiotics and use them against other types of bacteria. This kind of interaction has been known for some time.

But there is another fight strategy that scientists like Buchan are just now considering: bacteria might use the viruses that infect them as weapons against other types of microbes.

"We have recently discovered that while they are in the process of dying, microbes can produce new viruses that then go to attack their original invader. This is a form of resistance we had not observed before," said Buchan.

This type of competitive interaction, Buchan said, is important for stabilizing the size of microbial populations in marine systems. This balance may be crucial for biogeochemical processes, including many related to climate change.

During Sunday's presentation, Buchan will be sharing the stage with Joshua Weitz, professor or theoretical ecology and quantitative biology at the Georgia Institute of Technology, and Matthew Sullivan, associate professor of microbiology and civil, environmental, and geodetic engineering at the Ohio State University.

Credit: 
University of Tennessee at Knoxville

Indigenous hunters have positive impacts on food webs in desert Australia

image: This is a drawing of Banded-hare wallabies from John Gould Mammals of Australia, 1845-63.

Image: 
Public Domain

Australia has the highest rate of mammal extinction in the world. Resettlement of indigenous communities resulted in the spread of invasive species, the absence of human-set fires, and a general cascade in the interconnected food web that led to the largest mammalian extinction event ever recorded. In this case, the absence of direct human activity on the landscape may be the cause of the extinctions, according to a Penn State anthropologist.

"I was motivated by the mystery that has occurred in the last 50 years in Australia," said Rebecca Bliege Bird, professor of anthropology, Penn State. "The extinction of small-bodied mammals does not follow the same pattern we usually see with people changing the landscape and animals disappearing."

Australia's Western Desert, where Bird and her team work, is the homeland of the Martu, the traditional owners of a large region of the Little and Great Sandy Desert. During the mid-20th century, many Martu groups were first contacted in the process of establishing a missile testing range and resettled in missions and pastoral stations beyond their desert home. During their hiatus from the land, many native animals went extinct.

In the 1980s, many families returned to the desert to reestablish their land rights. They returned to livelihoods centered around hunting and gathering. Today, in a hybrid economy of commercial and customary resources, many Martu continue their traditional subsistence and burning practices in support of cultural commitments to their country.

Twenty-eight Australian endemic land mammal species have become extinct since European settlement. Local extinctions of mammals include the burrowing bettong and the banded hare wallaby, both of which were ubiquitous in the desert before the indigenous exodus, Bird told attendees at the 2019 annual meeting of the American Association for the Advancement of Science today (Feb. 17) in Washington, D.C.

"During the pre-1950, pre-contact period, Martu had more generalized diets than any animal species in the region," said Bird. "When people returned, they were still the most generalized, but many plant and animal species were dropped from the diet."

She also notes that prior to European settlement, the dingo, a native Australian dog, was part of Martu life. The patchy landscape created by Martu hunting fires may have been important for dingo survival. Without people, the dingo did not flourish and could not exclude populations of smaller invasive predators -- cats and foxes-- that threatened to consume all the native wildlife.

Bird and her team looked at the food webs -- interactions of who eats what and who feeds whom, including humans -- for the pre-contact and for the post-evacuation years. Comparisons of these webs show that the absence of indigenous hunters in the web makes it easier for invasive species to infiltrate the area and for some native animals to become endangered or extinct. This is most likely linked to the importance of traditional landscape burning practices, said Bird.

Indigenous Australians in the arid center of the continent often use fire to facilitate their hunting success. Much of Australia's arid center is dominated by a hummock grass called spinifex.

In areas where Martu hunt more actively, hunting fires increase the patchiness of vegetation at different stages of regrowth, and buffer the spread of wildfires. Spinifex grasslands where Martu do not often hunt, exhibit a fire regime with much larger fires. Under an indigenous fire regime, the patchiness of the landscape boosts populations of native species such as dingo, monitor lizard and kangaroo, even after accounting for mortality due to hunting.

"The absence of humans creates big holes in the network," said Bird. "Invading becomes easier for invasive species and it becomes easier for them to cause extinctions."
The National Science Foundation and the Max Planck Institute for Evolutionary Anthropology supported this work.

Credit: 
Penn State

How do we conserve and restore computer-based art in a changing technological environment?

Software- and computer-based works of art are fragile--not unlike their canvas counterparts--as their underlying technologies such as operating systems and programming languages change rapidly, placing these works at risk.

These include Shu Lea Cheang's Brandon (1998-99), Mark Napier's net.flag (2002), and John F. Simon Jr.'s Unfolding Object (2002), three online works recently conserved at the Solomon R. Guggenheim Museum, through a collaboration with New York University's Courant Institute of Mathematical Sciences.

Fortunately, just as conservators have developed methods to protect traditional artworks, computer scientists, in collaboration with time-based media conservators, have created means to safeguard computer- or time-based art by following the same preservation principles.

"The principles of art conservation for traditional works of art can be applied to decision-making in conservation of software- and computer-based works of art with respect to programming language selection, programming techniques, documentation, and other aspects of software remediation during restoration," explains Deena Engel, a professor of computer science at New York University's Courant Institute of Mathematical Sciences.

Since 2014, she has been working with the Guggenheim Museum's Conservation Department to analyze, document, and preserve computer-based artworks from the museum's permanent collection. In 2016, the Guggenheim took more formal steps to ensure the stature of these works by establishing Conserving Computer-Based Art (CCBA), a research and treatment initiative aimed at preserving software and computer-based artworks held by the museum.

"As part of conserving contemporary art, conservators are faced with new challenges as artists use current technology as media for their artworks," says Engel. "If you think of a word processing document that you wrote 10 years ago, can you still open it and read or print it? Software-based art can be very complex. Museums are tasked with conserving and exhibiting works of art in perpetuity. It is important that museums and collectors learn to care for these vulnerable and important works in contemporary art so that future generations can enjoy them."

Under this initiative, a team led by Engel and Joanna Phillips, former senior conservator of time-based media at the Guggenheim Museum, and including conservation fellow Jonathan Farbowitz and Lena Stringari, deputy director and chief conservator at the Guggenheim Museum, explore and implement both technical and theoretical approaches to the treatment and restoration of software-based art.

In doing so, they not only strive to maintain the functionality and appeal of the original works, but also follow the ethical principles that guide conservation of traditional artwork, such as sculptures and paintings. Specifically, Engel and Phillips adhere to the American Institute for Conservation of Historic and Artistic Works' Code of Ethics, Guidelines for Practice, and Commentaries, applying these standards to artistic creations that rely on software as a medium.

"For example, if we migrate a work of software-based art from an obsolete programming environment to a current one, our selection and programming decisions in the new programming language and environment are informed in part by evaluating the artistic goals of the medium first used," explains Engel. "We strive to maintain respect for the artist's coding style and approach in our restoration."

So far, Phillips and Engel have completed two restorations of on-line artworks at the museum: Cheang's Brandon (restored in 2016-2017) and Simon's Unfolding Object (restored in 2018).

Commissioned by the Guggenheim in 1998, Brandon was the first of three web artworks acquired by the museum. Many features of the work had begun to fail within the fast-evolving technological landscape of the Internet: specific pages were no longer accessible, text and image animations no longer displayed properly, and internal and external links were broken. Through changes implemented by CCBA, Brandon fully resumes its programmed, functional, and aesthetic behaviors. The newly restored artwork can again be accessed at http://brandon.guggenheim.org.

Unfolding Object enables visitors from across the globe to create their own individual artwork online by unfolding the pages of a virtual "object"--a two-dimensional rectangular form--click by click, creating a new, multifaceted shape. Users may also see traces left by others who have previously unfolded the same facets, represented by lines or hash marks. The colors of the object and the background change depending on the time of day, so that two simultaneous users in different time zones are looking at different colors. But because the Java technology used to develop this early Internet artwork is now obsolete, the work was no longer supported by contemporary web browsers and is not easily accessible online.

The CCBA team, in dialogue with the artist, analyzed and documented the artwork's original source code and aesthetic and functional behaviors before identifying a treatment strategy. The team determined that a migration from the obsolete Java applet code to the contemporary programming language JavaScript was necessary. In place of a complete rewriting of the code, a treatment that art conservators would deem invasive, the CCBA team developed a new migration strategy more in line with contemporary conservation ethics, "code resituation," which preserves as much of the original source code as possible.

Credit: 
New York University

Engineered metasurfaces reflect waves in unusual directions

image: Photo of the actual metasurface.

Image: 
Aalto University

In our daily lives, we can find many examples of manipulation of reflected waves such as mirrors to see our reflections or reflective surfaces for sound that improve auditorium acoustics. When a wave impinges on a reflective surface with a certain angle of incidence and the energy is sent back, the angle of reflection is equal to the angle of incidence. This classical reflection law is valid for any homogenous surface. Researchers at Aalto University have developed new metasurfaces for the arbitrary manipulation of reflected waves, essentially breaking the law to engineer the reflection of a surface at will.

Metasurfaces are artificial structures, composed of periodic arranged of meta-atoms at subwavelength scale. Meta-atoms are made of traditional materials but, if they are placed in a periodic manner, the surface can show many unusual effects that cannot be realized by the materials in nature. In their article published 15 February 2019 in Science Advances, the researchers use power-flow conformal metasurfaces to engineer the direction of reflected waves.

'Existing solutions for controlling reflection of waves have low efficiency or difficult implementation,' says Ana Díaz-Rubio, postdoctoral researcher at Aalto University. 'We solved both of those problems. Not only did we figure out a way to design high efficient metasurfaces, we can also adapt the design for different functionalities. These metasurfaces are a versatile platform for arbitrary control of reflection.'

'This is really an exciting result. We have figured out a way to design such a device and we test it for controlling sound waves. Moreover, this idea can be applied to electromagnetic fields,' Ana explains.

This work received funding from the Academy of Finland. The article was published in the online version of the journal on 15 February 2019.

Credit: 
Aalto University

Study: No race or gender bias seen in initial NIH grant reviews

MADISON, Wis. -- Examinations of National Institutes of Health grants in the last 15 years have shown that white scientists are more likely to be successful in securing funding from the agency than their black peers.

A new study from the University of Wisconsin-Madison shows that bias is unlikely to play out in the initial phase of the process NIH uses to review applications for the billions of federal grant dollars it apportions annually to biology and behavior research, even though the reviewers at that early stage in the process are aware of each grant applicant's identity.

"Absence of bias here does not mean there is no bias in the entire review process," says psychology professor Patricia Devine, who secured NIH funding to assess the agency's review process. "But we're confident that this is a strong and valid result showing no evidence of bias against female and black scientists in this first review of grant applications."

Their findings were published this week in the journal Nature Human Behavior.

A team of UW-Madison psychologists selected 48 actual grant proposals sent to NIH -- half of which were awarded funding -- and stripped them of any identifying information. Each study was reproduced four times with new fictitious names, information implying the applying scientist was a white man, white woman, black man or black woman.

They recruited more than 400 scientists with credentials qualifying them to serve as reviewers for grant applications submitted to NIH's four largest institutes -- most of whom had served as NIH reviewers, applied to NIH for funding, or both. Each volunteer reviewer received three of the experiment's grant applications: two ostensibly written by white men, and one with names reworked to appear as authored by a black woman, black man or white woman.

The reviewers read the applications and returned detailed critiques as they would in an actual, initial NIH review, including scores in several areas (the most consequential being an "impact" score).

There was no consequential difference in the impact scores or the reviewers' use of descriptive language that can be consequential for how grants are perceived -- how they apportioned words like "diligent," "fails," "limits," "convincing" or "remarkable" -- in their reports.

"That will be surprising to people. This is a place where people will assume there is bias," says Devine. "But the reviewers were focused on the actual grants that were in front of them, and not the social categories of the applicants."

The initial review phase may not be a likely step to favor a race or gender. The first assessment is an in-depth affair, with long, written justifications for judgments that don't lend themselves well to the usual trappings of bias.

"We know from other areas of research related to bias that when people have a lot of information, take more time to think, and are more accountable for how they act, bias is less likely to show up," says William T. L. Cox, a UW-Madison scientist. Cox and University of Arkansas professor Patrick S. Forscher were members of Devine's lab and are co-authors of the study with Devine and UW-Madison psychology professor Markus Brauer.

"We have captured the part of the review process where reviewers are taking the most time and paying the most attention," Cox says. "So, it may be the area where we would least expect bias to appear."

The researchers also found no difference in the treatment of high-scoring grants that NIH had actually funded and lower-scoring grants that missed out.

"The social science literature tells us that when things are ambiguous, men tend to get a bit of a bonus while women and black people are kind of downgraded," Devine says. "We didn't see that in the reviews supplied in this experimental study."

The painstaking research took five years to complete, and included confirming the findings by applying more than 4,500 possible differences in analytical emphasis to the data. More than 97 percent of the time, the results showed no significant difference in treatment of applicants based on race or gender.

"But we don't want to be understood to suggest that we don't think there is any bias in the process," says Devine, who admires NIH for supporting the kind of examination that could have uncovered a bias among reviewers evaluating grant proposals. "If there is bias, I don't know yet where it is and how it manifests."

In later steps in the grant review process, which can involve shallower analysis, sometimes brief debate and less individual accountability for reviewers, Cox points out, it's more likely bias could creep in.

The bias seen in the apportionment of funding may not be built into the review process at all. Grant applications submitted by women and black scientists may be somehow different than those written by white men, the researchers suggested.

Letters of recommendation written on behalf of black scientists may undermine them in subtle ways relative to peers, for example. Editing help from overcompensating white or male colleagues may be less constructively critical. Women may be more cautious than men in their writing, including more qualifying statements and fewer bold claims.

"Those things would be important to know," Devine says. "If anybody isn't receiving careful training on how to engage effectively in this process and write grants in a way that increase their chances for funding, that's something you could train people on and a way to turn the tide."

Credit: 
University of Wisconsin-Madison