Culture

Study examines potential link between partner bereavement and skin cancer

Psychological stress has been proposed as a risk factor for melanoma, but clinical evidence is limited. A recent British Journal of Dermatology study funded by the European Academy of Dermatology and Venereology looked for a potential link between the death of a partner, which is one of the most stressful life events, and melanoma. The research was carried out by the London School of Hygiene & Tropical Medicine and Aarhus University Hospital in Denmark.

In the study, which included information from the UK Clinical Practice Research Datalink and Danish nationwide registries, investigators found that partner bereavement was linked with a decreased risk of being diagnosed with melanoma, but with an increased risk of dying after being diagnosed.

"The study findings are interesting and may relate to bereaved people no longer having someone to help with skin examinations, leading to delays in diagnosis, although we cannot rule out stress being important in melanoma progression," said senior author Sinéad Langan, PhD, of the London School of Hygiene and Tropical Medicine.

Credit: 
Wiley

Multi-country study reveals shortcomings in treating obesity

To address obesity worldwide, changes are needed in both the availability of treatments and the attitudes of clinicians. That's the conclusion of a survey-based study of health professionals.

In the Clinical Obesity study, investigators surveyed 274 respondents from a total of 68 low, middle, and high income countries. Respondents in most countries stated that there were professional guidelines for obesity treatment, but adequate services were lacking, especially in lower income countries and in rural areas of most countries.

Lack of treatment was attributed to a broad range of issues including: no clear care pathways from primary care to specialty services; absent or limited specialty services in some regions; potentially high costs to patients; long waiting times for surgery; and stigma or blame experienced by patients.

Few countries were willing to define obesity as a disease.

"The lack of investment in clinical services shows a critical failure of government to respect and protect our right to good health," said senior author Tim Lobstein, PhD, of the World Obesity Federation. "Without substantial investment in the treatment of obesity, the demands on health services will increase dramatically--not only because of the rising numbers of people suffering obesity and its consequences, but because the duration of experiencing obesity greatly increases the risk of more disabling diseases requiring greater intensity of interventions".

Credit: 
Wiley

Does antibiotic use during pregnancy and infancy impact childhood obesity?

Use of antibiotics during pregnancy does not appear to affect children's weight in subsequent years, but use during infancy may increase their risk of becoming overweight or obese. The findings come from an analysis published in the journal Obesity.

When investigators examined all relevant published studies that looked at the use of antibiotics during pregnancy or infancy and children's later weight--which included 23 observational studies involving 1,253,035 participants--they did not find a link between prenatal antibiotic use and childhood overweight or obesity. An increased risk of overweight or obesity was seen in analyses limited to the use of antibiotics in the second trimester, however, as well as in the use of antibiotics during infancy.

"Antibiotics should be used more cautiously for children than pregnant women," said senior author Yong Xu, MD, PhD, of The Affiliated Hospital of Southwest Medical University, in China.

Credit: 
Wiley

How loneliness affects end-of-life experiences

In a study of Americans over age 50 years who died between 2004 and 2014, individuals who were characterized as lonely based on survey results were burdened by more symptoms and received more intense end-of-life care compared with non-lonely people.

In the 2,896-participant study, which is published in the Journal of the American Geriatrics Society, one-third of adults were lonely. In addition to having an increased likelihood of experiencing burdensome symptoms at the end of life, lonely individuals were more likely to use life support in the last 2 years of life (35.5% versus 29.4%) and more likely to die in a nursing home (18.4% versus 14.2%) than non-lonely individuals.

"Loneliness is a pervasive psychosocial phenomenon with profound implications for the health and wellbeing of older adults throughout the life continuum, and particularly at the end of life," said lead author Nauzley Abedini, MD, MSc, of the University of Michigan. "We must do more--as healthcare providers, but also as a society--to screen for and intervene on loneliness not just during the dying process, but before the end of life period."

Credit: 
Wiley

Rapid DNA test quickly identifies victims of mass casualty event

image: To quickly identify victims of the 2018 Camp Fire, the deadliest wildfire in California's history, researchers used a technique called Rapid DNA Identification that can provide results within hours, compared with months to years required of conventional DNA analysis.

Image: 
Butte County Sheriff's Office

To quickly identify victims of the 2018 Camp Fire, the deadliest wildfire in California's history, researchers used a technique called Rapid DNA Identification that can provide results within hours, compared with months to years required of conventional DNA analysis.

As reported in the Journal of Forensic Sciences, Rapid DNA Identification was used in a rented recreational vehicle outside the Sacramento morgue. Sixty-nine sets of remains were assessed, and of these, 62 (89.9%) generated DNA profiles that were screened against 255 family reference samples. In total, 58 victims were identified using Rapid DNA.

This work represents the first use of Rapid DNA Identification in a mass casualty event. It has since been utilized in the Conception Dive Boat fire in 2019.

"Rapid DNA enabled Sacramento County and Butte County to support the loved ones of the victims of the devastating Camp Fire," said Richard F. Selden, MD, PhD, Founder of ANDE Corporation. "What we've learned from this tragedy is that Rapid DNA Identification can and should be used routinely in mass casualty victim identification. Furthermore, by bringing DNA analysis from the laboratories to the field, Rapid DNA Identification can help reduce sexual assault and human trafficking, fight terrorism, and exonerate the innocent."

Credit: 
Wiley

Safe tackling, padded helmets lower head injuries in youth football

Middle school football players greatly reduce the chance of head injuries if they wear padded helmets and use safe tackling and blocking techniques, according to Rutgers researchers.

The study was published in the journal Neurosurgery.

With youth players representing 70 percent of all amateur and professional football participants, attention is turning to the safety of the sport for younger players. Concussions and other head injuries in football and other contact sports among older athletes have been linked to cognitive problems, including dementia, depression and chronic traumatic encephalopathy, a degenerative brain disease.

"Parents are understandably concerned that their young football players are putting themselves at risk for the same long-term brain or nervous system injuries that are reported in older players," said Robert Heary, director of the Center for Spine Surgery and Mobility at Rutgers New Jersey Medical School. "Although concussions causing readily observable signs and symptoms are of great concern to physicians examining football players during and after games, the effects of other head trauma with fewer symptoms also can result in long-term damage."

The researchers monitored 20 members of a youth football team in New Jersey who wore helmets equipped with a system that tracked the number and severity of impacts that each player sustained during their 20-practice, seven-game season. A tackling coach taught players and coaches safe methods for blocking, defeating blocks and tackling that reduced head contact for both offensive and defensive players. During practices, the players wore Guardian Caps, which are helmets fitted with a padded cover that lessen the number of high-energy head impacts.

There were 817 recorded impacts during the season -- an average of 41 impacts per player and about 20 minutes of full contact per practice -- but no concussions.

"The use of Guardian Caps, safe tackling techniques and the age of participants may have contributed to the very low number of impacts recorded and the complete lack of injuries," said Heary.

Credit: 
Rutgers University

Genome editing strategy could improve rice, other crops

Scientists at UC Davis have used CRISPR technology to genetically engineer rice with high levels of beta-carotene, the precursor of vitamin A. The technique they used provides a promising strategy for genetically improving rice and other crops. The study was published today in the journal Nature Communications.

Rice is a staple food crop for more than half the world's population. Golden Rice, a genetically engineered rice with high levels of beta-carotene, has been approved for consumption in more than five countries, including the Philippines, where vitamin A deficiency in children is widespread. Because of the social impact of Golden Rice, the researchers chose the high beta-carotene trait as an example.

Conventional plant genetic engineering uses a bacterium or a particle gun to transfer genes encoding desired traits into the plant genome. In this case, researchers would use a bacterium to take beta-carotene producing genes and transfer them into the rice genome. But those transgenes can integrate into random positions in the genome, which can result in reduced yields.

"Instead, we used CRISPR to precisely target those genes onto genomic safe harbors, or chromosomal regions that we know won't cause any adverse effects on the host organism," said first author Oliver Dong, a postdoctoral scholar in the UC Davis Department of Plant Pathology and Genome Center.

In addition, the researchers were able to precisely insert a very large fragment of DNA that does not contain marker genes. By contrast, conventional genetic engineering relies on the inclusion of marker genes in the inserted DNA fragment. These marker genes are retained when the plant is bred over generations, which can often trigger public concern and stringent regulations of the transgenic products before their entrance to the marketplace.

"Scientists have done targeted insertions before and without marker genes, but we haven't been able to do it with such big fragments of DNA," said Dong. "The larger the fragment of DNA, the more biological function or complex traits we can provide the plants."

Dong said this opens up the possibility that genes controlling multiple desirable traits, such as having high levels of beta-carotene as well as being disease-resistant or drought-tolerant, can be clustered at a single position within the genome. This can greatly reduce subsequent breeding efforts.

Credit: 
University of California - Davis

Hypertension in young adulthood associated with cognitive decline in middle age

High blood pressure, or hypertension, affects everything from your arteries to your kidneys, from eyesight to sexual function. Among older adults, high blood pressure is also associated with cognitive decline as a result of interrupted blood flow to the brain, as well as strokes, heart attacks and impaired mobility.

A new Northwestern University-Tel Aviv University study has revealed that subjects who experienced relatively high blood pressure during young adulthood also experienced significant declines in cognitive function and gait in midlife (approximately 56 years old). The study cohort included about 200 young adults with an average age of 24 at the beginning of the study.

The research was led by Prof. Farzaneh A. Sorond and Dr. Simin Mahinrad of Northwestern University's Department of Neurology and Prof. Jeffrey Hausdorff of TAU's Sackler Faculty of Medicine, TAU's Sagol School of Neuroscience and Tel Aviv Medical Center's Center for the Study of Movement, Cognition, and Mobility at the Neurological Institute. The study was published in the American Heart Association's journal Circulation in March.

"We find that the deleterious effects of elevated blood pressure on brain structure and function begin in early adulthood. This demonstrates the need for preventive measures of high blood pressure even at this early age," explains Prof. Hausdorff. "We know that poor gait and cognitive function among older adults are associated with and predict multiple adverse health outcomes like cognitive decline, dementia, falls and death. Our study shows that the time to treat high blood pressure and to minimize future changes in gait and cognition is much earlier -- decades earlier -- than previously thought."

In addition, the study suggests that gait impairment may be an earlier hallmark of hypertensive brain injury than cognitive deficits.

For the study, the researchers assessed the blood pressure, gait and cognition of 191 participants from the Coronary Artery Risk Development in Young Adults study, a community-based cohort of young individuals followed over 30 years. In the last year of follow-up, gait was assessed using an instrumented gait mat; cognitive function was evaluated using neuropsychological tests; and the level of white matter intensity in the brain, a symptom of cardiovascular disease, was measured using MRIs. The impact of cumulative levels of high blood pressure was found to be independent of other vascular risk factors over the same 30-year period.

"Higher cumulative blood pressure was associated with slower walking speed, smaller step length and higher gait variability," Prof. Hausdorff says. "Higher cumulative blood pressure was also associated with lower cognitive performance in the executive, memory and global domains."

"Our takeaway is this: Even in young adults, blood pressure has significant implications, even at levels below the 'hypertension' threshold, and is important to assess and modify for future cognitive function and mobility," Prof. Hausdorff concludes.

Credit: 
American Friends of Tel Aviv University

Study shows low carb diet may prevent, reverse age-related effects within the brain

image: Lilianne R. Mujica-Parodi, PhD

Image: 
Stony Brook University

STONY BROOK, NY, March 4, 2020 - A study using neuroimaging led by Stony Brook University professor and lead author Lilianne R. Mujica-Parodi, PhD, and published in PNAS, reveals that neurobiological changes associated with aging can be seen at a much younger age than would be expected, in the late 40s. However, the study also suggests that this process may be prevented or reversed based on dietary changes that involve minimizing the consumption of simple carbohydrates.

To better understand how diet influences brain aging, the research team focused on the presymptomatic period during which prevention may be most effective. In the article titled "Diet modulates brain network stability, a biomarker for brain aging, in young adults," they showed, using large-scale life span neuroimaging datasets, that functional communication between brain regions destabilizes with age, typically in the late 40's, and that destabilization correlates with poorer cognition and accelerates with insulin resistance. Targeted experiments then showed this biomarker for brain aging to be reliably modulated with consumption of different fuel sources: glucose decreases, and ketones increase, the stability of brain networks. This effect was replicated across both changes to total diet as well as after drinking a fuel-specific calorie-matched supplement.

"What we found with these experiments involves both bad and good news," said Mujica-Parodi, a Professor in the Department of Biomedical Engineering with joint appointments in the College of Engineering & Applied Sciences and Renaissance School of Medicine at Stony Brook University, and a faculty member in the Laufer Center for Physical and Quantitative Biology. "The bad news is that we see the first signs of brain aging much earlier than was previously thought. However, the good news is that we may be able to prevent or reverse these effects with diet, mitigating the impact of encroaching hypometabolism by exchanging glucose for ketones as fuel for neurons."

What the researchers discovered, using neuroimaging of the brain, is that quite early on there is breakdown of communication between brain regions ("network stability").

"We think that, as people get older, their brains start to lose the ability to metabolize glucose efficiently, causing neurons to slowly starve, and brain networks to destabilize," said Mujica-Parodi. "Thus, we tested whether giving the brain a more efficient fuel source, in the form of ketones, either by following a low-carb diet or drinking ketone supplements, could provide the brain with greater energy. Even in younger individuals, this added energy further stabilized brain networks."

To conduct their experiments, brain network stability was established as a biomarker for aging by using two large-scale brain neuroimaging (fMRI) datasets totaling nearly 1,000 individuals, ages 18 to 88. Destabilization of brain networks was associated with impaired cognition and was accelerated with Type 2 diabetes, an illness that blocks neurons' ability to effectively metabolize glucose. To identify the mechanism as being specific to energy availability, the researchers then held age constant and scanned an additional 42 adults under the age of 50 years with fMRI. This allowed them to observe directly the impact of glucose and ketones on each individual's brain.

The brain's response to diet was tested in two ways. The first was holistic, comparing brain network stability after participants had spent one week on a standard (unrestricted) vs. low carb (for example: meat or fish with salad, but no sugar, grains, rice, starchy vegetables) diet. In a standard diet, the primary fuel metabolized is glucose, whereas in a low-carb diet, the primary fuel metabolized is ketones. However, there might have been other differences between diets driving the observed effects. Therefore, to isolate glucose vs. ketones as the crucial difference between the diets, an independent set of participants was scanned before and after drinking a small dose of glucose on one day, and ketones on the other, where the two fuels were individually weight-dosed and calorically matched. The results replicated, showing that the differences between the diets could be attributed to the type of fuel they provide to the brain.

Additional findings from the study included the following: Effects of brain aging emerged at age 47, with most rapid degeneration occurring at age 60. Even in younger adults, under age 50, dietary ketosis (whether achieved after one week of dietary change or 30 minutes after drinking ketones) increased overall brain activity and stabilized functional networks. This is thought to be due to the fact that ketones provide greater energy to cells than glucose, even when the fuels are calorically matched. This benefit has previously been shown for the heart, but the current set of experiments provides the first evidence for equivalent effects in the brain.

"This effect matters because brain aging, and especially dementia, are associated with "hypometabolism," in which neurons gradually lose the ability to effectively use glucose as fuel. Therefore, if we can increase the amount of energy available to the brain by using a different fuel, the hope is that we can restore the brain to more youthful functioning. In collaboration with Dr. Eva Ratai at Massachusetts General Hospital, we're currently addressing this question, by now extending our studies to older populations," said Mujica-Parodi.

"Additional research with collaborators at Children's National, under the direction of Dr. Nathan Smith, focuses on discovering the precise mechanisms by which fuel impacts signaling between neurons. Finally, in collaboration with Dr. Ken Dill and Dr. Steven Skiena, at Stony Brook, we're working on building a comprehensive computational model that can incorporate our understanding of the biology, from individual neurons to whole brains to cognition, as it develops."

The research is currently funded under a new $2.5 million National Science Foundation BRAIN Initiative "Frontiers" grants (numbers (NSFECCS1533257 and NSFNCS-FR 1926781) awarded to Stony Brook, as well as by the W. M. Keck Foundation, which originally funded the team in 2017 with a $1 million seed grant designed to jump-start "pioneering discoveries in science, engineering, and medical research."

Credit: 
Stony Brook University

Zombie scanning enables the study of peptide-receptor interactions on the cell surface

image: The tethered toxin (T-HmK) is drawn free and bound to a potassium channel (blue) expressed in the cell membrane. The reaction volume that a T-HmK can visit (two-headed arrow) is determined by the combined lengths of the flexible peptide linker, the GPI anchor, and the diameter of HmK toxin. The lower panel shows the expression construct design.

Image: 
UCI School of Medicine

Irvine, Calif. - March 4, 2020 - In the past, biologically-active peptides - small proteins like neurotoxins and hormones that act on cell receptors to alter physiology - were purified from native sources like venoms and then panels of variants were produced in bacteria, or synthesized, to study the structural basis for receptor interaction. A new technique called zombie scanning renders these older processes obsolete.

Peptides are used for medical therapy and to study biology, among other things, but their production cost in time and money is increasingly high.

"If a peptide has 30 residues, simply changing each site once requires the synthesis, purification and validation of the folded composition of all 30 variants, a process that requires months and many thousands of dollars," said Steven A.N. Goldstein, MD, PhD, vice chancellor for Health Affairs at the University of California, Irvine, and distinguished professor in the UCI School of Medicine Departments of Pediatrics and Physiology & Biophysics.

Published today in Science Advances, the new study co-led by Goldstein and Jordan H. Chill, PhD, a professor in the Department of Chemistry at Bar-Ilan University in Israel, reveals how researchers were able to hijack cell machinery to simplify the creation of peptides allowing for extensive, rapid studies of structure-function and mechanism to improve specificity and affinity of action, the important parameters for therapeutic efficacy.

"Since we hijack the cell machinery to synthesize and display the peptides on the cell surface with the receptor, we dub this zombie scanning," said first author Ruiming Zhao, PhD., an investigator in the Goldstein laboratory.

"Using this new technique, peptide changes are as simple as plasmid synthesis and require only days of work and pennies per construct. This enables us to study the roles of many sites with many changes in a much shorter period of time at a much lower cost."

The study, titled, "Tethered peptide neurotoxins display two blocking mechanisms in the K+ channel pore as do their untethered analogues," outlines how the encoded peptides are linked via a native pathway to the outside of the cell on a flexible tether. In this case, the receptor target was also expressed from a plasmid and could be modified. The method also allows study of low affinity interactions that would not otherwise be feasible to analyze.

Using zombie scanning, researchers made the unexpected discovery that a peptide in clinical trials as an immune suppressant acts differently than once thought, revealing that this family of neurotoxins has two possible modes of interacting with potassium channels rather than just one.

Chill and colleague Netanel Mendelman, PhD, enhanced the impact of these findings by elucidating the three-dimensional structure of a selected neurotoxin using nuclear magnetic resonance (NMR).

"By estimating hundreds of distances and angles between atoms in the peptide, we now know its structure, offering a molecular context for these exciting results," said Chill. "The two binding modes seem to involve a 'flipping' of the toxin or some rearrangement of its atomic structure."

Alternative binding modes as described for these peptides are a troublesome confounding factor in structure-based drug design, highlighting the importance of the findings of this report and future studies of the recognition process between channels and inhibitory peptides.

Credit: 
University of California - Irvine

Diversity semantics shift higher ed inclusivity away from students of color

Affirmative action in higher education was originally meant to rebalance the scales of mostly-white, mostly-male institutions. But a study from the University of Colorado Denver found that the legal semantics of two landmark Supreme Court cases have redefined the focus of affirmative action from access for students of color to educational benefits for white students. This repositioning of diversity priorities also shows up in higher education diversity initiatives, such as "Inclusive Excellence."

"The idea of 'inclusive excellence' seemed good at face value," said Naomi Nishi, PhD, department of Ethnic Studies at CU Denver. "We cannot, as an institution, be excellent unless we have a diversity of people participating. But in practice, students of color at elite institutions are being treated like their responsibility is to be the diversity that gives white students a well-rounded education."

The study was published in the journal Race Ethnicity and Education.

Defining diversity and critical mass

In 2003, a pair of affirmative action lawsuits found their way to the Supreme Court (Grutter v. Bollinger (2003) and Gratz v. Bollinger (2003)). The first found in favor of the University of Michigan, ruling Michigan's Law School was using racial diversity appropriately in admissions as they were seeking a critical mass of Students of Color through a nuanced admissions process. The second found for the plaintiff, arguing that the University of Michigan had used a quota system in their undergraduate program scoring system that unfairly favored people of color.

During the two cases, two terms--diversity and critical mass--began to fluctuate in their definitions. In 1977, Rosabeth Moss Kanter coined the term "critical mass" to mean diversity in diversity--enough people of color in the classroom so that students of color would not be tokenized or feel isolated. In the Grutter case, the term matched Kanter's definition half of the time.

Diluting and revising the legal definitions

But a revised usage showed up, too: "critical mass" became the number of students of color needed for the student body to realize educational benefits. The definition was revised to focus less on the students of color and more on the benefit for current students (most of whom were white). It stuck. By the next Supreme Court cases, Fisher v. University of Texas (2013, 2016), it was the only definition used.

Nishi described how, in the discourse of these cases, the definitions of key words, like "critical mass" and "diversity" were being diluted and revised to be more palatable to white interests.

"Both sides were rapidly changing definitions to co-opt and concede the remaining benefit for students of color that remained in Affirmative Action," said Nishi.

Who benefits from diversity?

The semantic concessions had reverberations in future affirmative action cases. As affirmative actions cases proliferated alongside state-wide bans on using any racial consideration in the admissions process, institutions shifted their efforts to instead focus on diversity and inclusivity programming. It was what Nishi called a "Plan B in the face of legal dismantlement of racial considerations in admissions"

"Yet, these same diversity and inclusivity initiatives were quickly co-opted by people who were looking--even unintentionally--to further white elite interests," said Nishi.

For example, when a business school launched an Inclusive Excellence program, Nishi said leaders in the school described the program as a way to help prepare their students for future jobs where they would manage diverse groups.

"It was a predominantly white institution and they were essentially saying 'we've got to teach our white students how to manage people of color because one day they'll be their boss,'" said Nishi. "That's troubling."

The problem of racism and supremacy in higher ed

But affirmative action and subsequent diversity programming were Band-Aids from the beginning, said Nishi.

"We've never actually been interested in dealing with the larger problem of racism and white supremacy in higher education," said Nishi. "White women were the biggest benefactors of affirmative action, but now their numbers in higher ed outpace white men in some areas. We've seen that when these programs no longer benefit white people, interests diverge. There's a seizing back of any benefits or access from people of color--something we call 'Imperialistic Reclamation.'"

"The truth is, we won't see racial justice until we stop pretending that racism isn't at work in our institutions of higher education and end the futile attempts to convince white elites in power that racial equity is first and foremost for them. It's not."

Credit: 
University of Colorado Denver

Integrating electronics onto physical prototypes

image: CurveBoards are 3D breadboards -- which are commonly used to prototype circuits -- that can be designed by custom software, 3D printed, and directly integrated into the surface of physical objects, such as smart watches, bracelets, helmets, headphones, and even flexible electronics. CurveBoards can give designers an additional prototyping technique to better evaluate how circuits will look and feel on physical products that users interact with.

Image: 
Image: Dishita Turakhia and Junyi Zhu

MIT researchers have invented a way to integrate "breadboards" -- flat platforms widely used for electronics prototyping -- directly onto physical products. The aim is to provide a faster, easier way to test circuit functions and user interactions with products such as smart devices and flexible electronics.

Breadboards are rectangular boards with arrays of pinholes drilled into the surface. Many of the holes have metal connections and contact points between them. Engineers can plug components of electronic systems -- from basic circuits to full computer processors -- into the pinholes where they want them to connect. Then, they can rapidly test, rearrange, and retest the components as needed.

But breadboards have remained that same shape for decades. For that reason, it's difficult to test how the electronics will look and feel on, say, wearables and various smart devices. Generally, people will first test circuits on traditional breadboards, then slap them onto a product prototype. If the circuit needs to be modified, it's back to the breadboard for testing, and so on.

In a paper being presented at CHI (Conference on Human Factors in Computing Systems), the researchers describe "CurveBoards," 3D-printed objects with the structure and function of a breadboard integrated onto their surfaces. Custom software automatically designs the objects, complete with distributed pinholes that can be filled with conductive silicone to test electronics. The end products are accurate representations of the real thing, but with breadboard surfaces.

CurveBoards "preserve an object's look and feel," the researchers write in their paper, while enabling designers to try out component configurations and test interactive scenarios during prototyping iterations. In their work, the researchers printed CurveBoards for smart bracelets and watches, Frisbees, helmets, headphones, a teapot, and a flexible, wearable e-reader.

"On breadboards, you prototype the function of a circuit. But you don't have context of its form -- how the electronics will be used in a real-world prototype environment," says first author Junyi Zhu, a graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL). "Our idea is to fill this gap, and merge form and function testing in very early stage of prototyping an object. ... CurveBoards essentially add an additional axis to the existing [three-dimensional] XYZ axes of the object -- the 'function' axis."

Joining Zhu on the paper are CSAIL graduate students Lotta-Gili Blumberg, Martin Nisser, and Ethan Levi Carlson; Department of Electrical Engineering and Computer Science (EECS) undergraduate students Jessica Ayeley Quaye and Xin Wen; former EECS undergraduate students Yunyi Zhu and Kevin Shum; and Stefanie Mueller, the X-Window Consortium Career Development Assistant Professor in EECS.

Custom software and hardware

A core component of the CurveBoard is custom design-editing software. Users import a 3D model of an object. Then, they select the command "generate pinholes," and the software automatically maps all pinholes uniformly across the object. Users then choose automatic or manual layouts for connectivity channels. The automatic option lets users explore a different layout of connections across all pinholes with the click of a button. For manual layouts, interactive tools can be used to select groups of pinholes and indicate the type of connection between them. The final design is exported to a file for 3D printing.

When a 3D object is uploaded, the software essentially forces its shape into a "quadmesh" -- where the object is represented as a bunch of small squares, each with individual parameters. In doing so, it creates a fixed spacing between the squares. Pinholes -- which are cones, with the wide end on the surface and tapering down -- will be placed at each point where the corners of the squares touch. For channel layouts, some geometric techniques ensure the chosen channels will connect the desired electrical components without crossing over one another.

In their work, the researchers 3D printed objects using a flexible, durable, nonconductive silicone. To provide connectivity channels, they created a custom conductive silicone that can be syringed into the pinholes and then flows through the channels after printing. The silicone is a mixture of a silicone materials designed to have minimal electricity resistance, allowing various types electronics to function.

To validate the CurveBoards, the researchers printed a variety of smart products. Headphones, for instance, came equipped with menu controls for speakers and music-streaming capabilities. An interactive bracelet included a digital display, LED, and photoresistor for heart-rate monitoring, and a step-counting sensor. A teapot included a small camera to track the tea's color, as well as colored lights on the handle to indicate hot and cold areas. They also printed a wearable e-book reader with a flexible display.

Better, faster prototyping

In a user study, the team investigated the benefits of CurveBoards prototyping. They split six participants with varying prototyping experience into two sections: One used traditional breadboards and a 3D-printed object, and the other used only a CurveBoard of the object. Both sections designed the same prototype but switched back and forth between sections after completing designated tasks. In the end, five of six of the participants preferred prototyping with the CurveBoard. Feedback indicated the CurveBoards were overall faster and easier to work with.

But CurveBoards are not designed to replace breadboards, the researchers say. Instead, they'd work particularly well as a so-called "midfidelity" step in the prototyping timeline, meaning between initial breadboard testing and the final product. "People love breadboards, and there are cases where they're fine to use," Zhu says. "This is for when you have an idea of the final object and want to see, say, how people interact with the product. It's easier to have a CurveBoard instead of circuits stacked on top of a physical object."

Next, the researchers hope to design general templates of common objects, such as hats and bracelets. Right now, a new CurveBoard must built for each new object. Ready-made templates, however, would let designers quickly experiment with basic circuits and user interaction, before designing their specific CurveBoard.

Additionally, the researchers want to move some early-stage prototyping steps entirely to the software side. The idea is that people can design and test circuits -- and possibly user interaction -- entirely on the 3D model generated by the software. After many iterations, they can 3D print a more finalized CurveBoard. "That way you'll know exactly how it'll work in the real world, enabling fast prototyping," Zhu says. "That would be a more 'high-fidelity' step for prototyping."

Credit: 
Massachusetts Institute of Technology

Even fake illness affects relationships among vampire bats

image: Vampire bats (Desmodus rotundus) are a highly social species. What happens to their social interactions when individuals get sick?

Image: 
Josh More

As Italy urges tourists not to cancel their plans in the face of the coronavirus outbreak and a National Basketball Association memo reportedly encourages teammates to avoid high-fives, a new study conducted at the Smithsonian Tropical Research Institute (STRI) in Panama takes a look at how social bonds change in response to illness in another highly social animal: the vampire bat. In these bats, just as in humans, strong family bonds were less affected by the appearance of disease than were weaker social relationships.

Vampire bats are an extremely social species. Their interactions range from grooming family members and unrelated individuals to saving another bat from starvation by sharing a regurgitated blood meal.

“By asking how different social connections change in response to sickness, we can better understand how social networks change as a pathogen spreads,” said Gerry Carter, research associate at STRI and assistant professor of biology at The Ohio State University.

The lead author of the paper, Sebastian Stockmaier, did this project at STRI as part of his doctoral research at the University of Texas at Austin, advised by Daniel Bolnick at the University of Connecticut. Stockmaier injected vampire bats in a captive colony with a bacterial extract that challenged their immune systems, making them feel sick without harming their health or any risk of transmission. He then observed how the appearance of being sick would affect the ill bats’ social relationships with other colony members.

Just as a sick person might choose to stop shaking hands with strangers, but would still need to go to buy groceries, their results, published in the Journal of Animal Ecology, showed that sick vampire bats may reduce the amount of time they spend grooming—not a very important social interaction—but not food sharing—a more important social interaction.

Family relationships are very important: “A female vampire bat is less likely to groom an unrelated bat that is sick, but she won’t reduce the amount of time grooming her own sick offspring,” Stockmaier said.

“What we demonstrated in this study is that the type of social connection matters,” Carter said. “Just as in the recent COVID-19 outbreak, we would expect that a virus transmitted by contact would spread mainly within family groups, because these social connections will not be reduced by sickness behavior. In vampire bats, as well as in humans, the most important social behaviors and relationships don’t change as much when individuals are sick.”

“This study underlies the importance of basic research,” said Rachel Page, research biologist at STRI and a co-author of the paper. “Understanding how social interactions change in the face of illness is a key component in predicting the channels and speed at which a pathogen can spread across a population. Close observation of vampire bat behavior sheds light on how social animals interact, and how these interactions change—and importantly, when they do not change but persist—as individuals become sick.”

The Smithsonian Tropical Research Institute, headquartered in Panama City, Panama, is a unit of the Smithsonian Institution. The institute furthers the understanding of tropical biodiversity and its importance to human welfare, trains students to conduct research in the tropics and promotes conservation by increasing public awareness of the beauty and importance of tropical ecosystems. Promo video.

Stockmaier, S., Bolnick, D.I., Page, R.A. and Carter, G.G. 2020. Sickness effects on social interactions depend on the type of behavior and relationship. Journal of Animal Ecology. https://doi.org/10.1111/1365-2656.13193

Journal

Journal of Animal Ecology

DOI

10.1111/1365-2656.13193

Credit: 
Smithsonian Tropical Research Institute

Kaon: New physics to explain decay of subatomic particle proposed

TALLAHASSEE, Fla. -- Florida State University physicists believe they have an answer to unusual incidents of rare decay of a subatomic particle called a Kaon that were reported last year by scientists in the KOTO experiment at the Japan Proton Accelerator Research Complex.

FSU Associate Professor of Physics Takemichi Okui and Assistant Professor of Physics Kohsaku Tobioka published a new paper in the journal Physical Review Letters that proposes that this decay is actually a new, short-lived particle that has avoided detection in similar experiments.

"This is such a rare disintegration," Okui said. "It's so rare, that they should not have seen any. But if this is correct, how do we explain it? We think this is one possibility."

Kaons are particles made of one quark and one antiquark. Researchers study how they function -- which includes their decay -- as a way to better understand how the world works. But last year, researchers in the KOTO experiment reported four instances of a particular rare decay that should have been too rare to be detected yet.

This observation violates the standard model of physics that explains the basic fundamental forces of the universe and classifies all known elementary particles.

According to their calculations, there could be two possibilities for new particles. In one scenario, they suggest that the Kaon might decay into a pion -- a subatomic particle with a mass about 270 times that of an electron -- and some sort of invisible particle. Or, the researchers in the KOTO experiment could have witnessed the production and decay of something completely unknown to physicists.

Researchers in Japan are conducting a special data run to confirm whether the previous observations were true detections of new particles or simply noise.

"If it's confirmed, it's very exciting because it's completely unexpected," Tobioka said. "It might be noise, but it might not be. In this case, expectation of noise is very low, so even one event or observation is very striking. And in this case there were four."

Credit: 
Florida State University

Why runner's addiction is adding to your injury woes

Each week, millions of runners around the world lace up their running shoes, spurred on by the psychological, health and social benefits that running delivers.

The birth of Parkrun in 2004 - now an international activity with more than 20 countries involved - is credited with a sharp rise in the popularity of running in the past decade, but with benefits come downsides.

A new research paper by University of South Australia Adjunct Professor Jan de Jonge and his team reveals the price that runners (and society) pay when the sport becomes an obsession.

Prof de Jonge, based in the Netherlands at Eindhoven University of Technology and Utrecht University, surveyed 246 recreational runners aged 19 to 77 years to investigate how a person's mental outlook (mental recovery and passion for running) affects their risk of running-related injuries.

Not surprisingly, the more "obsessively passionate" runners - where the sport fully controlled their life to the detriment of partners, friends and relatives - reported far more running-related injuries than those who were more "harmoniously passionate" and laid back in their approach to running.

The latter group, who are in full control of their running and integrate the sport into their life and other activities, reported faster mental recovery after a run and sustained fewer running-related injuries. They were more likely to heed the early warning signs of injuries and take both physical and mental breaks from running whenever necessary.

Obsessively passionate runners disregarded the need to recover after training and failed to mentally detach from the sport, even when running became harmful. Their approach to running delivered short-term gains such as faster times but resulted in more running-related injuries.

Age and gender played a part. The older runners were able to mentally detach and recover a lot faster after a run than those in the 20-34 age group - especially females - who were more prone to running-related injuries.

"Most running-related injuries are sustained as a result of overtraining and overuse or failing to adequately recover, merely due to an obsessive passion for running," Prof de Jonge says.

"The majority of research focuses on the physical aspects of overtraining and lack of recovery time, but the mental aspects of running-related injuries have been ignored to date.

"When running becomes obsessive, it leads to problems. It controls the person's life at the expense of other people and activities and leads to more running-related injuries. This behaviour has also been reported in other sports, including professional dancing and cycling."

In the Netherlands, where the study was undertaken, running-related injuries costs the economy approximately €10 million a year (A$16 million) in medical costs, work absences and reduced productivity. Next to soccer, running is the Dutch sport with the highest number of injuries.

While there are no comparative figures available for Australia, a study by Medibank Private lists running as the 4th most injury-prone sport in Australia after Aussie Rules, basketball and netball, with sporting injuries overall costing the economy more than $2 billion a year.

The paper, "Mental Recovery and Running-Related Injuries in Recreational Runners: the Moderating Role of Passion for Running", is published open access in the International Journal of Environmental Research and Public Health.

Notes to editors

The pilot study was undertaken by Prof Jan de Jonge, Professor Toon W. Taris from Utrecht University and Dr Yannick A. Balk from the University of Amsterdam.

Prof Jan de Jonge is based at Eindhoven University of Technology and Utrecht University, and is an Adjunct Professor in the Asia Pacific Centre for Work Health and Safety at the University of South Australia.

The study examined 246 recreational runners (54 per cent male and 46 per cent female) with a mean age of 47 years. The average running experience was 14 years. On average, participants engaged in running activities three times a week, and the average running distance was about 27 kilometres per week. Two-thirds of the runners ran in groups, and approximately half of the runners used an individualised training schedule for their training activities.

Of all participants, 51.2 per cent reported running-related injuries over the past 12 months, such as knee, Achilles tendon and foot injuries.

Credit: 
University of South Australia