Tech

Building a culture of high-quality data

The era of big data has inundated nearly all scientific fields with torrents of newly available data with the power to stimulate new research and enable inquiry at scales not previously possible. This is particularly true for ecology, where rapid growth in remote sensing, monitoring, and community science initiatives has contributed to a massive surge in the quantity and kinds of environmental data that are available to researchers.
Writing in BioScience, a team led by US Department of Agriculture ecologist Sarah McCord states that the volume of the data is only part of the story. Just as important, they say, is the quality of the data. According to the newly published article, "Big data has magnified both the burden and the complexity of ensuring quality data." And a failure to ensure quality data, say the authors, may cause significant problems for science: "Breakdowns in data quality management can have dire consequences for the rigor of inferences drawn from data analyses, our understanding of ecosystems, and the predictive power of models and their uncertainty," which in turn affect real-world management decisions.
To meet these challenges, the authors propose a comprehensive data quality framework with the aim of encouraging best practices among collectors, curators, and users of ecological data. They argue that their proposed approach constitutes an improvement on the broadly used DataOne lifecycle and similar approaches.
Key to the proposed framework is that, rather than isolating quality assurance and quality control at a single stage, it would separate the two and "encourage all ecologists and land managers, who increasingly rely on found data and may not have a personal relationship with the study initiators or data collectors, to participate in ensuring data quality." In so doing, say the authors, the proposed framework would lead to higher-quality ecological data useful to a greater number of users, with fewer errors among data sets.
Although deployment of the proposed framework presents challenges, the authors argue that "the imperative to take these steps is global. The quality assurance and quality control framework can enhance existing ecological data and collaboration approaches, reduce errors, and increase efficiency of ecological analysis, thereby improving ecological research and management."

Credit: 
American Institute of Biological Sciences

Tilapias are not precocious, they are just resilient

Tilapias living in crowded aquaculture ponds or small freshwater reservoirs adapt so well to these stressful environments that they stop growing and reproduce at a smaller size than their stress-free counterparts.

A new study by researchers at the University of Kelaniya in Sri Lanka and the University of British Columbia, explains that while most fishes die when stressed, tilapias survive in rough environments by stunting and carrying on with their lives in dwarf form.

"Tilapia and other fish in the Cichlidae family do not spawn 'earlier' than other fishes, as it is commonly believed," Upali S. Amarasinghe, lead author of the study and professor at the University of Kelaniya, said. "Rather, they are uncommonly tolerant of stressful environmental conditions which, however, elevate their oxygen demand."

As it happens with other fishes, when tilapia's metabolism accelerates, it needs more oxygen to sustain its body functions. But the interaction between an increased metabolism and a growing body leads to gills reaching a point where they cannot supply enough oxygen for a larger body, so the fish either dies or just stops growing.

"Gill surface area grows in two dimensions, that is, length and width, but they cannot keep up with bodies that grow in three dimensions - length, width and depth," said Daniel Pauly, co-author of the study and principal investigator of the Sea Around Us initiative at UBC's Institute for the Oceans and Fisheries. "As fish get bigger, their gills provide less oxygen per unit of body weight. Thus, to stay alive in stressful conditions, which increase their oxygen demand, fish have to remain smaller. This theme is further developed in what I called the Gill Oxygen Limitation Theory."

In the case of tilapias, the stress they experience under suboptimal conditions adds to the stress they experience from the surface of their gills not keeping with the increasing oxygen demand of their growing bodies. In consequence, the hormonal cascade that leads to maturation and spawning is triggered at smaller sizes than under optimal conditions.

But the spawning doesn't occur at a 'younger age,' as the fish's growth process has already ended.

To reach this conclusion, the researchers analyzed the length at first maturity and maximum lengths reached in 41 populations of nine fish species such as tilapia and other cichlids found in lakes and aquaculture ponds across the world, from Brazil to Uganda, and from Egypt to Hong Kong.

When looking at the ratio between the maximum lengths these fishes can reach and their lengths when they reproduce for the first time, they found it was the same ratio previously identified in other freshwater and marine fishes.

"This ratio tells us that tilapias in stressful conditions don't spawn 'earlier,' they just adjust their size downward, but their life cycle continues," Amarasinghe said.

"These findings will matter to fish farmers, notably in Asia, whose ponds are often full of wildly reproducing, small tilapia for which there is no market," Pauly said.

Credit: 
University of British Columbia

Development of a broadband mid-infrared source for remote sensing

image: Diode-pumped configuration enables a compact/low-cost construction.

Image: 
National Institute for Fusion Science

A research team of the National Institutes of Natural Sciences, National Institute for Fusion Science and Akita Prefectural University have successfully demonstrated a broadband mid-infrared (MIR) source with a simple configuration. This light source generates highly-stable broadband MIR beam at 2.5-3.7 μm wavelength range maintaining the brightness owing to its high-beam quality. Such a broadband MIR source facilitates a simplified environmental monitoring system by constructing a MIR fiber-optic sensor, which has the potential for industrial and medical applications.

In the MIR wavelength region, there are many strong absorption lines of molecules due to the change of their rotational and vibrational states. Therefore, by using MIR sources, we have promising opportunities to develop sensitive remote monitoring systems of practical use. In particular, a fiber-optic sensor based on MIR absorption spectroscopy has great potential as the next-generation gas detecting device, e.g., for an exhaust gas monitor at an industrial plant, breath analysis for medical purposes, and other uses. However, there is the remaining issue of the absence of a suitable MIR source exhibiting broadband spectrum and high-beam quality. In this work, the research team has demonstrated an ultra-broadband amplified spontaneous emission (ASE) source at the MIR region, which meets the requirements for developing the fiber-optic sensor.

In order to obtain the MIR emission, the research team has developed optical fiber made of fluoride glass co-doped with trivalent ions of Er (the atomic number 68) and Dy (66). This fiber enables a simple and low-cost configuration of ASE light source with diode-pumping (Fig. 1) by means of energy transfer from Er3+ to Dy3+. A broadband and moderate-power ASE light source of 2.5-3.7 μm wavelength (Fig. 2) was experimentally investigated for the optimum design of fluoride fiber in terms of ion concentration, fiber length, pumping configuration, and pumping power. In addition, this light source exhibits excellent beam quality resulting in high-coupling efficiency with an external optical fiber.

Assistant Professor Hiyori Uehara in the research team states that, "Our new light source can facilitate a simplified MIR fiber-optic sensor device for various practical applications. For example, an environmental monitoring system in the industrial plant, stand-off detection of hazardous objects, disease diagnosis by breath analysis, inspection of fiber-optic devices, and others. Ongoing detailed research demonstrating highly-sensitive multiple-gas detection using a MIR fiber sensor will be performed and reported in the near future."

Credit: 
National Institutes of Natural Sciences

Research shows how a sugary diet early in life could mean memory trouble later

New research shows how drinking sugary beverages early in life may lead to impaired memory in adulthood.

The study, published today in Translational Psychiatry, also is the first to show how a specific change to the gut microbiome -- the bacteria and other microorganisms growing in the stomach and intestines -- can alter the function of a particular region of the brain.

According to the Centers for Disease Control and Prevention, sugar-sweetened beverages are a leading source of added sugars in Americans' diets. Nearly two-thirds of young people in the United States consume at least one sugary drink each day.

Neuroscientist Scott Kanoski, associate professor of biological sciences at the USC Dornsife College of Letters, Arts and Sciences, has studied the link between diet and brain function for years. His research has shown that consumption of sugary beverages impairs memory function in rats and that those same drinks change the gut microbiome.

In the current study, Kanoski and researchers at UCLA and the University of Georgia, Athens, sought to find out if a direct link exists between changes to the microbiome and memory function.

The scientists gave adolescent rats free access to a sugar-sweetened beverage similar to those that humans drink.

When the rats grew to be adults after about a month, the researchers tested their memories using two different methods. One method tested memory associated with a region of the brain called the hippocampus. The other method tested memory function controlled by a region called the perirhinal cortex.

The researchers found that, compared to rats that drank just water, the rats that consumed high levels of sugary drink had more difficulty with memory that uses the hippocampus. Sugar consumption did not affect memories made by the perirhinal cortex.

"Early life sugar consumption seems to selectively impair their hippocampal learning and memory," said study lead author Emily Noble, assistant professor in the UGA College of Family and Consumer Sciences and a former postdoctoral fellow at USC Dornsife.

The scientists then checked the rats' gut microbiomes and found differences between those that drank the sweet beverage and those that drank water. The sugar drinkers had larger populations of two particular species of gut bacteria: Parabacteroides distasonis and Parabacteroides johnsonii.

The researchers then asked if the Parabacteroides bacteria could, without the help of sugar, affect the rats' memory function. They transplanted Parabacteroides bacteria that were grown in the lab into the guts of adolescent rats that drank just water. The rats receiving the bacteria showed memory impairment in the hippocampus when they grew to adulthood much the same as the sugar-drinking rats.

The scientists also found that, unlike the sugar-drinking rats, the transplanted rats also showed memory impairment in the perirhinal cortex. This difference provides further evidence that altered brain function associated with diet may actually be rooted in changes to the gut microbiome.

Previous studies have transplanted the entire gut microbiome from one group of animals to another, producing similar changes to brain function. However, this study is among the first to do so with just two specific species.

"It was surprising to us that we were able to essentially replicate the memory impairments associated with sugar consumption not by transferring the whole microbiome, but simply by enriching a single bacterial population in the gut," said Kanoski, who is a corresponding author on the study.

Finally, the scientists examined the activity of genes in the hippocampus, comparing rats that drank the sugary beverage to those that drank just water and comparing water drinkers to those transplanted with Parabacteroides.

Gene activity did, in fact, change in both the rats that consumed the sugar-sweetened beverages and the rats transplanted with Parabacteroides. The genes that were affected control how nerve cells transmit electrical signals to other nerve cells and how they send molecular signals internally.

The results of this study confirm a direct link, on a molecular level, between the gut microbiome and brain function.

In future studies, Kanoski and the researchers hope to determine if changing habits, such as eating a healthier diet or increasing exercise, can reverse the harm to memory caused by elevated sugar consumption earlier in life.

Credit: 
University of Southern California

SMART study finds ridesharing intensifies urban road congestion

image: Multiple pathways of TNCs' impact on urban mobility

Image: 
Amy Wanjin Diao

Transport Network Companies (TNCs) not only increased road congestion but were also net substitute for public transit reducing PT ridership by almost 9%

The reduction in private vehicle ownership due to TNCs was insignificant

Research findings can provide valuable insights for transportation policy and regulation

Singapore, 31 March 2021 - Transport Network Companies (TNCs) or ridesharing companies have gained widespread popularity across much of the world, with more and more cities adopting the phenomenon. While ridesharing has been credited with being more environmentally friendly than taxis and private vehicles, is that really the case today or do they rather contribute to urban congestion?

Researchers at the Future Urban Mobility (FM) Interdisciplinary Research Group (IRG) at Singapore-MIT Alliance for Research and Technology (SMART), MIT's research enterprise in Singapore, Massachusetts Institute of Technology (MIT) and Tongji University conducted a study to find out.

In a paper titled "Impacts of transportation network companies on urban mobility" recently published in the prestigious journal Nature Sustainability, the first-of-its-kind study assessed three aspects of how ridesharing impacts urban mobility in the United States - road congestion, public transport ridership and private vehicle ownership - and how they have evolved over time.

"While public transportation provides high-efficiency shared services, it can only accommodate a small portion of commuters as their coverage is limited in most places," says SMART FM Principal Investigator and Associate Professor at MIT Department of Urban Studies and Planning Jinhua Zhao. "While mathematical models in prior studies showed that the potential benefit of on-demand shared mobility could be tremendous, our study suggests that translating this potential into actual gains is much more complicated in the real world."

Using a panel dataset covering mobility trends, socio-demographic changes, and TNC entry at the Metropolitan Statistical Areas (MSAs) level to construct a set of fixed-effect panel models, the researchers found the entrance of TNCs led to increased road congestion in terms of both intensity as well as duration. They also found a 8.9% drop in public transport ridership and an insignificant decrease of only 1% in private vehicle ownership.

While many previous studies have focused on Uber alone, SMART and MIT's latest study takes into account both Uber and Lyft - the two most popular ridesharing companies in the United States. While Uber accounts for 69% of the market share, Lyft accounts for a significant 29% and its inclusion into the dataset would give a more holistic and unbiased estimate of the TNC effect.

The study also found easy access to ridesharing discourages commuters from taking greener alternatives like walking or taking public transportation. Survey data from various US cities also showed that approximately half of TNC trips would otherwise have been made by walking, cycling, public transport or would not have been made at all.

"We are still in the early stages of TNCs and we are likely to see many changes in how these ridesharing businesses operate," says Dr Hui Kong, SMART-FM alumna and Postdoctoral Associate at the MIT Urban Mobility Lab, and an author of the paper. "Our research shows that over time TNCs have intensified urban transport challenges and road congestion in the United States, mainly through the extended duration and slightly through the increased intensity. With this information, policies can then be introduced that could lead to positive changes."

The researchers think that the substantial deadheading miles (miles travelled without a passenger) by TNCs could contribute to the TNC's negative impact on road congestion. According to some other studies, approximately 40.8% of TNC miles are deadheading miles.

"Our findings can provide useful insights into the role that TNCs have played in urban transport systems," says Professor Mi Diao of Tongji University and SMART-FM alumnus, who is the lead author of the paper. "It can be very useful in supporting transportation planners and policymakers in their decisions and regulations with regard to TNCs."

Credit: 
Singapore-MIT Alliance for Research and Technology (SMART)

New study sews doubt about the composition of 70 percent of our universe

Until now, researchers have believed that dark energy accounted for nearly 70 percent of the ever-accelerating, expanding universe.

For many years, this mechanism has been associated with the so-called cosmological constant, developed by Einstein in 1917, that refers to an unknown repellant cosmic power.

But because the cosmological constant--known as dark energy--cannot be measured directly, numerous researchers, including Einstein, have doubted its existence--without being able to suggest a viable alternative.

Until now. In a new study by researchers at the University of Copenhagen, a model was tested that replaces dark energy with a dark matter in the form of magnetic forces.

"If what we discovered is accurate, it would upend our belief that what we thought made up 70 percent of the universe does not actually exist. We have removed dark energy from the equation and added in a few more properties for dark matter. This appears to have the same effect upon the universe's expansion as dark energy," explains Steen Harle Hansen, an associate professor at the Niels Bohr Institute's DARK Cosmology Centre.

The universe expands no differently without dark energy

The usual understanding of how the universe's energy is distributed is that it consists of five percent normal matter, 25 percent dark matter and 70 percent dark energy.

In the UCPH researchers' new model, the 25 percent share of dark matter is accorded special qualities that make the 70 percent of dark energy redundant.

"We don't know much about dark matter other than that it is a heavy and slow particle. But then we wondered--what if dark matter had some quality that was analogous to magnetism in it? We know that as normal particles move around, they create magnetism. And, magnets attract or repel other magnets--so what if that's what's going on in the universe? That this constant expansion of dark matter is occurring thanks to some sort of magnetic force?" asks Steen Hansen.

Computer model tests dark matter with a type of magnetic energy

Hansen's question served as the foundation for the new computer model, where researchers included everything that they know about the universe--including gravity, the speed of the universe's expansion and X, the unknown force that expands the universe.

"We developed a model that worked from the assumption that dark matter particles have a type of magnetic force and investigated what effect this force would have on the universe. It turns out that it would have exactly the same effect on the speed of the university's expansion as we know from dark energy," explains Steen Hansen.

However, there remains much about this mechanism that has yet to be understood by the researchers.

And it all needs to be checked in better models that take more factors into consideration. As Hansen puts it:

"Honestly, our discovery may just be a coincidence. But if it isn't, it is truly incredible. It would change our understanding of the universe's composition and why it is expanding. As far as our current knowledge, our ideas about dark matter with a type of magnetic force and the idea about dark energy are equally wild. Only more detailed observations will determine which of these models is the more realistic. So, it will be incredibly exciting to retest our result.

Credit: 
University of Copenhagen - Faculty of Science

Advances in tropical cyclone observation may aid in disaster reduction and prevention

image: South China Sea Experiment 2020 of the "Petrel Project". The first three-dimensional observational data of a tropical cyclone system were obtained by multiple unmanned vehicles in an experiment conducted by 12 institutions under the umbrella of the Petrel Project. A high-altitude large unmanned aerial vehicle outfitted with measurement tools flew over Typhoon Sinlaku, a storm system that began at the end of July 2020 and dissipated on 3 August.

Image: 
Advances in Atmospheric Science

Tropical cyclones -- known as typhoons in the Pacific and as hurricanes in the Atlantic -- are fierce, complex storm systems that cause loss of human life and billions of dollars in damage every year. For decades, scientists have studied each storm, striving to understand the system yet unable to fully measure every intricate variable. Now, the convergence of new observational tools and the launch of an inclusive database may elucidate the innerworkings of tropical cyclones in the Western North Pacific and South China Sea.

Three papers were published in the latest issue of Advances in Atmospheric Science. One paper, led by the Chinese Meteorological Administration (CMA), focuses on a new tropical cyclone database, and the other two, led by The Petrel Meteorological Observation Experiment Project (Petrel Project) of the CMA, and the Institute of Atmospheric Physics at the Chinese Academy of Sciences respectively, reports an extensive marine observing experiment based on unmanned vehicles.

"The improvement of tropical cyclones prediction accuracy depends not only on the understanding of tropical cyclone dynamics but also on the accuracy of tropical cyclone location and intensity estimations, which is the basis of operational tropical cyclone prediction and research," said Xiaoqin Lu, Shanghai Typhoon Institute, CMA, and Key Laboratory of Numerical Modeling for Tropical Cyclone, CMA. Lu was first author on the CMA tropical cyclone database paper.

With eight separate datasets, including one compiled by the CMA with measurements dating to 1949, the database comprises historical or real-time locations, intensity, dynamic and thermal structures, wind strengths, precipitation amounts, frequency and more, according to Lu.

"The CMA tropical cyclone database is the only multi-source, multi-timescale, multi-spatial-scale comprehensive tropical cyclone database that covers the western North Pacific," Lu said. "This database can provide basic scientific support for researchers, forecasters and government departments performing tropical cyclone research, forecasts and disaster reduction planning."

Tropical cyclone datasets from multiple sources were previously compiled as resources for forecasters and researchers in 2017, and 2019.

"These resources have played an important role in revealing tropical cyclone climatic rules in the western north Pacific and the South China Sea, allowing researcher to search for impact factors with physical significance and finally establish an operational prediction scheme," Lu said. "We hope that this database will be used for the application of new technologies, such as artificial intelligence, in a wide range of disciplines."

Developing technology will soon contribute to the database. The first three-dimensional observational data of a tropical cyclone system was obtained by multiple unmanned vehicles in an experiment conducted by 12 institutions under the umbrella of the Petrel Project. A high-altitude large unmanned aerial vehicle (UAV) outfitted with measurement tools flew over Typhoon Sinlaku, a storm system that began at the end of July 2020 and dissipated on Aug. 3. At the same time, a solar-powered marine unmanned surface vehicle (USV) developed by the Institute of Atmospheric Physics actively navigated itself to the eye of the storm. The voyage of the USV is detailed in one of the papers. Two buoys also took observational data during the storm.

"The data contain measurements of 21 oceanic and meteorological parameters acquired by the five devices, along with video footage from the UAV," said Xuefan Zhang of the Meteorological Observation Center, CMA and first author of the Petrel Project paper. "The data proved very helpful in determining the actual location and intensity of Typhoon Sinlaku. The experiment demonstrates the feasibility of using a high-altitude, large UAV to fill in the gaps between operational meteorological observations of marine areas and typhoons near China, and it marks a milestone for the use of such data for analyzing the structure and impact of a typhoon in the South China Sea."

The improved ability to take detailed, real-time observational data will help researchers understand how tropical cyclones form and develop, according to Zhang.

Zhang plans to continue conducting comprehensive tropical cyclone observations and use the data to develop an operation system to better understand and predict the storms, and Lu plans to expand and enrich the multi-source database with more field experiments and new equipment to explore the finer details of tropical cyclone structure and operation. According to both researchers, the ultimate goal of their projects is to better understand tropical cyclones in order to help mitigate and reduce disasters.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Flood risk uncertainties assessed at the global scale

image: Researchers from the Institute of Industrial Science, The University of Tokyo calculate sources of uncertainty in flood risks to improve global flood predictions

Image: 
Institute of Industrial Science, the University of Tokyo

Tokyo, Japan - A research team from the Institute of Industrial Science, The University of Tokyo has conducted a detailed analysis of the uncertainties associated with flood risk modeling at the global scale. They found large uncertainties were mainly associated with runoff data. Flood magnitude is large in wet regions, but uncertainties in flood depth is larger in dry and mountainous regions affected by rare, extreme floods. The results of the study can be used to identify the key areas for improvement in hydrological modeling and improve future predictions of flood risk.

Assessment of the risk of rare and extreme floods is essential for disaster management and recovery planning at international, national, and regional levels. However, the accurate assessment of flood risk is limited by the number of observations possible, and relies on hydrological modeling which has limited performance. Theoretical flood hazard maps, which are relied on by governments, regional planners, insurance services, and other stakeholders, are an important part of understanding the potential extent of flood risk. However, they are subject to high levels of uncertainty, especially at the large scale.

"The main problem is that the creation of accurate flood hazard maps relies on excellent flood frequency analysis," explains study co-author Dr. Wenchao Ma, "and the main limitation of that is its reliance on assumptions about the underlying distribution of flood events, which is hugely variable among different regions of the world. There is no single solution that can be applied everywhere."

The research team found that uncertainties in runoff inputs contributed more than 80% of the total uncertainty. Overall uncertainties were highest in Africa, but exposure risk was greatest in Asia. "We found that land susceptible to rare floods - called 1-in-100-year floods - accounts for 9.1% of the global area, excluding Antarctica," says study lead author Dr. Xudong Zhou. "In addition, the numbers show 13.4% of the population may be exposed to such a flood event, with a potential economic impact of up to 14.9 trillion US dollars to global GDP."

More adjustments are needed to flood hazard maps to ensure their accuracy. Further data are needed to take into account the efforts made by flood defenses and dam regulations and to improve inputs for areas with large uncertainties, such as Africa. Such improvements would reduce these uncertainties and improve these resources for use under future climatic conditions.

Credit: 
Institute of Industrial Science, The University of Tokyo

NTU Singapore scientists design 'smart' device to harvest daylight

video: Bright idea: Daylight can now be brought underground, thanks to NTU Singapore scientists who have built a 'smart' device to harvest the daylight to light up underground spaces. Like a magnifying glass, the device shaped like a crystal ball focuses sunlight into one point where it is collected at the end of a fibre cable then transported to the other end that is deployed underground where the light is emitted. This reduces the need to tap on traditional energy sources for lighting.

Image: 
NTU Singapore

A team of Nanyang Technological University, Singapore (NTU Singapore) researchers has designed a 'smart' device to harvest daylight and relay it to underground spaces, reducing the need to draw on traditional energy sources for lighting.

In Singapore, authorities are looking at the feasibility of digging deeper underground to create new space for infrastructure, storage, and utilities. Demand for round-the-clock underground lighting is therefore expected to rise in the future.

To develop a daylight harvesting device that can sustainably meet this need, the NTU team drew inspiration from the magnifying glass, which can be used to focus sunlight into one point.

They used an off-the-shelf acrylic ball, a single plastic optical fibre - a type of cable that carries a beam of light from one end to another - and computer chip-assisted motors.

The device sits above ground and just like the lens of a magnifying glass, the acrylic ball acts as the solar concentrator, enabling parallel rays of sunlight to form a sharp focus at its opposite side. The focused sunlight is then collected into one end of a fibre cable and transported along it to the end that is deployed underground. Light is then emitted via the end of the fibre cable directly.

At the same time, small motors - assisted by computer chips - automatically adjust the position of the fibre's collecting end, to optimise the amount of sunlight that can be received and transported as the sun moves across the sky.

Developed by Assistant Professor Yoo Seongwoo from the School of Electrical and Electronics Engineering and Dr Charu Goel, Principal Research Fellow at NTU's The Photonics Institute, the innovation was reported in the peer-reviewed scientific journal Solar Energy early this month.

The device overcomes several limitations of current solar harvesting technology. In conventional solar concentrators, large, curved mirrors are moved by heavy-duty motors to align the mirror dish to the sun. The components in those systems are also exposed to environmental factors like moisture, increasing maintenance requirements.

The NTU device, however, is designed to use the round shape of the acrylic ball, ridding the system of heavy-duty motors to align with the sun, and making it compact.

The prototype designed by the researchers' weighs 10 kg and has a total height of 50 cm. To protect the acrylic ball from environmental conditions (ultraviolet light, dust etc.), the researchers also built a 3mm thick, transparent dome-shaped cover using polycarbonate.

Device compact enough to be mounted as a lamp post

Asst Prof Yoo, lead author of the study said, "Our innovation comprises commercially available off-the-shelf materials, making it potentially very easy to fabricate at scale. Due to space constraints in densely populated cities, we have intentionally designed the daylight harvesting system to be lightweight and compact. This would make it convenient for our device to be incorporated into existing infrastructure in the urban environment."

The NTU team believes the device is ideally suited to be mounted as a conventional lamp post above ground. This would enable the innovation to be used in two ways: a device to harvest sunlight in the day to light up underground spaces, and a streetlamp to illuminate above ground at night using electricity.

The research by the NTU scientists is an example of NTU's Smart Campus vision that aims to develop technologically advanced solutions for a sustainable future.

'Smart' automatic positioning to harvest maximum sunlight

As the sun moves across the sky throughout the day, so will the position of the focused sunlight inside the acrylic ball. To ensure that maximum sunlight is being collected and transported down the fibre cable throughout the day, the system uses a computer chip-based mechanism to track the sun rays.

The Global Positioning System (GPS) coordinates of the device location are pre-loaded into the system, allowing it to determine the spot where maximum sunlight should be focused at any given time. Two small motors are then used to automatically adjust the position of the ?bre to catch and transport sunlight from the focused spot at one-minute intervals.

To guarantee the device's automatic positioning capability, pairs of sensors that measure light brightness are also placed around the sunlight collecting end of the fibre cable. Whenever the sensors detect inconsistencies in the light measurements, the small motors automatically activate to adjust the cable's position until the values on the sensors are the same. This indicates that the fibre is catching the maximum amount of sunlight possible.

During rain or overcast skies when there is inadequate sunlight to be collected and transported underground, an LED bulb powered by electricity installed right next to the emitting end of the fibre cable, will automatically light up. This ensures that the device can illuminate underground spaces throughout the day without interruption.

Performs better than LED bulbs

In experiments in a pitch-black storeroom (to simulate an underground environment), the NTU researchers found the device's luminous efficacy - the measure of how well a light source produces visible light using 1 Watt of electrical power- to be at 230 lumens/Watt.

This far exceeds those recorded by commercially available LED bulbs, which have a typical output of 90 lumens/Watt. The quality of the light output of the NTU smart device is also comparable with current commercially available daylight harvesting systems which are far more costly.

Dr Charu, who is the first author of the study, said, "The luminous efficacy of our low-cost device proves that it is well-suited for low-level lighting applications, like car parks, lifts, and underground walkways in dense cities. It is also easily scalable. Since the light capturing capacity of the ball lens is proportional to its size, we can customise the device to a desired output optical power by replacing it with a bigger or smaller ball."

Michael Chia, Managing Director at Technolite, a Singapore-based design focused agency specialising in lighting, and the industry collaborator of the research study said, "It is our privilege and honour to take this innovation journey with NTU. While we have the commercial and application knowledge, NTU in-depth knowhow from a technical perspective has taken the execution of the project to the next level that is beyond our expectations."

Moving forward, the lighting company is exploring ways to potentially incorporate the smart device or its related concepts into its industrial projects for improved efficiency and sustainability.

Credit: 
Nanyang Technological University

A successful phonon calculation within the Quantum Monte Carlo framework

image: Phonon dispersion of diamond calculated at the variational Monte Carlo level by TurboRVB

Image: 
Kousuke Nakano from JAIST

Ishikawa, Japan - The focus and ultimate goal of computational research in materials science and condensed matter physics is to solve the Schrödinger equation--the fundamental equation describing how electrons behave inside matter--exactly (without resorting to simplifying approximations). While experiments can certainly provide interesting insights into a material's properties, it is often computations that reveal the underlying physical mechanism. However, computations need not rely on experimental data and can, in fact, be performed independently, an approach known as "ab initio calculations". The density functional theory (DFT) is a popular example of such an approach.

For most material scientists and condensed matter physicists, DFT calculations are the bread and butter of their profession. However, despite being a powerful technique, DFT has had limited success with "strongly correlated materials"--materials with unusual electronic and magnetic properties. These materials, while interesting on their own, also possess technological useful properties, a fact that strongly motivates an ab initio framework suited to describe them.

To that end, a framework known as "ab initio quantum Monte Carlo" (QMC) has shown considerable promise and is expected to be the next generation of electronic structure calculations due to its superiority over DFT. However, even QMC is largely restricted to calculations of energy and atomic forces, limiting its utility in computing useful material properties.

Now, in a breakthrough study published in Physical Review B (Editors' Suggestion), scientists have taken things to the next level based on an approach that allows them to reduce the statistical error in atomic force evaluation by two orders of magnitude and subsequently speeds up the computation by a factor of 104! "The drastic reduction in computational time will greatly expand the range of QMC calculations and enable highly accurate prediction of atomic properties of materials that have been difficult to handle," observes Assistant Professor Kousuke Nakano from Japan Advanced Institute of Science and Technology (JAIST), who, along with his colleagues Prof. Ryo Maezono from JAIST, Prof. Sandro Sorella from International School for advanced Studies (SISSA), Italy, and Dr. Tommaso Morresi and Prof. Michele Casula from Sorbonne Université, France, led this groundbreaking achievement.

The team applied their developed method to calculate the atomic vibrations of diamond, a typical reference material, as a proof-of-concept and showed that the results were consistent with experimental values. To perform these calculations, they used a large computer, Cray-XC40, located at the Research Center for Advanced Computing Infrastructure at Japan Advanced Institute of Science and Technology (JAIST), Japan, along with another located at RIKEN, Japan. The team made use of a QMC software package called "TurboRVB", initially launched by Prof. Sorella and Prof. Casula and developed later by Prof. Nakano along with others, to perform phonon dispersion calculations for diamond that were previously inaccessible, greatly expanding its scope.

Prof. Nakano looks forward to the applications of QMC in materials informatics (MI), a field dedicated to the design and search for novel materials using techniques of information science and computational physics. "While MI is currently governed by DFT, the rapid developments in computer performance, such as the exascale supercomputer, will help QMC gain popularity. In that regard, our developed method is going to be very useful for designing novel materials with real-life applications," concludes an optimistic Dr. Nakano.

Credit: 
Japan Advanced Institute of Science and Technology

A new review on how to fight COVID-19 during the British wintertime

A new report is highlighting ways we can fight COVID-19 while indoors during cold weather periods.

At the beginning of the COVID-19 crisis, there was a lack of empirical evidence on the virus's airborne transmission. However, an increasing body of evidence - gathered particularly from poorly ventilated environments - has given the scientific community a better understanding of how the disease progresses. Information on the asymptomatic and pre-symptomatic transmission of the virus strongly supports the case for airborne transmission of COVID-19.

In a study published by the journal Proceedings of the Royal Society A, scientists from the University of Surrey, together with other members of the Royal Society's Rapid Action in Modelling the Pandemic (RAMP) initiative, conducted a literature review of how two types of indoor spaces - open-plan offices and school classrooms - condition and ventilate their environments.

Their primary recommendation was that assessments of ventilation provision, or where practical the monitoring of CO2 levels to indicate ventilation provision, should be carried out to help manage the risk of COVID-19 transmission via the airborne route.

The researchers also found that humidity can influence the spread of the virus. While higher humidity might help to reduce the spread, it could also lead to other health issues related to the growth of mould and other pathogens. The researchers recommend that in cold weather, humidity should be maintained at between 40 and 50 per cent.

The report also confirms that social distancing and the use of face masks continue to play an essential role in reducing the risk of transmitting the virus when combined with good ventilation of indoor spaces.

The researchers were unconvinced by any evidence of the effectiveness of using desk and ceiling fans to reduce the transmission of COVID-19. However, there may be some overall health benefit to using a fan to increase air circulation in certain enclosed spaces.

Dr Oleksiy Klymenko, co-author and lecturer in Chemical and Process Engineering at the University of Surrey, said: "This awful year living with COVID-19 has motivated the scientific community to understand all that we can about how this dangerous virus behaves in an indoor environment. We hope that our review will be a valuable tool for managing the virus in future."

Dr Michael Short, co-author and lecturer in Chemical and Process Engineering at the University of Surrey, said:

"While sustainability and temperature control have been important considerations during the construction process of a built environment, the pandemic has shown us the importance of moving air quality and ventilation processes up on that agenda. We hope this report will contribute to future debate about how to make sure our indoor environments are safer for all."

Credit: 
University of Surrey

An organic material for the next generation of HVAC technologies

image: Dehumidifiers with enhanced polyimide membranes (white disc) will be energy efficient with a smaller carbon footprint.

Image: 
Dharmesh Patel/Texas A&M Engineering

On sultry summer afternoons, heating, ventilation and air conditioning (HVAC) systems provide much-needed relief from the harsh heat and humidity. These systems, which often come with dehumidifiers, are currently not energy efficient, guzzling around 76% of the electricity in commercial and residential buildings.

In a new study, Texas A&M University researchers have described an organic material, called polyimides, that uses less energy to dry air. Furthermore, the researchers said polyimide-based dehumidifiers can bring down the price of HVAC systems, which currently cost thousands of dollars.

"In this study, we took an existing and rather robust polymer and then improved its dehumidification efficiency," said Hae-Kwon Jeong, McFerrin Professor in the Artie McFerrin Department of Chemical Engineering. "These polymer-based membranes, we think, will help develop the next generation of HVAC and dehumidifier technologies that are not just more efficient than current systems but also have a smaller carbon footprint."

The results of the study are described in the Journal of Membrane Science.

Dehumidifiers remove moisture from the air to a comfortable level of dryness, thereby improving air quality and eliminating dust mites, among other useful functions. The most commonly available dehumidifiers use refrigerants. These chemicals dehumidify by cooling the air and reducing its ability to carry water. However, despite their popularity, refrigerants are a source of greenhouse gases, a major culprit for global warming.

As an alternative material for dehumidification, naturally occurring materials known as zeolites have been widely considered for their drying action. Unlike refrigerants, zeolites are desiccants that can absorb moisture within their water-attractive or hydrophilic pores. Although these inorganic materials are green and have excellent dehumidification properties, zeolite-based dehumidifiers pose challenges of their own.

"Scaling up is a big problem with zeolite membranes," Jeong said. "First, zeolites are expensive to synthesize. Another issue comes from the mechanical properties of zeolites. They are weak and need really good supporting structures, which are quite expensive, driving up the overall cost."

Jeong and his team turned to a cost-effective organic material called polyimides that are well-known for their high rigidity and tolerance for heat and chemicals. At the molecular level, the basic unit of these high-performance polymers are repeating, ring-shaped imide groups connected together in long chains. Jeong said the attractive forces between the imides gives the polymer its characteristic strength and thus an advantage over mechanically weak zeolites. But the dehumidification properties of the polyimide material needed enhancement.

The researchers first created a film by carefully applying polyimide molecules on a few nanometers-wide alumina platforms. Next, they put this film in a highly concentrated sodium hydroxide solution, triggering a chemical process called hydrolysis. The reaction caused the imide molecular groups to break and become hydrophilic. When viewed under a high-powered microscope, the researchers uncovered that the hydrolysis reactions lead to the formation of water-attractive percolation channels or highways within the polyimide material.

When Jeong's team tested their enhanced material for dehumidification, they found that their polyimide membrane was very permeable to water molecules. In other words, the membrane was capable of extracting excess moisture from the air by trapping them in the percolation channels. The researchers noted that these membranes could be operated continuously without the need for regeneration since the trapped water molecules leave from the other side by a vacuum pump that is installed within a standard dehumidifier.

Jeong said his team carefully designed their experiments to partial hydrolysis wherein a controlled number of imide groups become hydrophilic.

"The strength of polyimides comes from their intermolecular forces between their chains," Jeong said. "If too many imides are hydrolyzed, then we are left with weak material. On the other hand, if the hydrolysis is too low, the material won't be effective at dehumidification."

Although polyimide membranes have shown great promise in their potential use in dehumidification, Jeong said their performance still lags behind zeolite membranes.

"This is a new approach to improve the property of a polymer for dehumidification and a lot more optimizations need to be done in order to further enhance the performance of this membrane," Jeong said. "But another key factor for engineering applications is it has to be cheap, especially if you want the technology to be reasonably affordable for homeowners. We are not there yet but are certainly taking strides in that direction."

Credit: 
Texas A&M University

COVID-19-associated seizures may be common, linked to higher risk of death

BOSTON - COVID-19 can have damaging effects on multiple organs in the body, including the brain. A new study led by investigators at Massachusetts General Hospital (MGH) and Beth Israel Deaconess Medical Center (BIDMC) indicates that some hospitalized patients with COVID-19 experience non-convulsive seizures, which may put them at a higher risk of dying. The findings are published in the Annals of Neurology.

"Seizures are a very common complication of severe critical illness. Most of these seizures are not obvious: Unlike seizures that make a person fall down and shake, or convulse, seizures in critically ill patients are usually nonconvulsive," explains co-senior author M. Brandon Westover, MD, PhD, an investigator in the Department of Neurology at MGH and director of Data Science at the MGH McCance Center for Brain Health. "There is increasing evidence that non-convulsive seizures can damage the brain and make outcomes worse, similar to convulsions."

Westover notes that there have been only a few small reports of seizures in patients with severe COVID-19 illness, and it was previously unclear whether such seizures primarily occur in patients who already have a seizure disorder or whether they can arise for the first time because of COVID-19. The effects of such seizures on patients' health was also unknown.

To provide insights, Westover and his colleagues analyzed medical information for 197 hospitalized patients with COVID-19 who underwent electroencephalogram (EEG) monitoring--tests that detect electrical activity of the brain using small metal discs attached to the scalp--for various reasons at nine institutions in North America and Europe.

The EEG tests detected nonconvulsive seizures in 9.6% of patients, some of whom had no prior neurological problems. Patients who had seizures needed to be hospitalized for a longer time, and they were four times more likely to die while in the hospital than patients without seizures--suggesting that neurological complications may be an important contributor to the morbidity and mortality associated with COVID-19.

"We found that seizures indeed can happen in patients with COVID-19 critical illness, even those without any prior neurologic history, and that they are associated with worse outcomes: higher rates of death and longer hospital stay, even after adjusting for other factors," says co-senior author Mouhsin Shafi, MD, PhD, an investigator in the Department of Neurology at BIDMC, medical director of the BIDMC EEG laboratory, and director of the Berenson-Allen Center for Noninvasive Brain Stimulation. "Our results suggest that patients with COVID-19 should be monitored closely for nonconvulsive seizures. Treatments are available and warranted in patients at high risk; however, further research is needed to clarify how aggressively to treat seizures in COVID-19."

Credit: 
Massachusetts General Hospital

Kumon or Montessori? It may depend on your politics, according to new study of 8,500 parents

HOUSTON - (March 30, 2021) - Whether parents prefer a conformance-oriented or independence-oriented supplemental education program for their children depends on political ideology, according to a study of more than 8,500 American parents by a research team from Rice University and the University of Texas at San Antonio.

"Conservative parents have a higher need for structure, which drives their preference for conformance-oriented programs," said study co-author Vikas Mittal, a professor of marketing at Rice's Jones Graduate School of Business. "Many parents are surprised to learn that their political identity can affect the educational choices they make for their children."

Supplemental education programs include private tutoring, test preparation support and educational books and materials as well as online educational support services. The global market for private tutoring services is forecasted to reach $260.7 billion by 2024, and the U.S. market for tutoring is reported to be more than $8.9 billion a year. According to the Bureau of Labor Statistics, there are more than 100,000 businesses in the private education services industry. Supplemental education program brands are among the top 500 franchises in Entrepreneur magazine's 2020 rankings, and they include popular providers such as Kumon (ranked No. 12), Mathnasium (No. 29) and Huntington Learning Center (No. 39).

For over five decades, education psychologists have utilized two pedagogical orientations --conformance orientation and independence orientation. A conformance orientation is more standardized and guided, emphasizing lecture-based content delivery, knowledge and memorization, frequent use of homework assignments, standardized examinations with relative evaluation and classroom attendance discipline and rules. In contrast, an independence orientation features discussion-based seminars and student-led presentations, an emphasis on ideas rather than facts, use of multimodal interaction instead of books, and highly variable and unstructured class routines. The two approaches do not differ in terms of topics covered in the curriculum or the specific qualities to be imparted to students.

The research team asked parents about their preferences for different programs framed as conformance- or independence-oriented. In five studies of more than 8,500 parents, conservative parents preferred education programs that were framed as conformance-oriented, while liberal parents preferred independence-oriented education programs. This differential preference emerged for different measures of parents' political identity: their party affiliation, self-reported political leaning and whether they watch Fox or CNN/MSNBC for news.

"By understanding the underlying motivations behind parents' preferences, educational programs' appeal to parents can be substantially enhanced," Mittal said. "Supplemental tutoring will be a major expenditure and investment for parents grappling with their child's academic performance in the post-pandemic era. Informal conversations show parents gearing up to supplement school-based education with tutoring. Despite this, very little research exists about the factors that affect parents' preference for and utilization of supplemental education."

Mittal cautioned that these results do not speak to ultimate student performance. "This study only speaks to parents' preferences but does not study ultimate student achievement," he said.

Credit: 
Rice University

Architecture of Eolian successions under icehouse and greenhouse conditions

Boulder, Colo., USA: Anthropogenic climate change is one of the foremost scientific and societal challenges. In part, our response to this global challenge requires an enhanced understanding of how the Earth's surface responds to episodes of climatic heating and cooling. As historical records extend back only a few hundred years, we must look back into the ancient rock record to see how the surface of the Earth has responded to shifts between icehouse (presence of ice at the Earth's poles) and greenhouse (no substantial ice at Earth's poles) climates in the past.

In their study published last week in GSA Bulletin, Grace Cosgrove, Luca Colombera, and Nigel Mountney use a novel relational database (the Database of Aeolian Sedimentary Architecture) to quantify the response of ancient eolian systems (i.e., wind-dominated environments, such as sand dune fields) to global climatic shifts between icehouse and greenhouse climates, as registered in the rock record. They analyzed data on thousands of geological features that preserved a record of eolian processes and landforms, from 34 different eolian systems spanning over two billion years of Earth's history.

Their results demonstrate statistically that preserved sedimentary architectures developed under icehouse and greenhouse conditions are fundamentally different. These differences can be tied to contrasting environmental conditions existing on Earth's surface. During icehouse climates, alternations between glacial and interglacial episodes (caused by changes in the Earth's orbit--the so-called Milankovitch cyclicity) resulted in cycles of glacial-episode accumulation and interglacial deflation.

Greenhouse conditions instead promoted the preservation of eolian elements in the geological record due to elevated water tables and the widespread action of biogenic and chemical stabilizing agents, which protected deposits from wind-driven deflation.

In the context of a rapidly changing climate, the results presented in this work can help predict the potential long-term impact of climate change on Earth surface processes.

Credit: 
Geological Society of America