Culture

Length of pregnancy alters the child's DNA

Researchers from Karolinska Institutet in Sweden have together with an international team mapped the relationship between length of pregnancy and chemical DNA changes in more than 6,000 newborn babies. For each week's longer pregnancy, DNA methylation changes in thousands of genes were detected in the umbilical cord blood. The study is published in Genome Medicine.

Premature birth, that is before 37 consecutive weeks' of pregnancy, is common. Between 5 and 10% of all children in the world are born prematurely. Most children will develop and grow normally, but premature birth is also linked to respiratory and lung disease, eye problems and neurodevelopmental disorders. This is especially true for children who are born very or extremely prematurely. During the fetal period, epigenetic processes, i.e., chemical modification of the DNA, are important for controlling development and growth. One such epigenetic factor is DNA methylation, which in turn affects the degree of gene activation and how much of a particular protein is formed.

"Our new findings indicate that these DNA changes may influence the development of fetal organs," says Simon Kebede Merid, first author of the study and PhD student at Karolinska Institutet, Department of Clinical Science and Education, Södersjukhuset.

The majority of observed DNA methylations at birth tended not to persist into childhood, but in 17% the levels were completely stable from birth to adolescence. The levels that you are born with in certain genes thus track with age.

"Now we need to investigate whether the DNA changes are linked to the health problems of those born prematurely," says Professor Erik Melén, at the Department of Clinical Science and Education, Södersjukhuset.

Epigenetics is a hot research topic that links genes, the environment and health. This work was done within the international Pregnancy and Childhood Epigenetics (PACE) consortium. The work represents contributions from 26 studies. Professor Melén's group also contributed to the first PACE paper which showed that mother's smoking during pregnancy changes DNA in newborns and lead two PACE studies showing effects of air pollution. Links to diseases such as asthma, allergy, obesity and even aging have also been shown.

"We hope that our new findings will contribute valuable knowledge about fetal development, and in the long term new opportunities for better care of premature babies to avoid complications and adverse health effects," says Erik Melén.

Credit: 
Karolinska Institutet

Mapping childhood malnutrition

image: Prevalence of stunting in children under five in low- and middle-income countries (LMICs) (2000-2017).

Image: 
Local Burden of Disease Child Growth Failure Collaborators / <em>Nature</em> / CC BY 4.0

The scope of childhood malnutrition has decreased since 2000, although millions of children under five years of age are still undernourished and, as a result, have stunted growth. An international team of researchers analysed the scope of global childhood malnutrition in 2000 and 2017, and estimated the probability of achieving the World Health Organization Global Nutrition Targets by 2025.

According to a UN report, in 2018, one out of nine people in the world experienced hunger. The total number of hungry people exceeded 821 million globally, of which almost 514 million lived in Asia, over 256 million in Africa, and 42 million in Latin America and the Caribbean.

World Health Organization data as of 2018 show that almost half (45%) of mortality among children under the age of 5 is due to malnutrition. 3.1 million children die of hunger annually. Malnutrition leads to child growth failure (CGF), which is expressed as stunting, wasting, and underweight.

In addition to risks of literally dying of starvation, CGF causes cognitive and physical developmental impairments that can lead to later cardiovascular disease, reduced intellectual ability and school attainment, as well as reduced economic productivity in adulthood.

'Childhood malnutrition is an essential reason for children's vulnerability to infections and, accordingly, their high mortality,' said Vasily Vlassov, Professor in the HSE Department of Health Care Administration and Economics and one of the study's authors. 'This is not a temporary suffering in childhood, but a tragedy for the whole future life. Malnutrition decreases an individual's ability to learn.'

CGF is spread unevenly, with 99% of hungry children living in 105 low- and middle-income countries, most of which are located in Africa and Asia.

Russia, as well as many other countries with average-high income, was not included in the study, since, according to Prof. Vlassov,
serious childhood hunger is rather a rare phenomenon and is not a threat to public health.

Severe malnutrition leads to stunting. Even though estimated childhood stunting prevalence decreased from 36% to 26% over 17 years in the countries analysed in the report, in 2017, more than 176 million kids were shorter than medical standards presume for their age. Half of them lived in India, Pakistan, Nigeria, and China.

In the 21st century, countries of Central America and the Caribbean, North Africa, and East Asia achieved the most progress in fighting childhood stunting. In these regions, estimated stunting prevalence of at least 50% in 2000 had reduced to 30% or less by 2017. In sub-Saharan regions, Central and South Asia, as well as Oceania, up to 40% of children under five were affected by stunting in 2017.

Wasting, or low body weight indices, were diagnosed in 58.3 million children in the 105 countries in 2017. This is 2% less than in 2000. On average, about 6.5% of children in these countries suffered from wasting. Most of them live in India, Pakistan, Bangladesh, and Indonesia. The highest shares of children with wasting (up to 20%) are in Africa, in areas of countries stretching from Mauritania to Sudan, as well as in South Sudan, Ethiopia, Kenya, and Somalia.

According to data for 2017, 13% of children are underweight for their age. In 2000, their share reached almost 20%. Researchers observed the most significant improvements in this indicator in Central and South America, sub-Saharan Africa, North Africa and Southeast Asia Central Asia and Central Africa remain troubled regions.

The World Health Organization aims to reduce childhood stunting by 40% by 2025. According to the researchers, this is quite achievable in Central America and the Caribbean, South America, North Africa, and East Asia, despite regions in some of these countries continuing to have high shares of children who suffer stunting and wasting.

Meanwhile, in many countries analysed in the study, the probability of achieving the WHO targets is low, especially as it relates to stunting and wasting. This primarily concerns sub-Saharan regions, South Asia, and Oceania.

The global community joins forces to fight malnutrition within the framework of international organizations. The World Food Programme (UN WFP), which distributes 12.6 billion meals in 80 countries every year, is considered one of the leaders. In addition to direct food aid, WFP carries out projects aimed at development and restoring living conditions in areas suffering from conflict and natural disasters.

The UN goals for 2030 include achieving a zero level of hunger. To do so, money is being invested in agricultural development and production. In particular, small farms capable of providing food for local markets are being created.

In addition, the UN is implementing technologies that allow crop yields to be increased by means of conserving soil and water resources, protecting plants from pests, and using new breeds of plants that are resistant to disease and are enriched with essential vitamins and minerals.

Credit: 
National Research University Higher School of Economics

How three genes rule plant symbioses

image: Mycorrhizal and plant symbiosis.

Image: 
Pierre-Marc Delaux

For billions of years life on Earth was restricted to aquatic environments, the oceans, seas, rivers and lakes. Then 450 million years ago the first plants colonized land, evolving in the process multiple types of beneficial relationships with microbes in the soil.

These relationships, known as symbioses, allow plants to access additional nutrients. The most intimate among them are intracellular symbioses that result in the accommodation of microbes inside plant cells.

A study published in Nature Plants, led by scientists from the John Innes Centre in the UK and the University of Toulouse/CNRS in France, describes the discovery of a common genetic basis for all these symbioses.

It is hypothesised that the colonization of land by plants was made possible through a type of symbiosis that plants form with a group of fungi called mycorrhizal fungi. Even today 80% of plants we find on land can form this mycorrhizal symbiosis. Plants have also evolved the ability to engage in intracellular symbiosis with a large diversity of other microbes.

Over the past two decades, studies on mycorrhizal symbiosis and another type of symbiosis, formed by legumes such as peas and beans with soil bacteria, have allowed the identification of a dozen plant genes that are required for the recognition of beneficial microbes and their accommodation inside plant cells. By contrast, other types of intracellular symbioses have been poorly studied.

To address this, the team compared the genomes of nearly 400 plant species to understand what is unique to those that can form intracellular symbioses. Surprisingly, they discovered that three genes are shared exclusively by plants forming intracellular symbiosis and lost in plants unable to form this type of beneficial relationship.

"Our study demonstrates that diverse types of intracellular symbioses that plants form with different symbiotic partners are built on top of a conserved genetic program." said Dr Guru Radhakrishnan, lead author of the study and a BBSRC Discovery Fellow at the John Innes Centre.

The research, led by Dr Radhakrishnan in the UK and Dr Pierre-Marc Delaux in France, was conducted as part of the Engineering Nitrogen Symbiosis for Africa (ENSA) project sponsored by the Bill & Melinda Gates foundation.

ENSA is an international collaboration aiming at transferring naturally occurring symbioses to cereal crops to limit the use of chemical fertilizers and to improve yield in small-holder farms of sub-Saharan Africa where access to these fertilizers is limited.

"By demonstrating that different plant symbioses share a common genetic basis, our ambitious goal has become more realistic," says Dr Radhakrishnan.

An ancestral signaling pathway is conserved in plant lineages forming intracellular symbioses is in Nature Plants

Credit: 
John Innes Centre

How quickly do flower strips in cities help the local bees?

image: The flower strip at Fockensteinstraße as an example of the urban context of the flower strips studied here.

Image: 
Susanne S. Renner

Insects rely on a mix of floral resources for survival. Populations of bees, butterflies, and flies are currently rapidly decreasing due to the loss of flower-rich meadows. In order to deal with the widespread loss of fauna, the European Union supports "greening" measures, for example, the creation of flower strips.

A group of scientists from the University of Munich, led by Prof. Susanne S. Renner, has conducted the first quantitative assessment of the speed and distance over which urban flower strips attract wild bees, and published the results of the study in the open-access Journal of Hymenoptera Research.

Flower strips are human-made patches of flowering plants that provide resources for flower-visiting insects and insect- and seed-feeding birds. Previous experiments have proved their conservation value for enhancing biodiversity in agricultural landscapes.

The success of flower strips in maintaining populations of solitary bees depends on the floristic composition, distance from suitable nesting sites, and distance from other habitats maintaining stable populations of bees. To study the attractiveness of the flower strips in urban landscapes, the scientists used an experimental set-up of nine 1,000 sq. meters flower strips recently established in Munich by a local bird conservation agency.

"We identified and counted the bees visiting flowers on each strip and then related these numbers to the total diversity of Munich's bee fauna and to the diversity at different distances from the strips. Our expectation was that newly planted flower strips would attract a small subset of mostly generalist, non-threatened species and that oligolectic species (species using pollen from a taxonomically restricted set of plants) would be underrepresented compared to the city's overall species pool," shared Prof. Susanne S. Renner.

Bees need time to discover new habitats, but the analysis showed that the city's wild bees managed to do that in just one year so that the one-year-old flower strips attracted one-third of the 232 species recorded in Munich between 1997 and 2017.

Surprisingly, the flower strips attracted a random subset of Munich's bee species in terms of pollen specialization. At the same time, as expected, the first-year flower-strip visitors mostly belonged to common, non-threatened species.

The results of the study support that flower strip plantings in cities provide extra support for pollinators and act as an effective conservation measure. The authors therefore strongly recommend the flower strip networks implemented in the upcoming Common Agricultural Policy (CAP) reform in the European Union.

Credit: 
Pensoft Publishers

Directed species loss from species-rich forests strongly decreases productivity

image: The field trial BEF-China is carried out in Xingangshan in the province of Jiangxi in southeast China.

Image: 
Yuanyuan Huang

The forest biodiversity experiment BEF-China began in 2009 with the collaboration among institutions in China, Germany and Switzerland and is one of the world's biggest field experiment. In the subtropical forests in southeastern China the international team planted over 500 plots of 670 square meters of land with 400 trees each - with each plot receiving between one and 16 tree species in various combinations. The researchers simulated both random and directed species extinction scenarios and analyzed the data.

Directed loss of species reduces productivity

After eight years, directed species loss in species-rich forest ecosystems, in which evolutionary distinct species had higher extinction risks, showed much stronger reductions in forest productivity than did treatments that were subject to random species loss. "These findings have significant implications for biodiversity conservation and climate mitigation, because more productive forests also removed more carbon dioxide from the air," says Bernhard Schmid, professor at the Department of Geography of the University of Zurich (UZH) and last author of the study.

Diversity alone does not protect against losses

"Our results suggest that species loss can severely hamper ecosystem functioning already at high species richness, when many species still remain in the ecosystem. It challenges the decade-long assumption derived from studies based on random species loss," says Schmid. This assumption is that species loss from high-diversity communities would have only little impact on ecosystem functioning, because the remaining species could take over the functions of extinct species due to species redundancy.

Extinct species missing from the network

Why could directed species loss lead to such strong productivity reduction? "We think two processes associated with directed species loss might lead to the results. When species loss is directed, we may lose the most functionally distinct species first, if that functionally distinctiveness is also the cause of extinction," explains first author of the study Yuxin Chen, former post-doc at UZH and now associate professor at Xiamen University in China. "Species do not live independently, but participate in complex networks of species interactions. Losing species can change these interaction networks. The loss of species interactions contributed significantly to the observed results."

China responds with new laws

The ongoing impacts of a severe coronavirus epidemic this winter has prompted China to speed up biosecurity legislation and elevate it to a national security issue. The biosecurity law would cover various areas including conserving biodiversity. "Our research is grounded in one of the diversity hotspots in China. The findings are timely for supporting the legislation of biosecurity," says Keping Ma professor at the Chinese Academy of Sciences and co-founder of the BEF-China experiment. "Diversity loss from species-rich forests could also increase the risk of pest and disease outbreaks. Some other research teams are studying this issue".

Credit: 
University of Zurich

ITMO scientists develop new algorithm that can predict population's demographic history

image: Genetic algorithm for inferring demographic history of multiple populations from allele frequency spectrum data, ITMO University

Image: 
Dmitry Lisovskiy, ITMO.NEWS

Bioinformatics scientists from ITMO University have developed a programming tool that allows for quick and effective analysis of genome data and using it as a basis for building the most probable models of demographic history of populations of plants, animals and people. Operating with complex computational schemes, the software can, with a very high degree of likelihood, predict what history a particular group of living organisms has gone through in the past thousands of years, what periods of mass extinction or mass population growth a population has experienced, and how long it has been in contact with other populations of the same species. The scientists' article dedicated to this methodology has been published in GigaScience.

How to find out when exactly the modern tigers' first ancestors appeared on Earth? When did the two elephant populations split? Is there a difference between the Dama and the Moroccan gazelle? When did the division of the African and the Eurasian homo sapiens occur? The answers to all these questions can be found in the population's demographic history - in other words, the scenario that shows what stages the population went through in the course of its history, whether it underwent any mass extinctions, migrations, or sharp spikes in its numbers.

Apart from solving fundamental questions, this data can help us in the matters of applied research in the field of ecology and environmental protection. For instance, if some region only has some 800 walruses left, scientists have to understand whether it constitutes a critical decrease or it is a natural population size which has remained constant for several thousand years now, and answer the question of whether valuable resources have to be spent on protecting and saving this species from becoming extinct.

The creation of a population's demographic history on the basis of genetic information is a complicated task which requires population geneticists to possess not only knowledge in the field of biology but also programming skills. Such scientists have to garner data and write a code for computing possible models of a population's evolution which could have led to the vast multitude of the genetic information we can witness in this population's representatives today. Up until recently, this was a long process the end result of which relied very heavily on the researcher's initial hypothesis. If it had any defects or the research failed to take some aspect into consideration, the software couldn't correct this initial error and calculated the probability of particular demographic events only within the boundaries predefined by the researcher.

The software developed by a group of ITMO University scientists as part of the Project 5-100 grant programs and with support from JetBrains Research aims to solve this problem. The researchers proposed a programming product which independently and automatically predicts the most probable model of a population's demographic history. At that, it is significantly less dependent on the initial research hypothesis, doesn't require advanced programming skills and produces more accurate results. What is more, the software has the advantage of flexibility, meaning that if the obtained result somehow diverges from archaeological or historical data, you can easily introduce additional limitations into the underlying algorithm to update its hypothesis.

"Using genetic data, our software automatically computes the model it considers optimal," shares Vladimir Ulyantsev. "It looks at the entire volume of the scenarios available. As a scientist, I'll consider the scenarios I deem the most likely, there can be three, five, maybe ten of those. The software, on the other hand, will test all of the models it estimates as probable, this is a much bigger amount. That's why the solutions it comes up with are better than those proposed by people working on the basis of the initial methods. The most beautiful thing here is the method - a genetic algorithm inspired by how evolution happens: species multiply, mutate, with those with the least ability to adapt dying out. In the place of the species we have demographic models and their parameters, and their adaptability is measured on the basis of their similarity with the studied data."

After obtaining this data, the scientists can present it on a map and compare the information indicating that during a particular period a population underwent a migration with archaeological findings and other evidence. These algorithms were used to check a large number of hypotheses and research by evolutionary geneticists. In many cases, the obtained result was much more accurate than that of the initial works.

Credit: 
ITMO University

Cloud data speeds set to soar with aid of laser mini-magnets

image: Model of a single-molecule magnet

Image: 
Dr Olof Johansson

Tiny, laser-activated magnets could enable cloud computing systems to process data up to 100 times faster than current technologies, a study suggests.

Chemists have studied a new magnetic material that could boost the storage capacity and processing speed of hard drives used in cloud-based servers.

This could enable people using cloud data systems to load large files in seconds instead of minutes, researchers say.

A team led by scientists from the University of Edinburgh created the material - known as a single-molecule magnet - in the lab.

They discovered that a chemical bond that gives the compound its magnetic properties can be controlled by shining rapid pulses from a laser on it. The compound is composed mainly of the element manganese, which is named after the Latin word magnes, which means magnet.

Their findings suggest that data could be stored and accessed on the magnets using laser pulses lasting one millionth of a billionth of a second. They estimate this could enable hard drives fitted with the magnets to process data up to 100 times faster than current technologies.

The development could also improve the energy efficiency of cloud computing systems, the team says, which collectively emit as much carbon as the aviation industry.

Existing hard drives store data using a magnetic field generated by passing an electric current through a wire, which generates a lot of heat, researchers say. Replacing this with a laser-activated mechanism would be more energy efficient as it does not produce heat.

The study, published in the journal Nature Chemistry, also involved researchers from Newcastle University. It was funded by the Royal Society of Edinburgh, the Carnegie Trust and the Engineering and Physical Sciences Research Council.

Dr Olof Johansson, of the University of Edinburgh's School of Chemistry, who led the study, said: "There is an ever-increasing need to develop new ways of improving data storage devices. Our findings could increase the capacity and energy efficiency of hard drives used in cloud-based storage servers, which require tremendous amounts of power to operate and keep cool. This work could help scientists develop the next generation of data storage devices."

Credit: 
University of Edinburgh

KITE code could power new quantum developments

A research collaboration led by the University of York's Department of Physics has created open-source software to assist in the creation of quantum materials which could in turn vastly increase the world's computing power.

Throughout the world the increased use of data centres and cloud computing are consuming growing amounts of energy - quantum materials could help tackle this problem, say the researchers.

Quantum materials - materials which exploit unconventional quantum effects arising from the collective behaviour of electrons - could perform tasks previously thought impossible, such as harvesting energy from the complete solar spectrum or processing vast amounts of data with low heat dissipation.

The design of quantum materials capable of delivering intense computing power is guided by sophisticated computer programmes capable of predicting how materials behave when 'excited' with currents and light signals.

Computational modelling has now taken a 'quantum leap' forward with the announcement of the Quantum KITE initiative, a suite of open-source computer codes developed by researchers in Brazil, the EU and the University of York. KITE is capable of simulating realistic materials with unprecedented numbers of atoms, making it ideally suited to create and optimise quantum materials for a variety of energy and computing applications.

Dr Aires Ferreira, a Royal Society University Research Fellow and Associate Professor of Physics, who leads the research group at the University of York, said:

"Our approach uses a new class of quantum simulation algorithms to help predict and tailor materials' properties for a wide range of applications ranging from solar cells to low-power transistors.

"The first version of the free, open source KITE code already demonstrates very encouraging capabilities in electronic structure and device-level simulation of materials.

"KITE's capability to deal with multi-billions of atomic orbitals, which to our knowledge is unprecedented in any area of quantum science, has the potential to unlock new frontiers in condensed matter physics and computational modelling of materials."

One of the key aspects of KITE is its flexibility to simulate realistic materials, with different kinds of inhomogeneities and imperfections.

Dr Tatiana Rappoport from the Federal University of Rio de Janeiro in Brazil, said:

"This open-source software is our commitment to help removing barriers to realistic quantum simulations and to promote an open science culture. Our code has several innovations, including 'disorder cell' approach to simulate imperfections within periodic arrangements of atoms and an efficient scheme for dealing with RAM intensive calculations that can be useful to other scientific communities and industry."

Read the research paper in Royal Society Open Science.

Credit: 
University of York

Scientists find functioning amyloid in healthy brain

image: Protein FXR1, extracted from the brain of healthy rats, is colourised with an amyloid specific dye 'Congo Red' and shows an apple-green glow in polarised light, which is recognised as the 'gold standard' for amyloid identification.

Image: 
SPbU

Scientists from St Petersburg University worked with their colleagues from the St Petersburg branch of the Vavilov Institute of General Genetics. They conducted experiments on laboratory rats and showed that the FRX1 protein in the brains of young and healthy animals functions in an amyloid form. The previously published reports indicate that this protein controls long term memory and emotions: mice that have the FRX1 gene "off" quickly remember even complex mazes, and animals that have too much of this protein do not suffer from depression even after severe stress. In addition, in humans, a failure in the gene encoding FRX1 is linked to autism and schizophrenia.

'Our findings clearly show that developing a universal remedy that will destroy all amyloids in the brain is totally futile. Instead, we need to look for a cure for each specific pathology. The healthy brain was previously known to store only a few protein hormones in amyloid form. They are stored in secretory granules in the hypophysis, but when the time comes, the secretory granules burst and the proteins function in a normal, monomeric form,' said Alexey Galkin, Professor of the Department of Genetics, Doctor of Biology. 'We have initially proved that the protein can actually function in the brain in amyloid form, both as oligomers and as insoluble aggregates. Also, the amyloid form FRX1 can bind RNA molecules and protect them from degradation.'

The research was conducted by the Research Park of St Petersburg University with equipment provided by the resource centres "Chromas Core Facility" and "The Centre for Molecular and Cell Technologies". The amyloid form of FXR1 protein was discovered by scientists using the amyloid proteome screening method developed by a research team in 2016. Amyloids generally play an important role in many organisms: for example, one of these proteins is found in human pigment cells and affects skin tanning. However, today, scientists are interested in amyloids primarily due to the need to find a cure for neurodegenerative diseases, where these proteins play a key role.

Credit: 
St. Petersburg State University

Story Tips: Antidote chasing, traffic control and automatic modeling

image: A team of scientists may have discovered a new family of antidotes for certain poisons that can mitigate their effects more efficiently compared with existing remedies.

Image: 
Andrey Kovalevsky/Oak Ridge National Laboratory, US Dept. of Energy

Biochemistry - Chasing the antidote

In the most comprehensive, structure-based approach to date, a team of scientists may have discovered a new family of antidotes for certain poisons that can mitigate their effects more efficiently compared with existing remedies.

Poisons such as organophosphorus nerve agents and pesticides wreak havoc by blocking an enzyme essential for proper brain and nerve function. Fast-acting drugs, called reactivators, are required to reach the central nervous system and counteract damage that could lead to death.

"To enhance the antidote's effectiveness, we need to improve the reactivator's ability to cross the blood-brain barrier, bind loosely to the enzyme, chemically snatch the poison and then leave quickly," said ORNL's Andrey Kovalevsky, co-author of a study led by Zoran Radić of UC San Diego.

The team designed and tested reactivators on three different nerve agents and one pesticide with positive initial results. Their next step is to use neutron crystallography to better understand antidote designs.

Media Contact: Sara Shoemaker, 865.576.9219; shoemakerms@ornl.gov

Image: https://www.ornl.gov/sites/default/files/2020-02/01a%20-%20Biochemistry-Antidote1_1.jpg

Caption: A team of scientists may have discovered a new family of antidotes for certain poisons that can mitigate their effects more efficiently compared with existing remedies. Credit: Andrey Kovalevsky/Oak Ridge National Laboratory, U.S. Dept. of Energy

Vehicles - Fuel savings green light

Large trucks lumbering through congested cities could become more fuel efficient simply by not having to stop at so many traffic lights.

A proof-of-concept study by Oak Ridge National Laboratory shows promise of a potential new system to direct traffic lights to keep less-efficient vehicles moving and reduce fuel consumption.

In collaboration with traffic-management services company GRIDSMART, researchers used smart cameras to collect real-world data from images of vehicles as they move through select intersections.

The team used artificial intelligence and machine learning techniques to "teach" these cameras how to quickly identify each vehicle type and its estimated gas mileage, sending the information to the next intersection's traffic light.

ORNL's Thomas Karnowski said early results from the computer simulation could lead to more comprehensive research.

Media Contact: Sara Shoemaker, 865.576.9219; shoemakerms@ornl.gov

Image: https://www.ornl.gov/sites/default/files/2020-02/02%20-%20Truck-intersection_1.png

Caption: A preliminary study by ORNL and GRIDSMART shows promise of a new system to keep trucks moving through intersections and reduce fuel consumption. Credit: Thomas Karnowski/Oak Ridge National Laboratory, U.S. Dept. of Energy

Buildings - Automatic modeling

Oak Ridge National Laboratory researchers have developed a modeling tool that identifies cost-effective energy efficiency opportunities in existing buildings across the United States.

Using supercomputing, the energy modeling method assesses building types, systems, use patterns and prevailing weather conditions.

"Manually collecting and organizing data for energy modeling is a time-consuming process and is used in only a small percentage of retrofit performance projects," ORNL's Joshua New said.

The team's modeling approach applies automation to extract a building's floor area and orientation parameters from publicly available data sources such as satellite images. Researchers tested the tool on more than 175,000 buildings in the Chattanooga, Tennessee, area, demonstrating energy-saving opportunities.

"We can model a building in minutes from a desktop computer," New said. "This is the next level of intelligence for energy-saving technologies."

Future plans include making the tool openly available to help reduce energy demand, emissions and costs for America's homes and businesses.

Media Contact: Jennifer Burke, 865.576.3212; burkejj@ornl.gov

Image: https://www.ornl.gov/sites/default/files/2020-02/03%20-%20Building_energy_model_graphic_1.png

Caption: ORNL's modeling tool simulates the energy efficiency of buildings by automating data received from satellite images. The tool was tested on buildings in the Chattanooga area. Credit: Joshua New/Oak Ridge National Laboratory, U.S. Dept. of Energy

Credit: 
DOE/Oak Ridge National Laboratory

The GDP fudge: China edition

image: SMU Professor Cheng Qiang, Dean of the School of Accountancy, presenting his paper at the Review of Accounting Studies (RAST) Conference.

Image: 
Flora Teoh

SMU Office of Research & Tech Transfer - For all its shortcomings, the gross domestic product (GDP) of a country remains an important barometer of its economic health, strongly influencing both private and public spending. Though conceptually simple as the total dollar value of all goods and services produced within a specified time frame, calculating GDP is tricky in practice and can be manipulated by individual firms in a strategy known as earnings management.

In particular, China's economic reporting has been called into question, with the provincial governments reporting in 2016 a collective GDP that was 2.76 trillion yuan higher than the national GDP calculated by the National Bureau of Statistics, or about 3.7% of the national GDP. The central government has acknowledged the issue: in 2017, the National Audit Office singled out ten provinces which had inflated their fiscal revenue to the tune of 1.5 billion yuan.

According to a new analysis presented at the 2019 Review of Accounting Studies (RAST) conference, held from 13 to 14 December at the Singapore Management University (SMU), there is reason to believe that Chinese firms engage in earnings management to prop up provincial GDP figures. Titled "GDP Growth Incentives and Earnings Management: Evidence from China," the study presented by the Dean of the School of Accountancy, Professor Cheng Qiang, also won the "Best Paper Award" by popular vote.

The pressure to grow GDP

"If the government is making decisions based on an inaccurate GDP number, then its decision quality will be lower," said Professor Cheng, explaining the implications of his study findings. Although discrepancies in GDP calculation can simply be a result of poor infrastructure for the collection of statistical data, differences in calculation methods or simple human error, not all such problems are unintentional, he said.

In the case of China, there is a strong incentive for provincial officials to present a rosy economic picture as this is intrinsically linked to opportunities for political advancement. This desire may lead officials to pressure firms into behaviours that negatively affect the accuracy of their financial statements, Professor Cheng suggested.

"The central government controls the personnel: who should be promoted into the central government, who should move to a bigger province. The political careers of provincial officials are decided by the central government," he said. Because a province's GDP is a significant factor in deciding which officials to promote, the system creates competition among them to present the best economic picture to the central government, Professor Cheng explained.

To test their hypothesis, Professor Cheng and his colleagues examined various measures of financial reporting between 2002 to 2016 representing over 21,000 firm-years. Specifically, they looked at three figures as proxies for earnings management: discretionary revenues, overproduction and abnormal asset impairment losses, all of which can be manipulated to directly influence GDP numbers.

These measures were then examined in tandem with potential incentives for inflating GDP growth. One way in which the study calculated such incentives was to compare provinces' GDP growth with that of adjacent provinces (which are more likely to have similar economic situations) as well as the national GDP growth. A province with a lower GDP growth compared to the national average or that of adjacent provinces would hypothetically be under greater pressure to engage in earnings management to inflate future GDP growth.

Additionally, the study also examined the issue of incentives from other perspectives, such as the age of provincial officials. Hypothetically, younger officials are more likely to compete for advancement compared to older officials nearing the retirement age of 65, therefore giving younger officials a stronger incentive to inflate a province's GDP.

Short-term gain, long-term pain

Indeed, Professor Cheng and colleagues found that firms in provinces with GDP growth lower than national or adjacent provinces' average GDP growth were more likely to engage in earnings management in the future compared to firms in other provinces. Specifically, these firms were more likely to inflate revenues, overproduce and delay asset impairment losses.

Lending strength to the study's hypothesis was that these results were more pronounced for firms in provinces with younger officials (60 years old and below), as well as firms which were local state-owned enterprises (SOEs) - over which provincial officials have greater control - compared to central SOEs or non-SOEs.

Besides studying the factors behind GDP inflation, the study also examined the potential consequences of earnings management to the firms (and in turn the province), revealing that there is a heavy price to pay for constructing an artificial image of a flourishing economy.

"When the province reports a high growth, the tax collected as well as other economic expectations will also be higher," Professor Cheng explained. "When you cannot fulfil these expectations, at some point, the situation will just blow up." This was indeed what played out in many provinces that admitted to inflating their GDP between 2017 and 2018, he pointed out.

In short, engaging in earnings management is costly to firms in the long run, Professor Cheng cautioned. "We find that firms that engage in earnings management for the incentive of GDP growth have a high bad debt expense that comes from inflating revenue; high inventory write-off that comes from overproduction; and high asset impairment losses that come from the delaying of asset impairment losses. All these result in a lower return on assets in the future."

"This is the first study that examines how the incentives at the government level affect management at the firm level. The second contribution is that this paper provides evidence about one mechanism by which government officials use to inflate GDP growth," Professor Cheng said.

"The third contribution is in articulating the dynamics between macroeconomic numbers and microeconomic numbers, and how the macroeconomic situation can affect the integrity of a firm's financial reporting."

Credit: 
Singapore Management University

Atomic vacancy as quantum bit

image: Atomic thin layer of boron nitride with a spin center formed by the boron vacancy. With the help of high frequency excitation (red arrow) it is possible to initialize and manipulate the qubit.

Image: 
(Image: Mehran Kianinia, University of Technology Sydney)

Although boron nitride looks very similar to graphene in structure, it has completely different optoelectronic properties. Its constituents, the elements boron and nitrogen, arrange - like carbon atoms in graphene - a honeycomb-like hexagonal structure. They arrange themselves in two-dimensional layers that are only one atomic layer thick. The individual layers are only weakly coupled to each other by so-called van der Waals forces and can therefore be easily separated from each other.

Publication in Nature Materials

Physicists from Julius-Maximilians-Universität Würzburg (JMU) in Bavaria, Germany, in cooperation with the Technical University of Sydney in Australia have now succeeded for the first time in experimentally demonstrating so-called spin centers in a boron nitride crystal. Professor Vladimir Dyakonov, holder of the Chair of Experimental Physics VI at the Institute of Physics, and his team were responsible for this on the JMU side and carried out the crucial experiments. The results of the work have been published in the renowned scientific journal Nature Materials.

In the layered crystal lattice of boron nitride the physicists found a special defect - a missing boron atom - which exhibits a magnetic dipole moment, also known as a spin. Furthermore, it can also absorb and emit light and is therefore also called color center. To study the magneto-optical properties of the quantum emitter in detail, JMU scientists have developed a special experimental technique that uses the combination of a static and a high-frequency magnetic field.

A little luck is needed

"If you vary the frequency of the alternating magnetic field, at some point you hit exactly the frequency of the spin, and the photoluminescence changes dramatically," explains Dyakonov. A bit of luck is necessary, however, since it is difficuilt to predict at which frequencies one has to search for unknown spin states. Dyakonov and his team had discovered these centers in the 2D crystalline system, which had previously only been predicted theoretically. Among other things, they were able to demonstrate spin polarization, i.e. the alignment of the magnetic moment of the defect under optical excitation - even at room temperature.

This makes the experiments interesting for technical applications as well: Scientists around the world are currently working on finding a solid-state system in which the spin state can be aligned, manipulated on demand and later read-out optically or electrically. "The spin center we have identified in boron nitride meets these requirements," adds Dyakonov. Because it has a spin and additionally absorbs and emits light, it is a quantum bit that can be used in quantum sensing and quantum information. New navigation technology could also work with this technology, which is why space agencies such as DLR and NASA are conducting intensive research on this topic, too.

Material design by the Lego brick principle

For the basic scientist, the 2D materials are also exciting from another point of view. They have very special layer structure, combined with the only weak bonding of the layers to each other, offers the possibility of constructing different stacking sequences from different semiconductors. "If you then place a defect in one of these layers, we call it a spin probe, this can help to understand the properties of the adjacent layers, but also to change the physical properties of the entire stack," says Dyakonov.

In a next step, Dyakonov and his colleagues therefore want to produce, among other things, heterostructures made of multilayer semiconductors with a boron nitride layer as an intermediate layer. They are convinced: "If the atomically thin layers of boron nitride, which are 'decorated' with individual spin centers, can be produced and incorporated into a heterostructure, it will be possible to design artificial two-dimensional crystals based on Lego brick principles and investigate their properties."

Credit: 
University of Würzburg

Paper: Disposal of wastewater from hydraulic fracturing poses dangers to drivers

image: A new paper co-written by Yilan Xu, a professor of agricultural and consumer economics at Illinois, shows that the growing traffic burden in shale energy boomtowns from trucks hauling wastewater to disposal sites resulted in a surge of road fatalities and severe accidents.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Environmental concerns about hydraulic fracturing - aka "fracking," the process by which oil and gas are extracted from rock by injecting high-pressure mixtures of water and chemicals - are well documented, but according to a paper co-written by a University of Illinois at Urbana-Champaign environmental economics expert, the technique also poses a serious safety risk to local traffic.

New research from Yilan Xu ("E-Lan SHE"), a professor of agricultural and consumer economics at Illinois, shows that the growing traffic burden in fracking boomtowns from trucks hauling wastewater to disposal sites resulted in a surge of road fatalities and severe accidents.

"Fracking requires large amounts of water, and it subsequently generates a lot of wastewater," she said. "When trucks need to transport all that water within a narrow window of time to a disposal site, that poses a safety threat to other drivers on the road - especially since fracking occurs mostly in these boomtowns where the roadway infrastructure isn't built up enough to handle heavy truck traffic."

The study examined how fracking-related trucking affected the number of fatal crashes in the Bakken Formation in North Dakota from 2006-14, using the timing of fracking operations near certain road segments.

The researchers identified a causal link between fracking-related trucking and fatal traffic crashes, finding that an additional post-fracking well within six miles of the road segments led to 8% more fatal crashes and 7.1% higher per-capita costs in accidents.

"Our back-of-the-envelope calculation suggests that an additional 17 fatal crashes took place per year across the sampled road segments, representing a 49% increase relative to the annual crash counts of the drilling counties in North Dakota in 2006," Xu said. "That's a significant number when you're talking about a sparsely populated area like North Dakota.

"And besides the fatality and injury costs in fatal crashes quantified in our study, other costs may occur as well, including injury costs in nonfatal crashes and indirect expenditures on emergency services, insurance administrative costs, and infrastructure maintenance and replacement."

To lessen the negative impact on traffic fatalities as well as the severity of traffic accidents, the study proposes a tax that can be charged per well to internalize the costs of fracking-related trucking activities, similar to the impact fees implemented in energy-rich towns in Pennsylvania that yield hundreds of millions of dollars per year for the state.

"The tax could serve as an economic instrument that affects operators' drilling and fracking decisions and thus alleviate the hazard of the associated truck traffic indirectly," Xu said. "Likewise, a toll fee by miles driven by trucks could be collected on highways to absorb the negative impacts of fracking-related trucking."

The study also sheds light on more practical measures that local governments can undertake to curb the traffic risks associated with fracking.

"Since many fracking-induced fatal crashes take place in the daytime rush hours, local governments could adopt policies such as making a high occupancy vehicle lane for trucks carrying wastewater. An active traffic alert and warning system with live well-operations updates could also help drivers monitor traffic and avoid exposure to road hazards," she said.

Moreover, the paper calls for the active involvement of the oil and gas industry to seek ways to improve their workplace safety and mitigate the traffic hazard of fracking to road users.

"Our findings suggest that oil and gas operators could redistribute the traffic loads over time to avoid concentrated water hauling during peak hours," Xu said. "In the long run, since a well may need to be fracked multiple times over its productive life, operators may improve the water supply system by constructing water wells serving multiple well pads via a piping system. They could also develop the onsite wastewater treatment and disposal facilities as opposed to trucking wastewater over long distances. Such measures would reduce the long-term transport costs and the associated traffic effects."

The findings should give local and federal policymakers information when conducting due diligence and evaluating the regional costs and benefits of shale energy development, Xu said.

"Our study provides an estimate based on the North Dakota experience where population density and traffic volume is relatively low, but our findings have implications for other regions planning future shale development."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Engendering trust in an AI world

image: Above (left-right): SMU Professor David Llewelyn, Deputy Dean of SMU School of Law; SMU Associate Professor Warren Chik; Professor Ian Walden, Centre for Commercial Law Studies, Queen Mary University of London; SMU Associate Professor Alvin See (who presented SMU Associate Professor Yip Man's paper on her behalf); Mr KK Lim, Head of Cybersecurity, Privacy and Data Protection, Eversheds Harry Elias; and Mr Lanx Goh, Senior Legal Counsel (Privacy & Cybersecurity) & Global Data Protection Officer, Klook Travel Technology.

Image: 
Kareyst Lin

SMU Office of Research & Tech Transfer - Can you imagine a world without personalised Spotify playlists, curated social media feeds, or recommended cat videos on the sidebars of YouTube? These modern-day conveniences, which were made possible by artificial intelligence (AI), also present a scary proposition - that the machines could end up knowing more about us than we ourselves do.

According to Gartner's 2019 CIO Agenda survey, 37 percent of Chief Information Officers (CIOs) globally have already deployed AI technology in their organisations. The rapid adoption of AI solutions brings to focus the way data - which could consist of sensitive, confidential and personal information - are being managed and used by organisations.

Speaking at the conference panel on 'AI and Data Protection: New Regulatory Approaches', Singapore Management University (SMU) Associate Professor Warren Chik gave his perspective on how to conceptualise trust in a digital age. "When it comes to matters such as personal data, we don't treat AI as god. Therefore, we cannot rely on faith, which is what religion requires. We need something more substantial than that," he said.

In his talk titled 'Artificial Intelligence and Data Protection in Singapore: Consumers' Trust, Organisational Security and Government Regulation', Professor Chik explained that to engender trust in a digital solution, it is crucial that users are being engaged on the issues involved. "People tend to fear the unknown, and it is hard to have trust in something that you don't know."

Moderated by Professor David Llewelyn, Deputy Dean of the SMU School of Law, the roundtable featured speakers Professor Ian Walden, Centre for Commercial Law Studies, Queen Mary University of London; Associate Professor Yip Man (whose paper was presented by Associate Professor Alvin See on her behalf); as well as commentators Mr KK Lim, Head of Cybersecurity, Privacy and Data Protection, Eversheds Harry Elias; and Mr Lanx Goh, Senior Legal Counsel (Privacy & Cybersecurity) & Global Data Protection Officer, Klook Travel Technology.

AI as an influencer

The ability of an AI system to conduct personal profiling could fundamentally change a user's digital personality, said Professor Chik, highlighting a cause of worry for many.

"While an AI holds specific information such as your name and address, it also forms its own knowledge of your identity, and who you are as a person," Professor Chik said, citing algorithms used by social media feeds to collect data on one's identity, interests and surfing habits. From that data, the system then creates a profile of who they think you are.

"These algorithms - which may be right or wrong - feed you information, articles and links, and as a result brings about an effect on your thinking. In other words, AI can mold human behaviour, and this is a risk that makes a lot of people uncomfortable," Professor Chik said. The threat is very real, he emphasised, noting that regulators have clearly identified a need to regulate the use of data in AI.

In Singapore, for instance, the Protection from Online Falsehoods and Manipulation Act (POFMA) carries criminal provisions on the creation, use and alteration of bots to spread false information.

Data protection legislation: a balancing act

In trying to regulate data, there are always two competing objectives when regulating the use, collection and processing of personal data. "The first objective is to protect the data subject, and the second is to promote innovation," said Professor See, who presented Professor Yip's paper on her behalf.

Of the different types of protection for data subjects that exist today, the most commonly available option is the use of contracts. Professor Yip's paper points out that "[t]he problem with trying to regulate data use through terms and conditions is that in most cases, people don't read [the legal fine print]". The consent given is therefore not genuine.

Professor Llewelyn, who moderated the roundtable, added that the meaning of consent is an issue that needs to be explored in greater depth. "If a consumer were to accept an online contract in full without reading it, can it be realistically said that he or she has agreed to all the terms and conditions, and given full consent?" he asked. "Perhaps there should be legal acknowledgement given to the automatic nature of the commitment made in such contracts."

A more critical limitation of the contract as protection for the data subject, is that the contract only governs the information that is shared between the two parties bound by the contract. For instance, if Facebook were to transfer a user's personal data to a third-party not bound by the contract, the third-party firm will not be obligated to protect the user's information.

Data protection by design

Singapore's Personal Data Protection Act (PDPA), which regulates personal data through the use of legislation, is described as light touch regime that takes a strongly balanced approach between the need for privacy protection and the interest of business innovation.

Professor Yip's paper recognises that there is some level of tension between the two objectives mentioned above. The issue at hand, therefore, is how to strike a balance between individual rights and privacy, and the competing interest of economic growth and innovation, she noted.

At the end of the day, the focus is on preventing, rather than trying to remedy a breach of data privacy. "It is about recognising the rights of the individual and the privacy of their data, and at the same time, the need for organisations to collect, use and disclose personal data for legitimate and reasonable purposes," Professor Yip's paper added.

Another solution that Professor Yip explored in her paper was the use of technology instead of law to protect data subjects. In some cases, privacy can be directly built into the design and operation of operation systems, work processes, network infrastructure and even physical spaces. She nevertheless highlights that this solution is not perfect because it is against the interest of businesses which leverage data to make profits to build robust privacy safeguards into their systems and business models.

Credit: 
Singapore Management University

Geologists determine early Earth was a 'water world' by studying exposed ocean crust

image: Benjamin Johnson of Iowa State University woks at an outcrop in remote Western Australia where geologists are studying 3.2-billion-year-old ocean crust.

Image: 
Photo by Jana Meixnerova/provided by Benjamin Johnson

AMES, Iowa - The Earth of 3.2 billion years ago was a "water world" of submerged continents, geologists say after analyzing oxygen isotope data from ancient ocean crust that's now exposed on land in Australia.

And that could have major implications on the origin of life.

"An early Earth without emergent continents may have resembled a 'water world,' providing an important environmental constraint on the origin and evolution of life on Earth as well as its possible existence elsewhere," geologists Benjamin Johnson and Boswell Wing wrote in a paper just published online by the journal Nature Geoscience.

Johnson is an assistant professor of geological and atmospheric sciences at Iowa State University and a recent postdoctoral research associate at the University of Colorado Boulder. Wing is an associate professor of geological sciences at Colorado. Grants from the National Science Foundation supported their study and a Lewis and Clark Grant from the American Philosophical Society supported Johnson's fieldwork in Australia.

Johnson said his work on the project started when he talked with Wing at conferences and learned about the well-preserved, 3.2-billion-year-old ocean crust from the Archaean eon (4 billion to 2.5 billion years ago) in a remote part of the state of Western Australia. Previous studies meant there was already a big library of geochemical data from the site.

Johnson joined Wing's research group and went to see ocean crust for himself - a 2018 trip involving a flight to Perth and a 17-hour drive north to the coastal region near Port Hedland.

After taking his own rock samples and digging into the library of existing data, Johnson created a cross-section grid of the oxygen isotope and temperature values found in the rock.

(Isotopes are atoms of a chemical element with the same number of protons within the nucleus, but differing numbers of neutrons. In this case, differences in oxygen isotopes preserved with the ancient rock provide clues about the interaction of rock and water billions of years ago.)

Once he had two-dimensional grids based on whole-rock data, Johnson created an inverse model to come up with estimates of the oxygen isotopes within the ancient oceans. The result: Ancient seawater was enriched with about 4 parts per thousand more of a heavy isotope of oxygen (oxygen with eight protons and 10 neutrons, written as 18O) than an ice-free ocean of today.

How to explain that decrease in heavy isotopes over time?

Johnson and Wing suggest two possible ways: Water cycling through the ancient ocean crust was different than today's seawater with a lot more high-temperature interactions that could have enriched the ocean with the heavy isotopes of oxygen. Or, water cycling from continental rock could have reduced the percentage of heavy isotopes in ocean water.

"Our preferred hypothesis - and in some ways the simplest - is that continental weathering from land began sometime after 3.2 billion years ago and began to draw down the amount of heavy isotopes in the ocean," Johnson said.

The idea that water cycling through ocean crust in a way distinct from how it happens today, causing the difference in isotope composition "is not supported by the rocks," Johnson said. "The 3.2-billion-year-old section of ocean crust we studied looks exactly like much, much younger ocean crust."

Johnson said the study demonstrates that geologists can build models and find new, quantitative ways to solve a problem - even when that problem involves seawater from 3.2 billion years ago that they'll never see or sample.

And, Johnson said these models inform us about the environment where life originated and evolved: "Without continents and land above sea level, the only place for the very first ecosystems to evolve would have been in the ocean."

Credit: 
Iowa State University