Tech

A novel recipe for efficiently removing intrinsic defects from hard crystals

image: (a) Normal structure of α-tetragonal boron. While icosahedra of boron (gray) have an ordered atomic arrangement, interstitial boron atoms (red) are randomly arranged.
(b) Interstitial atoms are arranged in an ordered form in which the linear chains of interstitial atoms with different heights (red and blue) are alternately arranged.

Image: 
Osaka University

A team of researchers from Osaka University, the Institute for High Pressure Physics and the Institute for Nuclear Research of Russian Academy of Sciences (Russia), and TU Dresden (Germany), discovered an effective method for removing lattice defects from crystals. Their research results were published in Journal of Physics: Materials.

Boron, a semiconductor, has a variety of crystal structures, but all of them have large amounts of lattice defects that spoil the state of crystalline order. In this study, the team obtained an ordered phase of boron by adding hydrogen (hydrogenation) at high temperatures and through dehydrogenation by low-temperature annealing. This new method is the theoretical result by the research groups in Japan and Germany of a phenomenon that the Russian groups discovered in experiments.

Lattice defects present in all materials influence many of their electrical properties. The proper use of crystalline defects is useful in the electronic applications of semiconductors. The electrical conductivity of semiconductors can be enhanced by doping to produce n (negative)-type or p (positive)-type semiconductors. This control of lattice defects is called "valence electron control" and is achieved by placing dopants (impurities) into the atom sites. However, impurity atoms that occupy the interstitial sites are not useful in controlling valence electrons.

In a boron crystal, many atoms are randomly arranged in the interstitial sites (Figure 1 (a)). In addition, its crystal structure was too hard for the interstitial atoms to reach desirable sites. To render boron crystals as excellent semiconductor materials, it is necessary to rearrange randomly distributed boron atoms into an ordered structure.

Thus, this team created α-tetragonal (α-T) boron crystal at a high temperature and pressure, with a large amount of hydrogen as dopant. The obtained samples had a lot of defects. As shown in Figure 1 (a), while the B12 icosahedral boron clusters (gray) are ordered, boron atoms (red) and hydrogen atoms in the interstitial sites are randomly arranged (hydrogen atoms are omitted in the figure). Later, when recovering the samples to ambient conditions and annealing them at moderate temperatures, the removal of hydrogen atoms and the ordering of interstitial boron atoms occurred simultaneously (Figure 1 (b)). This indicates that the random arrangement of interstitial atoms becomes an ordered structure. This is the first time an ordered boron crystal with a large unit cell (a unit cell containing more than 50 boron atoms) was obtained.

Generally, a crystal takes a more ordered structure at low temperatures. Usually, crystallization occurs at high temperatures, which induce many defects, and these defects are solidified at low temperatures. However, when volatile hydrogen atoms are incorporated beforehand, they are easily released by annealing. Upon hydration releases, the migration of atoms is induced, achieving the ordering of boron atoms. This transition is a kind of "cooperative phenomenon" between two different changes: diffusion of hydrogen and the ordering of host atoms.

Associate Prof. Shirai from Osaka University says, "In addition to boron, our method of removing defects can also be applied to carbon-based materials, such as fullerene, which are very hard and have a high melting point. Because removing defects from these hard materials is difficult, our recipe will be an efficient method of removing defects for other semiconductor materials as well."

Credit: 
Osaka University

Charge fluctuations, a new property in superconductors

Superconductivity enables us to prevent loss when transporting energy from power plants to our homes. However, to do this, the lines must be cooled to temperatures that are so low as to make large-scale use of superconductors uneconomic at present. Therefore, in laboratories across the world researchers are looking for new superconductive materials that function at less prohibitive temperatures.

Great hope rests on so-called cuprates, copper and oxygen based compounds also called high-temperature superconductors, where the scientific community is focusing its efforts. An experiment conducted at the ESRF (European Synchrotron Radiation Facility), the European source of synchrotron-generated light, in Grenoble, coordinated by the Department of Physics at the Politecnico di Milano, with researchers from the Spin Institute from the National Research Council, Sapienza Università di Roma and the Chalmers University of Technology in Gothenburg, has revealed a new property of these materials: the presence of a variety of charge density waves called dynamical charge density fluctuations.

The study has been published in Science. These fluctuations do not appear to interfere with superconductivity; instead, they influence electrical resistance in the so-called 'normal' state, i.e. at temperatures above the superconducting critical temperature. Knowing about the presence of these charge fluctuations does not solve the key mystery, that of superconductivity; however it enables us to explain another strange behaviour of cuprates - the fact that they have a different electrical resistance to that of conventional metals. Furthermore this new "ingredient" could prove decisive in explaining superconductivity, no matter how this hypothesis is verified in the future.

In 2012 it was discovered that, in many cases, the superconductivity of cuprates is countered by load density waves, which partly impede transmission without resistance from electrons in the cuprates, without stopping it completely. Increasing our knowledge of these special materials is essential to being able to produce superconductors that function at ambient temperature or thereabouts, which is now a critical technological and scientific challenge.

The experiment which made this observation possible was carried out at the ESRF European Synchrotron Radiation Facility using RIXS technology, which analyses the preferred X-ray diffusion directions of the material under study.

Credit: 
Politecnico di Milano

Albeit it exists: Unexpected new material has been quenched to ambient pressure

image: Alena Ponomareva and Igor Abrikosov, authors from NUST MISIS discuss the graphic results of the theoretical modedlling.

Image: 
© NUST MISIS

Scientists from the National University of Science and Technology MISIS together with colleagues from Germany and Sweden achieved a result that seemed impossible. The researchers managed to create at ultra-high pressures a new material that preserves the structure and properties even under normal atmospheric pressure. Moreover, it turned out that it can be recreated in more "trivial" laboratory conditions via complex chemical reactions. The results of the experiment alongside with their theoretical explanation are presented in Nature Communications.

For several years, an international team of scientists from NUST MISIS, the University of Bayreuth (Germany) and the University of Linköping (Sweden) has been working on the search for novel superhard modifications of transition metals carbides and nitrides at ultrahigh pressures. Such metals have high hardness and high melting point, so they are used in the production of heat-resistant alloys, cutting tools, high temperature sensors, acid-and alkali-resistant protective coatings. The creation of more advanced superhard modifications will bring the use of such materials to a fundamentally new level.

Earlier experiments have proven the ability to create modifications of transition metals nitrides that are "impossible" for Earth conditions, but these modifications "disintegrated" when the pressure decreased. The next metal exposed to ultrahigh pressure was rhenium. This turned out to be a breakthrough: the material modified at such pressure has preserved its new structure and properties at conventional "room" conditions.

To a certain extent, the complexity of such a research can be compared to a golf play, where the hole is located on a steep hill, and one needs not only to throw the ball, but also to hold it inside.

During the experiment, rhenium and nitrogen were placed in the diamond anvil. Then the anvil was compressed simultaneously with the laser heating over 2000 Kelvin (>1700 °C). As a result, at pressures from 40 to 90 GPa (from 400 to 900 thousand Earth's atmospheres), a special monocrystalline structure was obtained, i.e. rhenium pernitride and two nitrogen atoms (rhenium nitride pernitride).

"Rhenium is almost incompressible as such, as its bulk modulus is about 400 GPa. After the modification, it increased to 428 GPa. To compare with, the bulk modulus of diamond is 441 GPa. Moreover, thanks to nitrogen components, the hardness of rhenium pernitride increased 4 times, to 37 GPa. Normally, materials obtained at ultrahigh pressures cannot preserve their properties after extraction from the diamond anvil, but this time our colleagues were pleasantly surprised. Of course, this result required explanation, so we modeled the process on our supercomputer. The theoretical results confirmed the experimental data and allowed to explain both the unusual properties of the new material and the possibility of its synthesis not only ta extreme, but also at Earth's conditions", Igor Abrikosov, Professor, scientific advisor of Materials Modeling and Development laboratory at NUST "MISIS", Head of Theoretical Physics Division at the Department of Physics, Chemistry and Biology, Linköping University, explains.

Indeed, it is important to understand that the diamond anvil can be used only for experiments, as it is very small, complex and expensive. That is why scientists decided to develop a technology that would allow recreating this new modification in more "trivial" conditions. Having understood the processes occurring in the material at ultra-high pressures, scientists were able to calculate and conduct a chemical reaction with ammonium azide in a large-volume press at 33 GPa. Now that the existence of such a modification is proved theoretically and experimentally, other ways to obtain it can be tested, for example, deposition of thin films.

Previously, scientists have already proved that one can create "forbidden" modifications of beryllium oxide, silica and a number of nitrides, as well as to transform insulating hematite into a conductor. All this happened at pressures hundreds of thousands (and sometimes millions) times higher than atmospheric one.

Credit: 
National University of Science and Technology MISIS

New methods for optimization of vibration shock protection systems are proposed

image: Time histories of the optimal coefficients of state-feedback for single-degree-of-freedom system.

Image: 
Lobachevsky University

Nowadays the words "uncertainty" and "multicriteria" characterize in a best way the relevance and complexity of modern problems of management of a variety of dynamic objects and processes.

In fact, any mathematical model describing complex controlled processes inevitably includes inaccuracies in the description of existing disturbances and parameters of the object under control. Ignoring such "uncertainty" often leads to fatal errors in the functioning of real management systems. On the other hand, the different requirements for the management system are usually contradictory. This leads to the elaboration of multi-criteria tasks, that in case of a successful solution allow to exclude at least knowingly "inefficient" solutions.

It is well known, that multi-criteria management tasks are very difficult to perform. These difficulties are amplified many times due to the uncertainty in the setting of the current disturbances. Thus, the development of the theory and methods of solving these problems seems to be relevant in both theoretical and applied aspects.

According to professor of the Institute of Informational Technologies, Mathematics and Mechanics Dr. Dmitry V. Balandin, the object of the study is a system of ordinary differential equations or partial differential equations. It is assumed that a dynamic object is subject to external influence, with respect to which it is known only that it belongs to a given class. In addition, the initial conditions for the system under consideration are also assumed to be unknown and belonging to a given set.

"The indicators characterizing the transients for the entire class of external influences and initial conditions, called maximum deviations of the system outputs, are introduced for the system under consideration. In essence, these indicators determine the maximum response of the system at the "worst" (most dangerous) external exposure and initial state» - mentioned prof. Balandin.

As a result, new methods and algorithms for numerical solution of problems of synthesis of the laws of optimal control of dynamic objects in the form of the inverse with the criteria in the form of maximum deviations of the system outputs are proposed. As an application, a new class of problems of optimal vibration shock protection of elastic objects is considered, the criteria of which are the maximum deformation of the elastic object of protection and the maximum deformation of the vibration isolating device. Tasks consist in finding feedback characterizing the vibration absorber and minimize the above-mentioned criteria in Pareto. In order to resolve this class of problems, the above-mentioned approach is applied to optimal control problems using the Hermeier convolution and the technique of linear matrix inequalities.

The two-criteria task of optimal vibration protection of a multi-storey high-rise building from seismic and wind impacts is considered in detail. A set of Pareto is constructed, as well as a comparison of the "ideal" Pareto of the optimal insulator, i.e. the control device, the feedback of which assumes the presence of current information about all variables of the state of the mechanical system under consideration, with the optimal insulators of active and passive types having a simpler structure of the control device.

The application of the developed methods of synthesis of the laws of optimal multi-criteria control to the vibration shock protection systems optimization problems is a pioneer and contributes to a significant advance in the theory and practice of vibration shock protection.

Credit: 
Lobachevsky University

NASA finds strongest storms off-center in Tropical Storm 14W  

image: On Sept. 4 at 7:40 a.m. EDT (1130 UTC ).the MODIS instrument that flies aboard NASA's Terra satellite showed strong storms (yellow) around 14W's center where cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius).

Image: 
NASA/NRL

NASA's Terra satellite provided an infrared view and temperature analysis of Tropical Storm 14W's cloud tops. Terra satellite showed some powerful thunderstorms in the storm were east of the center.

On Sept. 4 at 7:40 a.m. EDT (1130 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite used infrared light to analyze the strength of storms within the 14W. NASA researches these storms to determine how they rapidly intensify, develop and behave.

The Joint Typhoon Warning Center noted that, "Satellite imagery shows a partially exposed low-level circulation center with an area of deep convection (thunderstorms) offset to the east of the center." Just as on Sept. 3, the low-level circulation center of the storm remained exposed to outside westerly winds so the strongest thunderstorms continue to be pushed to the eastern side of the storm.

Tropical cyclones are made of up hundreds of thunderstorms, and infrared data can show where the strongest storms are located. They can do that because infrared data provides temperature information, and the strongest thunderstorms that reach highest into the atmosphere have the coldest cloud top temperatures.

MODIS found those strongest storms were around the center of circulation where cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius). NASA research has found that cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

The Joint Typhoon Warning Center (JTWC) noted on Sept. 4 at 11 a.m. EDT (1500 UTC), Tropical Storm 14W still had maximum sustained winds near 35 knots (40 mph/65 mph). 14W is far from land areas and is about 1,429 nautical miles southeast of Yokosuka, Japan. 14W is moving to the west.

JTWC said 14W will move west-northwest across the Pacific Ocean and gradually intensify to 90 knots after five days.

Credit: 
NASA/Goddard Space Flight Center

Low income cancer patients and those without insurance see fewer trial benefits

image: Unger is a health services researcher and biostatistician for SWOG Cancer Research Network, and is based at Fred Hutchinson Cancer Research Center.

Image: 
SWOG Cancer Research Network

PORTLAND, OR - When it comes to benefiting from experimental treatments offered in cancer clinical trials, your health insurance status and where you live matters, according to results of two new research studies to be presented at the 2019 ASCO Quality Care Symposium, held September 6 and 7 in San Diego.

Cancer patients who enroll in a clinical trial and have no health insurance, or are enrolled in Medicaid, the insurance program for low-income Americans, may not get the same benefits of successful trial treatments that other cancer patients do. Results from a separate study show that patients from socioeconomically deprived areas appear to have worse outcomes than their counterparts, even when treated on clinical trials with access to guideline-based cancer care.

Both studies were conducted by SWOG Cancer Research Network, a publicly-supported cancer clinical trials network funded by the National Cancer Institute (NCI) through the National Institutes of Health. Both studies were also led by Joseph Unger, PhD, a health services researcher and biostatistician for SWOG at the Fred Hutchinson Cancer Research Center, and an expert on health disparities in cancer clinical trials.

One study, the subject of a September 6 oral presentation at the ASCO meeting, is the first to show a connection between insurance status and patient benefit in cancer clinical trials.

Gauging the influence of insurance status can be difficult because of the size of any given trial. So Unger analyzed data from 19 SWOG trials - large randomized phase III trials that enrolled a total of 11,026 patients. Treatments tested in the trials targeted a variety of cancers, from breast to lung to prostate, and proved to have a statistically significant overall survival benefit to participants. SWOG sites in both the NCI's Community Oncology Research Program and its National Clinical Trials Network enrolled participants in the trials.

In cancer trials, overall survival is defined as how long a participant lives after they receive treatment, a measure of the efficacy of new interventions using drugs, surgery, or radiation. Unger wanted to determine whether the treatment effects on overall survival differed by age, sex, race and ethnicity, and insurance status. His results showed a discrepancy only for insurance status. Patients with no insurance or those enrolled in Medicaid had a smaller treatment benefit.

"We tend to assume that the relative benefit of a new therapy will be largely consistent across patient groups, so these results are concerning," Unger said. "We may need to think about how we design our cancer trials so we're accounting for these differences, and not misidentifying the benefits of the treatments we're testing for important patient groups."

In a separate poster presented on September 6, Unger will share research results that drive home the idea that disparities in clinical outcomes exist even after cancer patients receive high-quality treatment. Findings are highlighted in an ASCO media tip sheet distributed for the meeting.

In the poster, Unger and his SWOG team show that cancer patients living in socioeconomically deprived areas have worse cancer outcomes. The role of socioeconomic deprivation has not been systematically examined before in clinical trial patients.

To conduct the research, the team examined survival for patients enrolled in all phase III clinical trials for all major cancers conducted by SWOG between 1985 and 2012. Using participant zip codes, and the Area Deprivation Index, a measure of 17 demographic indictators based on work by the federal Health Resources & Services Administration, the team found that trial participants living in the most socioeconomically deprived areas in the U.S. had worse cancer survival rates - even after adjusting for race and insurance status.

Unger said more research would need to be done to explain why cancer trial participants with no or low insurance, or who are from socioeconomically deprived areas, appear to have worse outcomes in clinical trials. However, he said that both analyses suggest that low income patients may have more difficulty accessing proper supportive care after their initial cancer treatment.

Credit: 
SWOG

Can AI spot liars?

Most algorithms have probably never heard the Eagles' song, "Lyin' Eyes." Otherwise, they'd do a better job of recognizing duplicity.

Computers aren't very good at discerning misrepresentation, and that's a problem as the technologies are increasingly deployed in society to render decisions that shape public policy, business and people's lives.

Turns out that algorithms fail basic tests as truth detectors, according to researchers who study theoretical factors of expression and the complexities of reading emotions at the USC Institute for Creative Technologies. The research team completed a pair of studies using science that undermines popular psychology and AI expression understanding techniques, both of which assume facial expressions reveal what people are thinking.

"Both people and so-called 'emotion reading' algorithms rely on a folk wisdom that our emotions are written on our face," said Jonathan Gratch, director for virtual human research at ICT and a professor of computer science at the USC Viterbi School of Engineering. "This is far from the truth. People smile when they are angry or upset, they mask their true feelings, and many expressions have nothing to do with inner feelings, but reflect conversational or cultural conventions."

Gratch and colleagues presented the findings today at the 8th International Conference on Affective Computing and Intelligent Interaction in Cambridge, England.

Of course, people know that people can lie with a straight face. Poker players bluff. Job applicants fake interviews. Unfaithful spouses cheat. And politicians can cheerfully utter false statements.

Yet, algorithms aren't so good at catching duplicity, even as machines are increasingly deployed to read human emotions and inform life-changing decisions. For example, the Department of Homeland Security invests in such algorithms to predict potential threats. Some nations use mass surveillance to monitor communications data. Algorithms are used in focus groups, marketing campaigns, to screen loan applicants or hire people for jobs.

"We're trying to undermine the folk psychology view that people have that if we could recognize people's facial expressions, we could tell what they're thinking," said Gratch, who is also a professor of psychology. "Think about how people used polygraphs back in the day to see if people were lying. There were misuses of the technology then, just like misuses of facial expression technology today. We're using naïve assumptions about these techniques because there's no association between expressions and what people are really feeling based on these tests."

To prove it, Gratch and fellow researchers Su Lei and Rens Hoegen at ICT, along with Brian Parkinson and Danielle Shore at the University of Oxford, examined spontaneous facial expressions in social situations. In one study, they developed a game that 700 people played for money and then captured how people's expressions impacted their decisions and how much they earned. Next, they allowed subjects to review their behavior and provide insights into how they were using expressions to gain advantage and if their expressions matched their feelings.

Using several novel approaches, the team examined the relationships between spontaneous facial expressions and key events during the game. They adopted a technique from psychophysiology called "event-related potentials" to address the extreme variability in facial expressions and used computer vision techniques to analyze those expressions. To represent facial movements, they used a recently proposed method called facial factors, which captures many nuances of facial expressions without the difficulties modern analysis techniques provide.

The scientists found that smiles were the only expressions consistently provoked, regardless of the reward or fairness of outcomes. Additionally, participants were fairly inaccurate in perceiving facial emotion and particularly poor at recognizing when expressions were regulated. The findings show people smile for lots of reasons, not just happiness, a context important in the evaluation of facial expressions.

"These discoveries emphasize the limits of technology use to predict feelings and intentions," Gratch said. "When companies and governments claim these capabilities, the buyer should beware because often these techniques have simplistic assumptions built into them that have not been tested scientifically."

Prior research shows that people will make conclusions about other's intentions and likely actions simply based off of the other's expressions. While past studies exist using automatic expression analysis to make inferences, such as boredom, depression and rapport, less is known about the extent to which perceptions of expression are accurate. These recent findings highlight the importance of contextual information when reading other's emotions and support the view that facial expressions communicate more than we might believe.

Credit: 
University of Southern California

NASA catches the eye of Typhoon Lingling

image: On Sept. 4, 2019 at 1:20 a.m. EDT (0520 UTC) the MODIS instrument that flies aboard NASA's Terra satellite showed powerful thunderstorms circling Typhoon Lingling's visible eye.

Image: 
NASA/NRL

Typhoon Lingling continues to strengthen in the Northwestern Pacific Ocean and NASA's Terra satellite imagery revealed the eye is now visible.

On Sept. 4 at 1:20 a.m. EDT (0520 UTC) the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite showed powerful thunderstorms circling Typhoon Lingling's visible 15 nautical-mile wide eye. The Joint Typhoon Warning Center (JTWC) noted, "Animated enhanced infrared satellite imagery depicts tightly-curved banding wrapping into a ragged eye." In addition, microwave satellite imagery showed a well-defined microwave eye feature.

At 11 a.m. EDT (1500 UTC), the Joint Typhoon Warning Center or JTWC said that Typhoon Lingling, known locally in the Philippines as Liwayway, had moved away from the Philippines enough that warnings have been dropped.

Lingling was located near 23.0 degrees north latitude and 125.4 degrees east longitude. That is 247 nautical miles southwest of Kadena Air Base, Okinawa, Japan. Lingling was moving to the north-northeast and maximum sustained winds had increased to near 80 knots (75 mph/120.3 kph).

JTWC forecasters said that Lingling is moving north and is expected to intensify to 105 knots (121 mph/194 kph) upon passing between Taiwan and Japan.

Credit: 
NASA/Goddard Space Flight Center

Sex and height might influence neck posture when viewing electronic handheld devices

image: Figure demonstrating where landmarks were placed on the head/neck so that a geometric morphometric analysis could be completed.

Image: 
Kaitlin Gallagher and Ashly Romero, University of Arkansas

Sex and height appear to influence how people flex their neck when viewing handheld devices, according to a new study by researchers at the University of Arkansas.

The study looked at neck and jaw postures when using handheld electronic devices, the results suggesting that women and shorter individuals bend their necks differently than men and taller individuals; this could be related to the higher incidence of neck and jaw pain experienced by women.

As ownership of electronic handheld devices increases in the United States, new information is needed about how posture may affect the neck and jaw joint when using these devices. Some evidence shows that using these devices, such as cells phones or tablets, in certain postures may influence both the neck and jaw, eventually causing the development of pain in both. The study asked participants to hold and use electronic devices in five different postures while an X-ray was taken. These postures ranged from a neutral position of sitting straight up to a fully reclined position, as if the participant were leaning back in a chair.

Credit: 
University of Arkansas

The state of China's climate in 2018: More extreme events, but less loss

In order to provide information on climate features, meteorological disasters and climate impacts to the public for the previous year, the National Climate Center (NCC) of China has just completed a report to give an accessible and authoritative assessment of the climate in China based on the NCC operational system. It gives a summary of China's climate as well as major weather and climate events during 2018, and has been published in Atmospheric and Oceanic Science Letters. The majority of the report is based on temperature and precipitation observations.

"For weather and climate events, we select several high-impact events that occurred in 2018, such as typhoons, low-temperature freezing and snow disasters, rainstorms, heatwaves, droughts, severe convective weather events, dust storms, and haze events," says Dr. Chan XIAO, director of climate services division of NCC.

According to the report, in 2018, the mean temperature in China was 0.54°C above normal, and the annual rainfall was 7% above normal. More typhoons made landfall, inflicting severe damage. Low-temperature freezing and snow disasters occurred frequently, causing extensive losses. In summer, rainstorms occurred frequently, but with limited damage. Northeast China and central East China suffered extreme heatwaves. Regional and periodic droughts resulted in slight impacts. Severe convective weather and dust storms were relatively limited, but periodic haze influenced air quality and human health.

The good news is that, in 2018, the area of affected crops, death toll, and direct economic losses were all significantly less than those over the last five years. "Improved prediction and early-warning system from meteorological services were obviously instrumental." Said Xiao.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Fashion brands' business practices undermining progress on ending garment worker exploitation

Top fashion companies that are pledging to end worker exploitation in their global supply chains are hampering progress through their own irresponsible sourcing practices, concludes a new report published today on working conditions in the Southern Indian garment industry powerhouse.

Short production windows, cost pressures and constant fluctuations in orders by brands and retail chains like Nike, H&M, Adidas, Primark and Walmart make it very difficult for local suppliers to comply with the standards on decent working practice that the companies say they expect.

The South Indian garment industry clustered around Tirupur accounts for 45-50% - around $3.6 billion in 2017 - of all knitwear exports from India. Suppliers in the region have improved their working conditions over the past decade. However, heightened competition from lower-cost countries like Bangladesh and Ethiopia has meant that brands can force prices down, leaving little scope for further ethical improvements.

"When we interviewed manufacturers who supply knitwear to major global brands they explained that brands are growing louder in their demands for an end to bad labour practices but they are unwilling to alter their commercial practices to support improvements," said Andrew Crane, Professor of Business and Society at the University of Bath's School of Management, one of the five authors of the study.

"Brands do try to improve working conditions in their supply chains but their own sourcing practices often prevent meaningful change from happening. The demand for fast fashion at cheap prices means that brands ramp up penalties and put pressure on suppliers to deliver at low cost in short production windows. This makes it harder for suppliers to comply with the labour standards that brands expect.

"Brands need to ensure that local businesses are supported in their efforts to pursue decent work, and are not, as is all too often the case, squeezed by buyer demands that push them towards more exploitative practices," added Professor Crane.

The research by three UK Universities (Bath, Sheffield and Royal Holloway, University of London) in Tirupur found that social audits, intended to call out exploitation, are frequently manipulated and cheated by suppliers in order to retain business with brands. Suppliers complain that such ethical certification systems are too costly and add little value.

Interviews with over 135 business leaders, workers, NGOs, unions and government agencies in the state of Tamil Nadu during 2018 uncovered considerable evidence that while top-down initiatives from brands have led to some improvements in working conditions, they have failed to eradicate labour exploitation.

"Workers told us about extensive and shocking violations of their rights including routine disregard for health and safety standards, restricted freedom of movement and verbal abuse. They also reported incidents of child and bonded labour, and told us how they suffered from gender discrimination, unfair pay, a lack of contracts, and limited freedom to speak, amongst other violations of their rights," said co-author Genevieve LeBaron, Professor of Politics at the University of Sheffield.

Researchers find some cause for optimism from businesses at the bottom of the supply chain - especially mill owners and garment factories - who are pioneering strategies to eradicate exploitation that do not simply rely on audits.

"Business owners are bringing in initiatives to upskill workers, branding and product differentiation and investment in automation and cost-saving technologies - all of which have the potential to improve labour standards," said Laura Spence, Professor of Business Ethics at Royal Holloway, University of London.

"They are changing their recruitment strategies, for example providing free transportation services to pick up and drop off workers as a strategy to avoid the risks of hostels which tend to restrict workers' freedom of movement. And they are relocating manufacturing, so that workers can remain closer to home, where they have lower living costs and support from their families and communities," added Professor Spence.

The researchers are calling for the formation of a new taskforce in Tirupur to solve the labour issues facing the industry, led by an independent organisation or chair. They highlight three key issues to achieve decent work and economic growth: freedom of movement; health and safety; and worker-driven social responsibility and have made 12 recommendations to achieve this.

View the report, Decent Work and Economic Growth in the Southern Indian Garment Industry and its recommendations, at: https://www.bath.ac.uk/publications/decent-work-and-economic-growth-in-the-south-india-garment-industry/. The research is part of the British Academy's international programme, "Tackling Slavery, Human Trafficking and Child Labour in Modern Business", funded by the British Academy in partnership with the UK Department for International Development.

The research team also included Vivek Soundararajan, Associate Professor in International Management in the School of Management at the University of Bath and Michael Bloomfield, Lecturer in International Development in Bath's Department of Social and Policy Sciences at the University of Bath.

Credit: 
University of Bath

Using lasers to study explosions

image: An explosion is a complex event involving quickly changing temperatures, pressures and chemical concentrations. A special type of infrared laser, known as a swept-wavelength external cavity quantum cascade laser, can be used to study explosions. This versatile instrument has a broad wavelength tuning range that allows the measurement of multiple chemical substances in an explosive fireball. The ability to measure and monitor the dramatic changes during explosions could help scientists understand and even control them. This image shows how a swept-wavelength external cavity quantum cascade laser measures rapid changes in infrared light absorbed by molecules inside an explosive detonation.

Image: 
Mark C. Phillips

WASHINGTON, D.C., September 3, 2019 -- An explosion is a complex event involving quickly changing temperatures, pressures and chemical concentrations. In a paper in the Journal of Applied Physics, from AIP Publishing, a special type of infrared laser, known as a swept-wavelength external cavity quantum cascade laser (swept-ECQCL), is used to study explosions. This versatile instrument has a broad wavelength tuning range that allows the measurement of multiple chemical substances, even large molecules, in an explosive fireball.

The ability to measure and monitor the dramatic changes during explosions could help scientists understand and even control them. Measurements using rugged temperature or pressure probes placed inside an exploding fireball can provide physical data but cannot measure chemical changes that may be generated during the explosion. Sampling the end products of a detonation is possible but provides information only once the explosion is over.

In this work, molecules in the fireball are detected by monitoring the way they interact with light, especially in the infrared region. These measurements are fast and can be taken a safe distance away. Since fireballs are turbulent and full of strongly absorbing substances, lasers are needed.

Using a new instrument built in their lab, the investigators measured explosive events at faster speeds, at higher resolutions and for longer time periods than previously possible using infrared laser light.

"The swept-ECQCL approach enables new measurements by combining the best features of high-resolution tunable laser spectroscopy with broadband methods such as FTIR," co-author Mark Phillips explained.

The study looked at four types of high-energy explosives, all placed in a specially designed chamber to contain the fireball. A laser beam from the swept-ECQCL was directed through this chamber while rapidly varying the laser light's wavelength. The laser light transmitted through the fireball was recorded throughout each explosion to measure changes in the way infrared light was absorbed by molecules in the fireball.

The explosion produces substances such as carbon dioxide, carbon monoxide, water vapor and nitrous oxide. These can all detected by the characteristic way each absorbs infrared light. Detailed analysis of the results provided the investigators with information about temperature and concentrations of these substances throughout the explosive event. They were also able to measure absorption and emission of infrared light from tiny solid particles (soot) created by the explosion.

The swept-ECQCL measurements provide a new way to study explosive detonations that could have other uses. In future studies, the investigators hope to extend the measurements to more wavelengths, faster scan rates, and higher resolutions.

Credit: 
American Institute of Physics

Remora-inspired suction disk mimics fish's adhesion ability, offers evolutionary insight

video: Demonstration of the remora-inspired disc model and lamellae functionality.

Image: 
NJIT

Remora fishes are famed hitchhikers of the marine world, possessing high-powered suction disks on the back of their head for attaching themselves in torpedo-like fashion to larger hosts that can provide food and safety -- from whales and sharks to boats and divers.

Key to the remora's adhesion are the disk's well-known capabilities for generating suction, as well as friction created by spiky bones within the disk called lamellae to maintain hold on its host. However, the factors driving the evolution of remora's unique disc morphology have long eluded researchers seeking to understand, and even engineer new devices and adhesives that mimic, the fish's uncanny ability to lock on to various surface types without harming their host or expending much energy, often for hours at a time under extreme oceanic forces.

In a study led at New Jersey Institute of Technology (NJIT), researchers have showcased a new biologically inspired remora disc capable of replicating the passive forces of suction and friction that power the fish's ability, demonstrating up to 60% greater hold than has been measured for live remoras attached to shark skin.

Using the disc model to explore evolutionary drivers of the remora's disc, researchers say the study's findings provide evidence that today's living species of remora have evolved a greater number of lamellae over time to enhance their holding power and ability to attach to a broader range of hosts with smoother surfaces, thereby increasing their chance for survival.

The study, featured in Bioinspiration and Biomimetics, indicates the disc model may be used to inform the design of more effective, lower-cost adhesive technologies in the future.

"The beauty behind the remora's adhesive mechanism is that biological tissues inherently do most of the work," said Brooke Flammang, professor of biological sciences at NJIT who led the study. "The most significant aspect of this research is that our robotic disc relies completely on the fundamental physics driving the adhesive mechanism in remoras, allowing us to determine biologically relevant performance and gain insight into the evolution of the remora's disc. This was previously not possible with past designs that required a human operator to control the system."

Diverging from many of their closest scavenger-like ancestors, such as cobia (Rachycentron canadum), the remora fish (of the family Echeneidae) is believed to have first begun attaching to hosts with rough surfaces, akin to sharks, after having evolved its suction disc from dorsal fin spines nearly 32 million years ago. The disc of living remoras today now features a fleshy-soft outer lip for suction while the disc's interior houses many more linear rows of tissue (lamellae) with tooth-like tissue projections (spinules), which the fish raises to generate friction against various host bodies to prevent slipping during hitchhiking.

According to Flammang, while scientists have shed some light on the origins of the remora's modified fin structure, fundamental aspects of the disk's evolution have largely remained unclear.

"The evolution of the remora's disc is largely unknown," said Flammang. "There is one fossil remora, Opisthomyzon, in the fossil record that has a disc with fewer lamellae [than today's remoras] without spinules towards the back of the head."

Flammang says this raises two questions: "how" and "why."

"The 'how' is from the dorsal fin, although the intermediate evolutionary stages aren't known," explained Flammang. "If you look at a phylogeny of remoras it shows that those species that are thought to be more derived have more lamellae ... the 'why' has been assumed to be for adhesive performance, but that was never tested before this paper."

To learn more, Kaelyn Gamel, the study's first author and former graduate researcher in the Flammang lab, designed a remora-inspired disc from commercially available 3-D printed materials that could autonomously maintain attachment to various surfaces and be modified by adding and removing lamellae, enabling the team to investigate the performance of increased lamellar number on shear adhesion.

"Our disc's capability to add and remove lamellae while acting as a passive system allowed us to change the amount of friction along with the ambient pressure within the disc," said Gamel, now a Ph.D. researcher at the University of Akron. "We were able to compare the difference between no friction, some friction and a lot of friction based on the variation in lamellae number."

In collaboration with Austin Garner, a researcher at the University of Akron, the team conducted pull-off tests with their model disc underwater, experimenting with the model's lamellar number (up to 12 lamellae) to measure the shear force and time it took to pull the disc from silicon molds with surfaces ranging from completely smooth to those exceeding shark skin roughness (350-grit, 180-grit and 100-grit).

Overall, the team found that their disc's adhesive performance was strongly correlated with an increase in the disc's lamellae, observing a "sweet spot" in suction power between nine and 12 lamellae. When modified to 12 lamellae and 294 spinules, the team's disc weighed just 45 grams and withstood 27 N (newton) forces for 50 seconds -- almost three times the force that would typically pull a remora from a shark. The tests also revealed a minimum of six lamellae -- the number coincidentally found on the 32-million-year-old fossil Opisthomyzon -- were needed to maintain adhesion.

"What is most striking about these results is that for a given disc shape, there is an optimal range in which the friction and suction phenomena are balanced, and [as their disc size has gotten longer] remoras have evolved to maintain this sweet spot of high-performance adhesion," explained Flammang.

The team now says their remora disk model will be used for future evolutionary studies to learn whether suction or friction predominated attachment in earliest remora ancestors and how evolution of disc shape affects adhesion. The disc may also have engineering applications in everything from medical biosensors and drug delivery devices to geo-sensing tags for ecological studies and tracking marine life.
"One of the greatest advantages to our design is that it operates autonomously because it relies only on the physics of the system for operation," said Flammang. "This makes it easily scalable for a multitude of new technologies, both for medical and scientific purposes."

Credit: 
New Jersey Institute of Technology

Deer browsing is not stopping the densification of Eastern forests

image: This map shows units where the researchers collected data from forest inventory areas maintained and surveyed every five years by the US Department of Agriculture Forest Service, along with forest types in the eastern United States.

Image: 
Penn State

Selective browsing by white-tailed deer has been blamed by many for changing the character and composition of forest understories in the eastern U.S.; however, its impact on the forest canopy was previously unknown.

Now, a new study led by a Penn State researcher suggests that while deer browsing has impacted tree regeneration in the understory, it has not had much of an impact on forest canopies -- and in fact likely has slowed the forest densification process slightly.

"Forests in the region are becoming increasingly dense, and that is a major ecological problem," said Marc Abrams, professor of forest ecology and physiology in the College of Agricultural Sciences. "Indeed, deer can be thought of as an agent slowing down the densification problem, albeit not very effectively."

Abrams, who has spent most of his 40-year career studying how and why forests in the eastern U.S. have changed over the last few centuries, has assessed the role of increasing deer populations on reducing or eliminating tree regeneration in many forests.

"In addition to deer, a particularly important driver of forest change has been the near elimination of fire after the 1930s, attributed to the Smokey Bear campaign," said Abrams, the endowed Nancy and John Steimer Professor of Agricultural Sciences. "This has led to a densification of Eastern forests, in particular caused by red maple."

Recently Abrams realized that there is a paradox with the overstory densification occurring in the face of increasing deer pressure. So working with Brice Hanberry, a research scientist with the U.S. Forest Service, he compared forest overstory density with deer populations over time, by county, in the major forest types in the eastern U.S. The researchers found that the eastern U.S. has only a small proportion of understocked forests, and they have no statistical relationship to deer population.

The overstocking of forests is a major problem in the management of Eastern forests, Abrams believes.

"The densification greatly threatens the sustainability of many historically important tree species, such as oak, pine and hickory," he said. "For example, the increase in trees has been mainly from less-desirable species, particularly red maple, at the expense of oak, hickory and pine."

Red maple's success can be attributed more to its shade tolerance than it being a less-favored species for deer browsing, because its preference as food by deer varies by region. Therefore, the suppression of fire has been a more powerful driver of forest change on a landscape level than has deer browsing, Abrams explained.

"Management goals for the Eastern forests should include reducing the overstory density of undesired tree species and restoring natural fire cycles," he said. "These actions will help promote the historically dominant trees in the eastern U.S. A reduction in deer density may help promote some desirable tree species but may also exacerbate the densification problem."

Historically, forests in the region were more open and drier than those growing today. Long before the Smokey Bear period, Abrams noted, Native Americans burned large forest tracts regularly. And then, from 1870 to after 1930 -- mostly because of land clearing and logging during the clear-cut era that left a huge amount of fuel and resulted in catastrophic fires -- burning frequency increased and peaked in the 1930s.

To regenerate, desired tree species such as oak require sunlight to reach the forest floor, so with the densification problem, many of the historically dominant trees in the East no longer can regenerate well, and they are being replaced by shade-tolerant species such as red maple.

Deer browsing has not controlled tree density on a landscape level, Abrams said, because certain tree species -- whether preferred or not by deer -- generally have increased in the eastern U.S. during the past century. The exceptions are the fire-adapted species such as oak, hickory and pine, which have suffered from a lack of fire.

"If deer control tree regeneration, preferred tree species generally should decline relative to nonpreferred tree species," he said. "In Eastern forests with high deer densities, however, both decreasing and increasing tree species are favored browse for deer. Thus, deer densities do not explain compositional shifts between historical and current forests."

This research, published recently in Ecological Processes, is novel and was not possible before the advent of modern computers capable of handling huge amounts of data, Abrams pointed out. Information from more than 1,000 forest-inventory areas in the 26 states east of the Mississippi River -- maintained for many decades by the U.S. Forest Service -- was used to calculate changes in tree stocking and species. Those permanent plots are surveyed every five years.

"The finding of this research may greatly change how scientists and forest managers view the role of deer in the ecology of Eastern forests," he said. "They may realize that densification is an even larger problem."

Credit: 
Penn State

NASA finds tropical storm 14W strengthening

image: On Sept. 3, 2019 at 1:05 p.m. EDT (1505 UTC), the MODIS instrument that flies aboard NASA's Aqua satellite showed strong storms (yellow) around 14W's center where cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius).

Image: 
NASA/NRL

Tropical Storm 14W formed as a depression a couple of days ago in the Northwestern Pacific Ocean and strengthened into a tropical storm on Sept. 2. Infrared data from NASA's Aqua satellite shows some powerful thunderstorms fueling further intensification.

On Sept. 3 at 1:05 p.m. EDT (1505 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite used infrared light to analyze the strength of storms within the 14W. NASA researches these storms to determine how they rapidly intensify, develop and behave. In the data obtained about 14W, the very strong storms found near the center indicate the storm is strengthening.

Tropical cyclones are made of up hundreds of thunderstorms, and infrared data can show where the strongest storms are located. They can do that because infrared data provides temperature information, and the strongest thunderstorms that reach highest into the atmosphere have the coldest cloud top temperatures.

MODIS found those strongest storms were around the center of circulation where cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius). NASA research has found that cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

The Joint Typhoon Warning Center (JTWC) noted that, animated enhanced infrared satellite imagery shows that the low-level circulation center of the storm is exposed to outside winds, and that the strongest thunderstorms are being pushed to the eastern side of the storm, because of westerly winds. A microwave image at 6:47 a.m. EDT (1047 UTC) indicates tightly curved shallow banding of thunderstorms wrapping into the center with an isolated area of strong storms over the southeastern quadrant.

On Sept. 3 at 11 a.m. EDT (1500 UTC), Tropical Storm 14W had maximum sustained winds near 35 knots (40 mph/65 mph). 14W is far from land areas and is about 1,566 nautical miles southeast of Yokosuka, Japan. 14W is moving to the west.

JTWC said 14W will move west-northwest across the Pacific Ocean. The JTWC expects the system will gradually intensify to 90 knots after five days and move toward Japan.

Credit: 
NASA/Goddard Space Flight Center