Tech

The many lives of charcoal

In Africa, charcoal is ubiquitous as an energy source for cooking, even in urban areas where electricity and gas are available. Yet when Catherine Nabukalu was taking courses on energy as part of her master's degree in environmental studies at Penn, she noticed charcoal was often left out of the conversation about energy sources and their contribution to global carbon emissions.

"We would systematically go through coal, then nuclear, then hydropower, then geothermal, solar, etc.," says Nabukalu, now a project coordinator at the District of Columbia Sustainable Energy Utility. "You become aware that something is missing. Just because you don't have electricity, doesn't mean you don't have energy."

Reto Gieré, a professor in the Department of Earth and Environmental Science, encouraged her to research charcoal as an energy source for her final paper in his course, Energy, Waste, and the Environment. He then helped her get funding to travel to her native Uganda to continue to pursue the topic as an independent project that they conceptualized together. Nabukalu and Gieré shared the findings in a paper published in the journal Resources.

"I'm African, and I've used charcoal personally," says Nabukalu. "It's not fun to use, cooking is often not a healthy or enjoyable experience, but it's a big part of the energy mix. It's not the only source, but it's one of them."

As a step toward better understanding the full life cycle of charcoal, from creation to consumption, Nabukalu spent time at a number of sites in Uganda where charcoal is created, traded, sold, and consumed. She observed and interviewed participants at each of these stages and reviewed the literature about charcoal production and use worldwide. Gieré and Nabukalu shared some of the key results from this research with Penn Today that shed light on an overlooked source of energy.

Producing charcoal supports a nomadic lifestyle

In the locations Nabukalu visited, people produced charcoal by felling trees, stacking their trunks, and covering them with branches and leaves. A final layer of damp soil serves to keep as much oxygen as possible out when the pile is set alight. In this way, the wood undergoes pyrolysis instead of combustion, leaving behind carbon-dense charcoal.

These kiln installations are created and used only once. "In some cases, this necessitates that the people who use them are nomads," Gieré notes.

Indeed, some workers interviewed had moved to northern Uganda from the central region specifically for the opportunity to produce charcoal. "Some were entrepreneurs who moved on their own because they thought the trees were disappearing in the more southern areas," says Nabukalu. "But others were contracted by entrepreneurs that had informal companies to do this work."

Charcoal production takes a serious environmental toll and is difficult to regulate

Despite efforts by the Ugandan government to regulate charcoal production, it takes place by and large on private lands and thus is nearly impossible to suppress. "Partially, it's because the government has limited power on private land, but also there is strong demand for the product," Nabukalu says. Neighboring Tanzania attempted a ban that lasted only a couple of weeks, she found, as prices for charcoal soared on a quickly established black market.

For landowners, allowing charcoal production on their property can be a win-win. They may wish to clear land for agriculture and can invite charcoal producers onto their land, receiving a portion of the proceeds of the product's eventual sale. For the producers, it gives them a source of income without needing to own land.

While this type of small-scale charcoal production doesn't involve clearcutting, it still leads to forest degradation as the felled trees are unlikely to be replaced. Each year, Africa produces 24.5 million metric tons of charcoal, nearly 60% of the world's supply. In addition, much of the supply chain is informal, without any oversight. With traders selling charcoal far from where it is sourced, it is hard to track or quantify how much is actually being produced.

"I think a lot of reports grossly underestimate how much charcoal is being produced and used," Nabukalu says.

Access to alternative energy sources does not eliminate the use of charcoal

Nabukalu knew from personal experience that access to electricity did not obviate charcoal use. "It's a matter of home economics," she says. "When cooking, either you're going to use charcoal which you've already paid for, or you're going to turn on the gas or the electricity, but you don't know how high the bill will be."

In her research she found that Egypt, a country with a diversified mix of modern sources of energy, was still a top producer and importer of charcoal.

That's true not only in Africa but around the world.

"Take a look at Europe: Having wide and reliable access to electricity hasn't stopped usage there," she says, noting that Germany is the world's top charcoal importer. "You could ask," Gieré says, "do people in Germany really need to be using it? There it's for a Sunday barbecue as a leisure fuel, whereas in Uganda it's used for daily cooking."

Cooking with charcoal can be dangerous

Nabukalu and Gieré note the health impacts of cooking with charcoal, especially when done without proper ventilation.

"Burning biomass of any kind, whether it be firewood or charcoal, in a confined space exposes those nearby to fumes containing gases and particulates," says Gieré. "Similarly, producing the charcoal generates vapors during the pyrolysis process, including carbon monoxide, carbon dioxide, and methane, which all deteriorate air quality in the area surrounding the kilns."

In follow-up work, Hope Elliott, a student in Penn's Master of Science in Applied Geology program, also working with Gieré, is analyzing samples of the charcoal Nabukalu collected. In her research, Elliot is determining the chemical makeup of the charcoal and, by extension, the potential health effects of burning it.

Credit: 
University of Pennsylvania

Powering the future: Smallest all-digital circuit opens doors to 5 nm next-gen semiconductor

image: The entire all-digital PLL fits in a 50 × 72 μm2 region, making it the smallest PLL to date.

Image: 
Kenichi Okada

Scientists at Tokyo Institute of Technology (Tokyo Tech) and Socionext Inc. have designed the world's smallest all-digital phase-locked loop (PLL). PLLs are critical clocking circuits in virtually all digital applications, and reducing their size and improving their performance is a necessary step to enabling the development of next-generation technologies.

New or improved technologies, such as artificial intelligence, 5G cellular communications, and the Internet-of-Things, are expected to bring revolutionary changes in society. But for that to happen, high-performance system-on-a-chip (SoC)--a type of integrated circuit--devices are indispensable. A core building block of SoC devices is the phase-locked loop (PLL), a circuit that synchronizes with the frequency of a reference oscillation and outputs a signal with the same or higher frequency. PLLs generate 'clocking signals', whose oscillations act as a metronome that provides a precise timing reference for the harmonious operation of digital devices.

For high performance SoC devices to be realized, fabrication processes for semiconductor electronics must become more sophisticated. The smaller the area to implement digital circuitry is, the better the performance of the device. Manufacturers have been racing to develop increasingly smaller semiconductors. 7 nm semiconductors (a massive improvement over their 10 nm predecessor) are already in production, and methods to build 5 nm ones are now being looked at.

However, in this endeavor stands a major bottleneck. Existing PLLs require analog components, which are generally bulky and have designs that are difficult to scale down.

Scientists at Tokyo Tech and Socionext Inc., led by Prof. Kenichi Okada, have addressed this issue by implementing a 'synthesizable' fractional-N PLL, which only requires digital logic gates, and no bulky analog components, making it easy to adopt in conventional miniaturized integrated circuits.

Okada and team used several techniques to decrease the required area, power consumption and jitter--unwanted time fluctuations when transmitting digital signals--of their synthesizable PLLs. To decrease area, they employed a ring oscillator, a compact oscillator that can be easily scaled down. To suppress jitter, they reduced the phase noise--random fluctuations in a signal--of this ring oscillator, using 'injection locking'--the process of synchronizing an oscillator with an external signal whose frequency (or multiple of it) is close to that of the oscillator--over a wide range of frequencies. The lower phase noise, in turn, reduced power consumption.

The design of this synthesizable PLL beats that of all other current state-of-the-art PLLs in many important aspects. It achieves the best jitter performance with the lowest power consumption and smallest area (as can be seen in Figure 1). "The core area is 0.0036 mm2, and the whole PLL is implemented as one layout with a single power supply," remarks Okada. Further, it can be built using standard digital design tools, allowing for its rapid, low-effort, and low-cost production, making it commercially viable.

This synthesizable PLL can be easily integrated into the design of all-digital SoCs, and is commercially viable, making it valuable for developing the much sought after 5 nm semiconductor for cutting-edge applications including artificial intelligence, internet of things and many others, where high performance and low power consumption would be the critical requirements. But the contributions of this research go beyond these possibilities. "Our work demonstrates the potential of synthesizable circuits. With the design methodology employed here, other building blocks of SoCs, such as data converters, power management circuits, and wireless transceivers, could be made synthesizable as well. This would greatly boost design productivity and considerably reduce design efforts," explains Okada. Tokyo Tech and Socionext will continue their collaboration in this filed to advance the miniaturization of electronic devices, enabling the realization of newer-generation technologies.

Credit: 
Tokyo Institute of Technology

Lack of transparency in urban sustainability rankings

image: This is Lucia Saez, researcher of the UPV/EHU's Department of Financial Economics II.

Image: 
UPV/EHU

"The last two decades have seen significant growth in the spread of tools to classify and measure urban performance (rankings, indexes, etc.) across both the public and private institutions that use them, in response to different types of pressures encouraging uniformity. Naturally, all these tools are useful for guiding and assessing the policies implemented by local authorities in various fields of action, and are particularly prolific in the area of sustainability. Yet there is a lack of knowledge about the actual methodological base underpinning them and which is supposed to legitimize their use," explained Lucía Sáez-Vegas, PhD holder in the UPV/EHU's Department of Financial Economics II.

"With the aim of analysing and assessing quality and good practices in urban measuring and monitoring, and while devoting special attention to the methodological aspects, we took hundreds of measuring tools and selected a set of 21 similar rankings, indexes and tools designed to rank and monitor urban sustainability (understood in a very broad sense) so that we could study them in depth and thus adapt and apply an analysis methodology tested in another field, that of university rankings," added Dr Sáez.

The significance of methodological aspects

In each of the similar rankings, indexes and tools analysed, the researchers explored the following four main principles: aim and target group they are geared towards; methodology and weighting used in their design; transparency related to data gathering and information processing; and finally, the presentation of the results. As Dr Saéz explained, "of these four aspects the information on the first and the last is the most accessible, in other words, the descriptive information. That is specified by all the classifications analysed; yet that does not happen when it is about accessing all the information on the methodological aspects, data gathering and information processing; this results in what is known as the black box, an artefact whose results are studied and disseminated without its inner workings being thrown into doubt".

That is how the researchers confirmed various methodological weaknesses in all the rankings analysed. The researcher insists that "tools of this type tend to neglect complex causalities and lack transparency with respect to data gathering, weighting and aggregation process in their design, they tend to be biased and, as a result, tend to ignore badly ranked cities and to reinforce existing stereotypes".

"The possibility of ranking and comparing cities of different dimensions may help to spot those that appear to perform better in various urban aspects. That is why these tools are used on occasions by urban managers and public decision-makers to develop an action plan, even though one has to have a clear idea about how the ranking or index has been drawn up, and exercise caution when using it, above all if insufficient information is provided about the methodological aspects and the robustness of its results. We understand that these tools should be used more as a source of information and even inspiration and less as a road map for action," she added. "These rankings attract the interest of the general public because they measure concepts of a complex nature which are presented by means of a ranking, generally of a numerical type, which can be understood very easily. From our academic viewpoint, the fact that the results are presented in the form of a final ranking with a mention of the principal findings, but with little or no regard for the methodological aspects which, at the end of the day, are the ones that underpin the score or ranking, signifies a clear weakness of these tools when used to measure and monitor urban performance," said Sáez.

Credit: 
University of the Basque Country

Hidden donors play significant role in political campaigns

A new Caltech study reveals that so-called hidden donors in a political campaign--those contributors who donate less than $200--can make up a sizable fraction of a candidate's campaign funds.

The study, appearing in the Election Law Journal, specifically looked at the 2016 presidential campaign of Bernie Sanders. Unlike many other campaigns at the time, that of Sanders used an intermediary online fundraising service, called ActBlue, which meant that small contributions were required to be reported to the Federal Election Commission (FEC). Typically, donations from a single donor that add up to $200 or less do not need to be reported, but for intermediary fundraising services, the rules are more strict.

"That may seem like a small amount, but we have always wondered what it adds up to," says Mike Alvarez, professor of political science at Caltech and lead author of the new study. "Until recently, we haven't had the data to ask this question."

The study, in which the researchers sifted through and analyzed more than 100 million donation records, showed that the smaller contributions made up a total of 33 percent of all funds for the 2016 Sanders campaign. What is more, there were seven times more hidden donors than visible ones.

"What this is saying is that grassroots efforts to raise money from tens of thousands of people are an important part of a politician's campaign," says Alvarez.

Seo-young Silvia Kim, a Caltech graduate student who will soon become a professor at American University, led the data analysis, downloading chunks of individual contribution records from the FEC databases. At first, she began looking at many campaigns, but then later realized that the different rules for ActBlue donations resulted in smaller contributions being reported.

"Just in the 2016 election cycle, there were more than 1,133,000 files and more than 100 million records of individual contributions for all the campaigns," says Kim. "But to make sense of the data, you have to dig deep into the raw data, and not just its summaries. That's when I noticed how intermediary committees were reporting contributions differently than the usual committees."

"It's hard to scrape all this information together," says Alvarez. "The data are either not available or hard to obtain. Silvia is an incredibly talented data scientist. She linked the many data sets together, which was no easy task."

The results also showed that the hidden donors tended to contribute to the campaign relatively later than what is typical, and that they tended to be students, females, and racial/ethnic minorities.

The researchers plan to do similar studies for the 2020 elections, and with the increasing use of online fundraising platforms, they will be able to track small campaign contributions for not just Sanders but other candidates as well.

"Money is very important in politics, but all the previous studies about campaign finance were restricted to relatively large donors, leading to a skewed picture of this important political activity," says co-author Jonathan Katz, the Kay Sugahara Professor of Social Sciences and Statistics at Caltech. "Given changes in technology, these smaller donors are becoming both more numerous and important."

Credit: 
California Institute of Technology

Five things to know about egg freezing

Egg freezing for age-related fertility is becoming more common, and a short article in CMAJ (Canadian Medical Association Journal) provides quick reference points on the topic for primary care providers. http://www.cmaj.ca/lookup/doi/10.1503/cmaj.191191

1. Elective egg freezing is a way to help patients increase their chance of pregnancy at a later age
2. Patients aged 35 or younger with normal ovarian reserve have the best chance of success, although future live birth is not guaranteed
3. The treatment process usually takes 10-14 days
4. Fees for this treatment are not covered in Canada by government health insurance
5. Frozen eggs do not expire, but many clinics have age limits after which they will not proceed with transfer of embryos created from the frozen eggs.

Credit: 
Canadian Medical Association Journal

Scientists warn humanity about worldwide insect decline

image: Drivers (in red) and consequences (in blue) of insect extinctions. Note that drivers often act synergistically or through indirect effects (e.g., climate change favours many invasive species and the loss of habitat). All these consequences contribute to the loss of ecosystem services essential for humans.

Image: 
Pedro Cardoso

Engaging civil society and policy makers is essential for the future and mutual well-being both of people and insects. In addition to mitigating climate change, an important aspect of the solution involves setting aside high-quality and manageable portions of land for conservation, and transforming global agricultural practices to promote species co-existence.

Humanity is pushing many ecosystems beyond recovery. As a consequence, unquantified and unquantifiable insect extinctions are happening every day. Two scientific papers by 30 experts from around the world discuss both the perils and ways to avoid further extinctions, intending to contribute towards a necessary change of attitude for humanity's own sake.

"It is surprising how little we know about biodiversity at a global level, when only about 10 to 20 per cent of insect and other invertebrate species have been described and named. And of those with a name, we know little more than a brief morphological description, maybe a part of the genetic code and a single site where it was seen some time ago," says Pedro Cardoso, from the Finnish Museum of Natural History Luomus, University of Helsinki, Finland.

The results of recently published works make it clear that the situation is dire

Habitat loss, pollution - including harmful agricultural practices, invasive species that do not encounter borders, climate change, overexploitation and extinction of dependent species all variably contribute to documented insect population declines and species extinctions.

"With species loss, we lose not only another piece of the complex puzzle that is our living world, but also biomass, essential for example to feed other animals in the living chain, unique genes and substances that might one day contribute to cure diseases, and ecosystem functions on which humanity depends," confirms Cardoso.

The ecosystem functions he mentions include pollination, as most crops depend on insects to survive. Additionally, decomposition, as they contribute to nutrient cycling, as well as many other functions for which we have no technological or other replacement.

Practical solutions to mitigate insect apocalypse

The researchers also suggest possible practical solutions based on existing evidence gathered from around the world, which would help to avoid further insect population loss and species extinctions. These include actions such as setting aside high-quality and manageable portions of land for conservation, transforming global agricultural practices to promote species co-existence, and mitigating climate change.

Above all, communicating and engaging with civil society and policy makers is essential for the future and mutual well-being both of people and insects.

"While small groups of people can impact insect conservation locally, collective consciousness and a globally coordinated effort for species inventorying, monitoring and conservation is required for large-scale recovery" says Michael Samways, Distinguished Professor at Stellenbosch University, South Africa.

Ideas to help insects

1. Avoid mowing your garden frequently; let nature grow and feed insects.

2. Plant native plants; many insects need only these to survive.

3. Avoid pesticides; go organic, at least for your own backyard.

4. Leave old trees, stumps and dead leaves alone; they are home to countless species.

5. Build an insect hotel with small horizontal holes that can become their nests.

6. Reduce your carbon footprint; this affects insects as much as other organisms.

7. Support and volunteer in conservation organizations.

8. Do not import or release living animals or plants into the wild that could harm native species.

9. Be more aware of tiny creatures; always look on the small side of life.

Credit: 
University of Helsinki

Researchers virtually 'unwind' lithium battery for the first time

image: Reconstructed tomograms from neutron and X-ray computed tomography. Clearly visible in the X-ray images is the nickel current collecting mesh, which appears brighter than the active electrode material.

Image: 
UCL, ILL, HZB

An international team led by researchers at UCL has revealed new insights into the workings of a lithium battery by virtually "unrolling" its coil of electrode layers using an algorithm designed for papyrus scrolls.

In a study published in Nature Communications, https://www.nature.com/articles/s41467-019-13943-3 "4D imaging of Li-batteries using operando neutron and X-ray computed tomography in combination with a virtual unrolling technique," researchers combined X-ray and neutron tomography to track the processes deep within a lithium battery during discharge. They then used a mathematical model designed for ancient manuscripts too sensitive to be physically opened to "unroll" the electrode layers, so aiding analysis and revealing that different sections of the battery were operating differently.

Researchers found that using the two complementary imaging techniques and "unrolling" the electrodes while they are in normal use provides a fuller and more accurate understanding of how the battery works and how, where and why it degrades over time. Unseen trends in the spatial distribution of performance in the cells were observed.

The method paves the way for developing strategies for improving the design of cylindrical cells using a range of battery chemistries, including by informing better mathematical models of battery performance. As such the method may facilitate improvements in the range and lifetime of electric vehicles of the future.

The project was funded by the Faraday Institution, as part of its battery degradation project.

Further details

The team investigated the processes occurring during discharge of a cylindrical commercial Li-ion primary cell from Duracell using a combination of two highly complementary tomography methods. Tomography is a technique for displaying a representation of a cross section through a solid object through the use of a penetrating wave such as ultrasound or X-rays. The method is used in radiology, archaeology, atmospheric science, geophysics, oceanography as well as materials science.

X-rays are sensitive to heavier elements in the battery - such as manganese and nickel, and neutrons are sensitive to lighter elements - lithium and hydrogen, allowing the two techniques to visualise different parts of the battery structure and allowing researchers to build up a more complete understanding of the processes occurring deep within the cell during battery discharge.

X-ray computed tomography allowed for the quantification of mechanical degradation effects such as electrode cracking from the electrode bending process during cell manufacturing. Whereas the imaging using neutrons yielded information about the electrochemistry such as lithium-ion transport and consumption or gas formation by electrolyte decay.

A new mathematical method developed at the Zuse-Institut in Berlin then enabled researchers to virtually unwind the battery electrodes that are wound into the form of a compact cylinder. The cylindrical windings of the battery are difficult to examine quantitatively, and the cell cannot be unwound without inducing further damage that would not be present in an unwound battery.

Credit: 
The Faraday Institution

AAAS panel focuses on roadmap to 'radical transformation of the AI research enterprise'

When Dan Lopresti and his colleagues talk about the future of artificial intelligence (AI) during their upcoming panel at the annual meeting of the American Association for the Advancement of Science (AAAS), be prepared to imagine a better world.

In this world, the full potential of AI is unleashed to benefit society: health care is personalized and accessible through a friendly robot companion; education is customized to offer individualized plans for retraining and skills-building; and, businesses, large and small, operate with previously unheard-of efficiency and provide a level of customer service that can only be dreamed of today.

"The question is what are we going to see over the next ten or twenty years break loose as a result of the research, which is assuming the research gets done because of investments made," says Lopresti, a professor of computer science and engineering at Lehigh University. Lopresti is also the incoming Vice Chair of the Computing Community Consortium (CCC) Council which, along with the Association for the Advancement of Artificial Intelligence (AAAI), spearheaded the creation of "A Twenty-Year Community Roadmap for Artificial Intelligence Research in the U.S."

Lopresti will participate in a panel with the authors of the Roadmap and leaders of the initiative that led to it, Yolanda Gil (University of Southern California and President of AAAI) and Bart Selman (Cornell University and President-Elect of AAAI), on Saturday, February 15th from 8:00 am to 9:30 am at the AAAS annual meeting in Seattle.

The Roadmap lays out a case for the best use of resources to fulfill the promise of AI to benefit society. The 100-plus page report is introduced by an Executive Summary that argues that: "Achieving the full potential of AI technologies poses research challenges that require a radical transformation of the AI research enterprise, facilitated by significant and sustained investment."

The authors write that AI systems have the potential for transformative impact across all sectors of society and for substantial innovation and economic growth and articulates AI benefits in several specific areas: 1) boost health and quality of life, 2) provide lifelong education and training, 3) reinvent business innovation and competitiveness, 4) accelerate scientific discovery and technical innovation, 5) expand evidence-driven social opportunity and policy, and 6) transform national defense and security.

The report also recognizes the tremendous social change that will result, says Lopresti, and that this must be addressed as well. Ethics is also an important consideration across the board.

Eighteen "vignettes" bring the envisioned future of AI to life, such as:

- Vignette 1: Jane is a video game enthusiast and loves spicy food. She suffers from anxiety, has been under treatment for Type 1 diabetes since her early teens, and has a rare allergy to sesame seeds. Her health-focused personal assistant has been helping Jane manage her physical and mental health for years. It monitors Jane's vital signs and blood sugar, has access to her electronic medical records, and can draw on online health information from high-quality sources to generate recommendations and advice. It helps Jane manage her chronic illness, ensuring that the treatment is being administered correctly and has the intended effects. It stays up to date with the latest breakthroughs in diabetes treatment and reasons about how these might affect Jane...

- Vignette 11: Joe is a worker who was laid off in a company restructuring. He wants to retrain, but needs income in order to support his family and cannot afford to embark on full-time education. A free AI system helps him plan for career change--what is a feasible job he could take that would either build the skills he needs along the way or would pay the bills while giving flexibility to study and advance his career. To explore his short- and long-term career opportunities, Joe navigates to an interactive AI system and describes his skills and interests. The system visualizes a number of possible career paths for him, including both short- and longterm steps he can pursue to make progress on those paths...

- Vignette 12: Hollis runs a small online business, where she sells customized personal devices and customized robots, which she designs and builds on demand. Some objects are aesthetic, such as integrating light and motion sensors with embedded LED lighting to add responsiveness to jewelry; others are more functional, such as customized wristbands that integrate her designs with medical sensors and small displays. An interactive AI systems allows Hollis to rapidly develop specialized products for her customers, enabling new business opportunities...

The report acknowledges the challenges that must be overcome to achieve these scenarios and presents a number of recommendations amounting to "a reinvention of the AI research enterprise." The recommendations fall under three broad categories: 1) create and operate a national AI infrastructure; 2) re-conceptualize and train an all-encompassing AI workforce, and 3) ensure that core programs for basic AI research are expanded and supported.

"One of the goals is to create the infrastructure needed to keep faculty in universities doing research at a high level," says Lopresti. "We also need to keep students interested in the idea -when they get their graduate degrees - to go the faculty route rather than the industry route."

As Lopresti explains, to do cutting-edge research not only does one "...need access to tremendous software and tremendous computing power, you also need access to tremendous amounts of data. A lot of machine learning is based on data. And, if you go to Facebook or Google you get the data. A lot of people who leave universities to go to Google or Facebook, it's not so much about the money or the stock options, or the free lunches or the other perks, it's because they believe they can do the best research there because they will have access to the best data. That's a huge, huge issue with keeping people in academia."

He adds: "There are a lot of things that Google does that are really cool. Yet, there are obviously commercial interests driving Google. And the whole idea is that in academia, our work shouldn't be mired in or colored by commercial interests. At a university that's not our reason to be. It's to be the independent voice, independent scientists. That's really important."

They write that a reconceptualization and training of an all-encompassing AI workforce should build upon the National AI Infrastructure. Among the elements they describe are: developing AI curricula at all levels; incentivizing emerging interdisciplinary areas; and, engaging underrepresented and underprivileged groups to bring the best talent into the AI research effort. They emphasize that AI ethics and policy must be central, and highlight the importance of incorporating ethics and related responsibility principles as central elements in the design and operation of AI systems.

The new initiatives, they write, cannot come at the expense of core programs for basic AI research which are critical, adding: "These core programs--which provide well-established, broad-based support for research progress, for training young researchers, for integrating AI research and education, and for nucleating novel interdisciplinary collaborations--are critical complements to the broader initiatives described in this Roadmap, and they too will require expanded support."

Creating the roadmap: "a marshalling of the community"

According to the Computing Community Consortium, the goal of the 20-year Roadmap initiative was to identify challenges, opportunities, and pitfalls in the AI landscape, and to create a compelling report to inform future decisions, policies, and investments in this area.

The Roadmap was based on broad community input gathered via a number of forums and communication channels: three topical workshops during the fall and winter of 2018/2019, a Town Hall at the annual meeting of the AAAI, and feedback from other groups of stakeholders in industry, government, academia, and funding agencies.

"We marshalled the community," says Lopresti. "This was an amazing effort. In a period of about a year, we got info from hundreds of computing researchers around the country. We ran a series of workshops that were very well attended and produced this Roadmap for AI Research which is this quite hefty document that looks out twenty years."

The "hefty document" paints a compelling vision of a future made better through the unleashing of AI's full potential, with an understanding that attention must also be paid to the possible negative repercussions of this revolution. It's a future, they say, that can only be realized through strategic, substantial and sustained investment and a reimagining of how AI research is done.

Credit: 
Lehigh University

A thermometer can be stretched and crumpled by water

image: The effects of ionic side chain. a) Schematics indicating difference regarding pot life and thermal stability according to chemical and physical methods. b) Scheme for P(SPMA?r?MMA)s and water solubility. c) Physical cross?linking (reversible ion cross?linking and entanglement) regarding affinity with solvent in the gels.

Image: 
Taiho Park(POSTECH)

Recent outbreaks of the novel coronavirus have emphasized the importance of quarantine and prevention more than ever. When monitoring changes in our body, body temperature is first measured. So, it is very significant to measure the temperature accurately and promptly. With this regard, a research team recently developed a stretchable and crumpling polymer ionic conductor to realize a thermal sensor that could measure body temperature by simple contacts such as wearing clothes or shaking hands and an actuator that could control movements of artificial muscle.

Prof. Taiho Park and his student, Junwoo Lee from POSTECH Department of Chemical Engineering developed a thermally stable and flexible ionic conductor by using water solvent on a joint research with Nanyang Technological University for the first time. Their research accomplishment was published in the latest issue of the online version of Advanced Materials, the most renowned journal in the field of materials science.

Various materials for ionic conductors have been developed, however, with limitations to overcome. A semiconductor device used in most of the electronic devices has a problem with diminished electronic performances due to mechanical stress when it is stretched or contracted. Also, a rubber with nano-silver particle requires a difficult process and is not transparent. Hydrogel ions are easily dried out and lose their flexibility.

To solve these problems, the research team designed a P (SPMA-r-MMA) polymers with different ratios of ionic side chain and chemically linked ionic materials with polymer chains. When making an ionic conductor, it is critical to have a solution process at room temperature. So, the newly developed polymer ionic conductor was processed with water as a solvent and covered with thin film. The process was much simpler than the conventional ones and it did not use toxic solvent and could be mass-produced.

The chemically linked ionic conductor was thermally stable and stretchable. Also, it was self-healable that could recover its structures when it was ripped or broken. The research team used this ionic conductor to realize an actuator thermally stable up to 100°C and a flexible thermal sensor applicable to a body for the first time.

Junwoo Lee who performed the research said, "This is the first example of developing a polymer ionic conductor, which is used in a next-generation stretchable device, by facilitating a water solvent instead of a toxic chemical solvent. The polymer ionic conductor that we developed this time is stretchable, self-healable and thermally stable. For this reason, we anticipate that our research will impact greatly on the stretchable wearable electronic device industry."

Credit: 
Pohang University of Science & Technology (POSTECH)

Geothermal energy: Drilling a 3,000 meters deep well

image: This is the view of the Venelle-2 well. The well was designed to sample supercritical fluids.

Image: 
© Riccardo Minetto

Although stopping climate change is challenging, it is imperative to slow it down as soon as possible by reducing greenhouse gas emissions. But how can we meet the growing energy demand while reducing our use of polluting fossil fuels? Geothermal energy is an efficient, non-polluting solution but in certain cases geothermal operations must be handled with care. Reaching the most powerful sources of available energy means drilling deep into the layers of the earth's crust to find geothermal fluids with high energy content (hot water and gas released by magma). Yet, the deeper we drill the greater are the subsurface unknowns controlling the stability of the Earth's crust. Destabilising the precarious equilibrium at depth with geothermal wells may reactivate the geological layers causing earthquakes. Researchers at the University of Geneva (UNIGE), Switzerland, working in collaboration with the University of Florence and the National Research Council (CNR) in Italy, have studied the seismic activity linked to a geothermal drilling in search of supercritical fluids. They discovered that the drilling did not cause uncontrolled seismic activity. This drilling under such critical conditions suggests that the technology is on the verge of mastering geothermal energy, paving the way for new sources of non-polluting heat and electricity. You can read all about the results in the Journal of Geophysical Research.

The scientific community agrees that CO2 emissions need to drop by 45% by 2030 and that 70% of our energy must be renewable by 2050. But how can these targets be met? Geothermal power - a renewable form of energy - is part of the solution. A number of countries, including Switzerland, are already exploiting geothermal energy to produce heat from shallow wells. Until 1,500 metres such technology normally presents little risk. "To generate electricity, however, we have to drill deeper, which is both a technological and a scientific challenge", points out Matteo Lupi, a professor in the Department of Earth Sciences in UNIGE's Faculty of Science. In fact, drilling deeper than 1,500 metres requires special care because the unknown factors relating to the subsurface increase. "Below these depths, the stability of the drilling site is more and more difficult and poor decisions could trigger an earthquake."

A first success at Larderello-Travale in Italy?

The Larderello geothermal field in Tuscany - the world's oldest - currently produces 10% of the world's total geothermal electricity supply. We know that at about 3,000 metres depth, we reach a geological layer marked by a seismic reflector, where it is thought that supercritical fluids may be found. Supercritical fluids yield an enormous amount of renewable energy. The term supercritical implies an undefined phase state - neither fluids nor gaseous - and boast a very powerful energy content. "Engineers have been trying since the 1970s to drill down to this famous level at 3,000 metres in Larderello but they still haven't succeeded", explains Riccardo Minetto, a researcher in UNIGE's Department of Earth Sciences. "What's more, we still don't know exactly what this bed is made up of: is it a transition between molten and solid rocks? Or does it consist of cooled granites releasing fluids trapped at this level?" The technology is becoming ever more sophisticated. Because of this geothermal drilling in search of supercritical conditions has been attempted once more at Larderello-Tavale. The aim? Deepening a wellbore few centimetres wide to a depth of 3,000 metres to tap these supercritical fluids. "This drilling, which formed part of the European DESCRAMBLE project, was unique because it targeted the suggested transition between rocks in a solid and molten state", continues professor Lupi.

The Geneva team set up eight seismic stations around the well within a radius of eight kilometres to measure the impact of the drilling on seismic activity. As the drilling progressed, the geophysicists collected the data and analysed each difficulty that was encountered. "The good news is that for the very first time, drilling in search of supercritical fluids caused only minimal seismic disturbance, which was a feat in such conditions and a strong sign of the technological progress that has been made", explains professor Lupi. His team used the eight seismic stations to distinguish between the natural seismic activity and the very weak events caused by the drilling. The threshold of 3,000 metres, however, was not reached. "The engineers had to stop about 250 metres from this level as a result of the extremely high temperature increase - over 500 degrees. There's still room for technical progress on this point", says Minetto.

This study indicates that the supercritical drilling went well and that the technology is close to being mastered. "Until now, anyone who had tried to sink a well in supercritical conditions did not succeed because of the high temperatures but the results here are extremely encouraging", says professor Lupi. Switzerland is itself very active in promoting geothermal energy. This renewable source of energy if developed further would share some of the burden of the country's hydropower, solar and windpower. "Geothermal energy could be one of the main sources of energy of our future, so it's only right to promote future investments to develop it further and safely", concludes the Geneva-based researcher.

Credit: 
Université de Genève

Adding sewage sludge on soils does not promote antibiotic resistance, Swedish study shows

image: This is Joakim Larsson, Professor, Sahlgrenska Academy, University of Gothenburg.

Image: 
Photo by Johan Wingborg

Adding sewage sludge on soils does not promote antibiotic resistance, a study from University of Gothenburg shows.

Some of the antibiotics we use end up in sewage sludge, together with a variety of antibiotic resistant bacteria present in feces. Therefore, there is a widespread concern that spreading sludge on farmland would contribute to the development or spread of antibiotic resistance.

In a new scientific study, researchers from the Centre for Antibiotic Resistance Research, CARe, at the University of Gothenburg investigated effects of over thirty years of regular spread of sludge to soils.

The research group, led by Professor Joakim Larsson, took advantage of an agricultural field trial in southern Sweden, where land used for growing different crops had been amended with digested sludge every four years since the early 1980s. On a large number of plots, sludge was spread from a nearby treatment plant in different doses, while on parallel plots, only inorganic fertilizers were added.

Together with researchers from Umeå University and the University of Copenhagen, the research group from Gothenburg studied effects on the levels of antibiotics and other antibacterial substances in sludge and soil, resistance genes, resistant bacteria as well as what species of bacteria that were common in the soils.

- The overall result is that virtually nothing happens. Everything we studied looks about the same in the different soils, regardless if a lot of sludge, little sludge, no sludge or just inorganic fertilizers were added. No antibiotics accumulated in the soil, nor did any resistant bacteria. The only clear thing we can see is that the nutrient supply affects which bacterial species thrive best in the soils, says Joakim Larsson.

There is a clear value in returning nutrients to farmland and thus close the cycle. At the same time, sewage treatment plants, and the sludge produced there, are accumulation points for many of the chemicals we use in society.

- We have studied risks related to antibiotic resistance, which is only one of several pieces of the puzzle in the assessment of benefits and risks of using sludge as fertilizers. And, from a scientific point of view, small effects can never be completely excluded. In other countries, with higher antibiotic use, more resistant bacteria present in the human fecal flora, different sewage and sludge treatment procedures and higher dosing of sludge, risks can clearly not be excluded. Nevertheless, spreading sludge on farmland to the extent and in the way done in Sweden today does not seem to pose any obvious risk of driving resistance development. That is good news, concludes Joakim Larsson.

Credit: 
University of Gothenburg

Oblique electrostatic inject-deposited TiO2 film leads efficient perovskite solar cells

image: (a) RS J-V characteristics of PSCs made with SC- TiO2 CL, SP- TiO2 CL, OEI- TiO2 CL-60 sec, and OEI- TiO2 CL-30+30 sec. (b) IPCE spectra of PSCs made with SC- TiO2 CL, SP- TiO2 CL, and OEI- TiO2 CL-30+30 sec.

Image: 
Kanazawa University

Kanazawa, Japan - The need to efficiently harvest solar energy for a more sustainable future is increasingly becoming accepted across the globe. A new family of solar cells based on perovskites--materials with a particular crystal structure--is now competing with conventional silicon materials to satisfy the demand in this area. Perovskite solar cells (PSCs) are continually being optimized to fulfill their commercial potential, and a team led by researchers from Kanazawa University has now reported a new and simple oblique electrostatic inkjet (OEI) approach to deposit a titanium oxide (TiO2) compact layer on FTO-pattern substrates without the need for a vacuum environment as an electron transport layer (ETL) for enhancing the efficiency of PSCs. The findings are published in Scientific Reports.

The PSCs comprise a stack of different component layers that all have a specific role. The ETL, which is often composed of TiO2, enables the transport of electrons--which carry charge--to the electrodes, while blocking the transport of holes--which can recombine with electrons to prevent their flow. Establishing a complete TiO2 layer with the correct thickness, which is uniform and free of flaws, is therefore critical to producing efficient solar cells.

Many of the numerous TiO2 deposition techniques reported to date have associated limitations, such as poor coverage or reproducibility, or being unsuitable for scale-up. They can also require challenging preparation conditions such as a vacuum. The researchers report a simple, low-cost OEI-method that achieves a compact layer without requiring a vacuum.

"Our technique can produce uniform electron transport layers whose thickness can be varied by controlling the deposition time." Study lead author Assistant Professor Dr. Md. Shahiduzzaman explains. "Solar cells made using our approach had power-conversion efficiencies of up to 13.19%, which, given the other advantages of our technique, is very promising for scale-up and commercialization."

The technique is based on the deposition of positively charged droplets that are attracted to a negatively charged surface. Previous reports using the same electrostatic approach achieved lower power-conversion efficiencies because the droplets formed a stack on the surface as a result of gravity. Introducing an oblique angle into the process--spraying the TiO2 precursor at 45° to the surface--eliminated the effect of gravity, leading to the deposition of a more uniform layer.

"An optimum ETL deposition method must offer a number of properties to result in a high efficiency solar cell," Dr. Shahiduzzaman explains. "The ability to control the layer thickness and achieve a uniform, reproducible layer at low cost, without the need for a vacuum, provides a unique package of advantages that has not been reported to date. We hope that these properties will lead to effective and commercially relevant scale-up that will contribute to the drive towards cleaner energy worldwide."

Credit: 
Kanazawa University

Not everything is ferromagnetic in high magnetic fields

The high field magnet at HZB generates a constant magnetic field of up to 26 Tesla. This is about 500,000 times stronger than the Earth's magnetic field. Further experiments with pulsed magnetic fields up to 45 Tesla were performed at the Helmholtz-Zentrum Dresden-Rossendorf (HZDR).

The physicists examined crystals of U2Pd2In, which form a special class of solids (Shastry-Sutherland system). The interactions between the magnetically active uranium atoms are quite complex in this structure, mainly due to the extended 5f orbitals of the outermost electrons of uranium in a solid. These 5f electrons are also carriers of the magnetic moment in the material.

Using neutron diffraction in strong fields they found that an unusually complicated non-collinear modulated magnetic structure above a critical magnetic field. The magnetic unit cell is twenty times larger than the crystallographic unit, containing 80 magnetic moments. Such a structure is a consequence of competition between different strong interactions and the applied field. "Our results are important from two reasons", Dr. Karel Prokes (HZB) says. "First, they show that the field induced phase is not ferromagnetic and the magnetization increase at high fields is probably due to a gradual rotation of U moments towards the field direction, a finding that might be of relevance for many other systems and second, they may help to develop more precise theories dealing with 5f electron systems".

Credit: 
Helmholtz-Zentrum Berlin für Materialien und Energie

Quantum fluctuations sustain the record superconductor

image: Crystal structure of the Fm-3m phase of LaH10, where a highly symmetric hydrogen cage encloses the lanthanum atoms. In the top a sketch of the complex classical energy landscape is shown, where many minima are present. On the other hand, in the bottom we see a sketch of the completely reshaped much simpler quantum energy landscape, where only one minimum survives.

Image: 
Ion Errea. UPV/EHU

Reaching room-temperature superconductivity is one of the biggest dreams in physics. Its discovery would bring a technological revolution by providing electrical transport with no loss, ultra efficient electrical engines or generators, as well as the possibility of creating huge magnetic fields without cooling. The recent discoveries of superconductivity first at 200 kelvin in hydrogen sulfide and later at 250 kelvin in LaH10 have spurred attention to these materials, bringing hopes for reaching room temperatures soon. It is now clear that hydrogen-rich compounds can be high-temperature superconductors. At least at high pressures: both discoveries were made above 100 gigapascals, one million times atmospheric pressure.

The 250 kelvin (-23ºC) obtained in LaH10, the usual temperature at which home freezers work, is the hottest temperature for which superconductivity has ever been observed. The possibility of high-temperature superconductivity in LaH10, a superhydride formed by lanthanum and hydrogen, was anticipated by crystal structure predictions back in 2017. These calculations suggested that above 230 gigapascals a highly symmetric LaH10 compound (Fm-3m space group), with a hydrogen cage enclosing the lanthanum atoms (see figure), would be formed. It was calculated that this structure would distort at lower pressures, breaking the highly symmetric pattern. However, experiments performed in 2019 were able to synthesize the highly symmetric compound at much lower pressures, from 130 and 220 gigapascals, and to measure superconductivity around 250 kelvin in this pressure range. The crystal structure of the record superconductor, and thus its superconductivity, remained therefore not entirely clear.

Now, thanks to the new results published in Nature, we know that atomic quantum fluctuations "glue" the symmetric structure of LaH10 in all the pressure range in which superconductivity has been observed. In more detail, the calculations show that if atoms are treated as classical particles, that is, as simple points in space, many distortions of the structure tend to lower the energy of the system. This means that the classical energy landscape is very complex, with many minima (see figure), like a highly deformed mattress because many people are standing on it. However, when atoms are treated like quantum objects, which are described with a delocalized wave function, the energy landscape is completely reshaped: only one minimum is evident (see figure), which corresponds to the highly symmetric Fm-3m structure. Somehow, quantum effects get rid of everybody in the mattress but one person, who deforms the mattress only in one single point.

Furthermore, the estimations of the critical temperature using the quantum energy landscape agree satisfactorily with the experimental evidence. This supports further the Fm-3m high-symmetry structure as responsible for the superconducting record. The results are especially relevant because they demonstrate how atomic quantum fluctuations can stabilize crystal structures even at more than 100 gigapascals below their classical instability pressure. In fact, this work shows that the "classical" instabilities are due to the enormous electron-crystal lattice interaction that makes this compound a record superconductor. In other words, quantum effects stabilize crystal structures with substantial superconducting temperatures that would otherwise be unstable. Consequently, new hopes are opened to discover high-temperature superconducting hydrogen compounds at much lower pressures than expected classically, maybe even at ambient pressure.

Credit: 
University of the Basque Country

Who will lead the global surveillance of antimicrobial resistance via sewage?

For many, wastewater is simply contaminated, bacteria-filled water, but it is actually a valuable research resource. The water contains a wealth of information about e.g., the type of antimicrobial resistant bacteria and disease-causing microorganisms that are present in people in the collection area.

An international team of researchers led by the Technical University of Denmark, DTU, has proven that genomic analysis of sewage from around the world can produce important information about the exact type of bacteria that abound in certain areas. As such, analyzing sewage shows great potential as a surveillance tool.

Good addition to existing initiatives

In a peer-reviewed comment in the prestigious scientific journal Science, Professor Frank Møller Aarestrup of the DTU National Food Institute and Professor Mark E. J. Woolhouse of the University of Edinburgh outline both the many benefits of using wastewater in global disease surveillance and the limitations of this approach.

In their view, analyzing sewage is a good addition to existing surveillance initiatives, which predominantly operate on a national or regional level and generate data in relation to sick people. But the question is: Who will carry on the surveillance once their current project wraps up in 2023?

According to the two professors, one model is that the World Health Organization, WHO, and the European Centre for Disease Prevention and Control, ECDC, take over the surveillance. These two actors have the mandate to head up such a programme.

Practically, this could be done once a year through the countries of the world collecting a few liters of sewage and sending them for analysis in a central location, such as e.g., the WHO Collaborating Centre on Antimicrobial Resistance. As more countries get the equipment and expertise to carry out part or all of the analysis, the responsibility for this part of the surveillance can be passed on to each country.

Credit: 
Technical University of Denmark