Tech

Could the heat of the Earth's crust become the ultimate energy source?

image: Here, the height of the building represents the energy state of electrons. Electrons in the semiconductor layer rise to a high-energy state by becoming thermally excited and then transfer to the electron transport layer. Then, they go through an external circuit and reach the counter electrode. Redox reactions take place in the electrolyte layer next to the counter electrode, providing the semiconductor with low-energy electrons. In spite of providing continuous heating, this process eventually stops as the different copper ions in the electrolyte relocate. However, the battery can revert this situation by opening the external circuit for a certain duration.

Image: 
Journal of Materials Chemistry A, Sachiko Matsushita

In a world where energy consumption is on the rise, our only hope is the development of new energy-generation technologies. Although currently used renewable energy sources such as wind and solar energy have their merits, there is a gigantic, permanent, and untapped energy source quite literally under our noses: geothermal energy.

Generating electricity from geothermal energy requires devices that can somehow make use of the heat within the Earth's crust. Recently, a team of scientists at Tokyo Tech, led by Dr. Sachiko Matsushita, have made great progress in the understanding and development of sensitized thermal cells (STCs), a kind of battery that can generate electric power at 100 °C or less.

Several methods for converting heat into electric power exist, however, their large-scale application is not feasible. For example, hot-and-cold redox batteries and devices based on the Seebeck effect are not possible to simply bury them inside a heat source and exploit them.

Dr. Matsushita's team have previously reported the use of STCs as a new method for converting heat directly into electric power using dye-sensitized solar cells. They also replaced the dye with a semiconductor to allow the system to operate using heat instead of light. Figure 1 illustratively represents the STC, a battery that consists of three layers sandwiched between electrodes: an electron transport layer (ETM), a semiconductor layer (germanium), and a solid electrolyte layer (copper ions). In short, electrons go from a low-energy state to a high-energy state in the semiconductor by becoming thermally excited and then get transferred naturally to the ETM. Afterwards, they leave through the electrode, go through an external circuit, pass through the counter electrode, and then reach the electrolyte. Oxidation and reduction reactions involving copper ions take place at both interfaces of the electrolyte, resulting in low-energy electrons being transferred to the semiconductor layer so that the process can begin anew, thus completing an electric circuit.

However, it was not clear at that time whether such a battery could be used as a perpetual engine or if the current would stop at some point. After testing, the team observed that electricity indeed stopped flowing after a certain time and proposed a mechanism explaining this phenomenon. Basically, current stops because the redox reactions at the electrolyte layer stop owing to the relocation of the different types of copper ions. Most importantly, and also surprisingly, they found out that the battery can revert this situation itself in the presence of heat by simply opening the external circuit for some time; in other words, by using a simple switch. "With such a design, heat, usually regarded as low-quality energy, would become a great renewable energy source," states Matsushita.

The team is very excited about their discovery because of its applicability, eco-friendliness, and potential for helping solve the global energy crisis. "There is no fear of radiation, no fear of expensive oil, no instability of power generation like when relying on the sun or the wind," remarks Matsushita. Further refinements to this type of battery will be the aim of future research, with the hope of one day solving humanity's energy needs without harming our planet.

Credit: 
Tokyo Institute of Technology

Tiny vibration-powered robots are the size of the world's smallest ant

image: A micro-bristle-bot is shown next to a US penny for size comparison.

Image: 
Allison Carter, Georgia Tech

Researchers have created a new type of tiny 3D-printed robot that moves by harnessing vibration from piezoelectric actuators, ultrasound sources or even tiny speakers. Swarms of these "micro-bristle-bots" might work together to sense environmental changes, move materials - or perhaps one day repair injuries inside the human body.

The prototype robots respond to different vibration frequencies depending on their configurations, allowing researchers to control individual bots by adjusting the vibration. Approximately two millimeters long - about the size of the world's smallest ant - the bots can cover four times their own length in a second despite the physical limitations of their small size.

"We are working to make the technology robust, and we have a lot of potential applications in mind," said Azadeh Ansari, an assistant professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology. "We are working at the intersection of mechanics, electronics, biology and physics. It's a very rich area and there's a lot of room for multidisciplinary concepts."

A paper describing the micro-bristle-bots has been accepted for publication in the Journal of Micromechanics and Microengineering. The research was supported by a seed grant from Georgia Tech's Institute for Electronics and Nanotechnology. In addition to Ansari, the research team includes George W. Woodruff School of Mechanical Engineering Associate Professor Jun Ueda and graduate students DeaGyu Kim and Zhijian (Chris) Hao.

The micro-bristle-bots consist of a piezoelectric actuator glued onto a polymer body that is 3D-printed using two-photon polymerization lithography (TPP). The actuator generates vibration and is powered externally because no batteries are small enough to fit onto the bot. The vibrations can also come from a piezoelectric shaker beneath the surface on which the robots move, from an ultrasound/sonar source, or even from a tiny acoustic speaker.

The vibrations move the springy legs up and down, propelling the micro-bot forward. Each robot can be designed to respond to different vibration frequencies depending on leg size, diameter, design and overall geometry. The amplitude of the vibrations controls the speed at which the micro-bots move.

"As the micro-bristle-bots move up and down, the vertical motion is translated into a directional movement by optimizing the design of the legs, which look like bristles," explained Ansari. "The legs of the micro-robot are designed with specific angles that allow them to bend and move in one direction in resonant response to the vibration."

The micro-bristle-bots are made in a 3D printer using the TPP process, a technique that polymerizes a monomer resin material. Once the portion of the resin block struck by the ultraviolet light has been chemically developed, the remainder can be washed away, leaving the desired robotic structure.

"It's writing rather than traditional lithography," Ansari explained. "You are left with the structure that you write with a laser on the resin material. The process now takes quite a while, so we are looking at ways to scale it up to make hundreds or thousands of micro-bots at a time."
Some of the robots have four legs, while others have six. First author DeaGyu Kim made hundreds of the tiny structures to determine the ideal configuration.

The piezoelectric actuators, which use the material lead zirconate titanate (PZT), vibrate when electric voltage is applied to them. In reverse, they can also be used to generate a voltage, when they are vibrated, a capability the micro-bristle-bots could use to power up onboard sensors when they are actuated by external vibrations.

Ansari and her team are working to add steering capability to the robots by joining two slightly different micro-bristle-bots together. Because each of the joined micro-bots would respond to different vibration frequencies, the combination could be steered by varying the frequencies and amplitudes. "Once you have a fully steerable micro-robot, you can imagine doing a lot of interesting things," she said.

Other researchers have worked on micro-robots that use magnetic fields to produce movement, Ansari noted. While that is useful for moving entire swarms at once, magnetic forces cannot easily be used to address individual robots within a swarm. The micro-bristle-bots created by Ansari and her team are believed to be the smallest robots powered by vibration.

The micro-bristle-bots are approximately two millimeters in length, 1.8 millimeters wide and 0.8 millimeters thick, and weigh about five milligrams. The 3D printer can produce smaller robots, but with a reduced mass, the adhesion forces between the tiny devices and a surface can get very large. Sometimes, the micro-bots cannot be separated from the tweezers used to pick them up.

Ansari and her team have built a "playground" in which multiple micro-bots can move around as the researchers learn more about what they can do. They are also interested in developing micro-bots that can jump and swim.

"We can look at the collective behavior of ants, for example, and apply what we learn from them to our little robots," she added. "These micro-bristle-bots walk nicely in a laboratory environment, but there is a lot more we will have to do before they can go out into the outside world."

Credit: 
Georgia Institute of Technology

Increased use of partial knee replacement could save the NHS £30 million per year

New research from a randomised clinical trial published today in The Lancet and funded by the National Institute for Health Research (NIHR) shows that partial knee replacements (PKR) are as good as total knee replacements (TKR), whilst being more cost effective.

Results from the TOPKAT study (Total or Partial Knee Arthroplasty Trial), led by researchers at the University of Oxford, suggest that over five years PKR has similar, if not a slightly better clinical outcome than TKR. More importantly, the economic benefit of using PKR is substantial and could save the NHS about £30 million per annum based on an increase of 31% usage. At present only 9% of joint replacements are PKR yet it is thought that around 40% of patients could be suitable candidates.

Chief investigator Professor David Beard, Rosetrees/Royal College of Surgeons Director of the Surgical Interventional Research Unit at the University of Oxford, said: "Despite many previous studies and considerable data, we have never had a sufficiently large randomised clinical trial to answer this important question. TOPKAT has now definitively shown us that both operations provide benefit and are worthwhile but, given the option, PKR is probably the implant of choice - providing sufficient expertise exists to implant it."

He adds: "An important caveat is that the surgeries in the trial were performed by well trained, experienced surgeons in both groups. If any recommendation to increase the use of PKR is made it will have to be accompanied by provision of adequate training and expertise for surgeons undertaking the technically more demanding PKR procedure."

Over 300,000 knee replacements are performed in the UK each year, mainly for osteoarthritis. Surgeons and patients face a choice of which type of operation to perform or undergo for medial compartment (one area in the joint) arthritis.

TKR replaces all parts of the joint whereas a PKR replaces only the diseased area and retains as much soft tissue as possible. TKR is fully established and used most often whereas PKR is less common and has been in widespread use for a shorter period.

The multidisciplinary team from the Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences, University of Oxford, and the Health Services Research Unit at the University of Aberdeen, with the support of the British Association of Knee Surgeons (BASK), found the failure or revision rate for PKR to be identical to that of TKP (both with revision rates of 4%). These findings contrast with previous research based on national registries, which showed PKR to have higher revision date.

The study was started in 2010 and 528 patients from 27 hospital sites were followed up at five years with the help of Trial Manager, Dr Lottie Davies. They were asked questions about pain, function, activity levels and satisfaction. Most of the clinical measures were very similar between the two types although more patients having a PKR reported they would have the operation again compared to those having a TKR. The cost effectiveness findings (patient improvement in relation to cost) were the most persuasive to show that PKR should probably be offered to more patients.

Professor Beard says: "The trial is a testament to everyone's hard work and will help patients in the future, adding to our increasing body of quality evidence for surgical procedures."

Professor Marion Campbell, co-investigator from the University of Aberdeen said, "The results have been sent to NICE and NHS commissioning bodies will now consider the findings to see how they might adjust or change the delivery of knee replacement in the UK."

Credit: 
University of Oxford

Crystalline 'artificial muscle' makes paper doll do sit-ups

image: This foil paper doll, as seen above, is able to move thanks to a new material.

Image: 
American Chemical Society

Scary movies about dolls that can move, like Anabelle and Chucky, are popular at theaters this summer. Meanwhile, a much less menacing animated doll has chemists talking. Researchers have given a foil "paper doll" the ability to move and do sit-ups with a new material called polymer covalent organic frameworks (polyCOFs). They report their results in ACS Central Science. Watch a video of the material in action here.

Scientists make conventional COFs by linking simple organic building blocks, such as carbon-containing molecules with boric acid or aldehyde groups, with covalent bonds. The ordered, porous structures show great potential for various applications, including catalysis, gas storage and drug delivery. However, COFs typically exist as nano- or micro-sized crystalline powders that are brittle and can't be made into larger sheets or membranes that would be useful for many practical applications. Yao Chen, Shengqian Ma, Zhenjie Zhang and colleagues wondered if they could improve COFs' mechanical properties by using linear polymers as building blocks.

The researchers based their polyCOF on an existing COF structure, but during the compound's synthesis, they added polyethylene glycol (PEG) to the reactants. The PEG chains bridged the pore space of the COF, making a more compact, cohesive and stable structure. In contrast to the original COF, the polyCOF could be incorporated into flexible membranes that were repeatedly bent, twisted or stretched without damage. To demonstrate how polyCOFs could be used as an artificial muscle, the team made a doll containing the membrane as the waist and aluminum foil as its other parts. Upon exposure to ethanol vapors, the doll sat up; when the vapors were withdrawn, it laid down. The researchers repeated these actions several times, making the doll do "sit-ups." The expansion of polyCOF pores upon binding the gas likely explains the doll's calisthenics, the researchers say.

Credit: 
American Chemical Society

'Semi-synthetic' bacteria churn out unnatural proteins

image: Researchers identified this unnatural base pair as being optimal for information storage in a semi-synthetic organism.

Image: 
Adapted from <i>Journal of the American Chemical Society</i> <b>2019</b>, DOI: 10.1021/jacs.9b02075

Synthetic biologists seek to create new life with forms and functions not seen in nature. Although scientists are a long way from making a completely artificial life form, they have made semi-synthetic organisms that have an expanded genetic code, allowing them to produce never-before-seen proteins. Now, researchers reporting in Journal of the American Chemical Society have optimized a semi-synthetic bacteria to efficiently produce proteins containing unnatural amino acids.

All of Earth's natural life forms store information using a four-letter genetic code consisting of the nucleotides deoxyadenosine (dA), deoxyguanosine (dG), deoxycytidine (dC), and deoxythymidine (dT). Within the DNA double helix, dA pairs with dT, and dG with dC, to form the "rungs" of the DNA ladder. Recently, researchers have made synthetic nucleotides that can pair up with each other. When they placed these unnatural nucleotides into genes, bacteria could replicate the DNA and convert the sequences into RNA and then proteins that contained unconventional amino acids. However, bacteria often cannot use these synthetic sequences as efficiently as the natural ones. Therefore, Lingjun Li, Floyd Romesberg and colleagues wanted to optimize the unnatural base pairs to improve protein production.

The researchers tested different combinations of unnatural base pairs in E. coli and observed which ones were replicated most efficiently and produced the highest levels of a protein. Some of the synthetic base pairs had been tested before, whereas others were new variations. The team then used these optimized base pairs to demonstrate, for the first time, a semi-synthetic organism that could make a protein containing multiple unnatural amino acids.

Credit: 
American Chemical Society

Radiation in parts of Marshall Islands is higher than Chernobyl

image: The United States used the Marshall Islands as a testing ground for 67 nuclear weapon tests from 1946 to 1958, causing human and environmental catastrophes that persist to this day.

Image: 
World Future Council

Radiation levels in some regions of the Marshall Islands in the central Pacific, where the United States conducted nuclear tests during the Cold War, are far higher than in areas affected by the Chernobyl and Fukushima nuclear disasters, according to new research from Columbia University.

Three studies published July 15 in Proceedings of the National Academy of Sciences (PNAS) by a Columbia research team, led by Emlyn Hughes and Malvin Ruderman from the Columbia Center for Nuclear Studies, showed that the concentration of nuclear isotopes on some of the islands was well above the legal exposure limit established in agreements between the U.S. and Republic of the Marshall Islands. The studies measured soil samples, ocean sediment and a variety of fruit.

Nearly 70 nuclear bombs the United States detonated between 1946 and 1958 left widespread contamination on the islands, a chain of atolls halfway between Australia and Hawaii. The largest nuclear detonation, "Castle Bravo," in 1954 at Bikini Atoll, was 1,000 times more powerful than either of the bombs dropped on the Japanese cities of Hiroshima and Nagasaki.

The Marshall Islands have experienced rapid growth since the 1960s. Most of the nation's residents live on two crowded islands and are unable to return to their home islands because of nuclear contamination. Nuclear fallout from the tests is most concentrated on the Bikini, Enewetak, Rongelap and Utirik atolls.

"Based upon our results, we conclude that to ensure safe relocation to Bikini and Rongelap Atolls, further environmental remediation... appears to be necessary to avoid potentially harmful exposure to radiation," wrote the study authors, who also include Ivana Nikolic Hughes, associate professor of chemistry at Columbia.

Credit: 
Columbia University

Pokémon-like card game can help teach ecology: UBC research

image: This is a Phylo card game playing deck featuring species found in British Columbia, Canada.

Image: 
Megan Callahan

Playing a Pokémon-like card game about ecology and biodiversity can result in broader knowledge of species and a better understanding of ecosystems than traditional teaching methods, like slideshows, according to new research from the University of British Columbia.

An open-source project launched in 2010 by UBC biologist David Ng and collaborators, the Phylo Trading Card Game works similarly to Pokémon trading cards, but uses real organisms and natural events instead of imaginary characters. While the Phylo project has proven immensely popular around the world, this is the first study to have tested its efficacy as a teaching and learning tool.

Researchers examined how people who played the game retained information about species and ecosystems, and how it impacted their conservation behavior. They compared the results to people who watched an educational slideshow, and those who played a different game that did not focus on ecosystems.

"Participants who played the Phylo game weren't just remembering iconic species like the blue whale and sea otter, but things like phytoplankton, zooplankton and mycorrhizal fungi," said lead author Meggie Callahan, a PhD candidate in the Institute for Resources, Environment and Sustainability. "They would say things like, 'I really needed this card because it was the base of my ecosystem,' or, 'When my partner destroyed my phytoplankton it killed all of my chain of species.' Obviously, the game is sending a strong message that is sticking with them."

Participants in both the Phylo Game group and slideshow group improved their understanding of ecosystems and species knowledge, but those who played the Phylo Game were able to recall a greater number of species. They were also more motivated to donate the money they received to preventing negative environmental events, such as climate change and oil spills. (Study participants were rewarded with a toonie [$2] or a loonie [$1], and were given options to donate the money toward different causes.)

"The message for teachers is that we need to use all possible ways to engage the public and get them interested in and caring about the issues of species extinctions and ecosystem destructions," said Callahan. "Something as simple as a card game can be adapted to any environment, from classrooms to field-based workshops, in any location. Our study shows that this can be a really beneficial way of learning about species, and their ecosystems and environments."

Researchers used a deck created for the Beaty Biodiversity Museum that focused on British Columbia's ecosystems, but there are many other versions of the Phylo cards circulating the world. A global community of artists, institutions, scientists and game enthusiasts have created numerous iterations of the game--including decks featuring west coast marine life, dinosaurs, microbes, and even a Women in Science version created by Westcoast Women in Engineering, Science and Technology.

"We have 20 to 30 decks and more coming every year," said Ng. "Games have a way of enticing anybody."

All Phylo decks are open-source and can be downloaded for free from the Phylo website. The Beaty deck, used in the study, is also available at the Beaty Biodiversity Museum gift shop.

Credit: 
University of British Columbia

Fiber-optic vibration sensors could prevent train accidents

WASHINGTON -- Researchers have developed new sensors for measuring acceleration and vibration on trains. The technology could be integrated with artificial intelligence to prevent railway accidents and catastrophic train derailments.

"Each year, train accidents lead to severe injuries and even deaths," said research team leader Hwa-yaw Tam, from The Hong Kong Polytechnic University. "Our fiber accelerometers could be used for real-time monitoring of defects in the railway track or the train to pinpoint problems before an accident occurs."

The researchers describe their new accelerometers in The Optical Society (OSA) journal Optics Express. The devices can detect frequencies more than double that of traditional fiber-optic accelerometers, making them suitable for monitoring wheel-rail interactions. The durable sensors include no moving parts and work well in the noisy and high-voltage environments found in railway applications.

"In addition to railway monitoring, these new accelerometers can be utilized in other vibration monitoring applications, for example, structural health monitoring for buildings and bridges and vibration measurements of aircraft wings," said Zhengyong Liu.

All-optical railway sensing

For more than 15 years, the researchers have been working on condition-monitoring systems that use an all-optical sensing network to continuously monitor critical railway components. These systems can help replace inefficient and costly scheduled railway maintenance routines with predictive maintenance based on actual conditions. Systems developed by the researchers have been installed in Hong Kong and Singapore.

"An all-optical sensing network has many advantages as it is immune to electromagnetic interference, has long transmission distance and the sensors don't require electricity," said Liu. "However, there is a need for fiber-optic sensors that are optimized to measure different parameters in railway systems."

The fiber-optic accelerometers typically used in condition-monitoring systems are based on fiber Bragg gratings (FBGs) and cannot be used to detect vibrations higher than 500 Hz. Although this is adequate for most railway applications it can't be used to measure the wheel-rail interactions that are an important source of track wear.

To overcome this problem, the researchers designed a new fiber-optic accelerometer that uses a special optical fiber known as a polarization-maintaining photonic crystal fiber that is coiled into the shape of a disc only 15 millimeters in diameter. The coiled fiber is glued between a stainless-steel substrate and a cylindrical mass block. When a vibration occurs, the mass block will press on the coiled fiber at a frequency matching that of the vibration. This external force causes the wavelength of light in the fiber to shift in a measurable way.

"This interferometric configuration uses changes in the light inside the fiber to acquire precise information about the vibrations," said Liu. "Installing these accelerometers on the undercarriage of an in-service train allows them to monitor vibrations that would indicate defects in the track. They can also be used to detect problems in overhead lines used to power trains."

Comparison field tests

After thoroughly testing prototypes of the new accelerometer in the laboratory, the researchers carried out a field test by installing the device on an in-service train. They also installed an FBG-based accelerometer and a piezoelectric accelerometer for comparison.

They found that the new fiber accelerometer detected acceleration in a manner similar to the piezoelectric accelerometer. However, piezoelectric sensors require expensive shielded cables to reduce the effects of electromagnetic interference noise. Because the FBG-based accelerometer can't operate well at high frequencies, noise concealed some of the useful vibration information.

"Our results showed that our new accelerometers perform considerably better than existing accelerometers used for monitoring acceleration in trains," Liu said.

In this work, the researchers used a commercial polarization-maintaining photonic crystal fiber. They have since designed and fabricated a new type of fiber with smaller outer diameters, lower bending losses and higher birefringence, all of which would allow them to build a smaller accelerometer with even higher sensitivity.

"These new accelerometers could open new sensing and monitoring possibilities by providing data that supports implementation of artificial intelligence in the railways industry," said Tam. "Although railway monitoring is a good example of how fiber-optic sensing can be combined with artificial intelligence, we believe this combination is also promising for a number of other industries and applications."

Credit: 
Optica

Harvesting energy from the human knee

image: This is a diagram of the slider-crank mechanism that generates energy during the knee's motion.

Image: 
Gao et al.

WASHINGTON, D.C., July 17, 2019 -- Imagine powering your devices by walking. With technology recently developed by a group of researchers at the Chinese University of Hong Kong, that possibility might not be far out of reach.

The group describes the technology in Applied Physics Letters, from AIP Publishing. An energy harvester is attached to the wearer's knee and can generate 1.6 microwatts of power while the wearer walks without any increase in effort. The energy is enough to power small electronics like health monitoring equipment and GPS devices.

"Self-powered GPS devices will attract the attention of climbers and mountaineers," said author Wei-Hsin Liao, professor in the department of mechanical and automation engineering.

The researchers used a special smart macrofiber material, which generates energy from any sort of bending it experiences, to create a slider-crank mechanism -- similar to what drives a motor. The authors chose to attach the device to the knee due to the knee joint's large range of motion, compared to most other human joints.

"These harvesters can harvest energy directly from large deformations," Liao said.

Due to the continuous back-and-forth the material will encounter when the wearer walks, every time the knee flexes, the device bends and generates electricity. This means the harvester can "capture biomechanical energy through the natural motion of the human knee," according to Liao.

Previous wearable energy harvesters took advantage of the vibration caused in the device as a result of motion, which comes with drawbacks regarding efficiency.

"The frequency of human walking is quite slow, which significantly decreases the energy-harvesting capability," Liao said. Because the group's device uses a different method, it bypasses this limitation.

The prototype weighs only 307 grams (0.68 pounds) and was tested on human subjects walking at speeds from 2 to 6.5 kilometers per hour (about 1 to 4 miles per hour). The researchers compared the wearers' breathing patterns with and without the device and determined that the energy required to walk was unchanged, meaning that the device is generating power at no cost to the human.

The researchers note the advantages of an efficient, wearable energy harvester and look towards future commercialization of the technology.

"Self-powered equipment can enable users to get rid of the inconvenient daily charge," Liao said. "This energy harvester would promote the development of self-powered wearable devices."

Credit: 
American Institute of Physics

A new level of smart industrial robots control and management reached at FEFU

Robot technicians from Far Eastern Federal University (FEFU) together with colleagues from the Far Eastern Brach of the Russian Academy of Sciences (FEB RAS) developed a command-and-control plugin for intelligent industrial robots. The new software allows the robots to build up high quality 3D computer models of workpieces quickly, precisely, and in the fully automated mode. The related article was published in International Journal of Mechanical Engineering and Robotics Research.

The solution developed by FEFU scientists helps solve the issue of hard programming of industrial robots that prevents them from adapting to changing working conditions. From that very moment, there is no need in time-consuming manual re-adjustment of such robots to prepare them for production launch.

Thanks to the new software, one can fix the in-process workpieces on universal positioning devices as opposed to large-scale equipment designed for rigid fixation. The plugin even allows for certain positioning aberrations and deformations while still saving time. The robot would take all spatial characteristics of a workpiece into consideration, automatically fix any scanning errors, and obtain a high-quality model to carry out other actions as indicated by their control programs. No fixation errors would prevent the robot from working correctly.

"When workpieces or large scale details are scanned in a production department, the point clouds used by the machine to build up 3D CAD models often have critical gaps. Various factors, such as flares or distortions may cause that. As a result, a machine is unable to recognize and process a workpiece correctly. We have suggested a solution to this problem allowing a robot to detect such gaps automatically and to carry out additional scanning of the missing areas. We also managed to avoid the resource-consuming numerical processing of massive datasets that 3D images are made of. Our software is fast and systemic and helps to obtain high-quality models and further process them in due order," said Alexander Zuev, Assistant Professor at the Department of Automation and Control, FEFU School of Engineering.

According to the scientist, the speed of data processing increases due to a special set of mathematical methods implemented in the software. The mathematical apparatus works with a large array of points forming a 3D image, rearranges them on a plane, and then quickly analyzes possible scanning errors.

A demonstration robot working on the basis of this new technology is currently being launched in FEFU to show the representatives of the industry all processing possibilities of such devices.

The development of smart industrial robots is a massive international trend. The machines are equipped with technical vision systems that allows them to understand what happens in their working zones and adjust their control programs in order to complete the required actions in the most efficient manner. Although industrial robots are quite slow to implement in Russia compared to the rest of the world, the team hopes that their development would help our country become a leader in this edge-cutting technologies field.

Credit: 
Far Eastern Federal University

Protein oxidation reveals the environmental pollution level in Doñana National Park

Deep in the heart of a protected area like Doñana National Park, it is supposedly clean and free from pollution compared to other kinds of areas such as a big city's downtown area. However, it does not always work out that way, since natural environments get a growing number of pollutants. In the specific case of Doñana, we suspect this is due to its geographical location, as it is close to Huelva's chemical park and surrounded by areas with a great amount of agricultural activity.

Measuring the effects of toxic compounds on the organisms that inhabit the area in order to find solutions early on is the task taken on by a team from the Department of Biochemistry and Molecular Biology at the University of Cordoba, led by José Alhama and Carmen Michán.

In order to detect the level of environmental pollution in Doñana Park, they assessed the biological effects of these pollutants on western Mediterranean mice, unprotected natural inhabitants, that became land bioindicators of the toxins present in the area.

Specifically, they studied oxidative damage in proteins, the main targets of oxidative stress. This kind of stress is one of the most important effects of the pollutants, that is also related to different diseases.

Technicians Carlos Fuentes and Eduardo Chicano, from the Central Service for Research Support (abbreviated to SCAI in Spanish) and the Maimonides Institute of Biomedical Research (abbreviated to IMIBIC in Spanish) respectively, performed mass analysis techniques in redox proteomics (the set of proteins damaged by oxidative stress).

By doing a general study of which proteins are oxidated, pollutants that affect key biological processes are revealed. Examples of these key processes are protein replacement to repair oxidative damage or processes related to eliminating toxins, that take place in the liver, even to the point of causing significant damage to this organ, influencing the accumulation and permanence of the toxic effect in the organism.

While the study of other key bioindicators in aquatic ecosystems, like red crabs, revealed a high level of oxidation in rice fields near Doñana Park, this also warns us of the presence of pollutants in the heart of Doñana.

This kind of analysis allows for detecting whether pollution is present and how this affects the organism in question, in order to tackle it before it reaches higher levels of organization, a situation that could lead to greater negative or even irreversible effects.

Credit: 
University of Córdoba

Head start accountability systems may be missing how classroom quality varies within preschool centers

Main Findings:

The high-stakes accountability policies used to monitor the quality of Head Start preschool centers may miss important variation in classroom quality within centers, which could lead to incorrect representations of center quality and inaccurate decisions about which programs need to re-compete for their funding.

Federal and state accountability systems randomly select a portion of classrooms within a given center to determine center-wide quality, under the assumption that quality is consistent across a center's classrooms. However, across Head Start programs, the authors found that one third to one half of the variation in quality is due to differences between classrooms within a center, as opposed to quality differences from center to center.

The authors' analysis suggests that 37 percent of centers in their sample would have received different funding decisions by the major accountability system for Head Start, the Head Start Designation Renewal System (DRS), depending on which half of classrooms in a center were randomly included in the assessment of quality.

Furthermore, average center-level quality, as determined by current accountability policies, was not found to be related to measures of children's development, calling into question the common approach of averaging classroom quality within centers to represent children's experiences. Instead, differences in classroom instructional quality within a center was significantly related to differences in children's academic and social skills.

"Head Start has a long history of applying research to program improvement," said study coauthor Terri J. Sabol, an assistant professor of human development and social policy at Northwestern University. "Indeed, the major accountability system for Head Start, the Head Start Designation Renewal System, drew from best practices in research on the importance of classroom quality, including structural and process quality, for children. However, our findings suggest that there are still key issues to address in how we measure quality and use measurements to hold programs accountable and to encourage quality improvements."

"Head Start has done an impressive job investing in and improving quality over the past decades and, by and large, is meeting the needs of young children," said study coauthor Emily C. Ross, who is currently a policy fellow for the American Association for the Advancement of Science and the Society for Research in Child Development. "This study speaks specifically to concerns about how accountability systems monitor quality. It suggests the importance of ensuring that quality is measured and represented accurately so that, ultimately, all children can have a high-quality experience."

Details:

Head Start is the nation's largest publicly funded preschool program for low-income children, serving more than 1 million children each year with a federal annual budget of more than $9 billion. It is administered by the U.S. Department of Health and Human Services, which awards grants to individual agencies that operate local center-based programs. Nationwide, there are currently about 1,700 Head Start agencies, which provide program services to about 15,000 Head Start centers with more than 41,000 classrooms.

Many have argued that a way to boost and sustain the effectiveness of Head Start is by regulating quality to improve children's direct experiences with the program. Head Start programs are increasingly subject to high-stakes accountability systems.

Most policies designed to regulate and rate Head Start quality--including the DRS and state policy quality initiatives such as the Quality Rating and Improvement Systems (QRIS)--do so at the level of the agency or center.

Both Head Start DRS and QRIS randomly select a subset of classrooms within a given center (or agency) to represent quality, on the assumption that a subset of classrooms selected within a center generalizes to all classrooms within the center. Yet variation among classrooms within the same center and agency is possible under the current guidelines, which are set by Congress through the Head Start Program Performance Standards.

For their study, the authors examined variation in common indicators of classroom quality in current accountability systems--including class size, child-adult ratio, teacher education, the global environment, and instructional support--using a large, nationally representative sample of Head Start centers.

The authors assessed data on 196 centers, 596 classrooms, and 4,130 students from the Department of Health and Human Services' 2006 and 2009 cohorts of the Head Start Family and Child Experiences Survey (FACES), to determine the extent to which classroom quality varies within centers, whether current accountability practices provide an accurate or fair representation of center quality, and how classroom- and center-level quality relates to children's outcomes.

The analysis was not an exact replication of the entire Head Start DRS, but rather a simulation focused on a potential issue about quality ratings that has implications for how accountability systems are structured. Also, the issue about how classrooms are selected in centers or agencies applies only to the larger centers/agencies that do not assess quality for all classrooms but, rather, randomly select a subset of classrooms to represent quality.

"Despite decades of research on Head Start, there is surprisingly little research on the extent to which classroom quality varies within Head Start centers," said Sabol. "Our results indicate that a number of current choices in how Head Start centers are evaluated may interfere with fairness and accuracy in accountability systems."

"We recognize the cost and time burden of collecting data for all classrooms within a center," Sabol said. "However, with the high stakes of many accountability systems, it is important to get the structure of the systems correct. The next step for the research field is to figure out how many classrooms within a center should be sampled to get an accurate representation of quality."

"Teachers and classrooms matter," Sabol said. "Although it is important to select a high-quality school, our results suggest that it's also important--if not more important--to find high-quality teachers within schools. These findings apply to school administrators, as well, in terms of offering support services to ensure that all teaching within a school is of high quality."

Credit: 
American Educational Research Association

Giving a chip about masa

image: Researchers use floating density to measure kernel density, an important property in masa production.

Image: 
Mark Holmes

Products we commonly buy at the supermarket, such as tortillas and corn chips, are made from food grade corn. The corn is grown, harvested, bought by a food company, turned into masa (dough from ground corn) through a chemical process, and then made into our favorite products.

Each of these important steps has implications for the next -- and some scientists are calling for more research to make each step better to benefit both companies and consumers.

"Breeding, production, and processing of food grade corn is a massive industry," explains Candice Hirsch from University of Minnesota. "Yet, there is limited knowledge on each of these steps."

She adds that each step of this value chain spans many scientific areas. This results in the information being spread across scientists who don't regularly communicate with each other. To start tackling this problem, Hirsch and her team reviewed knowledge on making corn into food products. They used information from both universities and industry.

The researchers laid out the importance of corn quality and masa quality. Hirsch says that the breeding of food grade corn receives relatively few resources. However, this corn is made into products we eat. Better quality corn will provide a better product to consumers.

Better corn would have a higher yield, costing companies less money and possibly making the product cheaper. It could also increase the quality and consistency of the products we buy.

The hardness of the corn kernels, for example, is important. It can affect how well the corn ships and how many of the kernels crack during shipping. These cracks then affect the moisture uptake while the masa is being made.

Combined with other qualities of the kernel, such as starch levels, the amount of moisture taken up by the ground corn can impact the masa.

"The quality of grain and masa is extremely important to the final product quality," Hirsch explains. "If the consistency of the masa is not correct, there will be consequences for the texture and taste of the final products."

Hirsch and her colleagues would like to see researchers explore all of these areas to better understand how to breed and grow the best corn for making high quality masa. The work would involve plant breeders, agronomists, chemists, food scientists, production specialists, and many others.

"Ideally we would like to determine which attributes are best to allow us to breed better corn, and also come up with methods to be able to quickly test these attributes," Hirsch says. "Another application is doing screening so companies buying corn can determine if a shipment has the necessary attributes to make a high-quality product."

She adds that the collaboration between University of Minnesota, PepsiCo, and Corteva was critical in reviewing research in this area. In working together, they were able to define what was known and unknown across the value chain, and how to fill the gaps.

Additionally, the public is interested in this work because we like to know where our food comes from. The researchers' review provides a look at how corn chips are made. It also identifies factors that affect taste, texture, and nutritional aspects of chips.

"I have worked in a number of research activities that involve improving raw plant material for direct human consumption," Hirsch says. "I find it very rewarding. It is very relatable to the general public, which makes it a great way to connect with people."

Credit: 
American Society of Agronomy

High magnetic field of 10T during activated carbon production improves micropore capacity by 35%

image: (a) Polarizing microscopy image of carbonized coal tar pitch prepared in the absence and presence of magnetic fields of 2?T and 10?T. All figures show the diagonal position and extinction position. (b) Schematic drawing of a magnetic field effect on the carbonization process of coal tar pitch.

Image: 
Copyright 2019, Springer Nature, Licensed under CC BY 4.0

Carbon materials such as nanotubes, graphene, activated carbon and graphite are in high demand. Demands are thought to continue to increase because carbon materials have many beneficial uses and new applications are being discovered. They are essential to air and water purification, electrodes in metal refining, manufacturing pencils and lubricants. The carbon source quality (coal tar pitch), temperature, atmosphere, and preparation methods have great impact on the properties of carbon materials. This is because the structure of carbon materials effect their properties, and the structure can be manipulated in production. Advancing new control parameters during production will lead to the refinement of the functionality of carbon materials.

All materials interact somewhat with magnetic fields whether or not they are magnetic. There have been many other research explorations into methods of orienting graphene and nanotubes in a magnetic field. However, there have been no reports of experiments using a High Magnetic Field (HMF) in the preparation process of carbon materials to manipulate the structure. This current research was made possible by the superconducting magnets that can create magnetic fields of 10 Teslas and beyond.

The research team lead by Atom Hamasaki of the Institute of Science at Shinshu University set out to create more efficient forms of activated carbon by utilizing the superconducting magnets to coax activated carbon precursor of carbonized coal tar pitch during the mesophase (liquid crystal) to form crystallites (similar process to making graphite), thus increasing the volume of pores in the activated carbon by 35%.

The HMF encourages crystallites to form, and when there are more crystallites, more crevasses are created where chemicals can come into contact with the activated carbon. Many other materials that have negative magnetic susceptibility may also be manufactured using this effective procedure with HMF to control for better properties.

Credit: 
Shinshu University

New study works with historically disenfranchised communities to combat sudden oak death

image: Locations and study trees were selected through a collaborative process to ensure that sampling would be conducted to span the range of tanoak utilization on the Hoopa and Yurok reservations. Cuttings from canopy branches were removed in the field and transported to the laboratory on ice where they were challenged with the Phytophthora ramorum pathogen, incubated for 2 weeks in a high-humidity environment, and assessed through high-contrast image analysis.

Image: 
Richard C. Cobb, Noam Ross, Katherine J. Hayden, Catherine A. Eyre, Richard S. Dodd, Susan J. Frankel, Matteo Garbelotto, and David M. Rizzo

Science often reflects the priorities of dominant industries and ignores the needs of disenfranchised communities, resulting in the perpetuation of historical injustices. One team of scientists in Northern California studying sudden oak death, which poses a threat to the longstanding cultural heritage of several indigenous tribes, sought to chip away at this cycle through a new collaboration with these communities.

Sudden oak death has killed 50 million trees since emerging in 1995 and is an impending threat to the tanoak trees in Northern California. Tanoak, an evergreen tree, is native to the western United States and prolific along the California coast. In these regions it is revered by many indigenous tribes, including the Hoopa and Yurok, who value the tree as a source of food and medicine. The tanoak also holds an important role in their religious practices.

To develop resistance measures that represented the interests and unique perspectives of these tribes, the team of scientists worked with the tribal forest managers and community members to identify proactive ways to prevent the spread of sudden oak death. Through this partnership, the scientists were able to publish a paper bringing together indigenous land management, ecology, genetics, pathology, and epidemiology.

"I mostly work on the mathematical side," said co-author Noam Ross, who created the simulations and software used in the paper, "but it's important how that can be used to understand and manage the cultural and social effects of this forest disease, which has had significant impacts in these communities."

The results of this study, published in "Promise and Pitfalls of Endemic Resistance for Cultural Resources Threatened by Phytophthora ramorum" in Phytopathology, underscore the importance of collaboration between different interest groups. "This collaborative approach holds valuable lessons for researchers and managers working to empower a broader cross section of stakeholders," says author Richard Cobb.

More details about this study can be found in Phytopathology Volume 103, Number 3, published May 8, 2019. Phytopathology is an international journal publishing articles on fundamental research that advances understanding of the nature of plant diseases, the agents that cause them, their spread, the losses they cause, and measures used to control them.

Credit: 
American Phytopathological Society