Short stories and links shared by the scientists in our community
Mythical sea serpents were probably marine animals tangled in fishing gear
Gordon Johnson on Pixabay
Stories of sea serpents and other ocean-dwelling monsters are long-standing myths. Now, in research published in the journal Fish and Fisheries, one scientist has uncovered the culprit behind historical sea serpent sightings in the British Isles.
After parsing through over 200 reports of sea-serpent sightings made between 1809-2000, Robert France from Dalhousie University concluded that accounts of a “many-humped” monster lurking near the water's surface in the British Isles were actually early sightings of marine animals entangled in fishing gear.
France scoured sightings published in historical newspapers, scientific journals, natural history books, cryptozoology texts, and even legally sworn testimonials. While sightings varied substantially, there were some common threads: The sea-serpent body stretched for tens of meters in length (up to 100m), formed many coils or humps at the surface, and frequently had hair or whiskers. Many reports suggested the serpents were capable of moving rapidly or reported them thrashing at the surface of the water.
But France argues these descriptions conflict with all known (living and extinct) marine animals and can be more easily explained when considering the possibility of a marine animal pulling lines of rope and buoys behind them.
Today, the synthetic materials that impart strength and durability to fishing gear weave a tight cocoon around unfortunate animals tangled within their grasp. But before the advent of these materials, fishing gear was made of natural products that would have allowed for animals to move more freely while attached to fishing gear. Instead of succumbing to more instantaneous deaths we associate with entanglements today, animals may have simply carried their entrapment devices around with them until the natural materials eventually degraded.
Beyond solving an age-old mystery that has enchanted sea-goers, France points to a more insidious narrative: marine entanglements have long been a pervasive problem, plaguing the oceans far longer than scientists expected.
Human hearts are divided into two main parts, and the right one has long been neglected by researchers
Photo by Alexandru Acea on Unsplash
Human hearts are divided into two major parts: the right and left ventricles. In the history of research on heart function and failure, the left ventricle has received the majority of the attention while the right ventricle has been severely neglected, despite its reported functional abnormalities in an estimated 70 million people in the United States.
The right and left ventricles of our hearts work together as a pump for our bodies, but the right ventricle is different from the left ventricle in its anatomy and physiology. For example, the left ventricle wall is more muscular than the right ventricle wall, and the left ventricle can be described as a conical or bullet shape while the right ventricle is shaped like a crescent. These differences indicate that the understanding and treatment of conditions affecting the right ventricle require specific and distinct research on it.
However, a recent study of 510 hospitalized COVID-19 adult patients published in the Journal of the American College of Cardiology found that irregularities in the shape and structure of the right ventricle could predict COVID-19 mortality.
Right ventricle enlargement, called dilation, and dysfunction, observed from clinical transthoracic echocardiography (a non-invasive imaging technique using ultrasound), were reported from 35% and 15% of patients studied, respectively. Both dilation and dysfunction were associated with increased mortality risk. Taken together, the study suggests that right ventricle remodeling is a possible predictor of COVID-19 hospitalization and death.
Just three percent of undergraduate physics degrees are awarded to Black students. Walker aims to change that (and a lot more!)
Photo by Guillermo Ferla on Unsplash
Last year’s Black Lives Matter protests spurred a reckoning with the United States’ unjust history and ongoing systemic racism, and science is not exempt from this revolution.
According to the American Physical Society, only three percent of undergraduate physics degrees (and two percent of physics PhDs) are awarded to Black students in the U.S.
The American Institute of Physics records show that 2.1 percent of all physics faculty are Black, and there are only 22 Black women with astronomy PhDs according to AAWiP (African American Women in Physics).
Inspired by these facts, everything happening in the U.S., and other movements like #BlackBirdersWeek and #BlackInIvory, Ashley Walker, an astrochemist from Chicago, started the #BlackInAstro movement last summer. Walker is the first ever astrochemist to earn a Bachelor's degree from Chicago State University and an intern at the NASA Goddard Space Flight Center. They’ve been highlighting the achievements of Black astronomers and space scientists and sharing their experiences of what it’s like to be Black in the field of astronomy.
Now headed into another year of #BlackInAstro, and the start of Black History Month, I checked in with Ashley Walker to hear her thoughts on how far the movement has come, and where it’s going next. Looking back on the growth of #BlackInAstro, she’s proud of “the Black space community coming together, as well as the tremendous amount of support that came with it. I will always be surprised that it was trending on Twitter.”
Although they’ve done so much in building this community of Black astronomers and educating allies, #BlackInAstro is nowhere near done. Walker is determined to “continue celebrating ourselves, the past, and the future, as well as seeing what effective change is coming out of #BlackInAstro in addition to so many years of people before us fighting for equality in space sciences. We wanted to have our seat at the table, so I created a table for ALL of us.”
Disclaimer: The author of this piece is a member of the Astrobites collaboration, which has previously worked with Ashley Walker on #BlackInAstro.
Where did the hydrazine detected on Rhea come from?
One of the most exciting things about space chemistry is that it gives us a glimpse of chemistry that is difficult to study — or might not even exist — on Earth. A well known example of this is the chemistry on Saturn's largest moon Titan, which is famous for its lakes of methane. Scientists think that this Saturnian satellite has a hydrocarbon cycle that is much like our water cycle on Earth.
A recent article published in Science Advances shined a light on the chemistry of one of Saturn's lesser known moons: Rhea.
Using data from NASA's Cassini mission, researchers found a mysterious chemical signature in ultraviolet imaging data collected during a flyby of Rhea. They concluded that the most likely contender for this chemical feature is hydrazine, a nitrogen-containing compound typically used in manufacturing on Earth. In fact, hydrazine is one of the compounds used as a propellant for the Cassini spacecraft!
After confirming that Cassini's thrusters were shut off during the flyby of Rhea, the researchers had to consider other possible sources for the hydrazine. On Earth, small amounts of hydrazine are produced naturally by some algae and tobacco plants, but any hydrazine on Rhea wouldn't come from anthropogenic or biological sources. It is also possible that hydrazine could form within the ice on the surface of Rhea, but the moon's thin atmosphere leaves molecules on its surface vulnerable to irradiation that breaks apart the molecules needed to form hydrazine.
The hydrazine could also come from Titan. Scientists don't yet know whether hydrazine could even form on Titan, but the moon's nitrogen-rich atmosphere makes it a promising factory for hydrazine and other similar molecules.
Unfortunately, Saturn and its moons are too far away to further investigate this chemistry any time soon. We might have to wait until NASA's planned Dragonfly mission takes us back Titan again so we can better understand the chemistry there, and perhaps on Rhea too.
During their development, DNA off the ends of their chromosomes are chopped off and destroyed
Massimo brizzi on Wikimedia Commons (CC BY-SA 4.0)
The thought of losing DNA for survival sounds bizarre, but many life forms do this at different stages of their life. For example, the parasitic worm Ascaris, also called the roundworm, loses about 13-90% of its germ cell DNA during their change to somatic cells, which are all of the other cells in the body.
Scientists now looked closer into what kind of DNA sequences are lost and where they go, using sequencing- and imaging-based techniques.
They found that all 24 chromosomes of Ascaris germ cells harbor DNA breaks close to the chromosome ends or telomeric regions. Although most of the DNA in and around the telomeres is lost, new telomeric DNA is attached back to somatic cells.
When they imaged the worm cells, they found broken DNA is densely packed inside of the nucleus by lipid membranes. This packed DNA was evicted out into the cytoplasm, where it was attacked by proteins deployed by the cells to eat waste cellular material (a process called autophagy).
At this moment, it is still not clear why Ascaris cells put in all this extra work to get rid of DNA. One prediction scientists propose is that Ascaris could get rid of DNA only necessary for germ cell function but not useful to somatic cells.
The new superconducting material contains carbon, hydrogen, and sulfur
Via Wikimedia
In 1968, physicist Neil Ashcroft predicted that pure hydrogen would condense under extreme pressure into a superconducting metal capable of surviving at room temperature. Not many believed him, but the possibility of room temperature superconductivity inspired a few intrepid researchers.
Attitudes changed in 2015 when physicist Mikhail Eremets discovered a compound of hydrogen and sulfur that was superconductive up to -70 oC (-94 oF) when extreme pressure was applied. The work inspired a wave of research on room-temperature superconductivity with hydrogen compounds.
In a recent study published in Nature, a group of physicists reported superconductivity at room temperature and extreme pressure by adding a third element — carbon — to Eremet’s original compound of hydrogen and sulfur. They chose to use carbon because its strong bonds could help keep a material together once the pressure is released as it does for diamond.
The researchers compressed their mix of elements between the microscopic tips of two pointy diamonds. The final result was a superconducting temperature of 15 oC (58 oF) at 267 gigapascals, the same pressure that you would experience if you traveled about three-fourths of the way to the center of the Earth.
While they knew the chemical elements that made up the superconductor, the extreme pressure prevented their probes from obtaining data on the material’s final molecular and crystal structure. Until these are determined, researchers will face difficulty developing models that explain the high superconducting temperature measured.
Since the new superconductor requires extreme pressure, it also currently lacks immediate practical applications. Yet, the study suggests that a variation could prove useful, sparking new enthusiasm among researchers. The fantasies of ultra-efficient energy generation, perfect energy storage, and lossless power transmission are much closer to reality.
Tuberculosis requires a 6-12 month treatment course, and now scientists know why
Via Library of Congress
Before the advent of antibiotics, an infected paper cut could be deadly. Now we can use antibiotics to treat bacterial meningitis, strep throat and even tuberculosis. However, unlike most antibiotic prescriptions, tuberculosis treatment requires a regime of three different antibiotics and takes between six months and a year. This type of prolonged exposure to antibiotics drives the development of antibiotic resistance.
Scientists do not completely understand why such an extended course is needed to treat tuberculosis. They do know, however, that antibiotics must enter all of the bacterially-infected cells in order to be effective. Therefore, if scientists could develop antibiotics that enter the host cells as efficiently as the bacteria does, this could shorten the course of treatment required - reducing the risk of antibiotic resistance developing.
Researchers at the Francis Crick Institute in the UK and the University of Western Australia tackled this problem by developing an imaging technique to see which infected lung cells the antibiotics could enter. The team infected mice with Mycobacterium tuberculosis and treated them with the antibiotic bedaquiline. They used a new microscopy method, called CLEIMiT (correlative light, electron, and ion microscopy in tissue), to identify the specific cells that the antibiotic was taken up by.
They found that bedaquiline was unable to enter all of the infected lung cells, meaning while some bacteria were being killed, others managed to evade the treatment. This could explain why such a long treatment regime for tuberculosis is required. CLEIMiT offers the possibility of characterizing current antibiotics that we use to assess their cell-specific uptake, as well as aiding the development of more efficient antibiotics. This will ultimately reduce the risk of the development of antibiotic resistance.
However, it does not alleviate the serious ethical concerns around genome editing of embryos
In 2018 two babies, Lulu and Nana, were born as the result of a procedure called heritable human genome editing (HHGE) done by Dr. He Jiankui from the Southern University of Science and Technology in China. The procedure was against Chinese regulations and raised serious ethical questions. Dr. Jiankui is now in prison.
As a result of their genomes being edited, Lulu and Nana could face serious health conditions. Lulu and Nana’s DNA was modified long before they were born, when they shared one single cell, so they are at risk of mosaicism. Mosaicism occurs when an organism has different genetic information in different cells, as opposed to having the same genetic information in every single cell.
Genetic abnormalities in an early embryo can be detected before being implanted in the mother through genetic testing, using biological samples from the outer layer of the embryo. These tests do not reflect the genetic information of the whole embryo, and the occurrence of undetected mosaicism could affect the results.
Recently a group of scientists showed the efficiency of a new, non-invasive preimplantation genetic testing. Researchers used a sample from the inner cavity of an early embryo instead of the outer layer. As a result, this test was more reliable for detecting mosaicism in the embryo. The development of this non-invasive genetic testing could help to detect genetic abnormalities in embryos in assisted reproduction procedures and to detect mosaicism in HHGE experiments.
However, just because this procedure can detect mosaicism does not mean that HHGE is safe or a good idea. Undesired and unwanted potentially dangerous changes performed with genome editing can be passed down to future generations, and significant, legitimate ethical concerns remain. Currently, the scientific community recommends not to perform genome editing intended for pregnancy, and to regulate such experiments.
Viral predators can help battle an antibiotic-resistant bacterium considered an "urgent threat" by the CDC
Adapted from Kukski on Wikimedia Commons (CC BY-SA 4.0)
Antibiotic resistant bacteria are a growing threat, causing deadly infections that cannot be cured by our standard antibiotics. The development of antibiotic resistant bacteria may be further fueled by strained resources, increased hospitalizations, and decreased surveillance during the SARS-CoV-2 pandemic.
With limited development of new drugs to treat resistant bacterial infections, new therapies are desperately needed to prevent the spread of these “superbugs”. To this end, researchers are looking to a natural predator of bacteria – bacteriophages. Bacteriophages or “phages” are viruses that specifically infect bacterial cells. Depending on the type of phage, this infection can ultimately kill the bacteria that usually resist antibiotic treatment. Though commercial therapies are not yet available, the use of phage therapy is a hot topic boasting thousands of studies and several famous success stories.
While phage therapy shows great promise in the fight against resistant bacteria, these superbugs are constantly adapting and can evolve quickly to even resist phage infections. Fortunately for us, this resistance often comes with a cost. Researchers at Monash University have found a way to leverage the trade-off made by a phage resistant bacteria to make it once again susceptible to antibiotics.
Antibiotic-resistant Acinetobacter baumannii (A. baumannii) is considered an “urgent threat” by the Centers for Disease Control and Prevention. Like many disease-causing bacteria, A. baumannii forms a sugary outer capsule that can protect the bacterial cell from antibiotics and make it deadlier. However, the researchers behind this new study discovered that A. baumannii's protective layer also serves as the entry point for phage.
When the team exposed different strains of A. baumannii to phages in the lab, the bacteria quickly developed phage resistance by shedding their outer capsule to lock out the viral invaders. While capsule-less bacteria were protected from phage infection, researchers found that the mutated strains of A. baumannii were also re-sensitized to several antibiotics. Through experiments where they infected mice with A. baumannii, they also discovered that decreased bacterial reproduction in a host is another trade-off for phage resistance. They concluded that phage therapy can be effective in treating this superbug infection.
The CDC notes that infections from A. baumannii most often occur in healthcare settings, and people at highest risk are those who are on breathing machines (ventilators), in intensive care units, or have prolonged hospital stays. With these situations currently all too common in hospitals full of COVID-19 patients, phage therapy may provide an option where other treatments fail.
Circadian clocks help our bodies track a daily rhythm, but the reason these bacteria have them remains unclear
Photo by Juan Encalada on Unsplash
Circadian clocks, molecular timekeepers that can synchronize to 24-hour day/night cycles and thus allow cells to adapt to daily rhythms, have been characterized and studied in multicellular organisms for centuries. Their existence in single-celled organisms, on the other hand, has been questioned. In the 1980s and 90s, circadian clocks were found to regulate gene expression in photosynthetic bacteria. But, what about in bacteria who don’t directly depend on the sun for food?
This question was answered in a recent study published in Science Advances. It identified a circadian clock in the non-photosynthetic bacterium Bacillus subtilis, which is often found in the human gut and in soil. The authors observed that biofilm-forming cultures of these bacteria could synchronize their gene expression activities to 24-hour light or temperature cycles.
A biofilm is a collection of microorganisms that are held together by a sticky extracellular matrix. Different different parts of the biofilm can take on specialized roles; in this way, bacteria in a biofilm act like cells in a tissue, displaying behavior similar to multicellular development. It is plausible that adaptation to daily rhythms is tied to biofilm formation or maintenance, but the exact function of this newly-discovered clock remain unclear. Time and further research will tell whether circadian clocks also play roles in bacteria that aren’t inclined to live in biofilms.
White-nose syndrome thrives in the warm roosts that bats prefer to sleep in
USFWS via Wikimedia
For years, bats in North America have been plagued by a deadly fungal disease called white-nose syndrome. Despite measures to stop its spread, this fungus has swept across the continent, and scientists are monitoring the surviving bat populations to see if they are better equipped to avoid getting sick in the future. Unfortunately, North American bats may be stuck in an "ecological trap" that keeps them returning to the very habitats where the fungus grows best.
According to a recent study in the journal Nature Communications, little brown bats (Myotis lucifugus) in Michigan and Wisconsin choose to hibernate in roosts that stay above 8°C even when colder roosts are available. This is important because the fungal pathogen that causes white-nose syndrome grows best at 12-16°C, so hibernating in colder caves would protect bats from this deadly disease.
Conservation biologists expected that when white-nose syndrome tore through these populations, the warm-loving bats would die off, leaving only cold-loving survivors behind. And, as bats are known to be fast learners, they could potentially learn to avoid warm roosts in order to stay alive.
However, when the researchers compared the habitat preferences of bat populations before and after the fungus arrived, they only found a minor shift in preference towards colder roosts. The researchers concluded that these little brown bats are not likely to either evolve or learn warmer roosting preferences quickly enough to protect them from the disease. Considering that these bats have evolved for millennia with the risk of wintertime freezing, it makes sense that their desire to seek out warm roosts is difficult to overcome.
While it may not be possible to change the bats’ behavior, it is possible to conserve natural roosting sites that are cold enough to protect them and to restore human-altered sites like mines and tunnels by removing the artificial barriers that insulate them. These conservation measures will be increasingly important as both habitat degradation and climate change continue to worsen.
This is great for the insects, but high-quality specimens are important for research
Daderot on Wikimedia Commons (CC0 1.0)
Community science plays a crucial role in entomology research. Scientists regularly use observations, collated on databases such as iNaturalist, and specimens collected by community members in their research.
A recent study by Erica Fischer and their co-authors has revealed that specimens of Lepidoptera (an order of insects that includes butterflies and moths) have mainly been collected the community, rather than entomologists working at universities or natural history museums. However, between 1998 and 2009, the number of collections decreased by over a half. At the same time, the number of observations submitted to online databases has exploded.
The increase in observations shows there is no lack of interest in butterflies and moths. So why are community scientists, as well as professionals, collecting less specimens?
The researchers suggest that the decline is partially caused by a lack of funding, as well as a decrease in students learning the skills needed to collect specimens. In addition, many insect collections are not easily accessible, as they are not digitized (here is an example of a digitized collection). Lastly, they point out that taking a photo of an butterfly is much easier and quicker than collecting a specimen, especially as most people have a camera in their phone, so community scientists opt for this method of ‘collecting’ animals more often.
Despite the increase in observations, the lack of physical insect specimens could become an issue for future research, as they provide a wealth of additional information. For example, DNA can be extracted, and the morphology and internal anatomy can be studied in detail on real specimens. This is why museum collections are so important.
Although anybody can be a community scientist and contribute to entomology research, don’t go out and catch any butterfly you see, as there may be laws about what you can and cannot collect. Instead, join your local entomology society for a field trip or attend a BioBlitz to learn more!
San Francisco bobcats are also being poisoned, despite the fact that rat poison is highly regulated in California
milesz on Pixabay
Bobcats may be at the top of the food chain, but they still face threats to their survival. In urban and suburban areas, these big cats are dying at alarmingly high rates, which could lead to local extinction. Habitat loss and human development separates bobcat populations from each other and limits their movements across the landscape. Some of this could be mitigated, however, by wildlife corridors.
A recent study published in Biological Conservation found that corridors with native vegetation help bobcats move between remaining habitat patches in the San Francisco Bay Area. To see where bobcats travel and spend time, the researchers outfitted 38 bobcats with GPS collars to record their locations.
Using these coordinates and satellite maps, the researchers saw that bobcats traveled through areas with more plants and avoided areas with conventional agricultural and high-density housing. Yet, they wondered, would protecting these bobcat highways be enough to protect the species?
The research team also performed necropsies, autopsies for animals, on dead bobcats in the area. They found that road crossings, particularly those with high medians that the animals couldn’t easily cross, were dangerous for bobcats. More insidiously, rat poisons were a major killer of bobcats despite being highly regulated in California.
Thanks to their data, the research team was able to identify specific areas for permanent bobcat habitat protection in the Bay Area. The study, however, cautions that corridors alone cannot sustain bobcat populations: additional action must be taken to reduce car collisions and poisoning before it’s too late.
New research found that afternoon exercise provides greater health benefits for some people than a morning workout
Photo by Tim Mossholder on Unsplash
Getting in your daily exercise can be a trying task. Getting even just 15 minutes is a great start to improving your health outcomes. Now a new study, published in the journal Physiological Reports, has shown that exercising in the afternoon results in greater metabolic benefits than does exercising in the morning.
The study tested 32 adult men that were either at risk of or had type 2 diabetes. The participants were assigned to exercise either from 8-10 am or 3-6 pm. The researchers collected data from the participants on factors such as their insulin sensitivity, exercise performance, and body composition.
They found that many of these factors, including a change in body weight (indicating more weight loss), were significantly higher in the group that exercised in the afternoon than in the group that exercised in the morning. This study provides crucial information regarding how a simple change in the timing of exercise can enhance its positive health benefits.
The protein, called Ki-67, has similar properties to our detergents
Photo by Andrew Wulf on Unsplash
It might be a bad idea to mix detergents with chromosomes. But new research published in the journal Nature reports how a protein that has surfactant-like properties coats our chromosomes to protect them from other cellular material.
Our chromosomes are enclosed inside the nucleus by a nuclear membrane. When cells are dividing, the nuclear membrane breaks apart until the chromosomes are allotted to each daughter cell, then rebuilt again. Hence, there is a time our chromosomes are exposed to cellular material that is normally kept outside the nucleus (also called cytoplasm). But how the chromosomes are kept clean until the nuclear membrane is rebuilt is not clear.
Scientists found that a protein called Ki-67 has surfactant-like properties – similar to detergents – and sticks to the chromosomes as they start to segregate. Surfactants (or surface-acting-agents) are double-faced molecules that have a water-friendly and an oil-friendly side. This makes them efficient carriers or protectors.
While usually surfactants are made by a combination of fats and water-loving particles, Ki-67 is likely the first protein known to act as a surfactant molecule.
The researchers estimate that ~210 molecules of Ki-67 stick to each per square micrometer of the chromosomes. They stick head-first, keeping their tails facing out to prevent any cytoplasmic material from reaching the chromosomes. Then, when the chromosomes are divvied up and the nuclear membrane is about to be re-built, the Ki-67 molecules help keep the chromosomes clustered together.
The protein in question activates enzymes in our cells that degrade RNA
sgrunden on Pixabay
A phenomenon that has underpinned the COVID-19 pandemic is the range in severity of symptoms. While millions have tragically lost their lives, many people are asymptomatic. Although COVID-19 symptoms can vary by age, gender and ethnicity, these factors alone do not explain this disparity.
Scientists have been investigating this in the hope of identifying an effective way to treat severe COVID-19. Using different methods of investigation, two pre-prints have reported that a protein called OAS1 influences COVID-19 outcomes. The studies, which have not yet undergone peer review, describe a protective variant in OAS1 that is inherited from Neanderthals.
OAS1 activates enzymes in our cells that are responsible for RNA degradation. SARS-CoV-2, the virus responsible for COVID-19, is an RNA virus, and therefore this protein could potentially serve as part of the immune response against it. The same variant was found to be protective against SARS-CoV, the virus responsible for the 2002-2004 SARS outbreak.
Different people have different forms of proteins due to genetic variation. The Neanderthal variant of OAS1 has greater anti-viral activity than other isoforms. The variant was introduced into the European population via gene flow between Neanderthals and the ancestors of present-day humans. Neanderthal ancestry makes up 1.5-2.1% DNA in people outside of Africa, and many of these genes have been selected for over time.
This is not the first instance of Neanderthal variants influencing COVID-19 outcomes. A study published in Nature in September 2020 detailed a stretch of DNA on chromosome 3, identical to the Neanderthal genome, that tripled the risk of developing severe COVID-19.
This map will shed light on how other animals' brains develop, too
Why do scientists study the fruit fly brain? Although the fly brain is much simpler compared to a human brain, it is still capable of performing complex tasks, like navigation, memory, and color detection.
A brain is made up of different types of neurons, which have to be connected correctly to carry out tasks. These physical connections in the fruit fly brain were recently mapped in 3D. Scientists used machine learning algorithms to analyze ultrathin brain slices imaged under an electron microscope, and identified over 25,000 neurons and 20 million connections.
Now, a recent Nature study has added another dimension to this brain map. Scientists identified which genes are turned on or off in specific neurons over the course of fly development, from early pupae to adult, during which critical neuronal connections are made.
One interesting discovery from this new ‘developmental atlas’ is that certain neurons die before the fly becomes an adult, possibly because they are only needed for establishing connections of other neurons. Moreover, a new type of cell was identified which is similar to a type of neuron found in humans that is essential for brain development.
Having a clearer picture of how gene activities allow neurons to connect with one another in the fly brain will also benefit researchers working on the vertebrate brain. Importantly, the resources generated from this study pave the way towards a better understanding of developmental disorders that originate from improper neuronal connections.
Mutants like B.1.1.7 and B.1.351 carrying changes in the virus spike protein appear to help the virus spread, but not to evade currently available vaccines
Hakan Nural via Unsplash
COVID-19 variants have been emerging all over the world since the pandemic started. Some mutants arising in the last few months are spreading faster than the SARS-CoV-2 virus seemed capable of previously. Understandably, this has caused concern that the vaccines from Pfizer, Moderna, AstraZeneca, and other biopharma companies that will hopefully bring the pandemic under control won't be as effective.
The B.1.1.7 variant, which was detected in England in September 2020, carries a handful of different mutations, eight of which appear in the spike protein that allows the virus to attach to cells. One mutation, called N501Y, attracts a lot of attention because it sits at the interface between the spike protein and the cell about to be infected.
On top of that, the N501Y mutation itself — where the amino acid asparagine (in biochemical nomenclature simply called "N") at position 501 changed to tyrosine (called "Y") — is a rather significant change in terms of chemistry and size, because tyrosine has a bulky ring of carbons that asparagine lacks. Large rings like tyrosine's can change how areas of the protein around it are shaped. This mutation appears in other SARS-CoV-2 lineages as well. Another mutation in the spike protein, E484K present in the lineage B.1.351 that was first detected in South Africa in October 2020, is not as treatable with monoclonal antibodies.
These changes in the spike protein appear to strengthen the link between the virus and a cell. Since both the already-being-distributed Pfizer and Moderna vaccines target the spike protein, do these mutations alter their efficacy?
Scientists at Moderna and the NIH re-visited serum drawn from either non-human primates vaccinated with Moderna's mRNA-1273 vaccine or humans from vaccine's phase 1 clinical trial. They tested whether that serum could still effectively neutralize different SARS-CoV-2 mutants in comparison to earlier lineages, using non-infectious engineered viruses that had the genetic characteristics of the SARS-CoV-2 virus. They published their results in a preprint on bioRxiv this morning.
The B.1.1.7 virus had little-to-no ability to evade the immune response from vaccinated humans or non-human primates. However, the B.1.351 virus was more difficult to neutralize, requiring about 2 to 10 times more serum than the "original" SARS-CoV-2 virus.
However, all viruses and mutations tested were neutralized in these experiments — none of them escaped, it just took more of the immune response in the serum to do it. This mirrors the success of the Pfizer vaccine, which uses similar mRNA technology to Moderna's and is effective against the B.1.1.7 mutant.
This ability comes in handy for stealing a rival's hunting spot
Ryan Hodnett on Wikimedia Commons (CC BY-SA 4.0)
The last time you walked through a sticky spider web, you might have brushed it off in annoyance. Building these webs, though, is no small feat. It requires precious time and energy that could be spent eating or mating, so spiders are quite picky about where they string their silk.
If you’re a spider, better real estate means better hunting; it’s all about location, location, location. Stealing someone else’s web in a prime spot might pay off, but only if you’re confident you’re much bigger than your opponent. Web-building spiders, however, have notoriously poor eyesight. So, how do they size up their rivals?
Like human noses, spiders’ hairy legs can pick up chemical cues from their environment. Researchers from Miami University of Ohio found that one species of cellar spider, Pholcus manueli, can tell a web builder’s size by smelling chemicals left on the silk. And, this sizing-up ability might give P. manueli an edge on a closely-related competitor, the long-bodied cellar spider (Pholcus phalangioides). While both are considered invasive, P. manueli is challenging its longbodied counterpart's century-long dominance in the midwestern United States.
To determine how the species differ in their chemical sensitivity, the team first had a set of “builder” spiders from both species construct webs, then traded the builder out for another, “focal” spider. They measured each spider’s size and recorded how quickly the focal spider invaded the builder’s web. To ensure the spiders’ reactions were due to chemicals signals and not simply web design or structure, researchers repeated their experiment, adding an ethanol “web washing” control step before introducing focal spiders.
They found that P. phalangioides invaded webs faster than P. manueli no matter the size of the builder. P. manueli were flexible in their strategy, though, invading webs made by larger spiders more cautiously. Because this behavior disappeared after the web wash, the researchers concluded that P. manueli was definitely exploiting chemical clues.
What does this mean for upstart P. manueli? Despite P. phalangioides’ more aggressive approach, by picking their battles carefully, P. manueli just might win the war.
Some bacteria make us sick, and others keep our food safe to eat
Photo by Kenny Timmer on Unsplash
While eating fruits and veggies is a healthy thing to do, if they’re crawling with the bacterium Listeria monocytogenes (Lm), they may do more harm than good.
Lm contamination of fresh produce can stem from soil, water, and animals, among other sources (in the US, Listeria outbreaks have been traced to cantaloupe melons, Enoki mushrooms, and bean sprouts). The effects of Lm infection, or “listeriosis”, range diarrhea and fever to brain and bloodstream infections.
While proper handling or avoidance of high-risk foods is one way to prevent listeriosis, another is to use biological control tactics to limit Lm growth on food in the first place.
Biological control involves reducing populations of one organism, often insect pests or invasive weeds, by introducing a natural predator. To this end, a recent study in Applied and Environmental Microbiology sought to identify bacteria capable of limiting Lm colonization and persistence on produce, specifically cantaloupe melons.
The scientists isolated bacteria from various types of produce, ranging from alfalfa to grapes. They then tested the ability of each of 8,736 isolates to inhibit Lm growth in the laboratory. Of seven highly effective isolates, one called Bacillus amyloliquefaciens ALB65 (BaA) was the best inhibitor.
BaA was able to grow and persist on cantaloupe melon rinds and, importantly, did not inhibit plant growth or fruit production — in fact, cantaloupes colonized by BaA grew twice as fast as cantaloupes without it. Excitingly, BaA significantly reduced Lm growth on whole cantaloupes in the greenhouse, and completely inhibited growth at post-harvest refrigerator conditions (this is key, as Lm can survive cold temperatures, making it difficult to control).
While the researchers did not determine exactly how BaA inhibits Lm, they did identify genes in BaA’s genome that likely encode compounds known to limit growth of other bacterial species; these compounds may be responsible for the observed effects on Lm. This study points to a novel, effective biological control agent for reducing Lm growth on cantaloupes, with potential applications for other produce types as well.
Over 30 percent of people over age 65 experience hearing loss
Photo by Mark Paton on Unsplash
It is easy to take our senses, like hearing, for granted – but over one-third of people over 65 experience some degree of hearing loss, either due to degenerative effects of aging or accumulative exposure to loud noises.
When we think about the consequences of hearing loss, we might immediately think about how it would make communication with family and friends more difficult, or how we would miss the sounds of music in songs and films. But the detrimental effects of hearing loss go beyond reduced access to sounds.
In a review published in Ear and Hearing, researchers investigated an understudied consequence of hearing loss: fatigue. Certainly, one could imagine that the constant effort of trying to understand what others are saying may eventually take a toll on our brains.
After scouring through studies on hearing loss and fatigue, researchers found overwhelming evidence that hearing loss does, in fact, result in increased fatigue. There is also some evidence that obtaining a hearing device – such as a hearing aid or a cochlear implant – may relieve some of this fatigue, but researchers are quick to point out that more research on this topic is needed.
One limitation of current studies is that what we know today about this topic is based on subjective measures, such as questionnaires, and that each person might experience or describe fatigue differently. Future research could integrate more objective measures, such as behavioral or physiological measures, to provide stronger evidence about the link between hearing loss and fatigue. Despite these limitations, the current review is important for addressing the needs of people with hearing loss and for promoting continued research on this topic.
Atmospheric model of a Houston storm shows emissions increase rainfall intensity and occurrence
Steven Pisano / Flickr (CC BY 2.0)
Though humanity hasn’t discovered how to control the weather, we certainly can influence it. Land use changes and air pollutants have both been shown to influence weather and climate. In cities, the urban heat island effect can alter both where and how much precipitation falls. Likewise, aerosols emitted from power plants, oil refineries, and other sources cause atmospheric changes that result in more intense storms.
Many studies focus on the effects of urban land cover and air pollution individually, but fewer focus on how these two processes work together. To what extent do they affect weather events like rain? This question is especially important Houston, Texas, which is not only one of the largest cities in the US by area, but also produces high amounts of aerosol particles due to its many oil refineries.
A new study published in Atmospheric Chemistry and Physics sought to understand how Houston’s landscape and aerosol emissions impact its rainstorms. To do this, researchers ran four simulations of a storm that occurred in the region in June 2013. Each simulation toggled aerosol pollution on or off, or replaced the Houston urban area with croplands and pastures similar to those that are present outside the city.
Using these models, researchers determined that Houston’s aerosol pollution caused storms to produce 30 percent more peak rainfall and made intense rain eventsq five times more likely. Furthermore, when this aerosol pollution occurred within the city of Houston, the two processes amplified each other, creating even more rainfall and further increasing the probability of experiencing higher intensity storms.
While this study looked at Houston specifically, its results have much broader impacts. Studies like this give us a better idea on how storms will be impacted as cities continue to expand to accommodate our growing urban population. In addition, this research helps weather forecasters more accurately predict when hazardous weather will occur. This can, in turn, save lives and limit the destruction these natural events often cause.
The mysterious behavior appears related to collaborative hunting and hints at complex emotions
Octopuses are fascinating models for understanding the evolution of complex behaviors. Two-thirds of their brain cells are spread out inside their arms, meaning that each one can operate independently. The cephalopods change their texture and shape at will — an effective trick for hunting prey or hiding from predators. Octopus also cooperate with other predatory fish when they hunt. New research reveals that the inner-workings of this interspecies collaboration is not without surprises.
Many different species exhibit collaborative behavior in nature. Groupers and reef fish often hunt with octopuses to cover more gorund. They understand gestures from other fish which helps the group capture prey. Now for the first time, researchers captured footage of octopuses punching fish. Sometimes, seemingly, for no apparent reason.
Researchers studying cooperative hunting events filmed these interactions off the coast of Egypt. While observing different Octopus cyanea collaboratively hunting, they noticed these punches. Specifically, the octopus would aim at a fish and strike them with an explosive motion. These punches targeted different species of fish, suggesting this behavior serves an important function.
When an octopus punches a fish, it exerts a small amount of energy while hindering an individual fish's hunt. The fish then might lose their position within the hunt, may lose out on a prey opportunity or might even be kicked out of the group. The octopus clearly has the better end of this bargain, allowing it to control which individuals it hunts with. The researchers observed cases where the octopus would quickly grab the prey after punching a fish. Curiously though, sometimes it did not gain any immediate advantage from these punches.
The authors hypothesize that in these cases, the octopus might be punching out of spite to punish a hunting partner that cheated in the past. Alternatively, this aggression might serve to deter fish from non-collaborative behavior. This behavior may stem from complex cognitive or emotional pathways. This research provides another indication that the octopus brain, though drastically different than ours, is capable of complex behavior and cognition.
Fossil evidence from Spain suggests early humans may have hibernated for up to four months at a time
Photo by Frans Van Heerden from Pexels
Just like the American black bear and the European hedgehog, early humans may have hibernated to endure severe winter months.
Animals hibernate to survive cold weather and reduced food access by decreasing their metabolic rate and body temperature for months at a time. Accustomed to our modern-day central heating and abundance of supermarkets, the prospect of human hibernation seems like science fiction. However, 430,000 years ago, Earth experienced a period of extreme glaciation — otherwise known as the Ice Age. New research from scientists in Spain suggests evidence that these brutally cold times may have led to hibernation.
Animal hibernation causes high levels of the hormone parathyroid in the blood. This chemical damages bones, leaving long-lasting tell-tale signs. In this study, paleontologists examined human specimens for hyperparathyroidism from the Sima de los Huesos cave — Spanish for "pit of bones." The burial site is home to over 7,500 human fossils, from 29 individuals, between 300,000 and 600,000 years old — making it the largest and oldest collection of human remains to date.
The team studied the bones using a combination of microscopy and CT scans — the same as those used in hospitals. They discovered a plethora of lesions and bone damage indicative of disorders such as rickets, Chronic Kidney Disease — Mineral and Bone Disorder (CKD-MBD) and hyperparathyroidism. The authors propose that these diseases were caused by poor toleration to hibernation.
They argue that these ancestors may have hibernated for up to four months at a time. This strategy was imperative for survival during frigid and food-scarce periods, such as the extreme glaciation 430,000 years ago where these individuals lived.
While this discovery is certainly exciting, it is not entirely conclusive. Fossil experts will likely continue gathering more research to determine whether early humans really did hibernate, or not.
To look back into shark evolutionary history, paleontologists analyzed the fossilized teeth of these ancient predators
Via Wikimedia
For around 20 million years, gigantic sharks called megalodons roamed the oceans across the globe, eating whales, porpoises, and even other sharks. But while megalodons are known for being the prehistoric ocean's apex predator as adults, they first had to survive their vulnerable juvenile stage. A recent study suggests that young megalodons spent their early years in relatively safe and secluded coastal regions known as shark nurseries. According to the study authors, these nurseries may have played a key role in shark evolution.
Paleontologists compared the size of megalodon teeth from eight sites known as well as one newly discovered fossil-rich site in Spain. Of these locations, five were determined to be likely megalodon nurseries due to the prevalence of newborn and juvenile-sized teeth in comparison to the number of adult teeth. These five nurseries varied dramatically in age, with some dating back nearly 16 million years, while others were only three to four million years old. The researchers concluded that the megalodons’ reliance on nurseries must have been a stable characteristic throughout their long existence on earth.
As with many modern-day sharks, megalodons likely benefitted from nurseries due to their extremely slow rate of development. Some paleontologists estimate that they may have taken over 25 years for megalodons to reach their final adult size. If juveniles lacked a safe place to live, they could easily have been eaten by the same animals that their larger relatives considered prey.
The discovery of these megalodon nurseries suggests that this strategy for juvenile protection may have been one reason why this species survived for so many millions of years. On the flip side, the widespread loss of coastal habitats due to climatic changes likely caused a spike in mortality of juvenile megalodons and may have been a critical factor driving the extinction of this iconic species.