Hey there, welcome to my blog Mufawad. In this monthly writeup, we will try to unveil the latest breakthroughs & uncover tomorrow's possibilities. Whether you are a student, a professional, or simply a science enthusiast, this article will provide you an engaging and informative insights and updates. Plus, as a compliment, you will get a peep into quirky AI images generated by me related to those particular topics.
Let’s delve
into the new scientific research that happened in the past month or so and
explore the latest technologies and breakthroughs that were achieved in this field
of science. In current blog, you will read about the following science events
of the month:
- India lands onto the Moon
- The Insect memories do not survive metamorphosis
- Majority of Earth Life live inside the Soil
- Scientists found ways in which Cancer cells can self-destruct themselves
- LK-99 proven not be a superconductor
- July 2023 was the hottest month on the earth till now
- European Union gives up on Human Brain Project
- New isotopes of oxygen O28 & O27 found
- Scientists discover heaviest ever living organism that lived on the earth
- Research reveals that human genes can be controlled by electric current
- WHO holds first high-level conference on traditional medicine
- Russian Luna 25 mission for the Moon crashes
- How to overcome computing problems in 21st century
![]() |
Current Science Report: August 2023, Mufawad.com |
India lands onto the Moon
Indian spacecraft, Chandrayaan-3, has successfully landed near the Moon's rock- and crater-strewn south pole, making India the world's fourth to successfully perform a controlled landing on the lunar surface. This achievement is a significant accomplishment for the Indian Space Research Organisation (ISRO), following failed Chandrayaan-2 mission in 2019. ISRO has established a communication link with its lander and telecast images of the ramp being deployed for the module's rover to roll onto the lunar surface.
![]() |
Image generated by Mufawad using AI |
The landing is
a restatement of India's growing global stature and capability in science and
technology, inspiring Indian scientists to step forward and play lead roles in
the global quest for Moon-based scientific and technological enterprise. The
landing gives confidence to configure missions to go to the Moon, Mars, Venus,
and possibly even asteroids.
Chandrayaan-3,
India's third lunar mission, lifted off on 14 July, propelled into orbit a
3.9-tonne spacecraft carrying a 1.75-tonne landing module called Vikram, which
bore a six-wheeled robotic rover called Pragyan. Once deployed, The Pragyan rover
stared to ramble around the landing site and it will do so for one lunar day,
equivalent to 14 Earth days.
The Moon's
south pole, a challenging region to land in, has drawn interest from many
nations due to the possibility of it contains water ice and the large craters
near it could offer clues about the composition of the early Solar System.
Several missions are heading there in the coming years, and on 19 August, a
Russian craft, Luna 25, crashed into the Moon just days before it was going to
attempt a landing at the south pole.
India's success
instils confidence in the technological competence of the country's space
industry, which could attract global investment in the Indian private space
sector and foster international collaborations and innovation at Indian
universities, laboratories, start-up companies, and research communities.
Landing at the Moon's south pole is difficult because it involves positioning
the spacecraft at a different angle from that used in previous landings, and
the area has rough terrain.
A lack of
detailed data on the region's gravity and surface characteristics compounds the
problem, as well as "Moonquakes" near the area adds to the
complexity. Poor lighting from the Sun is another challenge, as some areas are
completely dark or light but with extreme sun angles, blocking out any terrain
features.
Chandrayaan-3's
success comes about a week before ISRO's next major mission, its first to study
the Sun, which is scheduled to launch in the first week of September.
Sources: The
Nature, The Hindu
The Insect memories do not survive metamorphosis
Metamorphosis, the process of transforming caterpillars into butterflies, is a phenomenon that has been studied extensively by researchers for over half a century. More than 80% of known animal species today undergo some form of metamorphosis or have complex, multistage life cycles.
The nervous
system plays a crucial role in this process, as it must code for multiple
different identities. In a recent study published in the journal eLife,
researchers traced dozens of neurons in the brains of fruit flies going through
metamorphosis. They found that adult insects likely can't remember much of
their larval life, unlike the tormented protagonist of Franz Kafka's short
story "The Metamorphosis," who awakes one day as a monstrous insect.
![]() |
Image generated by Mufawad using AI |
Although many
of the larval neurons in the study endured, the part of the insect brain was
dramatically rewired. This overhaul of neural connections mirrored a similarly
dramatic shift in the behavior of the insects as they changed from crawling,
hungry larvae to flying, mate-seeking adults.
The findings
are "the most detailed example to date" of what happens to the brain
of an insect undergoing metamorphosis, and the results may apply to many other
species on Earth. Beyond detailing how a larval brain matures to an adult
brain, the new study provides clues to how evolution made the development of
these insects take such a wild detour.
The earliest
insects emerged from eggs looking much like smaller versions of their adult
selves, or else they continued their "direct development" to get
steadily closer to their adult form, just as grasshoppers, crickets, and some
other insects do today.
Complete
metamorphosis seems to have arisen in insects only around 350 million years
ago, before the dinosaurs. Most researchers now believe that metamorphosis
evolved to lessen the competition for resources between adults and their
offspring: Shunting larvae into a very different form allowed them to eat very
different foods than the adults did. Insects that started to undergo complete
metamorphosis, like beetles, flies, butterflies, bees, wasps, and ants,
exploded in number.
To really
understand what's happening to the brain, researchers knew that to be able to
trace individual cells and circuits through the process. The nervous system of
a fruit fly offered a practical opportunity to do that. Although most of the
fruit fly larva's body cells die as it transforms into an adult, many of the
neurons in its brain don't. The nervous system in all insects arises from an
array of stem cells called neuroblasts that mature into neurons. That process
is older than metamorphosis itself and not easily modified after a certain
stage of development.
To map the
brain changes in that gelatinous mass, researchers scrutinized genetically
engineered fruit fly larvae that had specific neurons that shone a fluorescent
green under the microscope. They found that this fluorescence often faded
during metamorphosis, so they used a genetic technique they had developed in
2015 to turn on a red fluorescence in the same neurons by giving the insects a
particular drug.
The mushroom
body, a critical region of the brain for learning and memory in fruit fly
larvae and adults, is the focus of a study. The region consists of neurons with
long axonal tails that communicate with the rest of the brain through input and
output neurons that weave in and out of the strings, creating a network of
connections that allow the insect to associate odors with good or bad
experiences. These networks are arranged in distinct computational
compartments, like the spaces between the frets on the guitar.
When the larvae
undergo metamorphosis, only seven of their 10 neural compartments are
incorporated into the adult mushroom body. Within those seven, some neurons
die, and some are remodelled to perform new adult functions. All the
connections between the neurons in the mushroom body and their input and output
neurons are dissolved. At this transformation stage, “it’s kind of this
ultimate Buddhistic situation where you have no inputs, you have no outputs,”
Gerber said. “It’s just me, myself and I.”
The input and
output neurons in the three larval compartments that don’t get incorporated
into the adult mushroom body completely shed their old identities. They leave
the mushroom body and integrate into new brain circuits elsewhere in the adult
brain. The researchers suggest that these relocating neurons are only temporary
guests in the larval mushroom body, taking on necessary larval functions for a
while but then returning to their ancestral tasks in the adult brain.
Source: Quanta
Majority of Earth Life live inside the Soil
A study has found that soil is the single most species-rich habitat on Earth, with over half of all species living in it. The research, published in the journal Proceedings of the National Academy of Sciences, found that soil is home to 90% of fungi, 85% of plants, and more than 50% of bacteria. At 3%, mammals are the least associated with soils. Soil is likely home to 59% of life, including everything from microbes to mammals, making it the singular most biodiverse habitat on Earth.
Soil is the top
layer of the Earth's crust and is composed of a mixture of water, gases,
minerals, and organic matter. It is where 95% of the planet's food is grown,
yet it has historically been left out of wider debates about nature protections
due to our limited knowledge of it. One teaspoon of healthy soil can contain up
to a billion bacteria and more than 1 km of fungi.
![]() |
Image generated by Mufawad using AI |
The researchers
used a rough estimate of there being about 100 billion species in total and
used theoretical estimates and data analysis to work out what fraction of those
species were found in the soil. They defined a species as living in the soil if
it lived within it, on it, or completed part of its lifecycle in it. Other
habitats they looked at include marine, freshwater, the ocean floor, air, the
built environment, and host organisms such as humans.
There is a
large error range of 15% with the estimate, so the average prediction could in
theory be as low as 44% or as high as 74%. For some groups, the range was
large, with estimates between 22% and 89% for bacteria living in the soil.
A third of the
planet's land is severely degraded, and 24 billion tonnes of fertile soil are
lost every year through intensive farming alone. Adopting less intensive
agricultural practices, greater regulation of non-native invasive species, and
increasing habitat conservation will help increase soil biodiversity. Practices
such as soil transplantations could also restore microscopic lifeforms in soil.
Source: The
Guardian
Scientists found ways in which Cancer cells can self-destruct themselves
Dr. Gerald Crabtree, a developmental biologist at Stanford, and his colleagues have developed a new approach to cancer treatment that could potentially turn cancer against itself. The concept was inspired by a walk through the redwoods near his home in the Santa Cruz mountains. The researchers designed and built molecules that hooked together two proteins: BCL6, a mutated protein that the cancer relies on to aggressively grow and survive, and a normal cell protein that switches on any genes it gets near.
The new
construction, a dumbbell-shaped molecule, is unlike anything seen in nature. BCL6
at one end of the dumbbell guides the molecule toward cell-death genes that are
part of every cell's DNA and are used to get rid of cells that are no longer
needed. When a person has diffuse large B cell lymphoma, BCL6 has turned off
those cell-death genes, making the cells essentially immortal. When the
dumbbell, guided by BCL6, gets near the cell-death genes, the normal protein on
the end of the dumbbell arms those death genes.
This new
approach could be an improvement over the difficult task of using drugs to
block all BCL6 molecules. With the dumbbell-shaped molecules, it is
sufficient to rewire just a portion of BCL6 molecules in order to kill cells.
The concept could potentially work for half of all cancers, which have known
mutations that result in proteins that drive growth.
Dr. Crabtree
explained the two areas of discovery that made the work possible: the discovery
of "driver genes" — several hundred genes that, when mutated, drive
the spread of cancer, and the discovery of death pathways in cells. The quest
was to make the pathways driving cancer cell growth communicate with silenced
pathways that drive cell death, something they would not normally do.
When the hybrid
molecule drifted to the cells' DNA, it not only turned on cell-death genes but
also did more. BCL6 guided the hybrid to other genes that the cancer had
silenced. The hybrid turned those genes on again, creating internal chaos in
the cell.
The main effect
of the experimental treatment was to activate the cell-death genes, which is
the therapeutic effect. The group tested its hybrid molecule in mice, where it
seemed safe, but scientists noted that humans are much different than mice.
Source: NYT Science
LK-99 proven not be a superconductor
Researchers have found evidence that LK-99, a compound of copper, lead, phosphorus, and oxygen, is not a superconductor and has clarified its actual properties. This conclusion dashes hopes that LK-99 would prove to be the first superconductor that works at room temperature and ambient pressure. Instead, studies have shown that impurities in the material, particularly copper sulfide, were responsible for sharp drops in its electrical resistivity and a display of partial levitation over a magnet, properties similar to those exhibited by superconductors.
![]() |
P.C: Wikipedia |
The LK-99 saga
began in late July when a team led by Sukbae Lee and Ji-Hoon Kim at the Quantum
Energy Research Centre published preprints claiming that LK-99 is a
superconductor at normal pressure and at temperatures up to at least 127 ºC
(400 kelvin). All previously confirmed superconductors function only at very
low temperatures and extreme pressures. The extraordinary claim quickly grabbed
the attention of the science-interested public and researchers, some of whom
tried to replicate LK-99.
Initial
attempts did not find signs of room-temperature superconductivity, but were not
conclusive. Now, after dozens of replication efforts, many specialists are
confidently saying that the evidence shows LK-99 is not a room-temperature
superconductor.
Accumulating
evidence for LK-99's superconductivity was a video taken by the South Korean
team that showed a coin-shaped sample of silvery material wobbling over a
magnet. The researchers said that the sample was levitating because of the Meissner
effect, a hallmark of superconductivity in which a material expels magnetic
fields. Multiple unverified videos of LK-99 levitating subsequently circulated
on social media, but none of the researchers who initially tried to replicate
the findings observed any levitation.
Half-baked
levitation also popped out to Derrick VanGennep, a former condensed-matter
researcher at Harvard University in Cambridge, Massachusetts, who now works in
finance but was intrigued by LK-99. He thought LK-99’s properties were more
likely to be the result of ferromagnetism. So he constructed a pellet of
compressed graphite shavings with iron filings glued to it. A video made by
VanGennep shows that his disc — made of non-superconducting, ferromagnetic
materials — mimicked LK-99’s behavior.
On 7 August,
the Peking University team reported that this “half-levitation” appeared in its
own LK-99 samples because of ferromagnetism. The pellet experiences a lifting
force, but it’s not enough for it to levitate — only for it to balance on one
end. Li and his colleagues measured their sample’s resistivity, and found no
sign of superconductivity. But they couldn’t explain the sharp resistivity drop
seen by the South Korean team.
Making
conclusive statements about LK-99’s properties is difficult, because the
material is unpredictable and samples contain varying impurities. However,
samples that are close enough to the original are sufficient for checking
whether LK-99 is a superconductor in ambient conditions.
A team at the
Max Planck Institute for Solid State Research in Stuttgart, Germany,
synthesized pure, single crystals of LK-99 using a technique called
floating-zone crystal growth. This allowed the researchers to avoid introducing
sulfur into the reaction, eliminating Cu2S impurities. The result was a
transparent purple crystal, pure LK-99, or Pb8.8Cu1.2P6O25. LK-99 is not a
superconductor but an insulator with a resistance in the millions of ohms, too
high for a standard conductivity test. It shows minor ferromagnetism and
diamagnetism, but not enough for even partial levitation. The team concluded
that the hints of superconductivity seen in LK-99 were caused by Cu2S
impurities, which are absent from their crystal.
The LK-99 saga
has been viewed as a model for reproducibility in science, but some argue that
it involved an unusually swift resolution of a high-profile puzzle. When copper
oxide superconductors were discovered back in 1986, researchers leapt to probe
their properties, but nearly four decades later, there is still debate over the
materials' superconducting mechanism. Efforts to explain LK-99 came readily,
but the detective work that wraps up all of the pieces of the original
observation is relatively rare.
Sources: The Nature
July 2023 was the hottest month on the earth till now
July 2023 has been confirmed as the hottest month in recorded history, with the average global temperature being 1.54°C above the preindustrial average. This record-breaking increase is attributed to several factors, including a budding El Niño warming event in the equatorial Pacific Ocean and a volcanic eruption on the island of Tonga that injected water vapour, a powerful greenhouse gas, into the stratosphere. New regulations have also curbed the release of sulphur dioxide pollution from ships, which tends to have a cooling effect. However, the biggest driver by far, scientists say, is increasing greenhouse-gas concentrations in the atmosphere, which have been steadily raising average global temperatures and have loaded the dice in favour of extreme weather and climate events.
![]() |
Image generated by Mufawad using AI |
An analysis by
scientists at the World Weather Attribution initiative found that the heatwave
in China last month would have been expected only once every 250 years in a
world without human influence. Temperatures in southern Europe and North
America, meanwhile, would have been "virtually impossible" in the
preindustrial era. However, such extremes are becoming the norm: last month's
events can now be expected every 5-15 years, and could happen as often as every
2-5 years if global temperatures increase to 2°C above preindustrial levels,
the upper limit imposed by the 2015 Paris climate agreement.
Local troubles
are also a concern, as global average temperature, often measured on a rolling
ten-year basis, is a metric that scientists use to track broad trends in a
noisy, complex system. While 90% of the excess heat due to the presence of
greenhouse gases has gone into the oceans, the fact is that temperatures over
land are both warmer and rising faster than those of the ocean surface. Many
parts of the Earth's land surface have already warmed by more than 1.5°C in at
least one season, and temperatures in numerous places last month were as much
as 8°C above the average for July.
The 2021–22
assessment of the Intergovernmental Panel on Climate Change suggests that every
tenth of a degree of warming at the global level comes with additional and
often extreme impacts at the local and regional level. This is particularly
concerning as the warming shows no sign of stopping, with this year's El Niño
event just getting started and next year likely to be even warmer.
Sources: The Nature
European Union gives up on Human Brain Project
The Human Brain Project (HBP), one of the largest research endeavors ever funded by the European Union, is coming to an end after 10 years. The project aimed to understand the human brain by modelling it in a computer, leading to significant strides in neuroscience, such as creating detailed 3D maps of at least 200 brain regions, developing brain implants to treat blindness, and using supercomputers to model functions such as memory and consciousness.
![]() |
Image generated by Mufawad using AI |
However, the
HBP has drawn criticism for not achieving its goal of simulating the whole
human brain, which many scientists regarded as far-fetched in the first place.
It changed direction several times, and its scientific output became
"fragmented and mosaic-like," says HBP member Yves Frégnac, a
cognitive scientist and director of research at the French national research
agency CNRS in Paris. For him, the project has fallen short of providing a
comprehensive or original understanding of the brain.
HBP directors
hope to bring this understanding a step closer with a virtual platform called
EBRAINS, which is a suite of tools and imaging data that scientists around the
world can use to run simulations and digital experiments. Today, they have all
the tools in hand to build a real digital brain twin. However, the funding for
this offshoot is still uncertain.
The HBP was
controversial from the start, with its key aims being to develop the tools and
infrastructure required to better understand the function and organization of
the brain and its diseases. When it launched in 2013, one of its key aims was
to develop the tools and infrastructure required to better understand the
function and organization of the brain and its diseases, alongside smaller
projects in basic and clinical neuroscience.
In the first
year, the HBP ran into trouble when founder and former director, neuroscientist
Henry Markram, said that the HBP would be able to reconstruct and simulate the
human brain at a cellular level within a decade. Over time, Markram's
leadership became increasingly unpopular, and in 2014, he and the other two
members of the executive committee changed the focus of the project, cutting
out a swathe of research on cognitive neuroscience. In response, more than 150
scientists signed a protest letter urging the European Commission to reconsider
the HBP's purpose in time for the second round of funding.
The EU formed a
committee of independent specialists to look at how the project was being run
and to revise its scientific objectives. The committee recommended that the HBP
should re-evaluate and more sharply articulate its scientific goals, as well as
re-integrate cognitive and systems neurosciences into its core programme. In
February 2015, the HBP's board of directors voted to disband the three-person
executive committee and replace it with a larger board.
The Human Brain
Project (HBP) is a multi-year project that aimed to develop a comprehensive
atlas of brain regions, including the prefrontal cortex, that contribute to
memory, language, attention, and music processing. The atlas was created using post-mortem
brain data and linked to gene-expression data in the Allen Human Brain Atlas
database. Researchers have also developed unique algorithms that can build a
full-scale scaffold model of brain regions from microscopy images, producing a
detailed map of the CA1 region in the hippocampus, an area important for
memory.
The HBP has
translated some findings into clinical applications, using personalized models
of the brain, or 'digital twins', to improve treatments for epilepsy and
Parkinson's disease. Digital twins are mathematical representations of a
person's brain that merge scans from an individual with a model. The EPINOV
trial, launched in June 2019, tested whether digital models built using
brain-scan data can help identify the origin of seizures and improve the
success rate of surgery for epilepsy. The EPINOV trial has recruited 356 people
across 11 French hospitals and hopes to make the imaging data from the trial
available to other researchers through EBRAINS.
The original
project plan for the HBP included the development of computing systems modelled
on the brain. HBP scientists have contributed to neural networks that can
simulate large brain-like systems, either to test ideas about how brains work
or to control other hardware, such as robots or smartphones. However, the
project's organizers and critics cite a common thread running through the HBP:
fragmentation. This is a long-standing issue in neuroscience research. In its
last three years, the HBP has tried to overcome the fragmentation of its
interdisciplinary sub-projects by knitting together their technologies into
EBRAINS. Initiatives across the HBP’s six platforms started to develop
compatible tools and shared data standards, and some groups were re-organized
to center on particular scientific challenges rather than disciplines.
For some
researchers, the fragmented scientific outcomes of the HBP stem from a lack of
focus. The HBP’s lack of prioritization and limited collaboration meant that it
failed to capitalize on its size and to really unite the neuroscience community
behind a common goal.
At the end of September, the HBP will cease to get funds. Although some endeavours that emerged from the project have already secured grants to continue their work, the future is uncertain for many researchers who have worked partly or fully with the HBP.
New isotopes of oxygen O-28 & O-27 found
A study published in Nature has discovered that the heaviest oxygen isotope, oxygen-28, quickly decays, challenging physicists' understanding of nuclear forces. Isotopes are elements with the same atomic number but differ in their atomic weights. The most commonly available form of oxygen, O-16, has 8 protons and 8 neutrons in its nucleus, resulting in an atomic weight of 16. The isotopes, oxygen-28 and oxygen-27, have the same protons but differ in the number of neutrons, 20 and 19, respectively. The neutron-rich isotopes, oxygen-28 (28-O) and oxygen-27 (27-O), were observed as they were decaying into oxygen-24 (8 protons, 16 neutrons) after emitting 4 and 3 neutrons, respectively, and thus proving to be neutron-unbound.
The most
abundant form of oxygen, having 8 protons and 8 neutrons, is considered 'doubly
magic'. However, the decay of O-28, with 8 protons and 20 neutrons, has
challenged scientists' understanding of principles governing nuclear forces.
Both O-28 and O-27 were generated when thick liquid hydrogen was bombarded with
intense beams of 29F (flourine-29 isotope), having unstable nuclei, from a
particle accelerator at the RIKEN Radioactive Isotope (RI) Beam Factory,
Japan. The researchers observed these isotopes and studied their properties
by directly detecting their decay products. The findings enhance our
understanding of nuclear structure and forces by offering new insights,
especially for extremely neutron-rich nuclei.
Sources: The
Indian Express
Scientists discover heaviest ever living organism that lived on the earth
A newly discovered whale, Perucetus colossus, may be the heaviest animal to have ever lived, according to a study published in the journal Nature. The modern blue whale has long been considered the largest and heaviest animal ever, beating out all the giant dinosaurs of the distant past. However, Perucetus colossus, the colossal whale from Peru, may have been even heavier, according to the international team of researchers. An estimated body mass of 180 metric tons was estimated by the international team, which would not take the heavyweight title by itself.
The biggest
blue whale ever recorded weighed 190 tons, according to Guinness World Records.
However, the researchers estimated the ancient whale's weight range was between
85 and 340 tons, meaning it could have been significantly larger.
The first
fossil of the ancient whale was discovered in 2010 by paleontologist Mario
Urbina, who spent decades searching the desert on the southern coast of Peru.
The remains were presented to the public for the first time during a press
conference at the Natural History Museum in the Peruvian capital. The
researchers estimated that the animal reached about 20 meters (65 feet) in
length.
![]() |
Image generated by Mufawad using AI |
The new
discovery indicates that cetaceans reached their peak body mass roughly 30
million years earlier than previously thought. Perucetus colossus
likely had a "ridiculously small" head compared to its body, but
there were no available bones to confirm this. It was impossible to say for
sure what they ate, but Amson speculated that scavenging off the seafloor was a
strong possibility, partly because the animals could not swim quickly.
The researchers
were confident that the animal lived in shallow waters in coastal environments
due to the strange heaviness of its bones. Its whole skeleton was estimated to
weigh between five and seven tons, more than twice as heavy as the skeleton of
a blue whale. Perucetus colossus needed heavy bones to compensate for
the huge amount of buoyant blubber and air in its lungs, which could otherwise
send it bobbing to the surface.
Sources:
Phys.org
Research reveals that human genes can be controlled by electric current
Swiss scientists have developed an experimental technology called the direct current (DC)-actuated regulation technology (DART), which uses small pulses of electricity to trigger insulin production in test mice with specially designed human pancreatic tissues. This technology could be used to kick target genes into action when a helping hand is needed. The researchers believe that wearable electronic devices are playing a rapidly expanding role in the acquisition of individuals' health data for personalized medical interventions.
![]() |
Image generated by Mufawad using AI |
The technology,
known as DART, brings the digital tech of our gadgets and the analog tech of
our biological bodies together. The electricity generated non-toxic levels of
reactive oxygen species, which can activate cells engineered to respond to
changes in chemistry. This could potentially help with various conditions
affected by genetics.
The researchers were able to encourage the blood sugar levels of diabetic mice to return to normal range through this method. While the technology is still a long way from being able to manage diabetes, it is an exciting proof of concept. DART requires very little power, and the team is confident that it can be developed and expanded to trigger more than just insulin production. In many years, health wearables could be doing more than just reporting stats.
WHO holds first high-level conference on traditional medicine
The World Health Organization (WHO) is convened the Traditional Medicine Global Summit on 17 and 18 August 2023 in Gandhinagar, Gujarat, India. Co-hosted by the Government of India, the summit focuses on the role of traditional, complementary, and integrative medicine in addressing health challenges and driving progress in global health and sustainable development. High-level participants included the WHO Director-General and Regional Directors, G20 health ministers, and high-level invitees from countries across WHO’s six regions. Scientists, practitioners of traditional medicine, health workers, and members of civil society organizations also participated.
The aim of the Summit
was to explore ways to scale up scientific advances and realize the potential
of evidence-based knowledge in the use of traditional medicine for people’s
health and well-being around the world. Scientists and other experts led
technical discussions on research, evidence and learning; policy, data and
regulation; innovation and digital health; and biodiversity, equity, and
Indigenous knowledge.
![]() |
Image generated by Mufawad using AI |
Traditional
medicine has contributed to breakthrough medical discoveries and continues to
hold great promise. Research methods such as ethnopharmacology and reverse
pharmacology could help identify new, safe, and clinically effective drugs,
while the application of new technologies in health and medicine, such as
genomics, new diagnostic technologies, and artificial intelligence, could open
new frontiers of knowledge on traditional medicine. Safety, efficacy, and
quality control of traditional products and procedure-based therapies remain
important priorities for health authorities and the public.
A stronger
evidence base will enable countries to develop appropriate mechanisms and
policy guidance for regulating, ensuring quality control, and monitoring
traditional medicine practices, practitioners, and products, according to
national contexts and needs. The WHO will present emerging findings from the
third global survey on traditional medicine, which includes questions on
financing of traditional and complementary medicine, health of Indigenous
Peoples, quality assurance, traditional medicine knowledge, biodiversity,
trade, integration, patient safety, and more. The complete survey, which will
be released later in the year, first on an interactive online dashboard and
then as a report, will inform the development of WHO’s updated traditional
medicine strategy 2025-2034 as requested by the World Health Assembly in May
2023.
Standardization
of traditional medicine condition documentation and coding in routine health
information systems is a pre-requisite for effective management and regulation
of traditional medicine in healthcare systems. The Summit showcased countries’
experiences, explore regional trends, and discuss best practices, including in
the implementation of the traditional medicine chapter in the latest
International Classification of Diseases, the ICD-11.
Participants in
the Summit examined a global overview of policy, legal, and regulatory
landscapes; formal structures and policies to collect data and establish
systems for information management; an assessment of educational and training
programs for the development of traditional medicine workforce; and experiences
and best practices on training, accreditation, and regulation of traditional
medicine practitioners.
The Summit’s parallel
focus on sustainable biodiversity management in the face of the climate crisis
drove the identification and sharing of best practices, initiatives, and
legislative frameworks on the protection of traditional knowledge, innovation,
and access and equitable benefit-sharing by countries. Discussions will focus
on the rising prospect of global economic activities related to traditional
medicine, Indigenous knowledge-based innovations in health care, application of
intellectual property laws and regulations, and the use and promotion of
indigenous and ancestral medicine through intercultural dialogues to support
community health.
Traditional
medicine has been a significant part of healthcare for centuries, with 170 of
the World Health Organization's 194 Member States reporting on the use of
herbal medicines, acupuncture, yoga, and indigenous therapies. Today,
traditional medicine is a global phenomenon, with patients seeking more
compassionate and personalized care. The WHO's traditional medicine programme
began in 1976 and is working with countries to develop standards and benchmarks
for training and practice of different systems of traditional medicine, as well
as their evidence-based integration in the International Classification of
Diseases (ICD).
The 11th
revision of ICD (ICD-11) includes a chapter on traditional medicine for dual
and optional coding, based on ancient Chinese Medicine. The WHO is developing
the next module in the traditional medicine chapter, which will include
diagnostic terms of Ayurveda, Unani, and Siddha systems of medicine.
In 2022, the
WHO Global Traditional Medicine Centre was established with the support of the
Government of India, focusing on partnership, evidence, data, biodiversity, and
innovation to optimize the contribution of traditional medicine to global
health, universal health coverage, and sustainable development. The third
global survey on traditional medicine monitors progress in the performance of
traditional and complementary systems, aligning with WHO and global monitoring
frameworks.
Sources: WHO
Russian Luna 25 mission for the Moon crashes
Russia's Luna-25 spacecraft failed its first moon mission in 47 years, spinning out of control and colliding with the moon after a problem preparing for pre-landing orbit. The state space corporation, Roscosmos, lost contact with the craft at 11:57 GMT on Saturday after a problem as the craft was shunted into pre-landing orbit. A soft landing had been planned for Monday. A special inter-departmental commission was formed to investigate the reasons behind the loss of the Luna-25 craft, whose mission had raised hopes in Moscow that Russia was returning to the big power moon race.
The failure
underscored the decline of Russia's space power since the glory days of Cold
War competition when Moscow was the first to launch a satellite to orbit the
Earth - Sputnik 1, in 1957 - and Soviet cosmonaut Yuri Gagarin
became the first man to travel into space in 1961. It also comes as Russia's $2
trillion economy faces its biggest external challenge for decades: the pressure
of both Western sanctions and fighting the biggest land war in Europe since
World War Two.
Russia has not
attempted a moon mission since Luna-24 in 1976, when Communist leader
Leonid Brezhnev ruled the Kremlin. As news of the Luna-25 failure broke, the
Indian Space Research Organisation (ISRO) posted on X, formerly Twitter, that
Chandrayaan-3 was set to land on August 23. Russian officials had hoped that
the Luna-25 mission would show Russia can compete with superpowers in space
despite its post-Soviet decline and the vast cost of the Ukraine war.
Sources: Reuters
How to overcome computing problems in 21st century
Advances in deep learning over the past decade have led to significant advancements, such as programs that can defeat world champions of games and determine the three-dimensional shape of proteins. These developments are attributed to increased financial budgets and improvements in semi-parallel hardware, such as graphics processing units (GPUs) and tensor processing units (TPUs), used to run artificial intelligence (AI) training algorithms. The computing capacity required to train the largest AI models doubled around every 3 to 4 months until 2019, with the energy required to train a leading AI model exceeding global yearly energy expenditure by 2030.
![]() |
Image generated by Mufawad using AI |
These large
models also raise serious environmental concerns related to their carbon
footprint. Training one of the largest language models, GPT-3 in 2020, which is
the basis for ChatGPT, costs approximately US$12 billion, with 90% of such
training costs going into leasing the data-centre infrastructure and 10% paying
the electricity bills to run it. As a result, model growth has fluctuated over
the last three years, and is potentially plateauing. There is also increased
interest in compressing large models to smaller sizes and improving access to
the models via low-cost infrastructure and training.
Infrastructure
costs will only increase in the future, starting with the manufacturing costs
for advanced technology nodes. Thus, we are most likely to be beginning an era
of economics-limited computing. Progress on key computing problems will be
limited by the economics of today's computing systems. To solve crucial
problems in the twenty-first century, substantial improvements in computing
energy efficiency will be required. Therefore, new strategies in terms of
approaches to computing, energy production, and commercial computing budgets
are urgently required.
Three key
computing problems that exemplify different levels of use-case complexity are planetary-scale
weather modelling, real-time, brain-scale modelling, and human evolutionary
simulation. Planetary-scale weather modelling is essential for simulating
ecological sustenance schemes, predicting natural disasters, understanding
anthropogenic climate change, and modelling planetary-scale, self-sustaining
ecosystems. Brain-like computing is potentially the next step in the
development of AI, enabling rapid evaluation of prognostic neuro-scientific
models.
Human
evolutionary simulation captures key interactions and biological processes
among groups of humans, and is important in modelling our evolutionary future.
The results of such simulations could inform the preparation of
multi-generational manned space missions and predict genetic evolution upon
continuous exposure to specific drugs.
To
comparatively estimate the energy budgets needed to support each problem, we
calculate the energy costs of running the associated capability over one year.
For example, using published estimates of 1028 floating point operations per
second (FLOP s–1) to run small-sized human evolutionary simulations, we
estimate needing 3 × 1035 FLOP for a year, which would require around 108 TWh
(1 TWh = 3.6 × 1015 J) of energy.
(Note: A supercomputer process data at speeds measured in floating-point
operations per second (FLOPS) to perform complex calculations and simulations,
usually in the field of research, artificial intelligence, and big data
computing. A supercomputer is made up of many individual computers
(known as nodes) that work together in parallel. A common metric for measuring
the performance of these machines is petaflops, which is one quadrillion (or 1015) FLOPS.)
Predictions for
energy budgets in the future are based on the trend in hardware energy
efficiency improvements over the past decade, known as Koomey's law. Another
law called Huang's law describes an improvement in compute efficiency due to a
combination of architecture, memory, transistor scaling, and algorithms,
leading to energy-efficient hardware and fewer operations to solve problems and
process complex data.
AMD and Nvidia
have experienced significant computing efficiency gains in the past decades,
attributed to hardware-software co-design. This co-design includes algorithmic
improvements, particularly in the areas of tensor operations and graph
algorithms for compiling them onto hardware. If future models are similar to
today's models, we would probably solve planetary-scale weather modelling by
around 2060, achieve reasonably detailed brain-scale models by around 2080, and
address the smallest scale of human evolutionary simulations by the end of the
century.
However,
progress in prevailing digital processors will not be able to support our
computing needs for the rest of the century. GPU-based digital computing has
its limits, as manipulation of bits leads to entropy increase by information
loss, enforcing a fundamental thermodynamic limit on computing efficiency,
often expressed as the Landauer limit (2.8 × 10–2 1 J per bit manipulation).
The 'digital limit' is the minimum digital-computing energy required for each
problem, and we will encounter similar limits when component sizes reach around
1 nm — probably by 2030 — at which scale even ambient thermal noise can
manipulate bits.
The analysis
shows that projections for when critical problems can be solved are extremely
sensitive to forecasts of computing advances; this degree of sensitivity should
be used to calibrate long-term policy decisions aimed at major societal
impacts. However, the analysis also highlights that even the best digital
computers cannot efficiently solve our big problems within this century. Urgent
action is thus required if we are to address these challenges.
Three areas of
action are highlighted:
1.
Investment in novel approaches to computing. In the near term, the lowest hanging fruit will be
aggressive hardware-algorithm co-design. For example, we are now able to run
and partially train increasingly larger language models on as few as one GPU
using algorithmic tricks such as quantization into integer and new floating
point representations. Non-digital (analogue) in-memory computing based on
memristors has the potential to provide 100 to 100,000 times better computing
efficiency than GPUs, by offering massive parallelism in matrix
multiplications. Memristor-based computing will likely begin to commercialize
in five years and could sustain the AI market for a few decades, by then all
forms of classical computing will hit their thermodynamic limits.
2. Increase
global energy production by harnessing green or highly efficient energy sources
(such as nuclear fusion),
leading to lower energy and infrastructure costs. This would have clear
societal value and an impact on computing. Such lowered costs could also
motivate the exploration of training larger models.
3. Increase
commercial computing budgets via massive corporate and inter-governmental
conglomerations. These
increased budgets would extend the current compute runway in the near term,
allowing companies to tackle computing problems that previously would have been
too expensive. As a consequence of increased sales, hardware vendors would also
have incentives for innovation in more energy-efficient computing primitives,
some of which may be post-digital.
Based on the
analysis, there is currently a ten-orders-of-magnitude difference between the
energy spent on today's leading models and the energy required to solve the
largest of the three problems. This gap needs to close in order to solve all
the big problems by the end of the century with unchanged energy cost per
model.
Source: The
Nature