Wednesday, November 21, 2018
Tuesday, November 20, 2018
Tuesday, November 13, 2018
It's counter-intuitive but getting healthier should make us worry
Unlike in the past, modern public health programs have led to more poverty, unemployment and a rapid rise in chronic illnesses, says a leading GP
Confronted with challenges in our daily work, reinforced by depressing news about global politics, it is understandable that we feel like hunkering down and concentrating on local distractions, such as family and valuable friendships — not that these ‘distractions’ are uniformly free of challenges, of course.
But what is occurring globally establishes a context within which we all work and there can be value in looking above the parapet, keeping an eye open for a sniper in the distance.
Adjunct Professor Thomas Bollyky, a senior fellow and director of the Global Health Program at the Council on Foreign Relations in Washington DC, has written a challenging essay in the latest issue of council publication Foreign Affairs. The article is both informative and deeply disturbing. Entitled ‘Health without wealth’, Bollyky addresses “the worrying paradox of modern medical miracles”.
“For the first time in recorded history,” he states “bacteria, viruses and other infectious agents do not cause the majority of deaths or disabilities in any region of the world.” This infamy belongs to chronic illness.
Bollyky takes an historical view of how the vast improvements of the past century in health occurred in the US and Europe as a result of controlling communicable diseases.
These measures included “government-mandated measures — such as milk pasteurisation, laws against overcrowded tenements, and investments in clean water and sewage treatment systems — and better social norms around hygiene, childcare and girls’ education”. Half of the improvement in life expectancy in developing countries between World War II and 1970 was due to these means — and not to antibiotics and immunisation, Bollyky claims.
These public health measures had a strong relationship with prosperity. They occurred because governments invested in water and sewers and public housing. And, in return, a healthier workforce contributed to prosperity.
Big cities, the engines of innovation and achievement, became increasingly affluent because of rising productivity and were able to complete the circle of public health investment leading to societal economic benefit.
Related reading:
- Point-of-care testing: A revolution in waiting
- Patients pay attention when prescribed physical exercise
However, modern-day public health programs to address the likes of malaria, HIV and child immunisation have undoubtedly saved lives and increased life expectancy, but have failed to increase prosperity. “The recent hard-won gains threaten to bring a host of new and destabilising problems,” Bollyky writes.
This is because such programs, which are often paid for by outside agencies rather than local governments, have not automatically led to greater productivity, more employment or the expansion of local health services. In fact, the situation has led to more poverty, unemployment and a rapid rise in chronic illnesses as a result of changes to food supply, greater availability of tobacco and housing shortages.
In this setting, chronic disorders have flourished and a new generation of peri-urban slums have developed.
In Australia, we enjoy the ability to treat these conditions and we have succeeded in pushing many of them into the senior years. But think back to the 1950s and ’60s when our therapeutic abilities were much less and where death from an MI or stroke was common among middle-aged men. That is how it is in many less economically developed nations now — a huge loss of productive workforce in middle age.
Bollyky states that deaths from hypertensive heart disease among people under 60 have increased by nearly 50% in sub-Saharan Africa in the past 25 years.
“In 1990, heart disease, cancer and other non-communicable diseases caused about a quarter of deaths and disabilities in poor countries,” he adds. “By 2040, that number is expected to jump as high as 80% in countries that are still quite poor.”
The remedy, Bollyky suggests, is a more comprehensive approach to international aid — ensuring that investments help countries to improve their healthcare systems, make their cities more liveable and “enable their companies to employ more people more productively”.
While our preventive approaches to chronic illness in Australia could do with more money, we can be thankful that we have healthcare that enables us to manage chronic ailments, especially through general practice.
However, in this global era of chronic illness, it would be wise for governments to remember that spending on healthcare and enjoying prosperity are two sides of the same coin.Professor Stephen Leeder is an emeritus professor of public health and community medicine at the Menzies Centre for Health Policy and School of Public Health, University of Sydney.
Published in the Medical Observer 18 October 2018 https://bit.ly/2PwaJVK
Tuesday, October 9, 2018
A revolution is coming, warns emeritus professor
You're out of touch if you believe medicine will remain unscathed amid the rise of artificial intelligence, he insists
So you thought that
My Health Record was complicated and risky? In the digital revolution, it is
chicken feed.
Steady yourself and
gird your loins, because artificial intelligence (AI) is the big game that’s
coming to town soon and it can be challenging.
How close are we? A
statement in the IT world called Moore’s law observes that the number of
transistors that can be placed on a single integrated circuit doubles about
every two years.
It was named after
one of the co-founders of Intel, Gordon Moore, following his 1965 paper. It
means that the same-sized circuit you were using last year has doubled its
capacity this year.
According to
Google, your smartphone has enough computing power to fire a person to the
moon.
How long Moore’s
law will apply is unknown because the space on printed circuits is finite, but
we do know that today’s computers have the same processing power as the human
brain.
In the August issue
of Foreign Affairs magazine, Kevin Drum, a 60-year-old
Californian political blogger and columnist who knows a lot about Silicon
Valley, credits the immense social progress of the 19th century to the
Industrial Revolution.
“Without it,
there’s no capitalist revolution because agrarian states don’t need one.
Without it, there’s no rising middle class and no real pressure for democracy,”
he wrote in an essay called ‘Welcome to the
Digital Revolution'.
“The key drivers of
this era were the steam engine, germ theory, electricity and railroads.”
And now? The
computers to support AI are ready.
Their power is
measured in floating point operations — known in the trade as ‘flops’ — which
basically means that they work very fast. For example, one second is the
equivalent of about 10-100 petaflops.
The capacity of the
human brain is said to be able to handle 100 petaflops per second. That is, it
can perform 100,000,000,000,000,000 operations per second.
According to Mr
Drum: “A computer with this capacity, unfortunately, is the size of a living
room, costs $200 million and generates electricity bills of about $5 million (a
year).”
Software
development is critical and AI experts say there is a 50% chance that AI will
be able to perform all human tasks by 2060, he adds.
“The digital
revolution is going to be the biggest geopolitical revolution in human
history”, he says, adding that PricewaterhouseCoopers has predicted that 38% of
all jobs in the US are at high risk of automation by the early 2030s.
The effects on
human employment will be profound.
Within a decade, he
says, long-haul truck drivers will be displaced by driverless technology and
similar technology will knock out the jobs the displaced drivers might have
taken up. We need new politics.
Anyone imagining
that medicine and medical practice will not be profoundly altered is out of
touch.
Our eldest son
Nick, a vice-president with Google, recently told me that the AI development of
the driverless car was now sophisticated enough to engage in ethical reasoning.
For example, how
should an AI-driven vehicle respond to an impending crash where either the
humans in that car, or the colliding vehicle, will sustain a fatal injury? With
sacrifice, altruism or self-interest?
And, if ethical
reasoning can be used by AI for driving, then why not in medicine?
The two most
important developments for the 21st century will be AI-driven mass unemployment
and fossil-fuel-driven climate change, Mr Drum says.
A glimmer of hope
is that AI might be able to solve climate change by scaling up wind and solar
power.
But what about
medicine? Now, there’s the challenge for us doctors.
At the very least,
our medical education should accommodate more about the interface between
practice and AI.
This must go way
beyond the simplicities of how to use IT to include debating and considering
the implications for what we do as doctors in this brave new world.
What will ethical
practice mean and how will we relate to AI in this pursuit?
It’s time for a lot
of serious and creative thinking.
Source: Foreign Affairs 2018, online .
Related reading:
Published in The Medical Observer 13 August 2018 https://bit.ly/2PnMULp
Tuesday, September 25, 2018
Tuesday, September 18, 2018
Subscribe to:
Posts (Atom)