Tuesday, December 4, 2018

Doctors notching wins in a war that can't be won

It is incumbent upon us to promote the desirability of alternative solutions to conflict




At the recent Armistice Day centenary commemorations, I had the privilege of attending a splendid rendition of the concert An Australian War Requiem in Sydney’s Town Hall.
The requiem to the nation’s fallen soldiers in World War I is in three tableaux; the first concerns the horror of war, the second focuses on sons and mothers, and the third — and most dramatic — is a reflection on loss.
The event led me to reflect on the far-reaching role ­doctors have played throughout history in times of war, and continue to play when it comes to human horrors.
Our role is multifaceted and spans the breadth of war.
Our first contribution, of course, is to do our best to heal those who are physically and mentally injured by war, and to understand, as best we can, the awful circumstances they have endured.
One such doctor was New Zealand ENT surgeon Sir Harold Gillies. Working on the Great War’s Western Front, he witnessed attempts to repair the ravages of facial injuries and, as a result, became a pioneer in plastic surgery.1


A plaque honours Sir Harold Gillies. Photo: Simon Harriyott/Wikimedia Commons. https://bit.ly/2TBAgLu

He opened a hospital in the UK after the Battle of the Somme in 1916, where he treated thousands of cases of jaw and facial mutilation.
In World War II, Australian surgeon Sir Ernest Edward ‘Weary’ Dunlop was renowned for his leadership while being held prisoner by the Japanese.
He was hailed by other POWs in the prison camps and jungle hospitals on the Burma-Thailand railway for being “a lighthouse of sanity in a universe of madness and suffering”.2


Kanchanaburi war cemetery, where thousands of Allied POWs who died on the notorious Thailand to Burma death railway are buried.

Our second contribution comes in the aftermath of war when we look at the reasons for negative behaviour in affected servicemen and women and try to assist them if they emerge shell-shocked or, in modern parlance, with PTSD.
Research suggests that even medieval soldiers suffered from the psychological impact of war despite their training from a young age and being surrounded by death.
In 15th-century France, people believed that warfare caused a kind of madness and soldiers who went “berserk” were celebrated. However, non-combatants who were traumatised by war were pitied or ridiculed.3
Today PTSD remains a major therapeutic challenge, and not only for war veterans. A Google search of the phrase ‘Australian doctors treating PTSD’ yields a vast number of entries concerning psychiatric treatment, medication, lifestyle and other treatments.
Finally, we are bound to share our medical insight into the real cost of war with our communities.
Doctors in the US recently engaged in the modern-day version of this duty of care when, just a few days before the Armistice Day centenary, they took to Twitter to reveal the day-to-day horror of the country’s gun crime.4
Fuelled by the US gun lobby’s call for doctors to “stay in their lane” on the country’s gun control debate, dozens of emergency care medics posted photos of themselves working while covered in patients’ blood as a visual reminder of the human cost of America’s shooting epidemic.
Away from the front-line, numerous doctors have also been honoured for championing a reduction in the engines of war.
SA palliative care physician Professor Ian Maddocks was president of the Medical Association for Prevention of War when it received an Australian Peace Medal, and vice-president of the International Physicians for the Prevention of Nuclear War when it received the Nobel Peace Prize in 1985.5
And, more recently, Professor Tilman Ruff, an infectious disease and public health doctor from the University of Melbourne, was chair of the International Campaign to Abolish Nuclear Weapons — the recipient of the 2017 Nobel Peace Prize.6
It is important for doctors, who see the damage, to make clear to our communities and their political representatives the absolute desirability of finding alternative solutions to conflict.
But the reality is that in a world of scarce resources, war will likely remain part of the human condition, and will not, unfortunately, be going away. We must therefore remain ready to meet its challenges.
Requiescat in pace.

Professor Leeder is an emeritus professor of public health and community medicine at the Menzies Centre for Health Policy and School of Public Health, University of Sydney. 

Published in Medical Observer 26 November 2018
https://www.medicalobserver.com.au/views/doctors-notching-wins-war-cant-be-won

Tuesday, November 20, 2018

Tuesday, November 13, 2018

It's counter-intuitive but getting healthier should make us worry




Unlike in the past, modern public health programs have led to more poverty, unemployment and a rapid rise in chronic illnesses, says a leading GP
Confronted with challenges in our daily work, reinforced by depressing news about global politics, it is understandable that we feel like hunkering down and concentrating on local distractions, such as family and valuable friendships — not that these ‘distractions’ are uniformly free of challenges, of course.
But what is occurring globally establishes a context within which we all work and there can be value in looking above the parapet, keeping an eye open for a sniper in the distance.
Adjunct Professor Thomas Bollyky, a senior fellow and director of the Global Health Program at the Council on Foreign Relations in Washington DC, has written a challenging essay in the latest issue of council publication Foreign Affairs.  The article is both informative and deeply disturbing. Entitled ‘Health without wealth’, Bollyky addresses “the worrying paradox of modern medical miracles”.
“For the first time in recorded history,” he states “bacteria, viruses and other infectious agents do not cause the majority of deaths or disabilities in any region of the world.” This infamy belongs to chronic illness.
Bollyky takes an historical view of how the vast improvements of the past century in health occurred in the US and Europe as a result of controlling communicable diseases.
These measures included “government-mandated measures — such as milk pasteurisation, laws against overcrowded tenements, and investments in clean water and sewage treatment systems — and better social norms around hygiene, childcare and girls’ education”. Half of the improvement in life expectancy in developing countries between World War II and 1970 was due to these means — and not to antibiotics and immunisation, Bollyky claims.
These public health measures had a strong relationship with prosperity. They occurred because governments invested in water and sewers and public housing. And, in return, a healthier workforce contributed to prosperity.
Big cities, the engines of innovation and achievement, became increasingly affluent because of rising productivity and were able to complete the circle of public health investment leading to societal economic benefit.

Related reading:

However, modern-day public health programs to address the likes of malaria, HIV and child immunisation have undoubtedly saved lives and increased life expectancy, but have failed to increase prosperity. “The recent hard-won gains threaten to bring a host of new and destabilising problems,” Bollyky writes.
This is because such programs, which are often paid for by outside agencies rather than local governments, have not automatically led to greater productivity, more employment or the expansion of local health services. In fact, the situation has led to more poverty, unemployment and a rapid rise in chronic illnesses as a result of changes to food supply, greater availability of tobacco and housing shortages.
In this setting, chronic disorders have flourished and a new generation of peri-urban slums have developed.
In Australia, we enjoy the ability to treat these conditions and we have succeeded in pushing many of them into the senior years. But think back to the 1950s and ’60s when our therapeutic abilities were much less and where death from an MI or stroke was common among middle-aged men. That is how it is in many less economically developed nations now — a huge loss of productive workforce in middle age.
Bollyky states that deaths from hypertensive heart disease among people under 60 have increased by nearly 50% in sub-Saharan Africa in the past 25 years.
“In 1990, heart disease, cancer and other non-communicable diseases caused about a quarter of deaths and disabilities in poor countries,” he adds. “By 2040, that number is expected to jump as high as 80% in countries that are still quite poor.”
The remedy, Bollyky suggests, is a more comprehensive approach to international aid — ensuring that investments help countries to improve their healthcare systems, make their cities more liveable and “enable their companies to employ more people more productively”.
While our preventive approaches to chronic illness in Australia could do with more money, we can be thankful that we have healthcare that enables us to manage chronic ailments, especially through general practice.
However, in this global era of chronic illness, it would be wise for governments to remember that spending on healthcare and enjoying prosperity are two sides of the same coin.

Professor Stephen Leeder is an emeritus professor of public health and community medicine at the Menzies Centre for Health Policy and School of Public Health, University of Sydney.

Published in the Medical Observer 18 October 2018 https://bit.ly/2PwaJVK

Paying for Performance


Published in Australian Medicine.  15 October 2018.  https://bit.ly/2qNjEmN

Tuesday, October 9, 2018

A revolution is coming, warns emeritus professor


You're out of touch if you believe medicine will remain unscathed amid the rise of artificial intelligence, he insists
So you thought that My Health Record was complicated and risky? In the digital revolution, it is chicken feed. 
Steady yourself and gird your loins, because artificial intelligence (AI) is the big game that’s coming to town soon and it can be challenging.
How close are we? A statement in the IT world called Moore’s law observes that the number of transistors that can be placed on a single integrated circuit doubles about every two years.  
It was named after one of the co-founders of Intel, Gordon Moore, following his 1965 paper. It means that the same-sized circuit you were using last year has doubled its capacity this year. 
According to Google, your smartphone has enough computing power to fire a person to the moon.
How long Moore’s law will apply is unknown because the space on printed circuits is finite, but we do know that today’s computers have the same processing power as the human brain. 
In the August issue of Foreign Affairs magazine, Kevin Drum, a 60-year-old Californian political blogger and columnist who knows a lot about Silicon Valley, credits the immense social progress of the 19th century to the Industrial Revolution.  
“Without it, there’s no capitalist revolution because agrarian states don’t need one. Without it, there’s no rising middle class and no real pressure for democracy,” he wrote in an essay called ‘Welcome to the Digital Revolution'.
“The key drivers of this era were the steam engine, germ theory, electricity and railroads.”
And now? The computers to support AI are ready.
Their power is measured in floating point operations — known in the trade as ‘flops’ — which basically means that they work very fast. For example, one second is the equivalent of about 10-100 petaflops.  
The capacity of the human brain is said to be able to handle 100 petaflops per second. That is, it can perform 100,000,000,000,000,000 operations per second.  
According to Mr Drum: “A computer with this capacity, unfortunately, is the size of a living room, costs $200 million and generates electricity bills of about $5 million (a year).” 
Software development is critical and AI experts say there is a 50% chance that AI will be able to perform all human tasks by 2060, he adds.
“The digital revolution is going to be the biggest geopolitical revolution in human history”, he says, adding that PricewaterhouseCoopers has predicted that 38% of all jobs in the US are at high risk of automation by the early 2030s. 
The effects on human employment will be profound.
Within a decade, he says, long-haul truck drivers will be displaced by driverless technology and similar technology will knock out the jobs the displaced drivers might have taken up. We need new politics.
Anyone imagining that medicine and medical practice will not be profoundly altered is out of touch. 
Our eldest son Nick, a vice-president with Google, recently told me that the AI development of the driverless car was now sophisticated enough to engage in ethical reasoning.  
For example, how should an AI-driven vehicle respond to an impending crash where either the humans in that car, or the colliding vehicle, will sustain a fatal injury? With sacrifice, altruism or self-interest?
And, if ethical reasoning can be used by AI for driving, then why not in medicine?
The two most important developments for the 21st century will be AI-driven mass unemployment and fossil-fuel-driven climate change, Mr Drum says.
A glimmer of hope is that AI might be able to solve climate change by scaling up wind and solar power.  
But what about medicine? Now, there’s the challenge for us doctors. 
At the very least, our medical education should accommodate more about the interface between practice and AI.
This must go way beyond the simplicities of how to use IT to include debating and considering the implications for what we do as doctors in this brave new world. 
What will ethical practice mean and how will we relate to AI in this pursuit? 
It’s time for a lot of serious and creative thinking. 
Source: Foreign Affairs 2018, online .
Related reading:
Published in The Medical Observer 13 August 2018 https://bit.ly/2PnMULp

Tuesday, September 25, 2018

TB and HIV - still miles to go


Published in Australian Medicine September 17 2018 https://bit.ly/2pja5uY

Tuesday, September 18, 2018

Tuesday, August 14, 2018

An important reminder that we must never forget Nazi doctors


THE LAST WORD


Dr Hans Asperger (1906-1980), the Viennese academic paediatrician best known for his contributions to our understanding of autism and related conditions, has been revealed as a Nazi sympathiser. 
The revelations were contained in the results of an eight-year study of Dr Asperger published in the April edition of Molecular Autism.
The research was carried out by Dr Herwig Czech, a Holocaust scholar from the Medical University of Vienna, who concluded Dr Asperger failed to protect his young patients from the Nazis’ euthanasia program. 
In fact, Dr Asperger frequently referred children with what we now call autism and similar problems to a Nazi clinic for children with disabilities, who were judged to be a burden to parents and the state.
Somehow, Dr Asperger managed to sidestep criticism for his close association with the Nazi regime and continued practising as a respected clinician for decades after World War II.
However, his actions and those of other doctors who carried out medical atrocities in Nazi Germany led to a global movement among doctors to stop this from ever happening again.
The result was the establishment of the World Medical Association, which aimed to restate the ethical basis for the practice of humane medicine. It achieved this in the Declaration of Geneva published in 1948. 
The declaration provides doctors around the world with a code of ethics. They pledge not to permit “considerations of age, disease or disability, creed, ethnic origin, gender, nationality, political affiliation, race, sexual orientation, social standing or any other factor to intervene between my duty and my patient”.
The declaration also demands doctors “respect the autonomy and dignity” of their patient. But has it worked? Since it was introduced, we have not heard of anything on the scale of the human experimentation and euthanasia carried out by doctors working under the Nazis.
However, there have been cases. The so-called enhanced interrogation techniques, including waterboarding, were widely used on terrorist suspects rounded up by the CIA in the aftermath of 9/11.
While the torture methods were developed and inflicted on detainees by psychologists (contracted by the CIA) rather than doctors, groups such as Physicians For Human Rights claim doctors were complicit in what was happening by monitoring the health of those being tortured.
This included using a pulse oximeter to track the effectiveness of respiration during waterboarding. The group suggests this was a way for doctors to “calibrate physical and mental pain and suffering”.
More than a decade on, no medical professional has been held to account for their involvement in this dark chapter of American history, the group says. 
With this in mind, rather than curse the medical ethics committees that delay research, we should be grateful for these necessary checks and balances. And remind ourselves of the reasons why they came into existence.  

Published in the Medical Observer, 8 June 2018

Why we cannot allow machines to take over



The digitisation of medicine is having a negative impact by eclipsing the human side of medicine, writes Professor Stephen Leeder.


"There are times when the diagnosis announces itself as the patient walks in, because the body is, among other things, a text,” says Professor Abraham Verghese, professor for the theory and practice of medicine at Stanford University Medical School, California.
Writing in the New York Times (16 May), he adds: “I’m thinking of the icy hand, coarse dry skin, hoarse voice, puffy face, sluggish demeanour and hourglass swelling in the neck — signs of a thyroid that’s running out of gas. This afternoon the person before me in my office isn’t a patient but a young physician; still, the clinical gaze doesn’t turn off and I diagnose existential despair.”
The state of the US healthcare system, which means doctors no longer care for real patients, is the root cause of this young doctor’s despair, Professor Verghese says.
Similar cases of burnout are not uncommon in Australia where heavy workloads, long hours and administrivia are increasingly taking doctors away from the essential task of meeting and treating people, not printouts.
His essay, ‘How Tech Can Turn Doctors into Clerical Workers’, goes on to describe how patients sat in hospital beds are just “place-holders” and the work of doctoring now occurs with virtual patients who reside inside computers.
“Old-fashioned ‘bedside’ rounds conducted by the attending physician too often take place nowhere near the bed but have become ‘card flip’ rounds (a holdover from the days when we jotted down patient details on an index card) conducted in the bunker, seated, discussing the patient’s fever, the low sodium, the abnormal liver-function tests, the low ejection fraction, the one of three blood cultures with coagulase negative staph that is most likely a contaminant, the CT scan reporting an adrenal ‘incidentaloma’ that now begets an endocrinology consult and measurements of serum cortisol,” he writes.
“The living, breathing source of the data and images we juggle, meanwhile, is in the bed and left wondering: Where is everyone? What are they doing? Hello! It’s my body, you know!”
This is how the disillusioned young doctor before him has ended up as the highest-paid clerical worker in the hospital, says Professor Verghese, adding that for every hour a doctor in the US spends with a patient, they spend nearly two with the electronic medical record. I doubt these figures are much different in Australia.
Of course, we can’t blame the rise of electronics solely for the rise in doctor burnout. There are other factors at play, such as the increasing load of older and complex patients that our health system, with its strict divide between hospital and general practice, is struggling to adapt to with the necessary means for integrated care.
But it’s clear the digitisation of medicine is having a negative impact by eclipsing the human side of medicine.
In a recent edition of the ABC’s Life Matters, two anaesthetists who had had cancer were interviewed about their experiences as patients.
They spoke of the shock of diagnosis, the high-quality therapy they received and their eventual return to practice.
Both identified sensitive care as the most important element in their journey to recovery. They also noted how the time pressures of modern medicine easily exclude it.
Not having time to listen and interact closely with patients can lead doctors to emotional exhaustion, cynicism and resignation.
“True clinical judgement is more than addressing the avalanche of blood work, imaging and lab tests; it is about using human skills to understand where the patient is in the trajectory of a life and the disease, what the nature of the patient’s family and social circumstances is and how much they want done,” Professor Verghese points out.
“So let’s not be shy about what we do and ought to do and must be allowed to do, about what our patients really need.”
It is more important than ever for doctors to speak out about the caring element of the profession. For, if patients come to us for technical help and care and we skimp on one because we are so pressed for time, they will eventually seek help from a different health professional.
Just look at the billions of dollars Australians spend on alternative medicine each year, which suggests that they are already seeking treatment, and care, from others while we busily attend to machines.


Related reading:

Published in The Medical Observer.  19 June 2018.

Integrated healthcare - Building health systems for the future

Integrated Healthcare - Building health systems for the future.  

Scientia.  June 52018.  https://bit.ly/2MpK2iY







https://bit.ly/2MpK2iY

Tuesday, June 12, 2018

Wednesday, January 3, 2018

A little bit of sugar may (or may not) make the weight go down.

The statistics do not support the view that there are big differences in sugar consumption between the fat and the thin.  We need to define our enemy clearly in the battle against obesity.

The Sydney Morning Herald has announced a war on sugar. Its rationale is that we need to combat obesity with all its attendant ills.  Good thinking.  Sugar might appear to be easy pickings. Beware.
It is important that individual sugar consumption not be cast as the behaviour that we must attack with all our might. That will be a waste of energy – no pun intended – and leave the real changes essential for reversing our current trend to a fatter, less healthy community untouched.
Just how crucial is sugar to obesity?  A study of 132 479 individuals in the UK, published in the International Journal of Epidemiology* in 2016, analysed their consumption of macronutrients – fat, protein, carbohydrate and sugar – and compared how much energy in the diet of obese versus non-obese individuals came from these food categories.  This group was assembled for the UK Biobank genetic study and the current study made use of the comprehensive health data collected on all participants.
Anderson and Pell, the lead authors of the study from the University of Glasgow, made the point that in this study ‘dietary intake was self-reported outside the clinic, which may encourage more truthful reporting, and was collected using a 24 hour recall questionnaire which produce more accurate results than a food frequency questionnaire (the usual approach adopted in large-scale studies)’.  Their general conclusion was ‘66.3% of men and 51.8% of women were overweight/obese.’
Anderson et al wrote: "Compared with [those participants with] normal BMI, obese participants had 11.5% higher total energy intake and 14.6%, 13.8%, 9.5% and 4.7% higher intake from fat, protein, starch and sugar, respectively." So while the fat folk were consuming more energy than the thin, the excess due to sugar intake between the two groups was quite small. ‘There is only a weak correlation between absolute energy derived from sugar and from fat. Therefore, targeting high sugar consumers will not necessarily target high consumers of fat and overall energy.’
They concluded "fat is the largest contributor to overall energy. The proportion of energy from fat in the diet, but not sugar, is higher among overweight/obese individuals. Focusing public health messages on sugar may mislead on the need to reduce fat and overall energy consumption."
Do these observations mean that we should not include sugar as needing attention in our approach to obesity?  It cannot be said to be the main game. Unlike tobacco – a single and inessential commodity – there is no case to ban it completely, nor is such an approach desirable. A sugar tax would make all sugar-containing foods and drinks more expensive and hence less accessible to less affluent consumers and needs careful calibration against the criterion of equity. Also, Anderson et al warn of the tendency to substitute one source of energy for another and if this substitute is fat, then we are no further ahead.
The power of the sugar industry – cane, corn and beet – is immense and it is far from squeaky clean when it comes to promoting a healthy diet.  It is at the level of production and marketing that our attention needs to focus in encouraging a healthier approach to sugar.
Encouraging individuals to lobby for less sugar in processed foods and drinks will not be easy and blood will be spilt as that battle plays out.  But it is there – and not by beating up individuals to reduce their individual consumption of sugar (desirable but neither necessary nor sufficient) – where our efforts should be applied.
It is interesting that, in an international comparison of cost-effective ways of reducing obesity, McKinsey and Co, a consultancy, nominated reducing portion size as the best.  Given the nearly 12% difference in total energy intake between the obese and non-obese participants in this study, reducing the size of meals we eat by 10% would seem a wise recommendation – without worrying too much about macronutrients such as sugar.

Monday, January 1, 2018

SOCIAL CAUSES OF ILLNESS ARE NOT IMMUTABLE: THEY ARE AMENABLE TO CHANGE

Modifying our own behaviour in health promoting directions is sensible but for sustainable, nation-wide change we need to take action of a different kind.

Far from being a cause for despair, the insight that a lot of illness in our society derives from the environment we have created should give us enthusiasm to use the New Year to achieve better health generally.  As John Kennedy stated, “Our problems are man-made; therefore they may be solved by man”.  Or woman.

Discounting for Kennedy’s hyperbole, social determinants of illness are within our power to modify to our advantage. And while changing individual behaviour (less sugar, less salt, more exercise etc) is commendable, there are things that we can do in the community to make those individual behaviour changes easier for everyone and more likely to be sustained.

An interesting comparison between our globalised and immensely successful society and that of the Roman Empire is drawn by Kyle Harper, vice-president of the University of Oklahoma in a recent essay entitled How climate change and disease helped the fall of Rome. https://aeon.co/ideas/how-climate-change-and-disease-helped-the-fall-of-rome.  He wrote:

The decisive factor in Rome’s biological history was the arrival of new germs capable of causing pandemic events.

The empire was rocked by three such intercontinental disease events. The Antonine plague coincided with the end of the optimal climate regime, and was probably the global debut of the smallpox virus. The empire recovered, but never regained its previous commanding dominance. Then, in the mid-third century, a mysterious affliction of unknown origin called the Plague of Cyprian sent the empire into a tailspin. Though it rebounded, the empire was profoundly altered – with a new kind of emperor, a new kind of money, a new kind of society, and soon a new religion known as Christianity. Most dramatically, in the sixth century a resurgent empire led by Justinian faced a pandemic of bubonic plague, a prelude to the medieval Black Death. The toll was unfathomable – maybe half the population was felled.

He points to the critical role of infectious disease and natural changes in climate in weakening the Roman Empire to near collapse. Plague, especially, was brought by rats from the east in boats carrying trading goods to Roman ports. But the authorities and population were ignorant of the causes of these afflictions and powerless to control them.  And the effects of infectious disease more generally and endemically in the crowded cities of the Empire as urban migration surged diminished the productivity of the nation.  The average life expectancy of a Roman citizen was in the 20s.

An immense difference exists between the knowledge possessed by our technologically advanced societies and the Roman Empire. We have knowledge that enables prevention and therapy for many of our health problems.  Our success, through sanitation, immunisation and vastly improved nutrition as well as an impressive armamentarium of medical and surgical therapies means that our life expectancy is four times that of Rome. 

But clearly we are not problem-free and we are left with a hefty rump of problems, the major degenerative disorders of diabetes, heart disease and stroke, cancer, physical trauma, musculoskeletal disorders and drug and mental illness that are deeply troubling.  But unlike the epidemics of Rome, where nothing was known about their origin, we know a vast amount about the causes of these ailments.  And the causes, while often complex and shrouded in the economics and behaviour of our society, are as Kennedy suggests, soluble by humans. Three courses of action command our attention.

First, to move the settings on the dials that govern the way we live, the population needs to be convinced that the move has merit.  For this to happen in relation health the community needs first to be clear that proposed changes in our national diet and exercise patterns, for example, make sense and are potentially beneficial. 

Health messaging is needed that moves beyond recommending individual behaviour change and instead points to how such things as sugar taxes, if lobbied for effectively, will enable many people to lower their sugar consumption.  The paradox is this: if lots of people decrease their consumption of alcohol, tobacco or sugar just a bit, the likely benefits are greater than if a few people go to extremes.  Advertising agencies could assist in marketing that insight.

Second, there is a place for political leadership rather than followship.  By this I mean the kind of ‘out there’ behaviour that we saw from John Howard in relation to gun control and from Neal Blewett in regard to HIV/AIDS – pushing the agenda for change.  Politicians can only go as far as the community will permit and so this point is heavily dependent on the first. 

Third, the industrial and commercial interests that dominate our economic environment should be commended when they make moves to reduce the hazards in our environment – by offering food choices in our markets that are less injurious, cutting down on portion sizes in restaurants, and attending to equity of access to fresh food in rural, remote and Aboriginal communities.

Simply because disease is socially determined we are not rendered impotent in dealing with it.  If we take the correct messages from this insight and contribute to the large social changes needed for effective prevention, then 2018 will be an important year in preserving the health of our nation.