This is the fourth post in this series. The first three posts dealt with the 80’s, when I progressed through medical school and residency training into early practice, translating the theory of pain management into “the real world”.
Before I continue talking about pain management, however, there’s a couple of contextual points to discuss, so this post will involve a bit of a detour off the main road.
I’ve said it before, and I’ll say it again.
Things were different in the 80’s.
Come the 90’s, the times, they were a-changing.
First, up to the early 90’s, family practice revolved around the one-to-one relationship between the doctor and the patient, with more of an emphasis on acute care.
In those “classic” health care interactions involving an acute illness, the patient was the one with the health problem. They knew something was wrong and they made a choice to visit the doctor. They brought the story of their current illness, together with their own unique combination of fears, strengths, values and preferences. Often, they had past experience with similar illnesses or events affecting themselves or their family. There were times when they had even more practical knowledge than the doctor.
The family doctor was the one they turned to for help. The doctor had medical knowledge, skill in making medical decisions, and access to any necessary investigations, referrals, and treatments. The doctor’s role was to understand the patient’s concerns, make a diagnosis, explain what was happening, and then suggest a plan for further investigation and management.1
The doctor and the patient exchanged their different forms of knowledge. Ideally, decision-making was shared. In general, however, when dealing with an acute or even life-threatening problem, the doctor had more medical know-how at their fingertips than the patient. There would be limited time for the patient to get up to speed, so there was a tendency for the patient to defer to the doctor’s expertise. Some patients wanted to know more about their problem, some less. Doctors shared information based on their understanding of the patients and preferences. Some decisions seemed important enough to require patient input and some did not. In the end, of course, patients decided whether or not to follow the doctor’s advice.
Through it all, the doctor was beholden to the patient, even when the government was paying the bills.
However, in the latter part of the 20th century, there was a gradual shift in family medicine away from the treatment of acute illness toward services related more to prevention and the effects of chronic diseases. Chronic diseases were becoming more common, in part due to the aging of the population, the increasing success in keeping people with chronic conditions alive2, and also shifting definitions about what could be considered a disease.
Chronic diseases cannot be cured. They are controlled by medication and/or modification of lifestyles. That being the case, there’s a lot more time for the patient to get up to speed, and less of a tendency for the patient to defer to the doctor’s expertise. Patients could no longer simply be told what to do, they had to be persuaded. Patients and their doctors became partners. Less “paternalism”, more patient autonomy and shared decision making, as I mentioned in Part 3.
Thinking about pain management, acute pain was managed symptomatically (while fixing the cause) and cancer pain palliatively (keeping the patient comfortable until they died). In the 80’s, chronic non-cancer pain (as in arthritis, for example) was a symptom, not a diagnosis. It was the result of chronic disease, and so the basic premise was “treat the disease to fix the pain”.
As the focus shifted from acute to chronic disease and the roles of physicians and patients changed, there was increased “consumerism”. Patients started to shop around. Increasingly, patients started to bring in additional information which they had obtained from the media, the internet, other providers (naturopaths, etc.), and private testing facilities.
This changed things, including pain management.
The family doctor was no longer the primary source for medical knowledge, skilled decision-making, referrals, and treatments.3
History-taking got harder. As patients came to the office with “helpful” preconceived notions about their diagnosis, the doctor needed to understand more than just the story of the patient’s illness. Now, they also had to determine how the patient’s story might have been changed by any research the patient had done.
The doctor’s role expanded to include correcting misinformation and then explaining why the patient’s proposed investigation and/or treatment plan might be impractical, unjust, inappropriate, potentially harmful or even futile.
Each encounter became more of a negotiation. The patients wanted autonomy and to be “satisfied”. The doctors had ethical obligations to do no harm, help the patient, and uphold standards of care.
Beyond that, health care consumers (formerly known as patients) were becoming less satisfied with the existing menu of services, which had been set up to save them from catastrophically expensive medical events. They started to ask for things that had previously been more a matter of personal choice and less of a medical condition.4 In some cases, as with opiates in chronic pain, patients started to demand things that weren’t well researched, thinking that they might work, regardless of the risks.
While patient medical needs were increasing (due to aging, etc.) and patients were becoming more demanding (due to consumerism), family doctors were becoming increasingly beholden to others, most notably those who paid the bills. In Canada, that’s the government.
Seeking to contain costs, governments have always imposed limits on health care services, including Pharmacare. These limits can be fairly subtle, as when they modify the definition of what needs to be done during a medical encounter to justify billing a certain fee, or when they cut the hospital budget and thereby limit the number of hip replacements. Other times, the changes are more overt, as when they choose to de-insure specific services or categories of drugs. Sometimes the motivation is purely political, depending on who’s advocating for what.
The government wants patients to think they can get what they need (or want), but in truth they only get what the government is willing to pay for. Meanwhile, doctors jump through bureaucratic hoops to justify any “exceptions”.
In terms of pain management, where opiates play a role, the Nova Scotia Prescription Monitoring Program is one example of a well-intended but needlessly bureaucratic restriction on physician prescribing patterns.5 I’ll talk about it in my next post.
Bearing all that in mind, in the 80’s the health solutions offered to each individual patient tended to be custom developed with the patient, for the patient. Family practice was mostly “custom tailoring”, not “off the rack”.
Starting in the 90’s and continuing over the years since, there’s been a steady rise in the number of outside bodies telling doctors, and sometimes patients as well, what they should do, often through clinical practice guidelines (CPGs).
First seen in the mid 80’s, CPGs have grown exponentially in numbers ever since. They started off as evidence-informed options you should consider, depending on the needs and values of the patient in front of you. However, when coupled with consumerism and the various changes in accountability, guidelines have become “standards of care”, things you must do to be considered a good doctor.
Family practice has become more of a “cookie cutter” discipline. The doctor no longer makes the dough6; all they do is pick one of a limited number of shapes, in consultation with the patient.
As we moved from away from being accountable to the patient and building custom solutions, CPG’s and bureaucracy created pressure to do the things that have been shown to benefit groups of people (including the patient in front of you), but not necessarily every individual patient.7
In patients with “mild” high blood pressure (diastolic 90-100), for example, you need to treat 118 patients for 5 years to prevent one stroke. The person who didn’t have the stroke is happy, but you’ll never know which patient that is! If you did, they would be the only patient you needed to treat! The other 117 patients were treated for 5 years and got no real benefit, in terms of stroke prevention.
Treating mild hypertension makes sense for a group of patients. Whether or not it makes sense for the individual patient depends on many things, including their personal risk of a stroke or other complication of hypertension (family history, smoking habits, other health issues, etc.), how worried they are about strokes, how they feel about being labelled with a “chronic disease” (hypertension), how they feel about taking pills every day, what side effects those pills cause, whether or not they can afford the pills, etc. 8
Beyond that, of course, when you lower the threshold for diagnosing and treating hypertension, with a stroke of the pen you’ve created a whole new group of chronically diseased people!9 As consumers, they’ll insist upon the best treatment!
When it came to pain management, a couple of other things were fundamentally different back in the early 90’s.
First, prescription opiates were seen as medications for treating pain.10 Indeed, as per the WHO Pain Ladder (which was, in fact, an early CPG) opiates were essential in the treatment of pain, once the over-the-counter drugs were not working. As a reminder, however, opiates were NOT considered useful in chronic non-cancer pain.
Beyond that, yes, there were opiate addicts in Nova Scotia, and opiates were also used “recreationally”. Any recreational opiate use probably involved prescription opiates, and most of the opiate addicts were addicted to pharmaceutical opiates. As I explained in the previous post, heroin was a rarity.
Opiate addiction was NOT seen as something to be treated with opiates, except for a small cadre of patients for whom other approaches had failed, in which case methadone seemed to be the only option. I venture to say that back then the average family doctor would NEVER write in their chart that they were knowingly prescribing opiates to an addict.11 Knowingly supplying opiates to an addict, unless perhaps it was done in the context of weaning the patient off the opiates, would have been construed as a form of “malpractice” and therefore subject to sanction through the licensing bodies. Methadone prescribing was restricted to those with a special federal license, generally experts in addictions.
Second, there were only a few prescription opiates to choose from, and they had all been on the market for a very long time. The WHO’s Basic Drug List named nine opiates:
One was “standardized opium”, which we can ignore, because I doubt that it saw much use anywhere but the Third World.
Methadone was on the list, and it does work for pain, but it was rarely if ever used for pain in Canada in the early 90’s.
The rest (codeine, dextropropoxyphene/Darvon, morphine, pethidine/Demerol, buprenorphine, hydromorphone, and levorphanol) were all available in various formulations from various manufacturers, sometimes in combination with ASA or acetaminophen.
I suspect that most of the opiates on that list were no longer patent protected by the early 90’s. As a result, Big Pharma really didn’t put much energy into marketing them. Other opiates were known to exist but hadn’t even hit the market.12 Why bother? There were, after all, only so many people in pain, and pain meds were only supposed to be used for a defined period of time (acute pain or end-of-life palliation), so there seemed to be only limited room for growth in the market. This was about to change.
In summary, as we left the 80’s, family practice was starting to move away from a personalized relationship with the patient toward a retail/industrial model of care. Less bespoke. More mass produced on an assembly line. The appearance of increased choice, without much latitude to actually make individual choices. More bureaucracy.13
Looking at pain management, there were various forces at work:
As patients shopped around, they started to find new information about their chronic pain and new approaches to treatment.
Meanwhile, government bureaucracy sought to reduce the overuse, misuse and diversion of opiates. This will be the subject of Part 5.
Big Pharma got involved, touting opiates for the treatment of chronic non-malignant pain. I’ll talk about this in Part 6.
Reflecting the imbalance in those forces, the first wave of the contemporary opiate crisis, involving prescription opiates, started in the mid-to-late 1990s. It’s a sad story, neatly summarized in the following quote and graph from the report of the Stanford Lancet Commission. (Certain bits bolded, for clarity).
The first wave of the contemporary opioid crisis involved prescription opioids and started in the mid-to-late 1990s, and occurred at a time when illicit markets in heroin were isolated and stable in much of Canada and the USA.
The second wave, which began around 2010, was fueled by the first, and was instigated by drug traffickers realizing that individuals addicted to prescription opioids were a fertile potential market for heroin. As traffickers expanded heroin markets, including in small cities and towns where they had never operated before, many people addicted to prescription opioids were drawn in by the comparatively low price of heroin. An analysis of pooled national data from 2002 to 2011—a period before any substantial controls on prescribing were introduced—calculated that 79·5% of Americans who initiated heroin use started with prescription opioids.
Once efforts began to stop the rise in prescriptions and to reduce the diversion of prescriptions to individuals other than the intended recipient, some people addicted to prescription opioids began shifting to heroin more rapidly than they otherwise might have.
For more detail, see my post entitled “Practicing Medicine: How patient stories inform medical knowledge”
As an example, patients with Chronic Obstructive Pulmonary Disease (or COPD, including emphysema and chronic bronchitis) slowly deteriorate over time, often with repeated chest infections and/or hospitalizations. In the 80’s, all you could do was treat the flareups as they happened, so you tended to see these patients only when their condition “flared-up”, not when they were between flare-ups. These days, there are treatment protocols to prevent the flare-ups, so you see them more regularly.
Another example would be Ischemic Heart Disease (IHD). In the early 80’s, if you had a non-fatal heart attack (MI), you landed in hospital for a few days. Barring complications, after a few weeks you went back to work, perhaps with some advice to quit smoking, get some exercise, and watch your blood pressure. There wasn’t much to do otherwise, so it was treated more or less as an acute event, hopefully resolved. By the late 80’s, it was known that beta-blockers prescribed after an MI reduced the risk of further cardiac events. This shifted the perspective to IHD being treated as a chronic disease, requiring regular doctor visits and prescription refills.
This phenomenon is known as “deprofessionalization”.
In general, professional groups (like lawyers, accountants and doctors) are granted a sort of “monopoly” on their area of expertise, with the understanding that they’ll maintain standards through education, certification, licensure and self-regulation. With medicine in general, family medicine in particular, there’s been a steady erosion in professional stature, culminating in the current situation. People now self-diagnose and self-treat (often incorrectly). There’s no shortage of other groups who claim to be able to do whatever it is that family doctors do, including telehealth providers, pharmacists, nurse practitioners, midwives, chiropractors, naturopaths, counsellors, etc. It seems that our government agrees!
Meanwhile, unless you are Donald Trump, good luck telling your lawyer or accountant that you’ve looked things up on the internet and now know more than they do about how to manage your legal and financial affairs!
Think of diseases like male pattern baldness, erectile dysfunction, premenstrual dysphoria, or even obesity. In each case, as drugs appeared on the scene, a new disease was named, and/or treatment of the associated disease became a medical necessity. This phenomenon is known as medicalization and/or pharmaceuticalization.
The recent plan to set up national pharmacare in Canada takes this even further. Contraception has gone from being a personal choice to becoming the first thing the government will pay for, even before life-saving cancer drugs.
This is one form of “bureaucratization” - a steady increase in the controls placed on the medical profession over time.
Pun intended!
I’ll call this the “public health” model of health care, in which you are treating groups, not individuals.
In my later years of administration, I was working on a joint project with some geriatricians (specialists in care of the aged). They were concerned that older patients were getting treated according to the guidelines even when it no longer made sense to do so. For example, using cholesterol-lowering drugs to prevent heart attacks in 90-year-old patients with terminal dementia. The fact that you now need specialists to tell you to “deprescribe drugs” (i.e. to stop doing useless things) is testimony to the power of practice standards.
The threshold for diagnosing and treating type 2 Diabetes, hypertension, and all sorts of other conditions have gradually dropped over time, meaning that there are a lot more people diagnosed with these conditions, and a lot more drugs being prescribed.
They also saw limited use as cough suppressants. However, in most cases, using opiates to stifle a cough seemed “excessive”.
I did peer review back in the day, and so I read a lot of patient charts in a lot of offices. Even the doctors who DID prescribe a lot of opiates NEVER claimed to realize that the patient was an addict. They almost always said they were treating pain or really bad coughs.
Oxycodone, for example, was first synthesized in 1916, and first used on patients in 1917. It saw clinical use in Europe for a while (Adolf Hitler’s medical records show that he had more than a few doses of it!) It never caught on in North America, however, until Purdue decided to reformulate it as OxyContin.
And we wonder why family practice is losing its appeal as an occupation!
Very interestingly and thoughtfully written once again. Very enjoyable.
You would appreciate the book "Seeking Sickness" if you haven't already read it. Although none of it would be news to you, it is a great summary of the issue of lowering disease definition thresholds and how that has created huge new populations of "patients".