OPINION:
Researchers with Washington University in St. Louis conducted a bias test to see why there are so fewer females than males serving as medical surgeons — and also why those who do serve make significantly less in terms of salary — and came up with this finding: ’Cause most everybody thinks men are better suited than women to wield the surgical knives.
Way to further the gender wars, Washington U. But seriously, the findings aren’t just general interest; they come at a time when technology is making inroads in all-things-medical.
And if one of the biggest challenges in Big Tech is to overcome inherent biases in the modeling systems — then even a simple survey showing people automatically prefer men to women for their surgical needs takes on a new light.
If doctors themselves hold biases based on sex, then how will that affect the artificial intelligence used in the medical field that comes from their own data?
In an article entitled “The 3 ways A.I. could worsen health disparities (and how providers can fight back)” that was published in February, the Advisory Board reported one of the major challenges facing the medical community right now is finding women and minorities for research.
Another?
Developing technology that doesn’t “worsen disparities” between certain demographic groups, the Advisory Board went on to report.
Yet one other?
Making sure that these biases don’t become accepted as truths then built right into the medical world’s A.I.-driven diagnostic and testing systems.
“The risk with A.I. is that these biases become automated and invisible,” wrote Dhruv Khullar, a physician and researcher, in a New York Times’ opinion piece. “[The key is] being aware of the potential for bias and guarding against it.”
OK; granted, that’s all for A.I. that deals with the doctor-patient relationship — that pertains specifically to diagnosing and treating medical conditions based on technologically advanced findings.
But if those at the forefront of making the health care decisions that the A.I. world is ultimately relying upon to create this Brave New Technological Medical World are themselves inherently biased — well then, wouldn’t that just further the bias in the machine learned systems?
Of course it would.
How could it not.
And if women aren’t seen as equal players in the field of surgery as men, the fact is, this is a bias that could ultimately shape the outcomes of surgical decisions.
“A review of 42,991 Implicit Association Test records and a cross-sectional study of 131 surgeons provided evidence of implicit and explicit gender bias,” JAMA Network Open reported. “Data suggest that health care professionals and surgeons hold implicit and explicit biases associating men with careers and surgery and women with family and family medicine.”
According to the Association of Women Surgeons, only about 19% of U.S. surgeons in 2015 were female. And according to another JAMA report in mid-2016, of 10,241 physicians, females in the highest paid surgical role — orthopedics — earned an estimated $50,000 less each year than their male colleagues.
Admittedly, it’s tough to conclude whether the gender differences and these gender-based viewpoints exist because of long-held, long-standing biases of medical professionals in teaching, training and mentoring positions of power — professionals who then pass along those biases, perhaps even unwittingly, to both their male and female subjects — or, if the gender differences are due to simple personal preferences. Female medical professionals, for instance, may indeed choose family medicine over surgery because the first offers a better work-family balance and more control over office hours.
But when A.I. is involved, it really doesn’t matter.
A bias is a bias is a bias. The end result is the same, no matter the cause.
With that in mind: Once again, caution is key.
Given this Washington U. report, it seems the medical world has yet one more bias cross to bear — yet one other gender disparity to work out — before moving full-steam ahead with the A.I.-dominated diagnostics and Big Techie-type treatments.
• Cheryl Chumley can be reached at cchumley@washingtontimes.com or on Twitter, @ckchumley.
Please read our comment policy before commenting.