CMIO Corner: Did WWII lead to AI?

We often think of things as having “just arrived”— especially with new technology —because it was not within our field of vision until it came to market. However, much of technology is years in the making before we see it.

Advertisement

Artificial intelligence is one example of this. Although we hear about it everywhere now, there was a long process to get to where we are today. Its beginnings actually stem from a lack of medical knowledge relating to former President Franklin Roosevelt’s healthcare issues during World War II.

When French chemist and microbiologist Dr. Louis Pasteur coined germ theory of disease in the mid-1860s, he was the first scientist to go against the common and scientifically upheld belief that instances of disease occur randomly. Before Dr. Pasteur’s theory, humans chalked up illnesses and diseases like the plague to random chance or a case of bad luck, and certainly didn’t fathom any way out of it by means of a cure.

Dr. Pasteur refused to accept that diseases happened randomly. He launched experiments to test what types of bacteria caused beverages like milk and wine to spoil and eventually invented the process “pasteurization,” which proved that microbes were responsible for sour wine and spoiled milk. Dr. Pasteur later tested the concept on diseases, determining there is a correlation between bacteria and viruses and how people get sick.

The same thought process of randomness before Dr. Pasteur’s germ theory is often still present in today’s world despite how advanced we may believe we are. One area we see this in is AI. The tech is built on an entire history of data and medicine that began during President Roosevelt’s nearly four terms in office in the 1930s and 40s, culminating with the Framingham study in 1948.

AI’s history began while President Roosevelt was in office, during a time when doctors believed high blood pressure and heart disease to be natural or occur randomly with aging. When President Roosevelt came into office in 1932, his recorded blood pressure was 140 over 90, which today would signal the need for medical treatment. His aides shrugged off the numbers as nothing to worry about.  

During World War II, President Roosevelt was visiting England’s then-Prime Minister Winston Churchill, and throughout this period President Roosevelt had been experiencing headaches and neck pain. Today we would consider high blood pressure as a potential cause of those symptoms. However, despite increasing symptoms, the president chose an ear, nose and throat specialist to help attend to his issues indicating how little we knew then about the grave connection between high blood pressure and cardiac health.

President Roosevelt’s death became the turning point for studying cardiac disease and not accepting it as a random event. In 1948 President Harry Truman signed the bill to start the Framingham Heart study, which was an analysis of cardiac disease by gathering a large dataset of information never previously collected. The name Framingham was chosen for the city outside Boston where the study was being conducted.

After enrolling their first cohort of patients, the researchers gathered and compiled data, yet the results didn’t come out until about a decade later. It was then that the researchers published their findings and introduced many new concepts and terms to the world of medicine, such as “risk factor,” and opened the field of epidemiology and large population-based understanding of disease. These findings were also the beginning of statistical analysis in medicine with the study’s large population and datasets.

The physicians and scientists reporting the results faced heavy criticism. Many people questioned the validity of an epidemiological study—calling it “false science,” and questioned whether this data could truly be trusted for clinical purposes. 

Luckily for us, the physicians and scientists did not give up. They showed the value of their work. 

When computing power increased later on, descriptive statistics then snowballed to make way for developments such as logistic regression, which served almost as the precursor for predictive analytics. With this, scientists could examine not only the individual factors that contributed to cardiac disease but also how much they contributed based on the Framingham dataset. For example, in real life, it would be nearly impossible to have two people identical in age, gender and other factors, but differing in only their blood pressure. However, with a large enough population of people, you could statistically make that comparison. As a result, you could use logistic regression to help identify which factor attributed how much in terms of association with heart disease.

From there, these concepts evolved to predictive analytics, boot strapping and beyond. Today, this evolution has gone further to terms we commonly use now such as “machine learning” and “artificial, or augmented, intelligence.”

While it may seem that AI has just suddenly arrived randomly, we’ve seen many times in history things that seem random or sudden often have tremendous effort behind them—just like that pasteurized milk at the grocery store. 

More articles on artificial intelligence:
Bulk of clinical AI systems’ data comes from just 3 states, study finds
4 members of Congress demand research on racism within clinical algorithms
Defense Department, Philips expand AI research for early detection of infectious diseases

Advertisement

Next Up in Health IT

Advertisement

Comments are closed.