Monday, October 23, 2023

Origin of Humans: Were We Created or Evolved?

Origin of Humans: Were We Created or Evolved?

 

I have wondered for a very long time ever since I was in school about our human origin from the scientific point of view.

Long after my retirement from the Institute for Medical Research, Malaysia, I have even gone to the extent of attending a postdoctoral course on Evolution at the University of Cambridge in 20I9 in the hope of finding an answer to this mystery. 

Cambridge University is where Charles Darwin published his world-renowned studies on evolution. Charles Darwin wrote his celebrated treatise "On the Origin of Species" which was published on November 24, 1859, in London.

I took this opportunity to have several dialogues with experts on evolution at Cambridge, but I had not gained any convincing acceptance.  

 

The origin and the evolution of the human species remains a mystery to me till this day.

However, I have written “A Summary on The Creation of The Universe and History of Evolution of Life on Earth.” 

This included when we first made our appearances on the surface of this earth. 


https://scientificlogic.blogspot.com/2022/07/the-creation-of-universe-and-history-of.html


It may be possible for us to offer theories on the origin of microscopic life on earth which I have very briefly discussed and posted on Thursday, July 27, 2023, here in this link:

“Mysteries on the Origin of Life on Earth” here: 


https://scientificlogic.blogspot.com/search?q=origin+of+life


But it is highly improbable to offer the same for such a highly developed and complex species of life as us humans. Even if we attempt to link ourselves to fossils belonging  to the genus Australopithecus, an ancient hominin that was initially thought to live 2 million to 2.6 million years ago,  or other primates that produced successive clades leading to the ape superfamily, which gave rise to the hominid and the gibbon families; these diverged some 15–20 million years ago, and the African and Asian hominids (including orangutans) diverged about 14 million years ago, and  Hominins (including the Australopithecine and Panina subtribes) parted from the Gorillini tribe (gorillas) between 8–9 million years ago; Australopithecine (including the extinct biped ancestors of humans) separated from the Pan genus (containing chimpanzees and bonobos) 4–7 million years ago we still are unable to seamless linked them anatomically, structurally and genetically. There are “missing links” among them, as well as humans and them.  The Homo genus is evidenced by the appearance of H. habilis over 2 million years ago, while anatomically modern humans emerged in Africa approximately 300,000 years ago.

In any case, whether we were created, or zoologically evolved, scientists have classified us humans as animals because we are not angels or heavenly beings. 

They have put us into the Animal Kingdom, and we belong here, and nowhere else :

Kingdom:   Animalia
Phylum:   Chordata
Subphylum:   Vertebrata
Class:   Mammalia
 Subclass:   Theria
 Infraclass:   Eutheri

Order:   Primate

 Suborder:  Anthropoidea
 Superfamily:   Hominoidea
 Family:   Hominidae

Genus:   Homo

Species:   sapiens

 

Having said this, an hour ago, I read these links on our human evolution and also on the origin of races. Still, they didn’t tell us anything about our origin.

 

1.       https://humanorigins.si.edu/sites/default/files/transcript_pdfs/Evidence%20of%20Human%20Evolution.pdf

 

2.       https://humanorigins.si.edu/multimedia/videos/evidence-human-evolution

 

3.       https://www.yourgenome.org/stories/are-humans-still-evolving/

 

Origin of the Races:

 

https://en.wikipedia.org/wiki/Race_(human_categorization)

1.       https://www.nytimes.com/1865/01/22/archives/the-origin-of-human-races.html

At a Symposia in 2009 on Quantitative Biology at Cold Spring Harbor Symposia on Evolution – The Molecular Landscape, hundreds of papers presented were published in a massive Volume LXXXIV, where TD White from the Department of Integrative Biology and Human Evolution Research Centre, University of California, Berkeley, California wrote among many other thoughts this:  

Darwin would have been astonished and delighted to witness the 2009 Cold Spring Harbor Laboratory’s Symposium on Quantitative Biology anniversary celebrations of his birth and book. He would have recognized the many persistent themes discussed, taken satisfaction in the hundreds of mechanisms revealed, and been amazed by the broad advancing front of modern evolutionary biology. From ‘shadow’ enhancers (Heng et al, 2008) to segmental duplication (Marques-Bonet et al. 2009) and from ancient fossils to the ‘cognitive niche’ (Pinker 2003), ours is a world full of insights unavailable to Darwin in 1859.

In his historical scientist mode, Darwin was directly concerned with the paleontological, neonotological, and contextual data resulting from the natural, one-time, uncontrolled experiment of life on earth, Darwin clearly understood how the rich data sources of the neontological realm were living products of that vast experiment. And the phylogenetic and functional elucidation of how extant diversity has arisen – now provided by the modern landscape of molecular biology – is truly astounding, even in the hindsight of a single decade. These revelations make it too easy to forget what Darwin clearly appreciated – that the historical record of fossils, artifacts, and context is crucial to the fullest understanding of our evolution.

Human evolution was touched upon ever so lightly in Darwin’s 1959 ‘On the Origin of Species’. Darwin devoted detailed attention to ‘Imperfections in the Geological Records’ perhaps because he saw such gaps as rendering his theory vulnerable to critics (Sepkoski and Ruse 2009). His 1871 treatise on human evolution pondered what was then one of the largest imperfections of earth’s historical record – the paucity of truly early hominid fossil remains - family Hominidae bounds genera in the human clade after the last common ancestor we shared with the chimpanzees.

Darwin on Hominids:

Living humans are obviously anatomically, physiologically uniquely different from our closet living relatives, the African apes. What was the sequence by which natural selection assembled our obvious derivations of brain expansion, canine reduction, technology, and bipedality?

Darwin infamously avoided these topics in 1859, but despite this, Origin’s implications for human evolution could scarcely be concealed. Indeed, they generated even more immediate discussion and debate than did his later more immediate discussion and debate than his later 1871 treatise on humans (Browne 2002). When Huxley wrote on the subject in 1865 – followed by Darwin in 1871 – the poverty of the human paleontological record was overwhelming. 

Darwinian scholars had only a small, mostly European paleontological record extracted primarily from archaeological context with which had been labeled everything from ancestral to pathological. Even the extent of great apes were barely known. So, Darwin and Huxley turned to the extant hominoid primates to serve as the ‘outgroup’ for humans as proxies for the common ancestors we once shared with these now relics.

The late Stephen J. Gould famously characterized hominid paleontology as follows “…no true consensus exists in this most contentious of all scientific professions… a field that features more minds at work than bones to study” (Gould 2002. P. 910). Hominid primates are, in general, highly variable as judged by any of their living representatives. 

All workers agree that there is rampant homoplasy with the clade. Hominids have already lived fairly high on the food chain. Relative to many other mammals, they are K-selected, and therefore rare as fossils. 

These factors all contribute to make the delineation of hominid species lineages very difficult… and contentious. Contention is difficult to quantify but given the literally thousands of hominid fossils…and the relatively few professionals who work to interpret them… Gould’s characterization has surely been invalid since early in the 20th century. The fossil samples are today relatively large, even though the hominid clade’s record is terrestrial and therefore still full of imperfections.

The global experiment of human evolution cannot be repeated in a laboratory. We must infer what happened from the one-time experimental results, fragmentary and scattered as they may be. The good news about understanding our behavioural evolution is that there is a 2.5-million-year archaeological record. 

The good news about our understanding of our anatomical evolution is that some of the tissues shaped by that disappeared DNA can still be recovered from unique paleontological records derived from ancient landscapes. The order in which our unique human characteristics have been assembled via evolution is susceptible to investigations, and the temporal and anatomical perspective of the fossil record will continue to be the key to its success. 

Crucial in that investigation will be the understanding of how the hard tissues we recover as fossils were formed via development. Integration will continue to be the key to better understanding human origin and evolution.

Huxley’s ‘Nature’ 1882 obituary said it well on the occasion of Charles Darwin’s death. “He found the great truth, trodden under foot…” A century and a half ago, Charles Darwin wrote in “Origin” (1939) only that … light will be thrown on the origin of man and his history.”

He could not have imagined the illumination already thrown on our ancestry through the integration of the ever-expanding constellations of evidence about our human evolution".

Still, none of the papers presented gave any light on the origin of Homo sapiens, whether he was created or emerged through the agonizing process of millions of years of evolution?   

I think scientists are very confused and arrogant searching for the origin and evolution of human species. Were we created or evolved separately is my question?  

 It would have been far easier if they have carried their problematic academic burdens for an instant answer here: 

 

“And the LORD God formed man of the dust of the ground and breathed into his nostrils the breath of life; and man became a living soul.”

(Genesis 2:7)

 

We would need not have to search high and low, carrying such heavy thinking burdens in our brains about our origin if we just entrust all our questions to Jesus here:

 

Come unto me, all ye that labour and are heavy laden, and I will give you rest. Take my yoke upon you and learn of me; for I am meek and lowly in heart: and ye shall find rest unto your souls.”

(Matthew 11:28-29).

 

Scientists and paleoanthropologists have shown fossil skulls of human-like hominids over the last decade. There have been a number of important fossil discoveries in Africa of what may be very early transitional ape / hominins, or proto-hominins.  These creatures lived just after the divergence from our common hominid ancestor with chimpanzees and bonobos, during the late Miocene and early Pliocene Epochs.  The fossils have been tentatively classified as members of three distinct genera—Sahelanthropus. The earliest australopithecines very likely did not evolve until 5 million years ago or shortly thereafter (during the beginning of the Pliocene Epoch) in East Africa.  The primate fossil record for this crucial transitional period leading to australopithecines is still scanty and somewhat confusing.  However, by about 4.2 million years ago, unquestionable australopithecines were present.  By 3 million years ago, they were common in both East and South Africa.  Some have been found dating to this period in North Central Africa also.  As the australopithecines evolved, they exploited more types of environments.  Their early proto-hominin ancestors had been predominantly tropical forest animals.  However, African forests were progressively giving way to sparse woodlands and dry grasslands, or savannas.  The australopithecines took advantage of these new conditions.  In the more open environments, bipedalism would very likely have been an advantage.

My feeling about all these discoveries is that human-like hominids may have existed one after another long before the advent of modern humans beginning from Adam and Eve. These hominids were like clay models of various shapes and sizes moulded by a human potter.

The potter with his clay (soil) will initially craft out many figures and figurines that resemble humans, but not exactly like humans. He may initially play about with his craftsmanship, but not to his satisfaction. But with each figure and figurine he improves his art.

Similarly, from the anthropological angle, the Neanderthal (Homo neanderthalensis) said to be a sibling human species, the Java Man (Homo erectus), Piltdown Man, Taung Child (Australopithecus africanus, the Heidelberg Man that was probably ancestral to Homo sapiens & Homo neanderthalensis, Homo habilis that has features intermediate between Australopithecus and Homo erectus, Lucy (Australopithecus afarensis), and Australopithecus sediba were all separate hominids and there were missing links anatomically and genetically among them. They were all evolved separately like figures and figurines made separately by a potter at different times. They were not seamless genetically.

 

Then one day he decided to make a figure out of clay which is soil to resemble himself. In other words, it would be of his own image. Then when he had perfected his art, he decided to make a figurine of a female to accompany the male figure as he thought it would be better if he could craft out a pair of figure and figurine. So, he did just that.

 But what immediately strikes me is that God may not be satisfied with the imperfections of these hominids that looked like clay models. He would do exactly the same with these earlier hominids out of soil as He would with creating Adam out of soil that resembles His image, and later created Eve to make it a pair to accompany Adam.

This analogy given here by me fits exactly the verses in Genesis.  It explains the various skull fossils that have slightly different sizes and shapes scientists found in various parts of the world as much as a human potter makes different figures and figurines and places them in different parts of his workshop. Isn’t that similar? Isn’t this reasonable?

 In other words, God has been creating a lot of human-like creatures for thousands, if not millions of years before He perfected one which He named as Adam and Eve.

Then why was this not given in the Bible? First, we don’t expect the Bible to tell us that God has been experimenting with hominids before creating a perfect one that is in the same image as Him, do we? Second, the Bible is not a science book or a book on craftmanship or on technology. It is a book about God as a Maker similar to the potter and his dealings with his products of creation. 

 The Bible is a book on salvation which is far more important than the products of various craftsmanship. If the Bible was a book on science and technology, all the collective national libraries in this world would not be able to contain all those technical details of creation. So God has to make them exceedingly short and precise just for you and me. Does that explain?   

“ And God said, Let us make man in our image, after our likeness” (Genesis. verse 26) and in verses 27 – 31 it says, “let them have dominion over the fish of the sea, and over the fowl of the air, and over the cattle, and over all the earth, and over every creeping thing that creepeth upon the earth.
 So God created man in his own image, in the image of God created him; male and female created them.
 And God blessed them, and God said unto them, Be fruitful, and multiply, and replenish the earth, and subdue it: and have dominion over the fish of the sea, and over the fowl of the air, and over every living thing that moveth upon the earth. And God said, Behold, I have given you every herb bearing seed, which is upon the face of all the earth, and every tree, in the which is the fruit of a tree yielding seed; to you it shall be for meat.
 And to every beast of the earth, and to every fowl of the air, and to everything that creepeth upon the earth, wherein there is life, I have given every green herb for meat: and it was so.
And God saw everything that he had made, and, behold, it was very good. And the evening and the morning were the sixth day”.

(Genesis 26 -31)

 

It was then the image of Homo sapiens bearing the same image as God Himself came into existence. Is this theory of mine probable and acceptable?

 

 Furthermore, even the theory of evolution we already know, subscribes that the plants and trees were first to come into existence.

 

They have to be evolved first to provide oxygen through photosynthesis before the animals could be evolved. But this is exactly also the same series of events that took place in stages as described in Creation, given in Genesis in the verses 27 – 31. Does that ring the bell? 

 

Does that not fit in so neatly from the millions of years of evolution in the eyes of evolutionists, scientists, was just one day in the eyes of God. What may have taken 13.5 billion years in the eyes of an astronomer on the age of the universe was actually six days for Him.  

 

Does this explanation of mine opened the eyes of both the scientist and the church to be acceptable for both parties who has been at logger heads turtles with each other since the time of Charles Darwin. 

 

Does that sound logical?

 

It is also clearly revealed that a thousand years to us is just a day, or a watch in the night to God.

“For a thousand years in thy sight are but as yesterday when it is past, and as a watch in the night”.

(Psalm 90:4)

 

But, beloved, be not ignorant of this one thing, that one day is with the Lord as a thousand years, and a thousand years as one day”.

(Peter 3:8)

 

lim ju boo

(2,860 words) 

Saturday, October 21, 2023

My Personal Experiece Warded in Hospital for Surgery Two Days Ago!

 

At 10:05 am on Wednesday 18, 2023, I underwent a VenaSeal endo-venous surgery for my chronic venous stasis right leg ulcer. 


I was operated on by two vascular surgeons, one was from Malaysia, the other was a Singaporean surgeon who was on assignment from Singapore General Hospital to the Kuala Lumpur Hospital. The Singapore surgeon is a colleague of my niece who is a Clinical Assoc Professor, Professor Anne Hsu Ann Ling, and a Senior Consultant at the Singapore General Hospital.


The surgery was completed at 11:10 am, precisely 40 minutes later. My blood pressure then was between 92 /59 and 112 /66 mm Hg which is quite normal for me. The last surgery done on me was in September 2023 using radio frequency ablation (RFA).

The last surgery was not successful because out of three areas that were patent, the last surgeon could only close two areas. The third one he was unable to close because RFA depends on heat, and the third area was too near a nerve. As a result, he could not close it as the heat may damage the leg nerve. So, this time, a year later they decided to use the glue (cyanoacrylate glue) method.

However, during the surgery and long after that, my blood pressures were only slightly low. It hovered around 108 / 70, which to me has always been like that for me.

At home my blood pressure can drop to an unbelievable 78 /60 mmHg without me having synoptic attacks (fainting).

But during surgery, my heart rate was between 98 - 105 beats per minute (tachycardia) and has remained at around 90 beats per minute even  at  1:00 pm, 2 hours after surgery was completed.

I was given spinal anesthesia. Less than two minutes later, both my legs were numbed and paralyzed. 

My legs could move sideways a little, but not lift up at 1:46 pm

Both legs could be lifted  up at 2:10 pm. I was out of surgical recovery hall at 2:46 pm, and was able to walk in the ward at 5 pm.  


But I was fully conscious and was able to see the data on the monitor in the operation theatre, memorize them, then write them down in the post-surgical recovery hall, and in the ward over a period of 7 hours.  

Below is a small statistical analysis from the raw data I managed to collect on my own in the operation theatre, in the post-surgical recovery hall to the ward as I was fully alert and concious all the way.   


Mean systolic blood pressure = 109 mm Hg, SD = + / - 4.8, variance (s2) = 22.9, n = 18


Mean diastolic blood pressure: 63.5 mm Hg, SD = +/ - 2.4, variance (s2) = 5.6, n = 18  


Mean heart rate: 93.8 beats per minute, SD = 3.6, variance (s2) = 13.5, n = 18


Mean respiratory rate: 17.6 per minute, SD = 4.2, variance (s2) = 17.7, n = 8 


Mean pO2 :97.3 %, SD = 1.34, variance (s2) = 1.79, n = 10 

 

Due to my slightly low blood pressure during surgery, they tried giving me a pint of normal saline, I suppose as a volume expander, plus  phenylephrine, both  given intravenously to increase my blood pressure, but unfortunately my blood pressure remained on the low side.

So they pushed me into the third class ward after 2 hours of monitoring me in the post-surgical recovery room. In the third-class ward they have a surgical high-dependency unit where there are more nurses and doctors. There they could continue to monitor me, rather than pushing me back to my room in the 1st class ward where I was admitted the day before.  In the single room first-class ward there was no nurse in the room to monitor me for my vital signs.  

 But the anaesthetic doctor told me I could go back to the first-class (Ward 14 in Kuala Lumpur Hospital) at 5 pm after surgery, but this was not possible because my resting heartbeat was near 90 beats per minute, and my blood oxygen saturation level were 92 % when the normal should be 95 or 96 %.

When I was given pure oxygen in the surgical theatre at a rate of 5 litres per minute, and even many hours after that, my pSO2 saturation measured by the oximeter was 98 to 100 %. But when oxygen was taken off, my pSO2 dropped to 92 %.

Due to the slightly lower blood oxygen levels measured by the ward oximeter, they decided to take my arterial blood for the actual oxygen content using gas analysis, and the next morning send me for CXR (chest x rays) to see if my lungs are clear.  

An arterial blood gas test is a blood test taken from the artery. It measures the levels of oxygen and carbon dioxide, as well as the pH in the blood. Normally the arterial blood for oxygen level is used in emergency medicine and in critical care. This is more accurate than using the fingertip pulse oximeter that uses light to shine through the fingertip, making the tip appear to be red. By analysing the light that passes through the finger, the device can determine the percentage of oxygen in the red blood cell.

In fact, it merely measures the peripheral sub-cutaneous oxygen saturation after some of the oxygen has already been used up by the deeper underlying tissues. Hence, it will register slightly less than the actual arterial oxygen level. In other words, it measures the colour of the blood, not the actual oxygen content.

But then a question I would like to challenge is, what happens in case of carbon monoxide poisoning?  In the event of carbon monoxide poisoning the colour of blood is light reddish or pink, or “cherry-red,” due to carboxyhemoglobin formation. This would be very misleading if a fingertip oximeter were to be used.

However, the oxygen saturation in my arterial blood measured twice at an interval of about 3 hours both showed it was at 98 %, but the Kuala Lumpur Hospital electronic monitors that has an oximeter attached consistently showed between 92 or 93 % saturation.   

I thought to myself it was unnecessary to keep me there longer in the crowded and less comfortable 3rd class ward just because they have a surgical high-dependency unit where they could continue to monitor me for my slightly low blood pressure and slightly low oxygen saturation level. The acceptable oxygen saturation level is taken as 95 percent for most healthy people. A level of 92 percent or lower can indicate potential hypoxemia.

But in my case, I was very alert. I did not suffer from any signs or symptoms of cyanosis on my face, lips, fingertips. Neither did I have dyspnoea (short of breath), restlessness, discomfort, chest pain, except slightly fast heart rate. In fact, I was quite stable, and they could wheel me back to my first-class ward that was more comfortable. But they decided to keep me overnight in the third-class ward for monitoring till I was discharged the next day.  

There is always a difference between the oximetry and the arterial gas analysis. The arterial oxygen level is always slightly higher than the oximetry reading.

What they could do was to take as many readings as possible from as many patients as possible, both by oximetry and arterial blood analysis. The many readings are to prevent a statistical risk of being biased, or ‘by chance’. We can then look at the difference between the arterial and superficial skin oximetry readings after many studies.

 Let us say the higher arterial reading is x, and the lower oximetry reading is y. The difference consistently after several studies is x – y = c.

We can then use ‘c’ as the correction factor to add it on to y to get the actual arterial reading x each time oximetry reading is taken. This would have avoided the trauma of taking arterial blood which can be very painful than the normal venous blood draw. But this correction factor was not used.

Fortunately, the doctor whom my surgeon selected, and trusted more than the rest of the other Medical Officers took my arterial blood.  He told me he trusted only a certain doctor in front of dozens of his other doctors under him. I don’t know how the rest of the doctors felt. The selected lady doctor drew my brachial artery blood twice almost painlessly. It was less painful than even the usual venous blood done on me by numerous doctors and phlebotomists in the past.

Following this letter I wrote in my WhatsApp chat, a friend of mine, Dato Dr Ong Eng Leong asked me this  question in pink below:


“Dear Prof JB Lim, glad to know that you are progressing well and wishing you complete recovery from your surgery. With the discrepancy you described, should not a correlation be being done between oximeter readings and chemical results before the oximeter is commercialised? Now, many are relying on portable oximeters available in the market”.

Here’s my reply to Dr Ong in dark green:


I agree with you Dr Ong that there must be some correlation studies that need to be done between chemical analysis and oximetry readings before commercializing of all these products

As I am also a qualified analytical food quality control chemist, besides being a clinician and nutritionist among others,  sometimes in chemical analysis, we can go down to  levels of just a few parts per million (ppm) or even to the tune of a few parts per billion (ppb) depending on the analytical instruments we use, especially the wide range of spectrophotometers, from infra-red and far infra-red spectrometers to Nuclear Magnetic Resonance (NMR) spectroscopy available to us, and also the analytical procedures we chose to adopt.

Analytical results even if repeated should not be more than 1 %, at most 2 % error even if we use a recovery method to get back a known weight of a sample we added into the original analytical sample

But the error of the oximetry method varies as large as 5 % which is a huge difference between just a few parts per million within the same series of measurements.. For us as analytical chemists and quality controllers, how do they expect us to accept such large variations?

What these medical doctors, nurses and laboratory technicians can do is to do a series of measurements using their oximeters and match them with the far more accurate chemical  analysis done in the laboratory to calculate out  the correction factor as I have already explained in my original letter above to be added  to all oximetry readings. But this was not done.

The advantage with oximeters even those match box size ones sold in the pharmacy, is that, they are so extremely easy and fast to use by anyone, and they are not wet and messy at all so unlike in wet analytical chemistry where the analyst must also be trained and qualified under the Chemist Act  1975 before he can sign the analytical lab report, not to say chemical analytical  procedures are far more time consuming and far, far more expensive to conduct.

So which one do we want? the Devil or the Deep Blue Sea?

But I think clinicians should not be too fussy about objective measurements but they should also look subjectively and clinically at the patients. In other words, they must have good clinical judgement instead of relying on all these lab tests all the time!

For instance they kept me a day  longer than necessary just because the hospital's large ward vital signs monitors with oximeters showed my pSO2 level was consistently "only" 92 %  when my arterial blood oxygen content was 98 %  taken twice within 3 hours.

Furthermore, I was fully alert, not cyanotic, dyspnoea (SOB) or short of breath, hypoxic, dizzy and so on. These are clinical judgments I would use and as a clinician myself..

But they insisted on the arterial blood analysis that took some hours instead of seconds with an easy to use oximeter

My feeling is, a good clinician is a good diagnostician, and a good clinical observer of the patient's conditions, and his wellbeing

He should not rely  solely on lab tests alone unless the case is complicated by other underlying morbidities where the presentations (signs and symptoms) are not so clear cut.

In that case we need lab support to differential diagnose with medical history being one of the first and most important diagnostic tools.

Maybe this is why doctors here in KL Hospital and in all government hospitals big or small, doctors  spend most of their time writing and clerking tons and tons of clinical notes on medical history and other findings leaving the nurses to do most of the basic clinical work such as taking blood pressure, body temperature, do the pO2 readings, do the dressing, set up the IV drips and monitor them, give injections, give medicine regularly, cleanse the patient, do the bedding, attend to patients’ needs, measure and monitor the fluid input and output.

Nurses do all this clinical work far more than doctors do. So, I always wonder why medical doctors and not the nurses called clinicians? In some countries nurses are appropriately called ‘nurse clinicians’

Of course there are other doctors such as anaesthesiologists, surgeons, obstetricians, and gynaecologists who do a lot of hand-on clinical work, while I think physicians do the least.

So we leave them as they are. That's why clinicians hardly win the Nobel Prize in medicine which is the world's most glamorous and prestigious Prize in health care that I wrote here:

“Nobel Prizes in Medicine: Are Clinicians out of Fashion?”

https://scientificlogic.blogspot.com/2023/

Thank you, Dr Ong, for your comment and question.

Kind regards,

Lim ju boo

 

Wednesday, October 4, 2023

Nobel Prizes in Medicine: Are Clinicians out of Fashion?”


 

 BBC News - Nobel Prize goes to scientists behind mRNA Covid vaccines.

 

https://www.bbc.co.uk/news/health-66983060

 

The Nobel Prize in medicine or in physiology this year in 2023 again goes to two scientists rather than to a clinician. A clinician is a medical doctor with a basic bachelor’s degree in medicine. He or she may also be a medical specialist. But clinicians are not a doctorate holder unless they have a PhD in addition to their basic bachelor’s medical degree. 

 

Sometimes I wonder why medical doctors are called clinicians because they hardly do much clinical work except take medical history, prescribe the medicine, and request for laboratory tests which are done 100 % by medical scientists and laboratory technologists. Even radiologists mainly read the results of imaging and send the results back to the clinician. The term ‘clinician’ means anyone who does a lot of clinical work. A nurse who does most of the clinical work such as take the temperature and blood pressure of the patient, clean and dress up the wounds of patients, set up and monitor the IV drips, give the injections and other medication, measure their fluid intake and urine output…etc, etc. is actually a clinician. She does far more clinical work than a medical doctor who merely takes medical history, does a few clinical examinations, requests for blood, urine, haematological or radiological examinations, and prescribes the medicine.  

 

 All these are clinical work a nurse does which the medical doctor hardly does.

 

The only exceptions are the surgeons and anaesthesiologists who do a lot of clinical work on their patients. They together with the nurses are the real clinicians rather than a medical doctor who merely takes medical history, auscultate (listen) to the lungs for crepitation, rhonchi, plural rubs, etc, or listens to the heart for murmurs with a stethoscope.

 

It is what we do most that we are given a name of such an activity such as one who drives a car is a motorist, one who rides a bicycle, a cyclist, one who plays the violin or a piano a violinist, pianist, one who swims, a swimmer…and so on and on. So, in my opinion a nurse who does most of this clinical work should be called a clinician rather than a medical doctor who hardly does any of these jobs a nurse does.

 

Anyway whatever it is, a group of clinicians from the Royal Society of Medicine in London then wrote this in dismay:

 

“Nobel Prizes in Medicine: Are clinicians out of fashion?”

 

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3164255/

 

I learned about this grievance from clinicians more than 25 years ago. They were my Fellow colleagues at the 200-year-old Royal Society of Medicine in London (RSM).

 

I was admitted as a Fellow of Royal Society of Medicine in London in 1993 just one year before my retirement. 

 

I supposed the Royal Society of Medicine admitted me as a Fellow as a consolation “honour’ just before my retirement, and as a 'gift' to me for retirement after toiling for 25 years in medical research as a Senior Medical Research Officer at the Institute for Medical Research in Malaysia.  

 

It is very sad to learn that since 1927 most of the Nobel Prizes in Medicine or in Physiology were not awarded to clinicians, but to medical scientists.  Most of the medical scientists are in research who are frontiers of new medical discoveries as they plough into the unknown looking for new knowledge in medicine.


Anything the clinician uses today whether for diagnosis or for treatment does not belong to them. For instance, X-rays was discovered by physicist W.C. Rontgen in December 1895 after seven weeks of assiduous work during which he had studied the properties of this new type of radiation able to go through screens of notable thickness. The ultrasound machine was the invention of engineer Tom Brown who gave it to Ian Donald, an obstetrician to use for the first time.  Similarly, it was Godfrey Hounsfield, a biomedical engineer who contributed enormously towards the diagnosis of neurological and other disorders by his invention of the computed axial tomography scan for which he was awarded the Nobel Prize in 1979. It was never invented by a clinician or a medical doctor. The only medical doctor who was also a physicist was Raymond Damadian, MD.  He invented the first magnetic resonance imaging and built the first magnetic resonance imaging (MRI) scanner, which revolutionized the ability to diagnose cancer and other illnesses. Unfortunately, he never got the Nobel Prize in medicine. It went to two other scientists, Paul Christian Lauterbur (May 6, 1929 – March 27, 2007) an American chemist who shared the Nobel Prize in Physiology or Medicine in 2003 with Peter Mansfield for his work which made the development of magnetic resonance imaging (MRI) possible.


Similarly, it was Willem Einthoven (1860 - 1927) was a Dutch physician and physiologist who invented the first practical electrocardiogram (ECG) in 1901. Einthoven began his studies of the ECG with the mercury capillary electrometer and improved its distortion mathematically, so that he was finally able to register a good representation of the ECG before the beginning of the twentieth century. He later further improved ECG recordings with the introduction of a string galvanometer that he designed. For this, Dr Einthoven won the Nobel Prize in Physiology or Medicine in 1924.


The same with all the drugs clinicians use. They were not discovered or invented by them. The development of pharmaceuticals is the brainchild of pharmaceutical chemists, biochemists, biotechnologists, immunologists, microbiologists, pharmacologists, physiologists, toxicologists, cellular biologists, and other biological scientists.


 After developing the drugs, they put them on various stages of animal and clinical trials before marketing them to the clinicians who merely use them on their patients after the manufacturers teach the clinicians about their pharmacology, pharmacodynamics, pharmacokinetics, indications, dosage, etc and how to use them.  These drugs were never invented by clinicians. The pharmaceutical companies merely sell them to the clinicians who in turn prescribe them to their patients.

 

 

If all these products from discoveries and inventions by other scientists, from blood tests to imaging were withdrawn from the clinician, then they will be at a loss how to diagnose and how to treat except taking medical history, writing clinical notes and listening to heart beats and doing other physical examinations. There is nothing to give, no medicine, no drugs, no treatment, nothing to give or how to treat.


Thus, "the proportion of clinicians receiving this award has been diminishing year on year. In the past 100 years of awards to medical scientists (excluding war periods where Nobel Prizes were not awarded to individuals); over 79% of Nobel Prizes in the first 30 years were awarded to clinicians. This contrasts significantly with the last 30 years, where only 26% of prizes have been awarded to clinicians between 1970 - 1979.

 

Before and after 1979, almost all the Nobel Prizes in Medicine or in Physiology were given to non-clinicians.

Normally clinicians do not make medical discoveries or contribute anything to add new knowledge in medicine. They normally do just routine work in hospitals and elsewhere.

 

The Nobel Prize in Medicine is arguably the most prestigious award in the world in healthcare.

 

Although clinicians do routine work by applying the results and products of research done by their colleagues who are medical scientists, my feeling is, at least some of them should be given the Nobel Prize because clinicians have saved millions of lives throughout the world especially in emergency medicine and in critical care, an area I know well.

 

 But then there are millions of clinicians around the world, including myself, and we cannot expect the Royal Swedish Academy of Sciences for the Nobel Prizes to award such a prestigious accolade to every clinician. 

 

Then again not all diseases are life-threatening that require instant life-saving intervention. Most diseases are not. So, to whom shall among the millions of clinicians in this world should the Nobel Prize be given? 

 

Such a precious award should only be given to anyone whose discoveries and contribution in medicine has benefited and has saved millions of lives, and not just a few rare cases such as a heart or a liver transplant that benefits only a few and only the very rich can afford it who enrich the surgeon at the same time. They do not deserve the Nobel Prize no matter what the treatment that benefited only a few. It must be something simple, cheap, and very large scale such as a vaccine or a drug that could wipe out malaria or tuberculosis that are reemerging from the surface of this earth forever. So far, no scientist can do this.  

 

But I still think some clinicians who have toiled restlessly among thousands of poor communities without asking for remuneration deserve this glamourous Prize in Medicine. Very few do this. This is my feeling even though we admit 99.99 % of new diagnostic and new therapeutic approaches these days are done and contributed by medical scientists, and not by clinicians.

 

I think one of the reasons is, modern medicine these days focuses not just on traditionally gross broad aspect of medicine, using standard type of diagnosis and treatment as we see in all hospitals and health-care centres, but scientists help the clinician to go into molecular levels of medicine such as, using nanotechnology and nanoscience to manage SARS-CoV-2 and their variants, epigenetics and stem cell therapy in the management of diseases, gene-targeted diagnostics and therapies for cancer and metabolic liver disorders.

 

Other areas of medicine that require an understanding of molecular biology are cell therapy, gene virotherapy, molecular mechanisms of immune response, molecular mechanisms in neurodegeneration, molecular medicine in cancer treatment, molecular medicine for cardiology, molecular microbes, and disease. 

 

These areas are very tough to understand. These areas are beyond the understanding and knowledge of ordinary clinicians who normally only have a basic bachelor’s degree in medicine and surgery unless they also have an additional doctorate (PhD) in that area.

 

Another possible reason why such a prestigious Nobel Prize in medicine is not given to medical doctors is because they can earn more money by charging their patients huge frees than if they go into research that earns them nothing except their names goes eternally into the annals of medicine when they are able to publish a paper where their names are cited again and again by other researchers and clinicians. This too is quite a glory and achievement. Whereas even if a clinician sees 100,000 patients in his lifetime, it is just his routine job, and he is paid for each case he sees. In what way should he be given a Nobel Prize for this? Even if a clinician treated 100,000 cases in his life, his name will not be quoted or cited, and his name will not go into the archives of medicine. Only those researchers who have published papers to contribute to new knowledge in medicine will find their names quoted in the archives of medicine. 

 

Furthermore, for a clinician to go into research he or she must have that kind of special brain to be able to think critically and analytically on their own out of the box and outside textbook knowledge. Additionally, they need to source for research funds which sometimes can be quite difficult outside their fixed salaries. They also need to find 3 -6 research collaborators outside their expertise. To get them who are willing to cooperate is hard to come by. Hence it is far more lucrative for clinicians to go into private practice and charge patients exorbitantly.

  

This may explain why most of the Nobel Prizes in medicine since pre-war go are not given to clinicians, but to western scientists who normally may not even have a medical degree although most do.

 

But what is most important is, they must have a PhD. That alone suffices. They use their basic understanding of medicine as a springboard and bounce high up into new areas in diagnostics, treatment and in preventive medicine, and give the results of their work and discoveries to the clinician to apply on patients. Such a scenario will eliminate all clinicians from getting any Nobel Prize in medicine or in physiology.  


With a PhD these scientists can think out of the box. They then plough into new areas of medicine and made significant discoveries which earned them the much sought after, highly esteemed, and glamourous Nobel Prize.

 

See my additional view here, published on Sunday, July 9, 202

 

lim ju boo

 

Nutrition: Health Benefits of Yogurt. Food Science: How to Make Yogurt

  Milk and eggs contain proteins that have the highest biological values nutritionists would tell you. Milk for instance is the only food ...