From Chalk and Log Tables to Supercomputers: A Personal and Global Journey through the Evolution of Computing
A Personal Prelude: When Arithmetic Was Manual and the Brain Was the Only Processor
I still remember vividly the early days of my primary school education in Batu Pahat High School - a small town where I was born and brought up in Johor, Malaya then, now called Malaysia.
Long before anyone had heard of artificial intelligence or supercomputers, we relied solely on our memory and manual skills to perform even the simplest arithmetic. We were first taught addition and subtraction, then graduated to multiplication and division, all performed on paper or mentally. The multiplication table up to 12 times was a sacred tool, always printed on the back cover of our school exercise books, a humble yet powerful guide that we often memorised by heart.
As I progressed into lower forms in secondary school, mathematics became richer and more abstract: algebra, geometry, and trigonometry entered the curriculum. We began using logarithmic tables, printed in little blue books, to simplify multiplication and division of large numbers, an early form of computational aid. These tables were our analog calculators, and we took pride in being able to manipulate them with skill and understanding.
In Form 5 still in Batu Pahat High School and then in Singapore where I did my GCE A Levels in Science, we met calculus, a discipline that demanded a deeper understanding of limits, rates of change, and areas under curves. Every equation, every integral, and every derivative was solved painstakingly by hand. There were no shortcuts, only sheer intellectual effort and perseverance.
The Slide Rule Era: Precision by Hand
During my postgraduate studies at the University of London and later the University of Reading, I relied heavily on the slide rule, a fascinating analog instrument that allowed for multiplication, division, roots, logarithms, and even trigonometric functions. To us, it was an engineer’s or scientist’s portable computer. No batteries, no buttons, just a calibrated ruler and the sharpness of the human mind. The slide rule was a marvel of its time, a symbol of logic, mathematics, and manual precision.
The Dawn of Mechanical Calculators and Room-Sized Computers
By the late 1960s, when I began working professionally, I encountered the mechanical calculating machine, a heavy, clunky device with levers, gears, and rotating drums. We would crank or punch keys to perform basic arithmetic, and it was seen as a huge time-saver compared to manual calculations.
At about the same time, the first computers arrived at workplaces and universities. But they were not the sleek machines we know today. These were massive mainframe computers, often occupying an entire room, which had to be kept cold with four or more industrial air conditioners to prevent overheating, else it wont work. I remember we only have one computer stored in the Computer Room and for us to use the computer in turn. We have to book for 'computer time' in advance. The room was so cold that we literally had to wear winter clothings inside working in the computer room for hours. Their computing power was minuscule by today’s standards, yet they were revolutionary.
Using them was not straightforward. We had to learn computer languages such as BASIC, FORTRAN, and COBOL that were totally new to me. I had to attend courses for this, run by the Institute for Medical Research where I was working. Programming required punch cards and batch processing, nothing was instantaneous. I submitted my program and waited hours, sometimes days, for results. But for all their limitations, these early computers signaled a shift in how we processed information. They were the forerunners of everything that was to come.
Pocket Revolution: The Rise of the Electronic Calculator
Soon, the world witnessed the birth of the electronic pocket calculator, a compact device that could instantly compute addition, subtraction, multiplication, and division. Initially expensive and rare, they soon became ubiquitous. Later versions could handle trigonometric functions, logarithms, complex algebra, and even calculus. Some advanced models allowed for simple programming, making them portable computational powerhouses for students, engineers, and scientists alike. I remember when the first electronic calculator was available in the market, I would buy several of them, changing each version as they became more and more advanced within short periods with scientific features in them, later many of them were able to do complex mathematical calculations including calculus, and statistical calculations. As new versions became available that can be programmed, I would buy them to replace the older ones. I would buy a few of them for mathematical calculations for astronomy or physics as a hobby, or I would use them to analyze research statistical data gathered professionally for my medical research
This was a silent revolution. With a calculator in our pocket, many of us felt we held in our hands the equivalent of what once required entire rooms.
The Microcomputer Revolution: Personal Computers in Every Home
The 1980s and 1990s saw the arrival of personal computers (PCs). Thanks to visionaries like Steve Jobs and Bill Gates, computers became increasingly affordable and accessible. No longer the domain of large institutions, ordinary people could now own and operate a computer from their homes. These machines used user-friendly operating systems like MS-DOS and later Windows and Mac OS.
Software flourished. Word processors, spreadsheets, and early internet browsers transformed how we worked, communicated, and learned. Graphical interfaces replaced command-line programming, and the world entered the digital age.
The Rise of the Internet and Cloud Computing
The late 1990s and early 2000s brought about the internet revolution. Computers were no longer isolated machines but part of a vast global network. Information, once locked in libraries or behind closed doors, became available to anyone with a connection. Email, websites, social media, and cloud storage redefined how we interact with knowledge and one another.
Cloud computing further removed the need for physical hardware. You could now access software, processing power, and storage over the internet. Computation became decentralized and borderless.
Harnessing the Titans of Thought: Applications of Supercomputers in Modern Science and Society
Parallel to these public revolutions, another quieter but more powerful transformation unfolded, the rise of the supercomputer. These colossal machines, with thousands or even millions of processing cores, are designed to solve the most complex, data-intensive, and iterative problems across the sciences and industry.
In the realm of quantum mechanics, supercomputers are essential for solving the time-independent Schrodinger equation in multi-particle systems. Using iterative numerical methods such as density functional theory or variational techniques, these calculations simulate electronic structures and energy states with high accuracy. This enables breakthroughs in materials science, quantum chemistry, and nanotechnology.
Modern supercomputers, like Fugaku (Japan), Frontier (USA), and LUMI (Europe), operate at speeds measured in exaflops, a quintillion (10^18) operations per second. To compare, a basic calculator might perform a few thousand operations per second.
Supercomputers vs. Ordinary Computers
Feature | Ordinary Computer | Supercomputer |
---|---|---|
Processing Speed | GHz-level (billions of operations/sec) | Exaflop-level (quintillions/sec) |
Number of Cores | 4 to 32 | Hundreds of thousands to millions |
Usage | Everyday tasks (email, documents) | Complex simulations, AI training, climate models |
Storage | GB to TB | Petabytes |
Accessibility | Personal, business, education | Research labs, government, elite institutions |
Cost | Hundreds to thousands (USD) | Hundreds of millions (USD) |
In astrophysics, supercomputers are used to simulate nuclear fusion reactions at the heart of the Sun. The proton-proton chain, where hydrogen nuclei fuse into helium, involves quantum tunneling, plasma physics, and weak nuclear interactions, processes that demand enormous computing power to model precisely. These simulations help scientists understand stellar lifecycles, solar dynamics, and neutrino production. These calculations requires mathematical iterations that requires superspeed as the data changes
Supercomputers also power climate modeling, integrating atmospheric, oceanic, and terrestrial data into massive, long-term simulations. These models project global warming trends, sea-level rise, and extreme weather events, and guide climate policy and environmental strategies.
In medicine and molecular biology, supercomputers simulate protein folding, drug-receptor binding, and enzyme kinetics using molecular dynamics and hybrid QM/MM models. This accelerates drug discovery, enabling virtual screening of compounds and optimizing them before clinical trials, saving both time and lives.
The rise of artificial intelligence has further amplified the role of supercomputers. Training deep learning models, such as those used in language processing, computer vision, and autonomous driving, involves billions of parameters and massive datasets. Supercomputers enable the iterative optimization processes that underlie machine learning and AI development.
In high-energy physics, supercomputers analyze experimental data from particle colliders, model quark-gluon plasmas, and simulate conditions just moments after the Big Bang. They help physicists decode the behavior of subatomic particles, contributing to the discovery of entities like the Higgs boson.
Engineers use supercomputers to run computational fluid dynamics (CFD) simulations for designing aircraft, rockets, vehicles, and buildings. These simulations predict airflow, turbulence, and structural stress, enhancing safety, efficiency, and innovation in design.
In cybersecurity and cryptography, supercomputers are employed to simulate encryption algorithms, detect vulnerabilities, and analyze massive network data for potential intrusions. They are also being used to explore quantum algorithms that could one day revolutionize digital security.
Economists and governments utilize supercomputers for macroeconomic forecasting, market simulations, and risk modeling. These systems simulate the behavior of entire economies using agent-based models, factoring in consumer behavior, trade flows, and geopolitical variables.
In genomics, supercomputers have revolutionized DNA sequencing, allowing scientists to process massive genetic datasets for understanding disease susceptibility, population migration, and personalized medicine. Epidemiologists also rely on supercomputers to simulate pandemic spread, guiding public health responses in real-time.
In every domain, from atoms to galaxies, from the genetic code to the economy, supercomputers allow us to ask and answer questions that were once far beyond our reach.
A Reflection Across Time
As I look back on my journey, from memorising multiplication tables and solving calculus problems by hand to now witnessing the dawn of AI and supercomputers, I am filled with awe. Not just at the machines themselves, but at the human mind behind them. Every generation built upon the last, combining logic, creativity, and perseverance.
From mechanical clanks to quantum entanglement, from punch cards to predictive AI, we have come a long way. And yet, we are only at the beginning.
Today, I am retired from all medical research and has gone into astronomy as my childhood marvels looking at the twinkling worlds up there in a dark opened skies