I am thinking of a career in Computer Programming

It was 1988 – 1989, and I was in ninth standard (ninth grade). One computer training institute under the auspices of the Archdiocese had ‘sold’ their student-computer-training-course to our school. I was one of the few students who got the money from our homes to do that introductory course. The theory session covered all the basics of computers and also the basics of GW BASIC programming language which were being taught right after school in the evenings. A handful of practical sessions were also arranged in the air-conditioned computer lab for which we had to go to their institute on holidays. I still remember the two PCs costing about Rs 100,000 each at that time. Intel 486 or 386’s ? I think it was 386s. Rs 100,000 was also the price of a new Ambassador Car at that time. So, adjusting for inflation, those PCs cost somewhere in the region of Rs 300,000 to 350,000.

We learned about Charles Babbage, Analytical Machine, valve-based computers, LSI and VLSI,  first , second, third, and fourth generation computers and the impending fifth generation. But there was no mention of Ada Lovelace.

For the practicals, we started with computer games and then ran the GW BASIC programs we learned.


20 LET A = 2

30 LET B = 3

40 LET C = 2*3


60 END

and then hit ‘RUN’

Voila 6!

From then onward PCs exploded onto the scene, and by 1996 computers were ubiquitous. At least it was like that for students. That was when I decided to learn a course in Computerized Accounting. I went to the same institute and this time, I was a grown-up. The curriculum had an overlap with PGDCA students (Post Graduate Diploma in Computer Applications), so for the theory classes I had to sit with them.

Basics of Computers, binary number system, and the like. That was when I learned that to do a subtraction in binary, you just have to swap (in the number to be subtracted) ‘all the zeros with one’ and ‘all the ones with zeros’ and do an addition. The result would be the same. That is how computers handle subtraction.

I remember Java becoming popular.  Java compiles into code that is run on the JVM (Java Virtual Machine). So it is platform independent. Whether you have Windows, Mac, Linux, or any other operating system, once JVM is installed, it will run Java. It is unlike C language, which compiles into machine code which is different for different computers.

Notwithstanding all these, I never thought that I could become a computer programmer. I was totally frustrated by the thought of communicating with a machine that knew nothing but machine code, a string of 0s and 1s. Then I decided to learn more about programming. The findings astonished me.

In the basic sense, computer programming is asking to do the computer what we want to do. If we type something onto a word processor and we don’t like the font, we change it by selecting whatever we have already typed, and then pull down the font menu; and after selecting the new font, we press ‘Enter’. The font is changed. This is using a word processor, but the idea is same. You are asking the computer to do what you want to do.

We need not reinvent the wheel. We all learn the multiplication table in school. 7 x 7 is 49. We learn it by heart.  Whenever we encounter 49 in life, we can break it down to 7 times 7 or vice versa. This is the gist of programming. All the code is broken down to simpler code, and all the simple codes are assimilated into complex programs. And ultimately, it is compiled down to binary that the computer can understand.

In the above analogy, code is already written to make a word processor. When we select what we have already typed, and then pull down the font menu; and select the new font, and press ‘Enter’ – code is executed in the background.

In the same sense, for a piece of computer program, I select a few lines of code that is already written, and modify it to suit my needs. If I write it properly, it will run smoothly. I need not worry about all the binary code that is ultimately going to get processed.

It is the quality of our human brain. We can only process a certain amount of data at the same time. We can only deal with a few things at the same time. But hundreds of people, or even an individual can use hundreds of instances to process small chunks of data. When sequentially and logically written, it becomes one big piece of software.

The problem solving approach is the right way to look at programming. You need learn to program logically, learn about computer algorithms, study about data structures and the like. I am thinking about a career in computer programming.