Baby steps to eddy currents
It all started when I first started writing code, I was about 6 years old and my first bit of code was to draw random lines by controlling a small blot — “The Turtle” on a PTY screen using LOGO. It was all fascinating and wonderful. To be able to express the steps in my mind to solve something and then watch it being done in front of your eyes, it was exhilarating.
I discovered my passion for Programming when I was in 7th Grade. I was already good at it and knew way more than what was taught in out text-books. But 7th Grade was the time when I really started enjoying making things using a programming language. And since almost my relatives, except for my parents, had a diploma or a degree in CS, I had a nearly endless supply of reading material and endless stories to listen to.
In my journey as a CS grad, I have been influenced by many people — my Uncle, whose books I used to learn Java EE and Automata theory while still in high school, a person I met at a conference whose name unfortunately I don’t remember (in my defense he had a Nordic name), who was involved with the Linux project at a very early age contributing to the very first Ethernet drivers to the kernel, and many other great minds whose theories and algorithms we study.
Ever since I learnt to code, I was always interested in making computers understand natural language as opposed to instructions in a programming language, I was drawn to the field of Natural Language Processing. And I was largely influenced by Dr. Christoper Manning. He is a professor of Linguistics and of Computer Science and the director of Stanford Artificial Intelligence Laboratory (SAIL). His primary area of work is robust but linguistically sophisticated natural language understanding, and opportunities to use it in real-world domains. His publications include some ground-breaking work in deep learning for NLP, Universal Dependencies and dependency parsing, language learning through interaction, and reading comprehension.
I became aware of Dr. Manning after the Coursera NLP class (circa. 2012). This was also my first introduction to UNIX tools for text processing (awk, sed et al.). Then, his books on Information Retrieval that I studied as part of my University course, “Foundation of Statistical Natural Language Processing” and all his research papers and tools that I used during my undergraduate thesis like GloVe vectors, CoreNLP suite of NLP routines and the Universal Dependencies specification.
Finding a role model for computer science is difficult. Computer Science has existed only for half a century. In contrast, mathematics, physics, and chemistry and other disciplines of engineering like mechanical engineering & civil engineering are a lot older. Computer Science is like the new kid in the block. Students of mathematics or physics take great pride in scientists who have made breakthrough progress in their disciplines. We have Einstein and Hawkins in physics, and Ramanujan and Laplace in mathematics. These scientists have inspired generations of students by being their role models. In Computer Science, too, there have been great personalities. But alas, very few people can name these persons and state their contributions to Computer Science.