But most of the time, I have difficulty understanding the kinds of stuff in the book. Though the content is fine, and I end up asking questions on cs.StackExchange or StackOverflow. Then I thought of reading the reviews of the text and find out whether it is only me who is facing is the problem or there are people like me as well, finding the language hard to understand. Then I came across the review page of this book in good reads and among many reviews, I found one which explains the problem with the text probably. Here it is:
The good thing about this book is it is comprehensive and covers a lot of ground from different vantage points. For example, the parsing chapters also cover the design of lex and yacc apart from basic topics. The type checking and runtime environment chapters had examples from C and ML to cover different scenarios.
Now the bad parts. The book is awfully written and I cannot understand how can a book in its 2nd (or 3rd?) edition continue to be so bad. It was shoddy writing throughout but the prize for the worst writing is these topics:
Another consistent problem is their horrible use of passive voice. This interacts really badly with their discussion of mathematical aspects. While the use of passive voice might be excusable, its shoddy use is not.
Reading this book often felt like deciphering some ancient text. I needed to translate a lot of obtuse writing into something that can be understood in my head. This translation happens for all readings involuntarily but it was alarming I had to consciously do it for this book.
I am not sure why is this book called a "classic" computer science book. It is more like a kind of book that students are forced to study by their professors. This book is impossible to read unless you are following some other source. I followed Prof. Alex Aiken's lectures
This book might be responsible for the unpopularity of the study and research in programming languages. I agree compilers are not easy but this book has taken a difficult topic and made it incomprehensible thereby doing a disservice to this whole field of study.
I am not criticizing the text or authors. The only thing is that probably the text does not suit a self-learner like myself. Could anyone suggest a compiler text to substitute the red dragon book possibly? A book with easy language, such that one easily understand what the authors tried to mean just by reading a sentence once. This would help a self-learner to focus more on the concept rather than getting lost in finding out the meaning behind the lines.
The "problem" with the dragon book is that it is so complete; intentionally so. Over its lifetime there have been tremendous advances in the theory and practice of building compilers. If you want "state of the art", then you want the dragon book, probably a later (harder) edition, even.
But it is a bit much, as you have seen, for a first course, or an overview of compiling. It is often used in graduate courses, for example, and, even then, in a follow-up course to one in Language Principles which discusses much of the underlying theory separately.
While it is hard to find and a bit expensive, one of the best books for an introduction, and separately a bit of real literature in CS, get a copy of Brinch Hansen's On Pascal Compilers. It shows how to build a compiler as well as a machine to execute the code using Recursive Descent Parsing, which is easy to understand and suitable for LL languages (with a few extensions for non-LL features).
Most commercial compilers use LR compiling techniques even if the language is LL, simply for efficiency reasons, but the theory is much harder as are the compilers. For state-of-the-art understanding, you need to delve into LR, but for a first introduction, LL is enough and easy to understand.
LL compiling is also called Predictive Parsing, since it is easy to look at a language statement and choose a method for translating it, usually using keywords. The suitable languages for LL parsing are also easy for a human to read and understand. If a language (feature) absolutely demands LR parsing, then it will be difficult for a person to read.
But I did had the same complains from the students, then one of them (Hussain Hassan Mehanna, he's probably a prof in UK now) told me that his friend in AUC uses a book called "Compiler Construction, Principles & Practice" by Kenneth C. LoudenI brought it, the faculty agreed, the students liked it; ..... it was fine
Computational thinking, which centers around devising abstractions for problems so they can be solved with computational steps and efficient algorithms, is a concept that serves not only computer science (CS) but progressively more of science and everyday life. After Jeannette Wing published her influential paper on computational thinking,50 the discussion of computational thinking has greatly expanded in scale to include topics such as modeling natural processes as information processing.18
At the heart of computational thinking is abstraction, the primary subject of this essay. While abstractions have always been an essential part of computer science, the modern emphasis on computational thinking highlights their importance when teaching CS to a broad audience.
Abstractions play an important role in shaping virtually every field of science. However, in computer science, abstractions are not tethered to a physical reality, so we find useful abstractions pervading the field. For example, we shall encounter in Section 4 a vital abstraction from physics: quantum mechanics. There is a derived computing abstraction called quantum circuits, which starts with the physical ideas, but has led to programming languages for its simulation, to theoretical algorithms that exploit its unique capabilities and that one day may be implemented on a large-scale machine.
It may be that Abstraction 2 is sufficiently close to the machine that it has meaningful performance metrics. If so, Abstraction 1 can inherit those metrics to provide a notion of quality for algorithms written in Abstraction 1. However, we should remember that the high-level abstraction usually can be implemented in several different lower-level abstractions. Each of these abstractions may provide a radically different notion of running time or other metrics and thus imply different notions of algorithm goodness at the high level.
1.2. A taxonomy for abstractions. We can identify at least four different kinds of abstractions that are distinguished by their intended purpose. In the discussions that form the body of this paper, we shall give examples of each and explore their interactions.
1.2.1. Fundamental abstractions. Like all abstractions, these consist of a data model and operations. These abstractions are often thought of as abstract data types31,46 as implemented in object-oriented programming languages. However, fundamental abstractions do not have a specific implementation for the operations or a specific data structure to represent their data. One can also liken these abstractions to interfaces as in Java,48 but unlike the interface, these abstractions have an intended meaning for their operations, not just a name for the operations.
There are actually two somewhat distinct purposes for studying fundamental abstractions. In some cases, they represent common operations that are worth studying in their own right and which have a variety of approaches to implementation. For instance, we discuss the dictionary (a set with operations insert, delete, and lookup) in Section 1.4. Other examples of this type include stacks, queues, priority queues, and a number of other abstractions that we have cataloged in Aho et al.7 and Aho and Ullman.10
1.2.3. Declarative abstractions. One of the most important uses of abstractions is to foster a style of programming where you say what you want done, but not how to do it. Thus, we find many different abstractions that consist of a data model and a programming language that is at a higher level than that of conventional languages; often these languages are algebras of some sort. Examples are regular expressions, to be discussed in Section 2.1, and relational algebra, which we mention in Section 3. Context-free grammars (Section 2.2) are another example of this type of abstraction, although not strictly algebraic.
The special characteristic of these abstractions is that their compilation requires a serious degree of optimization. Unlike optimization of conventional languages, where you are happy to speed up running time on one machine by a factor of two, for these abstractions there can be orders of magnitude difference between the running times of good and bad implementations. Another characteristic is that the programming language for a declarative abstraction is not Turing-complete. Undecidability properties of any Turing-complete language would preclude the existence of an optimizer that deals effectively and generally with what a program wants to do without being told how to do it.
1.2.4. Computational abstractions. In contrast to the abstract implementations, these abstractions are close to a physically implemented machine. That is, no one would build a machine for the sole purpose of implementing an abstract implementation, but one normally implements a computational abstraction or something to which it is easily translated. Thus, computational abstractions offer meaningful performance metrics, even if they are not 100% accurate.
You are probably familiar with a number of these abstractions, since they include all the common programming languages as well as machine instruction sets. Other abstractions of this type are more theoretical, such as the random-access machine (RAM) model19 or the parallel-RAM (PRAM) model.20 Here, we shall talk in Section 1.7 of a model of a conventional machine that emphasizes the role of secondary storage. We shall also discuss abstractions for parallel computation: bulk-synchronous in Section 3.5 and MapReduce in Section 3.6.
64591212e2