Tuesday, May 3, 2011

SUMMARY

The machine instruction sets themselves constituted the first generation programming languages.
Programs were conceived as sequences of machine operations, and programmers worked directly with the
hardware, often entering code in ones and zeros directly through the front panel switches. Assembly
languages, using mnemonic character strings to represent machine instructions, made up the second
generation of programming languages. Beginning with FORTRAN in 1954, third-generation languages
allowed programmers to work at a higher level, with languages that were much more independent of the
computer hardware.
Programs can be compiled or interpreted. Compilers generate machine instructions that can run
directly on the computer, independent of further availability of the compiler program. Interpreters, on the
other hand, are programs that read and execute source code a line at a time. Java is an environment that
uses both. Java source code is compiled into machine-independent bytecode, and the Java Virtual Machine
interprets the bytecode at execution. Many JVM implementations today also compile bytecode to machine
instructions.
Some languages are described as imperative, and of these we discussed procedural, object-oriented,
and scripting languages. Other languages are described as declarative, and of these we discussed functional
languages.
When designing a new language, computer scientists value execution efficiency, human readability, ease
of implementation, expressiveness, regularity, extensibility, standardization, hardware and operating system
independence, and security. It is not possible to achieve all virtues simultaneously, so language design means
making wise tradeoffs for the intended use.
Language processing programs like compilers and interpreters go through the phases of scanning, parsing,
and code generation. Scanning is also known as lexical analysis, and the output of the scanner is a stream of
tokens in the language (key words, variable names, etc.). Parsing is also known as syntactical analysis, and the
parser must verify that the stream of tokens conforms to the rules of the language grammar. The output of the
parser is a parse tree. Finally, code generation, also known as semantic analysis, consists of traversing the parse
tree from the bottom up, creating the necessary machine instructions.
Half a century into the computer age, the world of software encompasses a wide variety of general-purpose
and special-purpose languages based on formal definitions and grammars. Interpreters, compilers, virtual
machines, or all three, support the myriad programs written in these languages. The future will probably bring
further differentiation and specialization of languages and programs as computer scientists further refine their
thinking about how best to translate human intention into machine instructions.

No comments:

Post a Comment