Bit
( 2014-11-09)
Slide version
Readings
- “American Standard Code for Information Interchange ASA Standard X3.4-1963.” 1963. Web.
Link (remote) →
- “Characters: A Brief Introduction.” The Linux Information Project 2004. Web.
Link (remote) →
- “Digital, n. And Adj.” OED Online Mar. 2014. Web.
Link (remote) →
- Britton, Jill. “Binary Numbers: Number Representations And Conversions in Binary.” 2014. Web.
Link (remote) →
- Bryant, Randal E., and David O’Hallaron. “A Tour of Computer Systems.” Computer Systems: A Programmer’s Perspective. 2nd ed. Boston: Prentice Hall, 2011. 1–23. Web.
Link (remote) →
- ---. “Representing And Manipulating Information.” Computer Systems: A Programmer’s Perspective. 2nd ed. Boston: Prentice Hall, 2011. 25–120. Web.
Link (remote) →
- Ceruzzi, Paul E. “Introduction.” Computing: A Concise History. Cambridge, MA: MIT Press, 2012. ix–xvi. Web.
Link (remote) →
- Culkin, Jody. “Counting in Binary.” 2010. Web.
Link (remote) →
- Fischer, Eric. The Evolution of Character Codes, 1874-1968. 2000.
Link (remote) →
- Haugeland, John. “Analog And Analog.” Philosophical Topics 12.1 (1981): n. pag. Web.
Link (remote) →
- Laue, Andrea. “How The Computer Works.” A Companion to Digital Humanities. Ed. by Susan Schreibman, Ray Siemens, and John Unsworth. Oxford: Blackwell, 2004. Web.
Link (remote) →
- Lavagnino, John. “Digital And Analog Texts.” A Companion to Digital Humanities. Ed. by Susan Schreibman, Ray Siemens, and John Unsworth. Oxford: Blackwell, 2004. Web.
Link (remote) →
- Sammet, Jean E. “Symbolic Assembly Language Programming.” Programming Languages: History And Fundamentals. Englewood Cliffs, N.J.: Prentice-Hall, 1969. 2–3. Print.
Link (local PDF) ↬
- ---. “Machine Language Programming.” Programming Languages: History And Fundamentals. Englewood Cliffs, N.J.: Prentice-Hall, 1969. 2. Print.
Link (local PDF) ↬
- Searle, Steven J. “A Brief History of Character Codes in North America, Europe, And East Asia.” 2004. Web.
Link (remote) →
- Wittern, Christian. “Character Encoding.” A Companion to Digital Humanities. Ed. by Susan Schreibman, Ray Siemens, and John Unsworth. Oxford: Blackwell, 2004. Web.
Link (remote) →
Digital: etymology and history
- OED: digital, n. and adj.
- English digital < classical Latin digitalis, “measuring a finger’s breadth”
- In post-classical Latin, a broader meaning: “of or relating to the finger” (digitus, finger)
- Also used to describe the whole numbers from 1-9 (“single digits”)
- In domain of electronics and computing, used to denote representation by discrete values, usually in binary form
- In domain of electronics and computing, used in contrast with analog
- “Digital paradigm” (Ceruzzi)
- “…the notion of coding information, computation, and control in binary form, that is, a number system that uses only two symbols, 1 and 0, instead of the more familiar decimal system that human beings, with their ten fingers, have used for millennia” (Ceruzzi x)
- “It is not just the use of binary arithmetic, but also the use of binary logic to control machinery and encode instructions for devices, and of binary codes to transmit information” (Ceruzzi xi)
- Catachrestic and/or ideological usage
Binary numeration
- Binary digit or “bit”
- Find an ordinary “toggle” light switch (not a dimmer)
- Flip the switch. Is the lamp on, or is it off?
- Flip the switch again. Is the lamp on, or is it off?
- Flip the switch one more time. Is the lamp on, or is it off?
- The lamp controlled by the switch has only two states
- Illustration (navigate forward to second panel titled “On and Off”)
- A device with two states can represent a binary digit with only two values
- Counting in binary
- A binary number consists of groups of bits
- Illustration (navigate forward to third panel titled “Groups of bits”)
Bit storage: history
Binary representation
- Binary representation as “lowest layer” of abstraction
- “All information in a system — including disk files, programs stored in memory, user data stored in memory, and data transferred across a network — is represented as a bunch of bits” (Bryant and O’Hallaron)
- Bits are grouped into data objects, providing context for their interpretation (Bryant and O’Hallaron)
- Bit and byte
- A binary number has a bit length
- In computing, a standard byte is a grouping of eight bits, or bit string
Memory addressing
- Virtual memory
- An abstraction of physical memory capacity in RAM, disk storage, other hardware, and OS operations
- Represented by a page table
- Byte addressing and word size
- The “smallest addressable unit of memory” is not the bit, but the byte (Bryant and O’Hallaron)
- In the same context, the largest addressable unit of memory is the word
- First of the transpositions of linguistic concepts that we will encounter in the history of computing
- 32-bit computer systems have a word size of 4 bytes
- 64-bit computer systems have a word size of 8 bytes
- Different data types are allocated different quantities of memory
- An ASCII alphanumeric character is allocated one byte
- Integers and floating point numbers might be allocated two or four bytes
Character display
- Binary representation of human writing systems
- “What makes personal computers useful to the majority of people is not that they can process numerical data […] but that they can process textual data” (Searle)
- “[T]here are many people who are unaware of the fact that to a computer textual data are also numerical data. In modern computer systems, the individual characters of the scripts that humans use to record and transmit their languages are encoded in the form of binary numerical codes, just as are the Arabic numerals used in calculation programs” (Searle)
- “This is because the circuitry of the microprocessor that lies at the heart of a modern computer system can only do two things — calculate binary arithmetic operations and perform Boolean (i.e., true or false) logic operations.” (Searle)
- Character display is not inscription
- The letter “A” is not impressed onto your screen
- Nor is it impressed into a storage medium
- “…when a personal computer records the letter ‘A’ onto a floppy disk, for instance, it does not create an image of the letter ‘A’ with tiny magnetic dots, rather it records a binary number (made up of zeroes and ones) that represents the letter ‘A’ in a character code table” (Searle)
- Character display process
- Hardware keyboard provides options for input
- Each hardware key is assigned an internal numeric key code
- That key code is assigned to a specific character in a specific character set and its international code page
- That character is displayed on the screen in a particular display font
Character encoding: Babel
As long as the processing of information from end to end occurs only in a single machine, there is no need for a standardized character encoding. Early computers up to the beginning of the 1960s thus simply used whatever ad-hoc convention to represent characters internally seemed appropriate; some distinguished upper- and lowercase letters, most did not. (Wittern)
Character encoding: ASCII
- Illustration: USASCII code chart
- A standard binary encoding of 95 written symbols in U.S. English
- Numerals
0
-9
- Letters
a
–z
and A
–Z
- Basic punctuation symbols
- Plus 33 non-printing control characters
- ASCII provides a one-byte (8-bit) encoding for each character
- A byte (8 bits) is represented as an eight-bit binary number
Binary and hexadecimal notation
Symbolic assembly language
- Another situation in which the “verbosity” of binary notation was a problem: early computer programming
-
A microprocessor (hardware) instruction set is represented in a “machine language” of (here, 6-bit) binary notation:
011011 000000 000000 000000 000001 000000
+----+ +--------------------------------+
^data
^instruction
(Sammet 2)
-
The same expression, using an alphanumeric mnemonic code for the instruction (example from Sammet, “Symbolic Assembly Language Programming”):
CLA 000000 000000 000000 000001 000000
+----+ +--------------------------------+
^data
^instruction
(Sammet 2)
- Systems of such mnemonic codes came to be called symbolic assembly language, the earliest forms of what we now call “programming languages”