What is Code?
In the broadest definition, code is any element, symbol, signal, character, or sequence (of the preceding) that can be transformed, encoded, or decoded into another form. This text that you are reading is an example of code in this broad context. This text has thoughts and ideas encoded in it, and when you read it, you’re decoding these thoughts and ideas into your mind. Written language is likely the earliest example of code created by humans, but there exists code that existed long before humans: DNA. DNA is a molecule that contains the instructions an organism needs to develop, live and reproduce. The information encoded within it is transformed into proteins, which carry out various functions in the body. This is a fundamental process that happens in every cell and is essential for all life. Code dates back to at least the origin of life.
Today, we are living in the digital age where most people likely think of programming languages when they think of code. This is correct as each programming language is an encoding or transformation of a lower-level language, all the way down to machine code (binary - ones and zeros).
Each programming language is, itself, an encoding of a lower-level programming language. It’s also important to note that programming languages provide tools for users to encode data into various types/structures. The process of serialization/deserialization (converting complex data structures to/from a linear form that can be transmitted over a network or stored on disk) is also a type of encoding/decoding.
When a programming language is compiled or interpreted, it is transformed into machine code — a language that can be understood and executed by your computer. Transformation of data types, structures, and schemas is a very common use case for developers who use programming languages. This type of work is often referred to as plumbing in the software industry (if simply transforming data from one schema to another). It’s an excellent type of work for new developers and new hires. Some software companies have even discovered ways to automate this work via auto-generated code using templates. Transformations don’t just occur in basic
Schema A -> Schema B use cases, they occur in almost every discipline of software development and every level of complexity.
Coding TheoryCoding Theory is a branch of math and computer science that focuses on reliable encoding, transmission, and storage of information that is expressed as code. One of the core principles of Coding Theory is Error Detection and Correction -- mechanisms to automatically detect errors in the transmission and storage of code. A common example of code that has built-in error detection is the ISBN (International Standard Book Number). These principles are used in many places where code exists, including the lowest levels of computers and networks to assure reliable and efficient transmission of data.
- These error detecting and correcting transmission principles have also been detected in the human brain.
- DNA Sequencing utilizes error detection and correction at many points in its process where errors can occur, for example:
- Human errors in preparation and handling.
- Biologically naturally occurring errors.
- Errors in interfacing the biological samples with machines.
If anyone is familiar with the term Cracking the Code, they’ll also be aware that information can be hidden by encryption. Encrypted information is referred to as code or more specifically, ciphertext. Encrypting plaintext information into ciphertext is accomplished using a cipher or set of ciphers in an algorithm. A cipher is a method for performing encryption or decryption. One of the simplest and most widely known encryption methods is the Caesar Cipher, which simply offsets each letter in a text by some number.
Code is everywhere. You’re made of it. It exists in biology, literature, and certainly computer science.
Sam Malayek works in Vancouver for Amazon Web Services, and uses this space to fill in a few gaps. Opinions are his own.