When we talk about digitization, what we’re really talking about is storing information on computers. Digitization, at least as far as Kodak is concerned, is taking old fashioned, analog media--home movies, pictures, and slides--and putting them into a digital format.
Digitization might seem a bit complicated, and that’s because it is.
In truth, humans have changed the way that we store information since the beginning of time. Cave people painted cave walls to depicts hunts, which was basically a very primitive form of data storage. Then we invented paper, pencils, and ballpoint pens. Nowadays, most new information is created on a computer and stored on a hard drive somewhere. That’s really what digitization is all about. Because digitization and computers go hand-in-hand, their evolution basically mirrors one another.
The Timeline
1890 - A dude named Herman Holerith, founder of a company that would become IBM, invents a punch card system to tabulate the US Census. Converting real life stuff into something different is about the most simplistic idea of a computer that there is.
1936 - Alan Turing invents a giant calculating machine used to help cipher Nazi communications. Called the Turing Machine, it’s the first device that could calculate huge, seemingly impossibly calculations.
1937 - Claude Shannon wrote a thesis on binary code, a way of representing words and numbers as 1s and 0s, opening up the possibility of circuit boards, storage, and processors, all of which laid the groundwork for what we call computer programs today.
1943-1944 - A pair of professors invent a giant machine that was able to calculate equations through programming. This was the first machine that allowed people to input data and get new data out without gears and belts.
1953 - A lady named Grace Hopper invents the first programming language, changing the way that data is translated and stored for forever.
1957 - The SEAC (Standards Electronic Automatic Computer) is invented as the first computer that could scan an image, store it, and recreate it into pixels.
1971 - The first Charge-Coupled Devices are invented. These are some of the first easily usable instruments that were able to take analog data (e.g. pictures) and convert them into a digital format.
1974-1977 - The first home computers start hitting the market
1975 - The first digital camera is invented. This is important, because it’s the first time that photos could be captured and stored on computer memory without having to go through film first.
1986 - A group of folks starts working on the JPEG format, a way to compress images into smaller pieces of data called bytes. Compression makes it possible to make image file sizes much smaller and, by extension, affordable.
2018 - Kodak offers services to people where they can send in old video tapes, photos, and slides, and convert them into a digital format.
Ok, But How Does that Fit Together?
The history of digitization is complicated. The thing to remember is that stuff in the real world has to be translated into a different form, then it has to be stored somehow until it can be translated back into a representation of the original. I know that doesn’t seem helpful, but bear with me.
When you’re looking at a picture in real life, you can see and hold the picture. Your eyes see colors and shading. Our memories can recall things from the real world just as we saw them, although we’re not sure how. Computers are different. They store and read things in huge, complex codes. When you’re digitizing something, it has to be scanned, translated into the computer’s language, and stored. The first computers used binary code--just 1s and 0s--to simplify how data needed to be input and how it could be output. For our example, a picture would go in, then using binary, 1s and 0s replace colors and shades. Eventually, binary got more and more sophisticated, was used to create entire programming languages, and now computers are basically magical wizards that read millions of 1s and 0s to show you pictures on a screen.