Photography can be dated back to 965 AD, when Ibn al-Haytham, the “father of modern optics,” made the camera obscure. With nitrate-based chemical photography well-entrenched throughout the 19th and early 20th centuries, it was not until NASA created digital technology that a new revolution in image capturing was born. Utilizing computer tomography and radio telescopes, shrunk to the size of microprocessors, the digital camera is firmly entrenched in modern and future photography.
Digital imaging has long been used by a number of departments in the United States government, especially departments concerned with intelligence gathering and espionage. The technology came into wider use in 1972, when Texas Instruments patented their film-less electronic camera. Subsequently, Kodak invented a number of solid-state image sensors that made capturing digital pictures for home and professional use feasible. Apple’s release of the QuickTake 100, in 1994, and Kodak’s DC40, in 1995, cemented the digital camera as a cultural mainstay.
Today, digital cameras are as ubiquitous as cellphones. Digital cameras are now the de facto tools for many aspiring photographers, visual artists, and even filmmakers. They are extremely portable, easy to use, and can be downloaded and modified on computers. These modifications are based on the type of editing software photographers can afford. Most software, like Adobe Photoshop, provides for simple modifications, like picture cropping and red-eye removal, but can be very cost prohibitive.
A working knowledge of photographic theory and technique is still necessary to take powerful pictures. It is necessary to consider the composition, lens aperture, and the lighting of any potential shot. While digital photography minimizes the cost involved with taking photographs, taking good ones still involves honing the artistic eye.