Why didn't digital cameras show up until they did?
A different style of writing than most previous posts, I try to decipher why digital cameras didn't show up before they did, what forces made them appear, and just for fun: a look at what electronic cameras could have looked like in 1980.
This article is essentially a "well actually" to the completely unfounded idea that "Kodak had a digital camera in 1975 and chose not to make it because they wanted to sell film". This is of course untrue, like most other anecdotes are when you look into them. Ruin the joke with this one weird trick (knowledge).
Do note that I consider electronic cameras to be a better definition for this purpose, specifically we know that digital stills cameras didn't show up until the early 90's but electronic cameras were around in the 1930's as analog video cameras.
Table of Contents
Executive executive summary
Internet. Computer.
Executive summary
They existed before this but they really started becoming popular in 1995-2000. This happened because the Internet suddenly created a consumer use case for poor quality low resolution images that didn't previously exist. Further, consumer grade colour printers started to show up making home printing of images onto copy paper a thing.
Over the next decade or so the same technological advancements led to rapid improvements to image sensors and by 2002 or so they were good enough for most purposes assuming you had $10,000 to drop on a camera.
Background
So film cameras were a thing since photography existed, assuming you count glass plates as film. By 1900 or so film in a recognisable form was moderately well developed and starting to become a consumer item.
Performance was limited compared to modern film, but just making the negative bigger made up for this for professional use where cost was less important. We saw a general trend in the 60's from medium format roll-film to 35 mm cassette format film for most consumers as film quality improved, lowering cost and expanding the market.
During the 1980's autofocus point & shoots were made practical by miniature xenon flash-bulbs and newly invented T-grain based high speed colour films.
Disposable cameras started showing up around that time as well, and were still relevant until the early 2000's.
APS showed up in 1996 and was fairly popular among consumers around 2000-2002, but was very much a product that showed up 20 years later than it should have.
Electronic imaging & Vidicons
Electronic cameras didn't really become a thing until the 1950's or 1960's when television rose to prominence. These cameras initially used a variety of variously terrible camera tubes, but ended up settling on the Vidicon and various other variants by the late 60's.
These Vidicons were in theory capable of decent resolution, but were in part hampered by the ever present 1930's reaffirmed in the 50's decision that "480 lines ought to be good enough for everyone". I doubt the creators of television expected their "Standard Definition" format would last well into the 2010's or they might have allocated a bit more bandwidth. Since the only real use case for imaging tubes was TV, there wasn't much need for higher resolution models.
For a long time zoom lenses were only available for TV cameras, not for cost reasons but simply because they were the only market with standards low enough to put up with the abysmal quality of pre-computer optimised zoom lenses. For photographers zoom lenses didn't reach acceptable quality until the late 1980's.
Vidicons seem to have been somewhat susceptible to microphony, where horizontal bands appear in the image when exposed to high vibration or sound pressures. This is quite noticeable in the UK portion of the Live Aid recordings, for example.
CCD's
During the 1980's Vidicons were gradually replaced with CCD sensors, a now classic technology that didn't really die until the late 2000's. Typically a 3CCD sensor was used, where a beam splitter divided the image by colour onto red green and blue sensitive CCD's. This let broadcast grade cameras output up to 4:4:4 video in principle, which is useful for chroma keying. The CCD was a staple of prosumer "portable" video cameras by the late 1980's, sometimes 3CCD and sometimes with RGB colour dye filters.
Bayer patterns were also used in the 80's (invented 1976); I expect that performing de-bayering in mostly analog circuity presented a bit of a challenge as it usually requires some kind of memory to perform the 2-D colour matrix operations. This was likely done using clever tricks in the CCD readout circuitry.

Fun fact: CCD's are a sort of hybrid of analog and digital with discrete pixels but distinctly analog signal chain; a CCD can be configured to output an analog video signal more or less directly for this reason. You can also store an image in there for some (short) time as it works like an analog shift register, all CCD's had electronic global shutters.
Why not an electronic camera?
The idea of replacing photographic film with a video camera still is basically canonical (meaning: it's an obvious thing to do). This seems like a great idea until you start looking into what the performance figures are for even low quality film vs. what a broadcast grade video camera could do. I'll refer to pixels here even though pixels don't technically exist in analog TV; lines are a thing though so the vertical resolution of TV is knowable. Unless you include the temporal aspect of moving objects with natively interlaced sensors.
Resolution
A bad quality 35 mm film frame from a modern disposable camera has a usable resolution somewhere around 3-10 megapixels, and a good film and camera combo can probably be expected to give you 20 megapixels (I usually scan colour film at ~10-15 mp/3200 dpi).
Your broadcast TV camera generates something like 768⨉494/752⨉582 pixels, or somewhere around 0.3 megapixels. CCD's technically had more pixels (e.g. 811⨉508 & 795⨉596) but they were masked from external light and mostly acted as black level reference pixels.

At full print quality (300 dpi) this corresponds to something like 5x4 cm, making enlargements bigger than this would yield a noticeably soft image.
Do also note a peculiarity of video vs. a still image camera: digitally sampling a TV CCD to make a still image would result in a somewhat sharper image than an equivalent video-processed signal from the same sensor. This happens since while the CCD can output discrete pixels in time with a pixel clock, analog video is a continuous time process where the video has to be low pass filtered to around 4-5 MHz to fit into a broadcast TV channel's bandwidth. This has the effect of smoothing out the otherwise sharp pixels from a CCD, blurring the image horizontally. Turning it into composite color—compatible colour—often leads to further losses of sharpness unless fairly sophisticated comb filters are used to separate the luma and chroma components. Video signals are usually sharpened to improve subjective sharpness (a basic "treble" filter), leading to bright/dark edge effects clearly visible in the gullcam around the coax lines.
Vidicon type imaging tubes can be made in several different ways, and sharpness depends on the "target" design (the front face), RCA made Vidicons with the imaging area made from discrete silicon photo-diodes acting more or less as pixels (used by the Lunar Rover's ground controlled TV camera) with resolution figures matching later broadcast grade standard Vidicons.
This is a funny sort of hybrid approach since the fundamental operating principle is at a surface level even more similar to much later CCD or CMOS sensors than a normal Vidicon: an array of silicon photodiodes convert photons to electrons and store it in a "bucket" capacitor; periodically the bucket is drained (by the electron beam, in the Vidicon case) and the amount of charge corresponds to the image. Of course the normal lead-oxide Vidicon (Plumbicon) also does more or less the same, storing charge on the surface of the lead oxide target.
For an example of this see RCA AN-4623. Readout of discrete sharp pixels would require very precise control of the electron beam however.
A much newer Hamamatsu Vidicon catalog lists silicon target tubes separately and the specifications for resolution are around half that of the unspecified type used otherwise—presumably lead-oxide—though they do have reduced ghosting and higher sensitivity.
I suspect silicon target Vidicons were mostly used for scientific and industrial applications. One of their key benefits was good near-IR sensitivity out to beyond 1000 nm; this would for example be useful for many military applications such as imaging Nd:YAG laser beams. Alternate photocathode materials were used to support X-ray, UV, and SWIR wavelengths: old Vidicons remain the only somewhat economical way of viewing 1-3 µm wavelengths today.
Smaller Vidicons have a spatial resolution figures fairly similar to modern film stocks, but the size is significantly smaller in standard models. This shouldn't come as a big surprise since the pixel pitch of a 1⁄3" TV CCD is only slightly larger than a modern CMOS sensor pixel.
Incidentally, the size of sub APS-C sensors is often listed as fractional inches and this is a carryover from the Vidicon days, the listed size is the outer diameter of the tube assembly, with the active area being a fair bit smaller.
Dynamic range
Another issue: the dynamic range or "latitude" of a film negative is quite high, often exceeding 10-12 stops (ratio of exposure of 1000-4000⨉) and good slide films sit happily around 8 stops. A stop is a doubling of light, and is basically analogous to bits in a linear light system (no gamma).

A Hamamatsu 8134 is a relatively late period Vidicon, operated in high dynamic range mode it has a dark current of around 5 nA and a peak listed signal current of 350 nA, indicating a dynamic range of around 70⨉ or 36 dB. I'm not sure if gamma correction should be applied to this number, if so it would be higher, but 30 dB SNR was as far as I know considered to be good enough for broadcast television over the air. RCA catalogs for the previously mentioned silicon target Vidicon indicates 100:1 dynamic range is possible, but they also say the noise is so nice to look at that it really deserves even higher numbers.
The XC-75CE is a late 80's 1⁄2" 9 µm pixel pitch monochrome technical camera that listed a maximum SNR of around 56 dB, but what we actually have to look at is the ratio of maximum optical signal to minimum optical signal. Taking the specifications at their word with gamma on the minimum intensity is 3 lx for an IR-cut monochrome image, and peak sensitivity is 400 lx giving a ratio of 133⨉ or around 42 dB.
Kodak 5279 "VISION" film in 1996 was listed with a guaranteed dynamic range of roughly 10 stops of exposure which equates to a ratio of around 1000⨉ (60 dB), though the "usable" range of the VISION film will be a fair bit higher since a CCD sensor will clip highlights (sometimes leading to charge leaking outside of the storage capacitor, resulting in white vertical stripes in the image) whereas a negative film will progressively compress the highlights (soft clipping). Further, the D-max of the film isn't really specified so the true maximum is not directly readable from the datasheet.
Modern day Tri-X black and white film seems to have a D-max of around 3.0, and extrapolating a low-contrast development to D-max of 2.0 suggests the dynamic range could be as high as 90 dB (15 stops) though the images would probably be hard to work with.
The sensitivity of a video camera is unfair to directly compare to stills cameras, but suffice to say that sensitivity⨉resolution was typically far better for black and white film than any practical electronic camera made before the 2000's.
We've covered some major downsides to old electronic video cameras compared to film, it's also fair to mention that the cost of these sophisticated broadcast grade cameras likely exceeded the yearly income of most middle class families throughout most of the 70's. Meanwhile film had cheap cameras available for a long time (Brownies and Instamatics barely had anything inside them) and benefited from massive economies of scale, well within reach of most middle class families from the 1930's onward. A given camera could also within reason be upgraded by buying new better film and in some cases better lenses.
What do?
So for sensor you have two choices really. It's the 60's or 70's: you will use a Vidicon tube, maybe an RCA silicon intensified target tube. You'd need to make specialty tubes with ~4-8⨉ the resolution of standard video tubes and you'd need three of them to get colour. It seems likely that existing tubes could be used in a slow-scan mode to give finer resolution.
Dynamic range would be limited compared to film, but as we saw with e.g. the Voyager probes this would certainly be possible to do. You could use a photo-multiplier tube and spot-scanner but that would be even more complex, but could achieve great sensitivity.
It's the 80's: if you can see the future you'd develop a giant CCD. It could probably outperform the Vidicon for sharpness, you'd probably want three of them with a beam splitter for colour. These would also need to be huge compared to TV types to pack enough pixels, so yield would be low, cost would be significant, and power consumption would be fairly high during operation.
According to Wikipedia's sources the Japanese 1035i MUSE HDTV used high resolution Vidicons cameras up to at least 1993 due to their still higher performance. Existing and well-understood Vidicon technology could probably be more easily coaxed to output a reasonable HD signal than comparably immature CCD's where the maximum resolution is finite. The famous 1993 New York HDTV demo footage that reappeared a few years ago is believed to have been shot on a Sony HDC-100 or later HDC-300, both were 3-Saticon tube (SeAsTe, S-A-T) cameras first introduced in the mid 1980's. These were likely replaced by the Sony HDC-500 shortly thereafter; this is apparently the first CCD HDTV camera though details seem scarce. While significantly better than SDTV the 1993 footage is considerably softer than 2000's HDTV systems and the limited dynamic range is very visible here as well.
In 1995 the Minolta RD-175 used what I think is a fairly unique solution to the resolution problem. Presumably lacking access to their own CCD fabrication line they used a complex optical system to split the image onto 3 video-grade CCD's as usual. However, they used one of the CCD's to record red/blue at low resolution, then they used two green sensors diagonally offset to give a double resolution green image. Extensive and time consuming at the time off-board digital processing was then required to interpolate up a 1.8 mp output image (they made the image from 3⨉0.3 mp, so some "cheating" was involved as usual).
Using a video CCD with an alternate colour filter arrangement was likely a significant cost savings vs. commissioning a fully custom CCD layout. I wonder if they might not have had an easier go at it if they had split the image into 3-4 equal images and put in some standard bayer pattern CCD's instead.
Meanwhile in 1995 Kodak had released their first 6 mp sensor in a camera costing only $35,600, at a 9 µm pixel pitch this seemed to be a bit of a resolution wall for Kodak that they didn't really exceed until 2009's Leica M9 at 6.8 µm pitch—6.5-6.8 µm pitch is about the size used for 1⁄3" standard definition CCD's since the 80's by the way. I expect that massive improvements were made to cost, dynamic range, and sensitivity in this time period however.
The size of pixels isn't so much a sign of quality as it is a fact of physics that if you want a pixel to sharply resolve something then it has to be at least a couple of wavelengths long, which seems to put a practical limit at around 4 µm in most cases. Smaller pixels than this presumably require very complex designs and probably post processing to compensate for the diffraction limited resolution they must have and I suspect this is part of why they don't start shrinking to this size until the 2010's.
The storage-problem
So let's for argument's sake say that we bump up the resolution of these electronic cameras enough to make a decent sized print. This is after all merely an engineering problem if you ignore yield: just make the sensor bigger/electron beam focus finer and add more silicon target diodes in the case of a Vidicon.
Where to store it? Well, tape obviously. Tape showed up in the 60's for video, and by the mid 70's there was a boon of professional and home video tape recorders; clearly there was a huge demand. It's fairly obvious that you could use a tape that can store hours of 50-60 fps interlaced video to store a couple of higher resolution still images instead.
Keep in mind a general rule I've observed: storing or transmitting any kind of uncompressed digital signal onto/via a fundamentally analog medium usually requires a massive increase in bandwidth. See for example analog composite video vs. the equivalent SDI bitstream. A full TV signal with stereo sound requires around 6-8 MHz of RF bandwidth, an SDI bitstream runs at over 100 Mbit/s to do fundamentally the same thing. Analog voice needs 3 kHz of bandwidth for phone quality audio, digitally it needs minimum 64 kbit/s.
Digital is harder to store for this reason, storing everything as ones and zeroes with sharp transitions turns out to be a very inefficient use of bandwidth in an analog medium. Packing digital tighter requires very powerful signal processing (see e.g. OFDM/QAM) and/or computationally expensive lossy compression schemes in the case of audio and video.
For most of this time period it would make most sense to store an electronic image as an analog video signal, this requires far fewer transistors to implement than anything digital. Kodak's famous 1975 digital camera stored the 100x100 pixel image digitally at a measly 6 bits per pixel and it took 23 seconds to write it out to tape, during this time the image had to live in hideously expensive RAM chips. It is said this project was so secret only a handful knew about it, I'd say it's equally likely the invention had so little utility at the time that only a handful saw the potential.
If this Kodak digital camera were storing the image in analog form it would have taken less than 1⁄60th of a second to store it with resolution to spare on any readily available video tape recorder, but this wouldn't make for a good story since it would just be a terrible video camcorder instead of a digital camera. Indeed, the only real difference between a CCD digital stills camera and a CCD video camera is the digital storage medium.
Digital tape
Generally available digital tape storage for audio showed a while after the CD in the form of DAT (basically a modified miniature VHS tape recorder for CD-quality audio, released in 1987), but it took until 1995 with DV before a consumer digital video format was invented. DV tape is impressive, but by 1995 it was also clear that flash storage would be the big new thing and dealing with tape is horrible. In 1995 we're also approaching critical mass of early digital cameras, so I doubt DV was ever seriously considered as a way of storing high resolution stills in real-time.
Digital compression?
Do keep in mind also that JPEG wasn't standardised until 1992 though products made use of it as early as 1990; JPEG was made practical as a consequence of computers becoming fast enough to make use of the fairly impressive compression it can achieve.
Before this you'd need much more digital storage for similar quality images; in the early 90's there was a brief market for JPEG accelerator cards for computers since general purpose pre-SIMD processors didn't find it particularly easy to do quickly.
One prominent example of this is C-Cube Microsystems CL550 and CL560 launched ca. 1990 (the head engineers were JPEG committee members). Per their chip datasheets the enhanced CL560 can JPEG compress full RGB colour images at up to 15 megapixels per second using a 35 MHz heavily pipelined processing core. This was sufficient to perform real-time JPEG compression of standard definition video (JPEG for motion) and was used in some of Avid's early non-linear video editing products. Other cards were intended to make Photoshop and similar programs save files faster.
What do?
What are your options? So in the 60's you'd call Ampex who would laugh at the idea of making even a man-portable recorder capable of storing your still image signal. By the early 70's you could probably make it work as long as you were content to wheel it around. You could perhaps use a satellite uplink to live-broadcast your images to a mainframe somewhere, the satellite uplink truck would be more convenient since you could live in it after selling your mansion to afford the camera.
By the 80's you definitely could make a somewhat portable recorder capable of putting an analog still image onto tape, assuming you uprated something like a Betacam recorder and had a CCD that could keep the image in the charge domain long enough to avoid needing a bunch of RAM. In fact they did make this sort of thing, floppy discs were a thing for computers for a long time but these early cameras were TV-quality CCD's storing a single analog video frame at a time instead of digital data on a floppy. It was called a "still video floppy." These had nowhere near good enough quality for any real purpose at that time and don't seem to have achieved much.
By ca. 1990 you could fit enough RAM into a device to store a digital copy of the image, and you could in principle use JPEG to compress it. Storing an analog high def still image would be quite feasible as well, though no one seems to have tried it outside of the early 90's push for HD TV that didn't go anywhere.
In practice we saw the first digital floppy-storing TV or sub-TV quality digital stills cameras, as well as e.g. the Kodak DCS 100 which used a hard drive to store a digital image at over 1 (real, not interpolated) megapixels. That camera and some later models were mainly used by entities like NASA for Space Shuttle missions, where sharing a high resolution image with the ground was a huge improvement over video links; 1980's shuttle missions had approximately as good of a video link as the moon landings did.
This was followed by a wide array of increasingly powerful CCD based hybrids and later dedicated digital cameras. I thought Kodak killed digital to save film, why did they lead the charge into digital cameras?
If you're wondering: the CCD division's last great successailure was the Leica M9 sensor which famously started falling apart after a few years leading to a major recall. The division was later sold to onSemi who closed it in 2019, though they still make competent industrial CMOS sensors.
Also for fun: the easiest way to record an electronic image in any of these time periods would be to use a flying spot scanner to expose a piece of film that you then develop later. This was a real practical way to record certain scientific images as late as the early 90's. It would only take a small leap to go from this to just putting this film in the focal plane of a camera and you're back to square one.
WTF do you do with an electronic image?
So the elephant in the room is that yeah sure you could make an electronic stills camera even in 1970 if you had to. Now what? You're not e-mailing it to anyone are you.
The only practical way to view an image was over a standard TV screen (bad quality), CRT projector (could be better than a TV), an eidophor (if you work for NASA), a print (great quality) or as a slide (great quality). Digital picture frames were still 40 years out and it would be at least 50 years before they had acceptable image quality.
Instant review of an image would invariable involve a CRT until the 80's, LCD's weren't really good enough until the 90's at the earliest and in my opinion it took until the 2010's for portable LCD's to really look acceptable. Plasma displays existed but were basically monochrome.
Printing an electronic image onto paper or reversal film would certainly be possible, but at that point you do need to ask why you didn't just get a point & shoot with some film in it.
I've argued in favour of analog recording previously, but to do any sort of editing you couldn't already do in the darkroom printing process you would essentially need to make the image digital and put it into a computer. The problem is that RAM was so expensive until the 90's that storing even full quality TV stills was a huge undertaking.
What do?
So in the 60's and 70's you're not doing anything other than projecting or printing an image; there simply isn't consumer technology suitable for anything else. Of course it would be possible to make a computer capable of editing a megapixel image digitally in 1970, it's just that you'd need a city block's worth of space to keep it in.
In principle you could probably make special TV's to display the image but printing would effectively still be a film-process with RA-4 colour paper or telecineing it onto a slide. Digital image editing is impossible by most standards.
By the 1980's you could perhaps start to imagine our current future, but telecommunications and computers weren't good enough to store transmit or display even very low resolution pictures at a consumer accessible price.
By the 90's you had computers that could fit on a desk that could in fact start to be useful with colour displays, as well as colour printers capable of printing images. Coincidentally you also had digital cameras at the same time.
So why did the digital camera show up when it did?
As mentioned initially I believe the internet and general availability of computers with moderately high resolution screens and fast processors was a massive driving force.
By 1992-93 or so many people (workers, at large companies) had access to high colour depth moderately high resolution displays on their computers (assuming they had a Mac IIfx with $10,000+ worth of accessories). There was enough ram to actually buffer up an entire screen's worth of image, and the processors had become fast enough to do useful image editing. By 1995 the situation was much more democratised, with Windows 95 capable computers (and PowerMacs) generally also being capable of showing megapixel images and low resolution video. You'd definitely want a lot of RAM in there but it was possible if you were patient.
Further, advances in the 80's in computerised printing had led to general availability of colour printers capable of acceptable results; it was then possible to make your own prints of low quality and moderate cost.
These same advancements in semiconductor manufacturing made it simultaneously possible to make computers powerful enough to work with >1 mp images, made it possible to process and buffer these images in a camera and store them, and presumably also made it more feasible to make higher resolution CCD's with acceptable yield.
By 1995 the internet was a thing everyone was excited for, and this seems to coincide with mass marketing of more consumer priced variants of the tech that mostly existed as early as 1986 with analog still video cameras. These cameras were initially basically all made with TV-grade CCD's, some kind of slightly inappropriate storage device, some even had JPEG compression to let you fit more than one image per floppy.
The reason seems obvious in highsight: it's exciting new tech, review your images instantly, and most importantly: e-mail them to friends! The limited telecommunications bandwidth of the time necessarily made it impractical to share good quality images but the low low image quality of a TV CCD is fine to downscale, compress, and send over dial-up. At around the same time journalism started their first steps onto the information superhighway, and these postage stamp sized images were about the right size for the early internet news sites.
I believe this huge interest in these low image quality cameras then drove CCD manufacturers to start serious work on high resolution CCD's instead of only catering to video cameras and NRO surveillance satellites. By 1996-1998 or so multi-megapixel cameras were available for professionals who presumably only used them to shoot sports for quick turnaround but film's fate was sealed as progress accelerated rapidly from there.
The storage problem wasn't really solved until the late 90's to my mind with flash storage for small to medium amounts of images, and e.g. microdrives for larger amounts for professional use.
These days when looking at image archives there is a hugely noticeable drop in image quality in the 1998-2004 period when most snapshots and childhood pictures were rapidly transitioned off film and onto consumer digital cameras. This reduction in technical quality would have been obvious even then, but it does indicate that while film held a very real quality edge until probably 2002 at the earliest (See EOS 1Ds, which roughly matches 35 mm Ektachrome in many quality aspects) and probably more like 2010 for lower cost cameras, your average consumer was very interested in getting away from film and the (at least perceived) high cost per frame.
I'd wager the amount of images taken rose rapidly in this period, and further I'd guess the number of usable pictures taken probably also rose since your average joe plumber could take 10 shots instead of one or two of the same scene and just keep the one good frame.
Film fights back
In the early mid 90's Kodak spent a fortune developing and marketing APS cameras, which were honestly pretty cool for consumers. The new cassette was much more user friendly than the 35 mm ones, the subminiature format wasn't that much smaller and made perfect sense for consumer grade cameras.
If they had introduced APS in 1986 instead of 1996 it could have owned the market. APS and stiff competition in the early 2000's for CCD's and CMOS image sensors is probably what killed off Kodak in my opinion.
However, during the 80's and 90's the biggest advancements I can think of are T-Grain films with much higher sensitivity and finer grain, and compact mini labs that can very rapidly turn negatives into prints at low cost. This change also led to a shift in consumer film usage away from primarily reversal (slide) film for projection and to more flexible negative film.
These days we also enjoy instant review of images on screens; this was a major benefit to even early digital cameras. This market was served by Polaroid and similar technologies, though the technical performance was much worse than standard film.
How could you make digital cameras in 1980?
The easiest way to accomplish this would be to retroactively change the calendar, making 1995 actually be 1980 now.
In 1980 most of the high level technology needed to make a digital camera make sense had been invented and could have been put together if an infinite amount of resources and money could be poured at the problem.
To make a digital camera itself you need:
- Image-sensor: the CCD was invented, in theory you could make one with sufficient resolution if your life depended on it.
- Assume 1.2 megapixel is possible, a 4⨉ bump in pixel count vs. a TV sensor
- Video-ADC to make the digital copy of the video from the CCD: technically quite possible, but very expensive and power hungry if it also had to be fast
- RAM to store the digital image: sure, but needs a lot of space and power if you want all of the image stored at once
- Man-portable storage media: you'd probably start with a VHS or similar video tape recorder. Alternately a hard drive could be made to work.
- Hard drives capable of storing more than one image were typically mounted in 19" racks at the time
- Ideally you'd want image compression: not sure what the state of the art was in 1980, but the ASIC to do some compression would probably be fairly compact next to all the RAM to store the image.
- In the mid 80's semi-digital video compression systems like the Japanese MUSE system and EBU's later MAC systems were doing relatively sophisticated digital processing on live video to try to enhance the image quality of the 30's and 50's designs of analog TV
- These systems were basically failures, though MUSE gets credit for being basically HD TV 20 years before widespread consumer availability in the rest of the world while no one has ever heard of EBU MAC
- Viewfinder to review the image: not strictly required but a high-ish resolution CRT would be quite possible to manufacture, see e.g. the Sony Watchman. An SLR-design wouldn't require an electronic viewfinder.
- Designs for TV use would use an electron beam large enough to avoid visible spacing between the lines; a CRT is basically a special case of an electron microscope and so the capability to create very high resolution monochrome screens existed for a long time.
- High resolution colour monitors would present more of a challenge since shadow mask TV's were all awful and would require a very very fine mask, eliminating most of the light output.
- A colour sequential system using a DLP-type colour wheel in the optical path seems like it could be the easiest way to achieve a small high resolution compact color CRT.
This all seems like you could have made it, it would probably be much easier in 1985 vs. 1980 but still doable. It would involve a backpack full of electronics to make it happen, and it would undoubtedly have very poor sensitivity and dynamic range vs. any negative film of the era. Do keep in mind that in 1980 the IBM PC and analog walkie talkies with barely enough processing power to do automatic channel switching (pre-digital mobile phones like NMT-450) were considered to be state of the art.
Here's the fun part, what do you do with the image?
- Long term storage: definitely tape here, not a huge problem except it's tape so it sucks. It won't keep as well as black and white negatives but it might last longer than 70's and 80's colour negatives, which were notoriously prone to fading.
- Making prints: inkjet printers were quite possible to acquire in 1980, flying spot-scanning an image onto RA-4 colour paper is also possible.
- Projecting it: CRT projector could do it, or better yet use the flying spot-scanner for making prints to print onto a Kodachrome slide and project it normally
- Displaying it digitally: high resolution displays were definitely possible to make, but the amount of expensive RAM chips you need in this pipeline is getting pretty insane. Possibly made easier by using a slow scan monitor with long-decay phosphor (see e.g. old vector CAD displays).
- Storing a full colour quality 24-bit per pixel standard definition TV image (which is 0.3 mp) requires around 1 MB of RAM, or just under twice what an IBM PC could make use of directly.
- Our proposed system would likely need around 4⨉ more than this just for video-RAM to truly compete.
- Editing it: You'd need at minimum a fridge-sized mini-mainframe to usably digitally edit the image, but it would be technically possible.
- Assuming we want to keep a single copy of the 1 mp image in RAM we need around 4 MB of RAM
- An IBM System/38 the size of a large chest freezer had around 1.5 MB of RAM in 1978
- An IBM AS/400 from 1988 had a dramatic increase, apparently 64 MB wasn't unheard of.
- Sharing it with friends: get a stamp and mail them prints like people did ca. 1890-2005.
- If you want it electronically then you need to get a beefed up ISDN rolled out to the masses 15-25 years ahead of the actual schedule.
- You also have to convince the phone company to not treat data transfer as if it's a phone call to really boost adoptation.
- In theory you could just print it and put it into a fax machine, but colour fax was never a mass market thing. Also you don't have a fax machine at home unless you're a stock broker.
It bears repeating that many or all of these limitations were solved or nearly solved by 1990 or so. CCD manufacturing was technically capable of megapixel sensors (Kodak DCS 100) and presumably much cheaper to manufacture. CRT displays for computers were full colour capable and megapixels full colour displays and associated graphics cards were quite feasible though very expensive and mainly used for professional publishing. Putting 16 MB of RAM in a computer in 1990 was quite possible, and in theory up to 128 MB was possible but outrageously expensive. Computer storage had major leaps in the same time period, both tape, solid state, and hard drives. A miniature floppy drive in 1990 could easily store a single uncompressed TV-quality image and JPEG compression was right around the corner. Printers were much more capable and compact. Though fax machines still hadn't grown colour.
As it so happens it looks a lot like 1990 or so was a critical mass for the technology to start emerging, the image sensor was only one part of the puzzle. The crazy speed of 80's and early 90's computer technology made the things that could technically be built ignoring cost in 1980 accessible to companies in 1990, many consumers by 1995, and commodities by 2005.
Conclusion
I hope this illustrates why digital stills cameras happened when they did. Even if it could have happened a few years earlier (and attempts were made at it as early as 1986), the convergence of especially consumer-accessible digital computing power and telecommunications advancements in the form of the internet seems to have created a real consumer use case for nascent digital cameras in spite of all their limitations.
Prior to the internet there really were very few useful things a consumer could do with a TV-still quality digital or analog camera that wouldn't make for an incredibly expensive and inconvenient camera with dramatically worse image quality than even the disposable cameras of the era. All with the further downside that the image would realistically have to live on a TV or computer screen afterwards.
In around the same era we also saw further enhancements to film photography probably partly owed to improved computing power in the R&D departments. To mention two technical enhancements I'd suggest T-grain films of the 80's which were initially black and white (e.g. Kodak T-Max) but were rapidly applied to create much finer grained high speed colour films than were previously possible. This made it possible to make the small-aperture autofocus zooming point & shoots of the 80's and 90's, as well as disposable cameras for even lower TCO for occasional shooters and children.
Another technical/process improvement was the proliferation of the mini-lab which made film developing much easier to perform in a limited space, leading to one-hour development shops popping up all over the place.
These factors kept film relevant for probably a decade longer, though I still suspect most families at the time only handed in their film roll for development somewhat begrudgingly once a quarter at most and were definitely ready for digital by the dawn of the new millenium.