Simply, it is not a digital one. Right? But at its core a ccd or a cmos is an analogue device, transforming photons into electrical
charge and only afterwards its converted in digital file. But we all agree that
this kind of photography is so called “digital photography” and not analogue
(or analog in American English)
photography. But large amount of analogue photographs after all is converted
into digital files by scanning negatives. At least for on line presentation. It’s a little bit complicated.
But leave philosophical matter about analogue vs. digital
for another column in the future. Analogue photography it’s whole universe of
diversity at itself. But what it is real analogue photography? Some would say
that real analogue photography is when it is taken on some light sensitized
material and that aperture and time this material is exposed to light is manually
controlled. Other would say give me some film and any camera it would take it. Then
it will take film to develop and printing to the local Quick lab. This is also
an analogue photography. But what would you say about alternative processes?
There it’s not already prepared film in advance, but you must prepare your own
light sensitive material, you must do developing and also printing (if it’s
needed) at your own. Are those processes more analogue than previous one? What do
you think about? What’s your way to be analogue?
Matjaž
p.s.: About last column and which camera I took to the hike.
I chose Altix. More about this matter in the next column.
We should definitely learn to scrutinize the differences between different types of data to be able to properly archive them. Mainly, in terms of the mechanisms that will be at play, taking into account the specific demands and concerns that each will raise.
ReplyDeleteRuby Badcoe