Discussion about this post

User's avatar
Michael Mejia's avatar

Pencil, paper, and the slide rule got us from the Wright brothers to jet propelled flight in 30-40 years. Home computers went from 5mb hard drives to multi-terrabyte computation and greater speed in about the same time. Computers are now, in some sense, inventing themselves faster than could the human mind.

At a barbeque last week many guests were denizens of the photo industry, some retired, some winding down, and some lost jobs to computer processing. Some cranking out imaging for later processing by computers, numbing creativity. And AI? It keeps getting better as computational operations re-invent themselves faster than we can.

While I felt at the mercy of this process, I held out hoping, really, that the innate ability of humans to appreciate essence, the moment, to perceive as only humans can and express from that experience, would always distinguish our process from the computational process. The human process would always be superior, elegant , and unique.

2 things:

Our marketplace is being eroded by clients who only need so much quality. This is not new; at the frayed edge of our work we have often been displaced by a secretary, now an admin, who had a camera, now a cell phone, who could produce something that was "good enough." Apps and algorithms, freely available online, can do needed processing. You and I can see the difference but the immediacy and economy of the processes now available are more attractive. As such, our market has shrunk.

I recently read a novel by Berkeley's Ursula K. LeGuin, the Dispossessed, where a physicist expressed a model where a rock was thrown at a wall. Each time it was thrown it could only reach halfway. Thrown again, it reached halfway, rinse and repeat. While it never actually reaches the wall it gets really, really close. This is Zeno's Paradox. For many purposes, that was good enough. Does AI get us halfway "there" and where will "there" be after innumerable iterations?

•••

I recall back in the 90s when computational processes were rapidly being applied to imaging and image processing and delivery. A mentor, at the end of his career, said and I paraphrase: "I won’t have to deal with this. Now it is in your hands." I think that we did it ably and sustained out craft.

I am retiring on June 30, just a few weeks away, after a long career of analogue and digital imaging and production, and years of teaching and bringing a Photo Department and its Studio to its' zenith. I won’t have to deal with this. Now it is in your hands. I can only wonder.

Expand full comment
søren k. harbel's avatar

Really interesting. I like that you bring in Mondrian. His work was pretty much out of left field. It had never been done before, so it would never have been contemplated by AI. I view AI as linear, as in if 'A', then 'B', and if 'B', then 'C', etc. What artists can do is go from 'A' to 'X' in one leap, and that is why AI will not get there. In part because those using AI will not accept the leap, given that it is 'black box', so you don't actually see how it goes from 'A' to 'B' to 'C', etc. You just see the logical conclusion. If 'X' is not logical, it will be rejected by the person consulting AI as it will be rejected by AI itself. That is why we must love an artist's mind, exactly for the reason that it is non-linear and can go straight to 'X'. Does that make sense?

Expand full comment
13 more comments...

No posts