Pencil, paper, and the slide rule got us from the Wright brothers to jet propelled flight in 30-40 years. Home computers went from 5mb hard drives to multi-terrabyte computation and greater speed in about the same time. Computers are now, in some sense, inventing themselves faster than could the human mind.
At a barbeque last week many guests were denizens of the photo industry, some retired, some winding down, and some lost jobs to computer processing. Some cranking out imaging for later processing by computers, numbing creativity. And AI? It keeps getting better as computational operations re-invent themselves faster than we can.
While I felt at the mercy of this process, I held out hoping, really, that the innate ability of humans to appreciate essence, the moment, to perceive as only humans can and express from that experience, would always distinguish our process from the computational process. The human process would always be superior, elegant , and unique.
2 things:
Our marketplace is being eroded by clients who only need so much quality. This is not new; at the frayed edge of our work we have often been displaced by a secretary, now an admin, who had a camera, now a cell phone, who could produce something that was "good enough." Apps and algorithms, freely available online, can do needed processing. You and I can see the difference but the immediacy and economy of the processes now available are more attractive. As such, our market has shrunk.
I recently read a novel by Berkeley's Ursula K. LeGuin, the Dispossessed, where a physicist expressed a model where a rock was thrown at a wall. Each time it was thrown it could only reach halfway. Thrown again, it reached halfway, rinse and repeat. While it never actually reaches the wall it gets really, really close. This is Zeno's Paradox. For many purposes, that was good enough. Does AI get us halfway "there" and where will "there" be after innumerable iterations?
•••
I recall back in the 90s when computational processes were rapidly being applied to imaging and image processing and delivery. A mentor, at the end of his career, said and I paraphrase: "I won’t have to deal with this. Now it is in your hands." I think that we did it ably and sustained out craft.
I am retiring on June 30, just a few weeks away, after a long career of analogue and digital imaging and production, and years of teaching and bringing a Photo Department and its Studio to its' zenith. I won’t have to deal with this. Now it is in your hands. I can only wonder.
Hey, what a great and thoughtful comment. First of all, congratulations on your retirement!
I don't disagree with a thing you've said. What I hold out hope for, and realize has always and will always be true, is that humans ARE different from machines, and while the commercial audience may be shrinking, and shrinking further with the increasing reliance on "good enough" as you brilliantly elucidated with Zeno's Paradox... There's always space for excellence. And so what I'm hoping, based on my current thinking anyway, is that we are at least able to accurately identify what it is that the technology does for us (not to mention to us). And I think we are losing the plot, broadly speaking, and all the hubbub is about how much better AI is at everything. But if it's something you really care about, at least at this point, I don't think it is. Will it be? Who knows. But all those previous technologies aided humanity's ability to think. Is this one different? It certainly will be if we give ourselves over to it, and shrug off the idea that "good" has any meaning anymore.
If it weren't happening to me I'd really enjoy watching how it upends entire industries. Just out of interest, I mean. As it is, I have to worry about staking my claim to... something. And I think the only thing I've got is my humanity. So there we have it, my line of demarcation.
Thank you for reading and responding. And seriously, way to go on the retirement. Enjoy it!
Consider effects of past technologies? Remember the term "labor saving device"? Theses inventions were supposed to give us more free time. Instead, that "saved" time and the leisure it was supposed to allow, was simply figured into new models of productivity. As a result we still work 5 days week and 8 hours a day. We can't seem to shake the Industrial model as long as the bean-counters have the dais.
That said, I recall mythic stories of the invention and development of the assembly line. It went sort like this: everything, supplies and tools, were lined up on a factory floor and the auto frame tolled along, all the while pieces were affixed until it was done. It was really slick; a marvel of human performance that was always there, just no conceptualized. Sorta like something for nothing.
The problem, they found, was that the process was not only repeated ad nauseum but sped up. It didn't take long to find human limits; repetitive process broke people's connective tissues down and crippled them. The rate was slowed. It took Unions to wrest control of the speed and method levers from the bean counters. In the meantime we have ergonomic chairs.
It appears that we are finding new "occupational injuries". The density of computational work is nearly infinite as computers become faster; cue the labor saving and human threshold models above. Cue "AI" and those folks are displaced. There are only so many paper hat jobs available and those are being replaced every day. Millions of jobs will be lost; this is a certainty. Photo jobs will evaporate with only the barest whimper amidst the big bang.
In the past labor-saving devices, it was feared, would provide so much leisure time. There were concerns for filling up that time. Games, entertainments, travel were all suggested. All you have to do it to observe what Joe Bob Briggs described as the infinitely reputable Disneyland experience; dumpy, waddling consumers with cotton candy in their hands.
Absent the society of the workplace and the community, what is going to happen to people? How will the mechanics of fiat and purchase be sustained? How will people maintain their "sanity" without the structure of work, community responsibilities, and the social engagement of commerce which is what seems to bind us together. Will they live on their computers? Lord knows we've seen how the chatroomiverse has gone; even now, folks steers away from assholes to folks who share their views. Imagine a life like this.
In a world like this, what does excellence have to do with anything? I guess that it matters upon which side of the "excellence" threshold one resides.
Really interesting. I like that you bring in Mondrian. His work was pretty much out of left field. It had never been done before, so it would never have been contemplated by AI. I view AI as linear, as in if 'A', then 'B', and if 'B', then 'C', etc. What artists can do is go from 'A' to 'X' in one leap, and that is why AI will not get there. In part because those using AI will not accept the leap, given that it is 'black box', so you don't actually see how it goes from 'A' to 'B' to 'C', etc. You just see the logical conclusion. If 'X' is not logical, it will be rejected by the person consulting AI as it will be rejected by AI itself. That is why we must love an artist's mind, exactly for the reason that it is non-linear and can go straight to 'X'. Does that make sense?
Faster is faster. That's it. Is a fast lunch at your desk better than sitting down with friends over a meal?
I recently photographed a conference which included a panel of hedge fund and other investors, and someone in the audience asked about AI. The consensus was that AI was very good at quickly summarizing large amounts of data, but not at all good at making decisions about that data.
Scary future ahead! While I agree that AI can be used as a helpful jumping off tool, I’m afraid most people will use whatever it spits out. Good enough is good enough as long as it’s cheap and fast. The collective mind is not particularly discerning. Ain’t nobody got time.
Great read Bill. I never understand using AI to generate ideas - I have more than enough ideas; too many really. The challenge is getting to the hard work that starts once you commit to an idea. AI is fine for "good enough, just", which might work for some business models. But they won't be the innovators or future MBA case studies.
Re your footnote 6 on your area of expertise. I found the same thing with traditional media on an issue I had a deep (expert) understanding of - invariably their reporting was a shorthand summary from a range of sources that often was "close but not quite". Good enough, I guess.
"There was no there there" is a pretty good description of almost everyone of these kind of events I've ever been to, long before the advent of AI. I think most of the time'good enough' is all people want or expect, so if all the AI can manage is 'good enough' that will be good enough. Sad, but true.
While I do think "good enough" certainly has a stronghold (and seemingly getting stronger), in this particular instance it was notable how each of the other groups had something that stood out as "special." It's really what made me notice how plain our presentation was. We checked the boxes but had none of the magic.
We've always been a people of speed over quality. Like you said - productivity. Capitalism. You've nailed a lot here, right on the head. Perhaps the quality will come. It's not there yet. It may or may not ever be there. I just know that AI won't bring the "look" I strive for in my photography, it won't tell the stories that are in my head the way I want to right them. Time will tell.
Your comment draws an important distinction, I think, between mass market and "art." There's always been room for"good but slower and more expensive." Maybe that safe harbor just keeps shrinking. Like you said, time will tell. Thanks for reading!
Pencil, paper, and the slide rule got us from the Wright brothers to jet propelled flight in 30-40 years. Home computers went from 5mb hard drives to multi-terrabyte computation and greater speed in about the same time. Computers are now, in some sense, inventing themselves faster than could the human mind.
At a barbeque last week many guests were denizens of the photo industry, some retired, some winding down, and some lost jobs to computer processing. Some cranking out imaging for later processing by computers, numbing creativity. And AI? It keeps getting better as computational operations re-invent themselves faster than we can.
While I felt at the mercy of this process, I held out hoping, really, that the innate ability of humans to appreciate essence, the moment, to perceive as only humans can and express from that experience, would always distinguish our process from the computational process. The human process would always be superior, elegant , and unique.
2 things:
Our marketplace is being eroded by clients who only need so much quality. This is not new; at the frayed edge of our work we have often been displaced by a secretary, now an admin, who had a camera, now a cell phone, who could produce something that was "good enough." Apps and algorithms, freely available online, can do needed processing. You and I can see the difference but the immediacy and economy of the processes now available are more attractive. As such, our market has shrunk.
I recently read a novel by Berkeley's Ursula K. LeGuin, the Dispossessed, where a physicist expressed a model where a rock was thrown at a wall. Each time it was thrown it could only reach halfway. Thrown again, it reached halfway, rinse and repeat. While it never actually reaches the wall it gets really, really close. This is Zeno's Paradox. For many purposes, that was good enough. Does AI get us halfway "there" and where will "there" be after innumerable iterations?
•••
I recall back in the 90s when computational processes were rapidly being applied to imaging and image processing and delivery. A mentor, at the end of his career, said and I paraphrase: "I won’t have to deal with this. Now it is in your hands." I think that we did it ably and sustained out craft.
I am retiring on June 30, just a few weeks away, after a long career of analogue and digital imaging and production, and years of teaching and bringing a Photo Department and its Studio to its' zenith. I won’t have to deal with this. Now it is in your hands. I can only wonder.
Hey, what a great and thoughtful comment. First of all, congratulations on your retirement!
I don't disagree with a thing you've said. What I hold out hope for, and realize has always and will always be true, is that humans ARE different from machines, and while the commercial audience may be shrinking, and shrinking further with the increasing reliance on "good enough" as you brilliantly elucidated with Zeno's Paradox... There's always space for excellence. And so what I'm hoping, based on my current thinking anyway, is that we are at least able to accurately identify what it is that the technology does for us (not to mention to us). And I think we are losing the plot, broadly speaking, and all the hubbub is about how much better AI is at everything. But if it's something you really care about, at least at this point, I don't think it is. Will it be? Who knows. But all those previous technologies aided humanity's ability to think. Is this one different? It certainly will be if we give ourselves over to it, and shrug off the idea that "good" has any meaning anymore.
If it weren't happening to me I'd really enjoy watching how it upends entire industries. Just out of interest, I mean. As it is, I have to worry about staking my claim to... something. And I think the only thing I've got is my humanity. So there we have it, my line of demarcation.
Thank you for reading and responding. And seriously, way to go on the retirement. Enjoy it!
Consider effects of past technologies? Remember the term "labor saving device"? Theses inventions were supposed to give us more free time. Instead, that "saved" time and the leisure it was supposed to allow, was simply figured into new models of productivity. As a result we still work 5 days week and 8 hours a day. We can't seem to shake the Industrial model as long as the bean-counters have the dais.
That said, I recall mythic stories of the invention and development of the assembly line. It went sort like this: everything, supplies and tools, were lined up on a factory floor and the auto frame tolled along, all the while pieces were affixed until it was done. It was really slick; a marvel of human performance that was always there, just no conceptualized. Sorta like something for nothing.
The problem, they found, was that the process was not only repeated ad nauseum but sped up. It didn't take long to find human limits; repetitive process broke people's connective tissues down and crippled them. The rate was slowed. It took Unions to wrest control of the speed and method levers from the bean counters. In the meantime we have ergonomic chairs.
It appears that we are finding new "occupational injuries". The density of computational work is nearly infinite as computers become faster; cue the labor saving and human threshold models above. Cue "AI" and those folks are displaced. There are only so many paper hat jobs available and those are being replaced every day. Millions of jobs will be lost; this is a certainty. Photo jobs will evaporate with only the barest whimper amidst the big bang.
In the past labor-saving devices, it was feared, would provide so much leisure time. There were concerns for filling up that time. Games, entertainments, travel were all suggested. All you have to do it to observe what Joe Bob Briggs described as the infinitely reputable Disneyland experience; dumpy, waddling consumers with cotton candy in their hands.
Absent the society of the workplace and the community, what is going to happen to people? How will the mechanics of fiat and purchase be sustained? How will people maintain their "sanity" without the structure of work, community responsibilities, and the social engagement of commerce which is what seems to bind us together. Will they live on their computers? Lord knows we've seen how the chatroomiverse has gone; even now, folks steers away from assholes to folks who share their views. Imagine a life like this.
In a world like this, what does excellence have to do with anything? I guess that it matters upon which side of the "excellence" threshold one resides.
Really interesting. I like that you bring in Mondrian. His work was pretty much out of left field. It had never been done before, so it would never have been contemplated by AI. I view AI as linear, as in if 'A', then 'B', and if 'B', then 'C', etc. What artists can do is go from 'A' to 'X' in one leap, and that is why AI will not get there. In part because those using AI will not accept the leap, given that it is 'black box', so you don't actually see how it goes from 'A' to 'B' to 'C', etc. You just see the logical conclusion. If 'X' is not logical, it will be rejected by the person consulting AI as it will be rejected by AI itself. That is why we must love an artist's mind, exactly for the reason that it is non-linear and can go straight to 'X'. Does that make sense?
Makes perfect sense! I Like to say logic is the enemy of art, and that's a good example of why.
Thanks for the thoughtful comment!
It does. And I agree.
Faster is faster. That's it. Is a fast lunch at your desk better than sitting down with friends over a meal?
I recently photographed a conference which included a panel of hedge fund and other investors, and someone in the audience asked about AI. The consensus was that AI was very good at quickly summarizing large amounts of data, but not at all good at making decisions about that data.
Faster is faster, that's it.
Faster is really really important for a whole lot of things. But not EVERY thing. And I feel like we need to point that out every now and then.
Thanks for reading!
Scary future ahead! While I agree that AI can be used as a helpful jumping off tool, I’m afraid most people will use whatever it spits out. Good enough is good enough as long as it’s cheap and fast. The collective mind is not particularly discerning. Ain’t nobody got time.
I think you are absolutely right. Gone are the days of dedicating enough resources to properly do... much of anything.
Great read Bill. I never understand using AI to generate ideas - I have more than enough ideas; too many really. The challenge is getting to the hard work that starts once you commit to an idea. AI is fine for "good enough, just", which might work for some business models. But they won't be the innovators or future MBA case studies.
Re your footnote 6 on your area of expertise. I found the same thing with traditional media on an issue I had a deep (expert) understanding of - invariably their reporting was a shorthand summary from a range of sources that often was "close but not quite". Good enough, I guess.
"There was no there there" is a pretty good description of almost everyone of these kind of events I've ever been to, long before the advent of AI. I think most of the time'good enough' is all people want or expect, so if all the AI can manage is 'good enough' that will be good enough. Sad, but true.
While I do think "good enough" certainly has a stronghold (and seemingly getting stronger), in this particular instance it was notable how each of the other groups had something that stood out as "special." It's really what made me notice how plain our presentation was. We checked the boxes but had none of the magic.
Thanks for reading!
We've always been a people of speed over quality. Like you said - productivity. Capitalism. You've nailed a lot here, right on the head. Perhaps the quality will come. It's not there yet. It may or may not ever be there. I just know that AI won't bring the "look" I strive for in my photography, it won't tell the stories that are in my head the way I want to right them. Time will tell.
Your comment draws an important distinction, I think, between mass market and "art." There's always been room for"good but slower and more expensive." Maybe that safe harbor just keeps shrinking. Like you said, time will tell. Thanks for reading!