I agree the hype has become tiresome when pit against current capabilities, but my personal experience is that AI tooling does have utility if one recognizes its limitations and works within and around them. One of the challenges there, though, is in communicating best case practices, particularly in the arts.
I agree the hype has become tiresome when pit against current capabilities, but my personal experience is that AI tooling does have utility if one recognizes its limitations and works within and around them. One of the challenges there, though, is in communicating best case practices, particularly in the arts.
It's a fraught topic, of course. My observation is that there are a number of people—regardless of whether they understand what AI is in the first place—who are building entire platforms around the notion that the only thing AI is good for is being universally condemned. I understand and empathize with their position, but I do worry about the consequences this will have for them (and for further social division) in the medium- and long-term. It reminds me an awful lot of the folks who used to jest that they "didn't know how to computer" back in the 90s and 00s. What good did it do them to ignore what ultimately became the primary way of working in just a matter of years?
That's not to say you or Laurie, based on her comment, fit that mold. In fact, the willingness to read and engage alone demonstrates a commitment to remaining in the conversation, and that alone buys one so much.
You're also spot on about LCMs: there's nothing even available publicly right now that would let a consumer engage with that type of model unless that consumer happened to have a pretty tech-heavy background and an awful lot of time to experiment with it.
Your comments—which I'm very grateful for!—do make me think that perhaps a post in the near future should focus on how I've been using AI as I get back into writing creatively. I certainly don't ask it to write for me and then copy-paste into Scrivener (I agree that its default writing style is very wooden, and it'd undermine the point of writing in the first place), but I've found it to be an effective starting point for further research, and it's been a decent enough sounding board to bounce ideas off of to maintain verisimilitude.
You've given me a lot to think about as always, Bruce! Thanks for chiming in. 🙏
My goal is to always have a thoughtful conversation. Again, I cheer on guys like you even though I'm largely staying on the sidelines. I can see some value in AI for idea generation, though right now I have no shortage of ideas so looking to AI hasn't been a priority. Keep the articles coming. I'm always open to learning.
Stimulating thoughtful conversation is precisely why I write posts like these. You can count on another one each week, and I always look forward to what folks have to say once they're published!
I agree the hype has become tiresome when pit against current capabilities, but my personal experience is that AI tooling does have utility if one recognizes its limitations and works within and around them. One of the challenges there, though, is in communicating best case practices, particularly in the arts.
It's a fraught topic, of course. My observation is that there are a number of people—regardless of whether they understand what AI is in the first place—who are building entire platforms around the notion that the only thing AI is good for is being universally condemned. I understand and empathize with their position, but I do worry about the consequences this will have for them (and for further social division) in the medium- and long-term. It reminds me an awful lot of the folks who used to jest that they "didn't know how to computer" back in the 90s and 00s. What good did it do them to ignore what ultimately became the primary way of working in just a matter of years?
That's not to say you or Laurie, based on her comment, fit that mold. In fact, the willingness to read and engage alone demonstrates a commitment to remaining in the conversation, and that alone buys one so much.
You're also spot on about LCMs: there's nothing even available publicly right now that would let a consumer engage with that type of model unless that consumer happened to have a pretty tech-heavy background and an awful lot of time to experiment with it.
Your comments—which I'm very grateful for!—do make me think that perhaps a post in the near future should focus on how I've been using AI as I get back into writing creatively. I certainly don't ask it to write for me and then copy-paste into Scrivener (I agree that its default writing style is very wooden, and it'd undermine the point of writing in the first place), but I've found it to be an effective starting point for further research, and it's been a decent enough sounding board to bounce ideas off of to maintain verisimilitude.
You've given me a lot to think about as always, Bruce! Thanks for chiming in. 🙏
My goal is to always have a thoughtful conversation. Again, I cheer on guys like you even though I'm largely staying on the sidelines. I can see some value in AI for idea generation, though right now I have no shortage of ideas so looking to AI hasn't been a priority. Keep the articles coming. I'm always open to learning.
Stimulating thoughtful conversation is precisely why I write posts like these. You can count on another one each week, and I always look forward to what folks have to say once they're published!