Nfts legitimately confuse me.
"Why can't you put the whole image in an nft?"
"It's too big"
"Why is it too big?"
"It'd take too long to generate."
"Okay, but why?"
"Because nfts can't hold that much information."
"Okay, but why?"
"Because it'd take too long to generate."
"Okay, but why would it take too long to generate???"
"Fuck you, stop wasting my time."
"Oooookay. I really don't understand but okay, fuck you too I guess."
Does anyone know why nfts are so small? Everything I've read says that they're fucking tiny, but nothing explains why they can't be larger, why being larger would be too slow, and so on. They honestly seem like a decent answer to the digital ownership problem of "I want to resell this game like I could 20yrs ago but I can't because it didn't come on a disc", however I get sent in a circle whenever I try to figure out what makes nfts so unwieldy and impractical.
(Not that I think anyone should be able to own a digital good; I pay for digital things because I want to support people, not because I think digital ownership is a legitimate concept. Imo, because digital things can be copied as many times as you want, you can't truly own a digital item, and nor should anyone be allowed to try and revoke said item unless said item is illegal for other reasons. However... As long as we live in a capitalist society hell-bent on applying the concept of ownership to a system that's only limited by your hardware, I think people should have the ability to actually "own" their digital goods (in a traditional sense), which includes things like the right to not have a company take them away whenever it feels like it and the ability to sell digital goods like an IRL market.)
I'll throw you a bone and say that, if/when AGI rolls around, I'll be more than happy to extend concepts like creativity and artistic ability to it. I'll throw you another bone and say you're technically not wrong either.
The question I've come to is less about what is "original vs remix", and instead, "sapience vs machine intelligence". If sentience is the ability for an individual to say, "I think, therefore I am", then sapience is the ability for an individual to figure out that "I think, therefore I am". Furthermore, in this context I define "machine intelligence" as something artificially created which demonstrates elements of sentience or even sapience but fails to meet all the criteria that we would consider necessary for human intelligence (basically machine intelligence is "fake" intelligence).
AI at its current state appears to be nothing more than machine intelligence. It looks cool, it can fool you pretty good, however, in the end it appears to be about as conscious and self-aware as a jellyfish or siphonophore.
Furthermore, the AI doesn't have the ability to create unique experiences. It doesn't have the ability to walk out the door, drive down the street, walk into a surf shop and buy a surfboard. Even if we say, "putting it in a robot is too hard, we'll just put it in VRChat instead", I still have strong doubts about whether or not the AI is actually experiencing anything.
I mean, it can't even learn from itself without human intervention ffs. Unlike a human, you can't train an AI while it's running. Unlike an AI, humans don't ever fully shut off until we're dead (no, your brain doesn't turn off when you sleep; if it did then you'd literally be dead).
So you're not technically wrong, but at the same time AI brings nothing new to the table. It doesn't have new experiences it can mix in with the artwork it was trained on, nor is there evidence that it'd be able to control or shape what it experiences. While I hesitate to attach the physical act of creation to the concept of creativity (I consider creativity to be separate from artistic skill), a large part of creativity is coming up with something new based on a combination of your own experience and the experiences of others. Whether or not you act on your creativity and how well you execute your idea is immaterial.