this post was submitted on 18 Jul 2025
156 points (96.4% liked)
Not The Onion
17464 readers
1071 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So it takes ChatGPT 10 minutes to an hour of servertime and the energy equivalent of a tank of gas or two to complete a simple task the user could have done in thirty seconds using their 40W brainmeats and a couple of pudgy fingers. That's just great. Good stuff, Altman. /s
You're not wrong today. But this is exactly the basis of the critique of computers in the 50s. And you probably created this post using a mobile Internet connected computer that fits in your pocket.
Okay, down vote away. Lemmy has such an ignorant hate boner against AI.
Computers were fucking trash in the 50s. Dumb tech enthusiasts all said the same shit people say about AI today: computers are unreliable, create more problems than they solve, are ham-fisted solutions to problems that require human interaction, etc. here are the HUGE problems computers had that we solved before the 70s.
Problem: No standard way to represent negative numbers in binary.
Solution: Two's complement became the standard.
Problem: Bit errors from unreliable hardware.
Solution: Hamming codes, CRC, and other ECC methods.
Problem: Inconsistent and error-prone real number math.
Solution: IEEE 754 standardized floating-point formats and behavior.
Problem: Each computer had its own incompatible instruction set.
Solution: Standardized ISAs like x86 and ARM became dominant.
Problem: Memory was slow, small, and expensive.
Solution: Virtual memory, caching, and paging systems.
Problem: Basic operations like sorting were inefficient.
Solution: Research produced efficient algorithms (e.g., Quicksort, Dijkstra’s).
Problem: No formal approach to designing logic circuits.
Solution: Boolean algebra, Karnaugh maps, and FSMs standardized design.
Problem: Programs used unstructured jumps and were hard to follow.
Solution: Structured programming and control constructs (if, while, etc.).
Problem: No standard way to represent letters or symbols.
Solution: ASCII and later Unicode standardized text encoding.
Problem: Code was written in raw machine or assembly code.
Solution: High-level languages and compilers made programming more accessible.
Its just ignorant to be acting like any of the problems we face with AI won't be sorted just as they were with computers.
I'm genuinely curious, how often does spouting off random bullshit work for you? Nothing you listed backs up your argument that the problems around AI are a result of it's infancy and first cut implementations.
Also, half of what you say is either untrue or disingenuous as all hell. "programs use unstructured jumps and were hard to follow"? What the fuck are you talking about? Please, find me a computer that didn't use something like a branch statement and didn't go in numerical sequence of instructions. I'll wait while you learn this so called "Instruction Set Standardization" of yours doesn't exist
Of course the AI defender uses AI to argue, because they don't need to understand shit if their AI girlfriend takes enough time and energy from their naysayers