This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
So, I'm not a computer guy by trade: I own a Raspberry with a Plex server, I wrote a few lines in Python as a hobby, and that's it.
I feel like we are living in an AI bubble right now and I would like to ask some professionals for some clarity in the matter.
This is not meant to sound demeaning, and you are welcome to disprove it, but in my view an algorithm that says
start = input("type a number")
for number in start to start 1000
if number%2==0
print("even")
else
print("odd")
is "AI", meaning that it leverages an infinitely more precise and methodic calculator than the human brain to perform a task.
You can make it as complicated as you want, but at the end of the day it all boils down to a program accepting an input and doing some "instantaneous" work for you. That's what we want computers to do for us, to take away the mental strain.
Now, in what does ChatGTP differ from the example I just wrote? Didn't we have image recognition before now? What makes this new strain of AI so different say from a search engine like Google? Aren't they just trained search engines?
Related to this answer, there comes my bigger gripe with all this AI hype and Llama in particular: which problem do they solve? Like, if they aren't predictable in their answers, what good is to them?
Sorry, I don't mean to sound retrograde, I wish I could just skip 20 years and see what all this really boils down to.
Thanks for your time.
Subreddit
Post Details
- Posted
- 5 months ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/AskProgramm...