|
Register | Sign In |
|
QuickSearch
Thread ▼ Details |
|
Thread Info
|
|
|
Author | Topic: ChatGPT | |||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 22940 From: New Hampshire Joined: Member Rating: 6.7 |
I'm finding this all hilarious, whether original or not.
--Percy
|
|||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 22940 From: New Hampshire Joined: Member Rating: 6.7 |
I gave ChatGPT this problem:
The Vicar Problem: It calculated that the vicar is 126 years old, obviously wrong. I asked it to find the error and it came up with the same wrong answer. I told it the answer and it came up with a different wrong answer. I asked it to try again and it got the right answer, but I think it fudged the math because it got the wrong ages for the congregants. I have to leave in 5 minutes and can't check where it went wrong right now. --Percy
|
|||||||||||||||||||||||||||||||||||||||||||
AZPaul3 Member Posts: 8654 From: Phoenix Joined: Member Rating: 6.5
|
Rocket salad. Very good. Seems ChatGPT can somewhat simulate a sense of humor. Rocket salad is a real actual physical thing btw.
Just a moment... Nip it. Responses are not unique. Most don't even qualify as malaprops. I give an E for effort though. It is trying to satisfy but a malaprop in reality is not a planned or requested item. It is (usually) a spontaneous misunderstanding and confusion of syntax and/or pronunciation. ChatGPT doesn't know from syntax and pronunciation. It doesn't know from confusion. All it knows is what word its algorithms register as most probably should appear next. It will not confuse "ovation" with "ovulation" or "Sistine" with "sixteen". If the request to cite examples of malaprops is given, that's all it can do. If the request is to create original malaprops ... it is not capable of making these errors. It's algorithms select the next word based on word probability within context and is incapable of this kind of inadvertent spontaneous syntax error. Putting two unrelated words together is not a malaprop. Such must be directed or programmed to occur but why, other than a discussion like this, would anyone want this thing to be programmed to make inadvertent spontaneous syntax errors? It already makes enough math errors.Stop Tzar Vladimir the Condemned!
|
|||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 22940 From: New Hampshire Joined: Member Rating: 6.7 |
I just asked ChatGPT to generate some original spoonerisms. It did, but they were terrible. It's not a shining wit.
--Percy
|
|||||||||||||||||||||||||||||||||||||||||||
dwise1 Member Posts: 6076 Joined: Member Rating: 6.9 |
It's algorithms select the next word based on word probability within context ... How does it do with grammatical errors that humans commonly commit? Like the one that you just committed: "It's" ("it is") instead of "Its" (possessive pronoun). Or confusing "there", "they're", "their", or "your" and "you're". Or writing "with you and I" instead of "with you and me". Like one of the ways to spot the non-native speaker of a language being that their use of the language is much better than that of a native speaker. On that last, my excuse is that I was highly alienated in jr. and sr. high school, so I learned English, my native language, mainly through reading. And learning German, which teaches much more English grammar in two years than 12 years of English classes ever could. Though I've been accused of run-on sentences, but that's because of German:a sentence is supposed to contain a complete thought and German packs a lot more thoughts into a sentence that English does. I'd love to discuss German's "extended adjective" (das ausgedehntes Eigenschaftswort) some time; eg, "the originally-having-come-from-Berlin man".
|
|||||||||||||||||||||||||||||||||||||||||||
AZPaul3 Member Posts: 8654 From: Phoenix Joined: Member Rating: 6.5 |
We will have to ask Percy but I don't think there is a programmatic equivalent to THC that would allow my kind of error to appear in ChatGPT's work. Psychedelics of the AI kind are possible as evidenced by its poem for Tangle but grammatical errors of the AZPaul3 kind are beyond its structure to produce even under the influence whereas I succumb readily.
Stop Tzar Vladimir the Condemned!
|
|||||||||||||||||||||||||||||||||||||||||||
AZPaul3 Member Posts: 8654 From: Phoenix Joined: Member Rating: 6.5 |
I asked it to find the error and it came up with the same wrong answer. From my reading, even though it exists as thousands of computers, ChatGPT is strictly a language algorithm and does not have access to an ALU. I’m thinking it is so blind to anything but the result of the algorithm ChatGPT cannot recognize there is a math component to the issue. It doesn’t know from issue. It doesn’t know from math? Numbers are just weighted words in the algorithm? This is a glaring hole and you have to know OpenAI is working to resolve this. Must be super difficult in their present architecture or it would have been resolved. Max Tegmark cannot be happy since his math, which is the creator of all structure in the universe, is reduced to a verbose prose. And a broken one at that.Stop Tzar Vladimir the Condemned!
|
|||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 22940 From: New Hampshire Joined: Member Rating: 6.7 |
I asked ChatGPT to "compose a sentence about table manners that appears to be written by someone who is semi-literate." It replied:
ChatGPT: This contains an inconsistency, first implying the person didn't know the word for "fork" when it use the term "them stick thingies" and then a few words later using the word "fork". --Percy
|
|||||||||||||||||||||||||||||||||||||||||||
AZPaul3 Member Posts: 8654 From: Phoenix Joined: Member Rating: 6.5 |
Percy, I think the stick things are chop sticks, the most common eating utensil in the world.
Stop Tzar Vladimir the Condemned!
|
|||||||||||||||||||||||||||||||||||||||||||
AZPaul3 Member Posts: 8654 From: Phoenix Joined: Member Rating: 6.5 |
A follow-up on the math issues.
quote: Just a moment... And what is this "Just a moment..." instead of the article title or url?Stop Tzar Vladimir the Condemned!
|
|||||||||||||||||||||||||||||||||||||||||||
nwr Member Posts: 6484 From: Geneva, Illinois Joined: Member Rating: 8.4 |
Yes, it seemed pretty obvious that "stick thingies" was referring to chopsticks. I'm not sure how Percy missed that.
Fundamentalism - the anti-American, anti-Christian branch of American Christianity
|
|||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 22940 From: New Hampshire Joined: Member Rating: 6.7 |
I considered chopsticks, but when I rephrased in my mind I ended up with this:
quote: Which makes no sense. By sentence structure "stick thingies" and "forks" have to be synonyms unless you assume ChatGPT made the person not just semi-literate but also dumb and grammatically inconsistent. But I see that other people are interpreting the sentence differently. I'm the only one who assumed that "ain't supposed to" applied both to talking with your mouth full and to using stick thingies for noodles. But if it only applies to talking with your mouth full and not to the part about stick thingies then chopsticks make sense. --Percy
|
|||||||||||||||||||||||||||||||||||||||||||
nwr Member Posts: 6484 From: Geneva, Illinois Joined: Member Rating: 8.4 |
You did ask for something by a semi-literate person, so you shouldn't be expecting strict grammar and logic.
Fundamentalism - the anti-American, anti-Christian branch of American Christianity
|
|||||||||||||||||||||||||||||||||||||||||||
AZPaul3 Member Posts: 8654 From: Phoenix Joined: Member Rating: 6.5 |
Percy is not used to speaking dumb hillbilly. He didn't recognise it. You and I, on the other hand ...
Stop Tzar Vladimir the Condemned!
|
|||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 22940 From: New Hampshire Joined: Member Rating: 6.7 |
It's been a lot more than once that I've misparsed something.
--Percy
|
|
|
Do Nothing Button
Copyright 2001-2023 by EvC Forum, All Rights Reserved
Version 4.2
Innovative software from Qwixotic © 2024