My first encounter with ChatGPT

This morning, I created an account with “OpenAI API” which is ChatGPT. After logging in, I thought I would give it the acid test. So, I entered the question, “How do I turn off a note using GPScript?”. Its response was:

So, right away, I thought this code was copied directly from the manual by chatGPT, but I was wrong because I now asked this question, "Can you please explain “const NOTE_OFF_STATUS = 0x80;” and was immediately greeted with the following answer:

At this, I am overwhelmed and, quite frankly, a little frightened! LOL! Is ChatGPT trying to replace my friends in this community who help me so much? :slight_smile:

1 Like

It didn’t give you GPScript — it produced JavaScript :stuck_out_tongue_winking_eye:

2 Likes

Reminds me of the horrible launch of Apple’s Maps app in 2012.

==========

Apple’s troubled Maps app could be dangerous in the wrong circumstances, Australian police have warned, after rescuing several people who were directed into the outback by mistake.

Cops in Victoria have had to fetch tourists from the huge Murray Sunset National Park, after their iPhones sent them more than 40 miles out of their way. Apple Maps plotted the entire town of Mildura in the wrong place.

“Police are extremely concerned as there is no water supply within the Park and temperatures can reach as high as 46 degrees, making this a potentially life threatening issue,” said Acting Senior Sergeant Sharon Darcy in a statement.

“Some of the motorists located by police have been stranded for up to 24 hours without food or water and have walked long distances through dangerous terrain to get phone reception.”

Apple launched its own Maps app with iOS 6 earlier this year, replacing Google’s mapping service. It was immediately clear that Apple’s effort wasn’t up to scratch, with users around the world spotting egregious errors. In the UK, Luton was plotted in Devon, and Heathrow Airport was located in Hyde Park, among many other howlers.

:rofl: … but it understood my question and gave me a good answer. Danger Will Robertson!

It does not understand anything. It just recognizes context from patterns based on a huge repository of text that has been accumulated and it has language grammar rules through which it can generate reasonable syntactically correct grammatical sentences.

There is absolutely no guarantee that the response will be correct and there are lots of examples where it’s brilliantly wrong….a famous example being where it produced the classic result that 9 women can produce a baby in one month. (That answer has probably been fixed by now)

Does anyone remember that old language translation system where the phrase, “Out of sight, out of mind” was fed into an English-Russian-English system and it produced the result, “Invisible insanity”

It’s worth reading this article and being careful not to read too much into what these systems are doing.

4 Likes

Plot twist :joy:

I asked it about myself. It seems that I’m a very well-known gigging musician :upside_down_face:

5 Likes