It’s certainly becoming easier, but I don’t like it.
We have a cool AI that’s a big problem solver, but its outputs are complex. We’ve attached a GPT onto it purely to act kind of like a translator or summariser to save time trying to understand what the AI’s done and why. It’s great. But we definitely don’t see the GPT as offering any sort of intelligence, it’s just a reference based algorithmic protocol bolted onto an actual AI. Protocols are, afterall, a set of rules or processes to follow. The GPT isn’t offering any logic, reasoning, planning, etc. which are still the conditions of intelligence in computer science. But it certainly can give off the impression of intelligence as it’s literally designed to impersonate it.
It’s certainly becoming easier, but I don’t like it.
We have a cool AI that’s a big problem solver, but its outputs are complex. We’ve attached a GPT onto it purely to act kind of like a translator or summariser to save time trying to understand what the AI’s done and why. It’s great. But we definitely don’t see the GPT as offering any sort of intelligence, it’s just a reference based algorithmic protocol bolted onto an actual AI. Protocols are, afterall, a set of rules or processes to follow. The GPT isn’t offering any logic, reasoning, planning, etc. which are still the conditions of intelligence in computer science. But it certainly can give off the impression of intelligence as it’s literally designed to impersonate it.