One of the current limitations of the current state of artificial intelligence applications is that most of its applications still require human intelligence to update and modify, or in technical lingo ‘train’ them.
This article identifies a significant problem. The people paid to train AI are outsourcing their work… to AI | MIT Technology Review
Most of this human labor to train AI data is made up of ‘gig’ workers who use this as a ‘side hustle’ to augment their ‘day jobs.’ The article states, “The workers are poorly paid and are often expected to complete lots of tasks very quickly.”
So guess what happens next? The workers are turning to ChatGPT to help them get their work done, as much as 33% to 46% of the training.
So what’s the problem with that?
Another quote from the MIT article:
“Using AI-generated data to train AI could introduce further errors into already error-prone models. Large language models regularly present false information as fact. If they generate incorrect output that is itself used to train other AI models, the errors can be absorbed by those models and amplified over time…:.
In other words, the large databases used in AI have errors in them – which require humans to intervene to correct them. But when humans turn to other AI to help them, which also has errors, the errors compound!
So instead of AI getting smarter, which is what the hype promises, this is making it dumber.
Maybe this quote regarding AI in driverless trucks is prophetic.
Just when you think this technology is almost here,” said Tom Schmitt, the chief executive of Forward Air, a trucking company that just started a test with Kodiak’s self-driving trucks, “it is still five years away.” (NY Times – October 2022)