Predictions about the potential impacts of generative AI may be hugely overblown because of "many serious, unsolved problems" with the technology according to Gary Marcus, one of the field's leading voices.
I wonder, could AI actually “collapse”. As in, once companies and people start leaving the AI hype space, could the external input become small enough so that the AI to AI input takes over to such a degree that all trained models become essentially useless?
I find that unlikely. AI is a subject much like space tech. It may not always be the giant it is now but it’s a baseline research countries will be conducting. Even if only as a means to defend themselves.
I’m sure challenges like this are coming but I’d be surprised if it causes these applications to collapse completely. However, it may force the ML companies to pay some fees for the training data they use.
I wonder, could AI actually “collapse”. As in, once companies and people start leaving the AI hype space, could the external input become small enough so that the AI to AI input takes over to such a degree that all trained models become essentially useless?
I find that unlikely. AI is a subject much like space tech. It may not always be the giant it is now but it’s a baseline research countries will be conducting. Even if only as a means to defend themselves.
no, you could just stop training on input and revert to a precious state if that happened
I’m sure challenges like this are coming but I’d be surprised if it causes these applications to collapse completely. However, it may force the ML companies to pay some fees for the training data they use.