Quite a number of IT-literate people have pointed out that LLM development has realistically already peaked, because it absorbs, synthesizes and plagiarizes human-created content. Considering most human content has already been harvested, and the rate of this content expansion is slow, many feel that LLMs won't go much farther from here. There are also studies indicating that people who use LLMs a lot are eroding their critical thinking abilities, so logically the rate of human progress that can be plagiarized in the first place may actually inverse.
Your thoughts on the matter?