what advancements? all llms use pretty much the same architecture. And better models aren’t better because they have better tech, they’re just bigger. (and slower and with a much higher energy consumption)
proving my point. the training set can be improved (until they’re irreversibly tainted with llm genrated data). The tech is not. Even with a huge dataset, llms will still have today’s limitations.
what advancements? all llms use pretty much the same architecture. And better models aren’t better because they have better tech, they’re just bigger. (and slower and with a much higher energy consumption)
The quality and amount of training done from model to model can very substantially.
proving my point. the training set can be improved (until they’re irreversibly tainted with llm genrated data). The tech is not. Even with a huge dataset, llms will still have today’s limitations.