Google employee John Mueller said that you don’t need to “use robots.txt to block URL indexing with parameters”. According to him, this method of blocking will not allow Google to canonize URLs and will lead to the fact that “you will lose all value from links to these pages.
Instead of robots.txt, John Muller recommends using rel=canonical, internal link placement, etc.
VOLODYMYR KUZMA | Promotion of the Site on Google: Where to Start?
He added that blocking the URL with parameters using robots.txt will prevent Google from seeing rel=canonical. The search engine will handle URLs blocked by robots.txt as well as other robotic pages. As a result, Google will not know what is on this page, and, therefore, can index the URL without content.