I can say this with some level of expertise, I have been a ham since 1986, and not only that, extensively familiar with electronics in radio and the use of antennas.
But those antennas were not so bad, after all! Sure, they're not ideal! But semiconductor and RF (radio-frequency) technologies have gotten really, really good in the last 35 years, since Magellan came out with their SINGLE RECEIVER GPS devices. Back when receiving four satellites was a challenge.
As RF devices, there were some exotic materials like gallium arsenide even back around 1990. But RF technology has dramatically improved over time.
In the past, I have pointed it out that smartphones could be checked for potential performance by first using a good antenna, amplifying the signal, and then exposing the smartphone to that signal, varying it. I believe that this would essentially eliminate the arguable bad and antenna problem at least for purposes of analyzing the potential performance.
I get 34 (or more?) satellites USED with my dual-frequency-capable OnePlus Nord2 5g and a mostly-open sky. And an estimated accuracy of "1.2 meters" as calculated by GPSTest.
Radio communication, and especially digital communication, has been described as a cliff: if the signal goes somewhat away, you can "fall off the cliff".
But if you are far away from that cliff, things can look much better. I think that industry simply did not try to push smartphone technology where it would lead us.