🎉 Share Your 2025 Year-End Summary & Win $10,000 Sharing Rewards!
Reflect on your year with Gate and share your report on Square for a chance to win $10,000!
👇 How to Join:
1️⃣ Click to check your Year-End Summary: https://www.gate.com/competition/your-year-in-review-2025
2️⃣ After viewing, share it on social media or Gate Square using the "Share" button
3️⃣ Invite friends to like, comment, and share. More interactions, higher chances of winning!
🎁 Generous Prizes:
1️⃣ Daily Lucky Winner: 1 winner per day gets $30 GT, a branded hoodie, and a Gate × Red Bull tumbler
2️⃣ Lucky Share Draw: 10
#预测市场 Prediction markets sound quite democratic, but now there's a new problem—AI has learned to fake public opinion. What about prediction markets?
After reading this article, I realized that market manipulation has actually been happening all along. From the 1916 presidential election to the Romney incident in 2012, and the price fluctuations of Trump on Polymarket in 2024, there have always been people secretly messing around. The most outrageous part is that some manipulation is even outright obvious—like the example in Germany in 1999, where political parties directly emailed members to buy their contracts, which is simply absurd.
But here's an interesting point: manipulating markets is actually quite difficult and expensive. Only markets with extremely low liquidity can be influenced for the long term. High-liquidity markets? Arbitrageurs can pull prices back in a matter of minutes, and manipulators would end up losing money.
The truly terrifying part isn't how effective manipulation itself is, but the crisis of public trust in opinion. Even if manipulation has minimal impact on votes, once CNN reports daily that "this might be foreign interference," the public starts to fall into conspiracy theories. Panic, accusations, trust collapse—these effects might be far worse than the price fluctuations themselves.
Now, media and platforms need to be more cautious: media should stop reporting on the prices of low-liquidity markets recklessly, platforms need to establish monitoring systems, and policymakers must clearly define manipulation as illegal. Otherwise, in the AI era, polls become unreliable, and prediction markets could instead trigger a trust crisis.