They're ranking better now. We're getting better performance. So we say, "You know what? We've learned this lesson. You should remove this really low-quality text from the bottom of your category pages." But then we tried it on another site, and we see that there's a downside, a small admission, but it was helping on those specific pages.
So I think it's just telling us that we need to test these recommendations every time. We need to try to build testing into our core practices, and I think that this trend is only going to grow and continue, because the more complex the ranking algorithms get, the more machine learning will get involved in it and it's not as deterministic as it ukraine number data to be, and the more competitive the markets are, so the less the difference between you and yours, the less the difference there is, the less the difference between your goods. There will be, and the greater the opportunity for something that works in one place and is negative in another place.
So I hope I’ve inspired you to check out some SEO A/B testing. We’re going to link to some resources that explain how you do it, how you can do it yourself, and how you can build a program around it, as well as some of our other case studies and lessons we’ve learned. But I hope you’ve enjoyed this journey into the surprising results of SEO A/B testing.
means:
Your tweets didn’t go unnoticed — we know MozCon #FOMO is very real. Many of you would be there in a second if it weren’t for busy schedules and annoying back-to-back meetings. So, when you’re hard at work, we’re here to make one thing easier: giving you the insights you need when you need them.
The pages are better without it.
-
- Posts: 324
- Joined: Tue Jan 07, 2025 4:34 am