If your organization has been around for more than a year or two, your website probably holds a mountain of published content. Much of that content no longer helps your audience or your brand because it’s outdated or irrelevant.
But it doesn’t hurt anything, right?
Wrong. Old content may negatively impact your content’s search rankings and visitors’ experience.
For example, published content might link to articles, research reports, or other useful web pages. Over time, pages move offline for one reason or another. Those links might now go to defunct pages. That hurts your search performance because Google frowns on content with broken links. And audiences don’t love them, either.
You also might have content about topics that don’t fit your current content or business strategy. You wouldn’t want those pages to turn up in search.
Many brands hesitate to delete content, but the results are often surprisingly positive. For instance, HubSpot deleted more than 3000 outdated content pieces. In a matter of months, the company saw an improvement in its SEO results.
HubSpot’s experience doesn’t mean you can or should delete all your old content. You can improve much of your previously posted content to improve your SEO and audience experience.
Luckily, there are many ways you can improve previously posted content to improve your SEO. Even better, properly refurbished or updated content can serve you just as well as new posts. For example, according to Search Engine Journal, your site might see more page traffic and better SEO by deleting or refurbishing old content every so often. By practicing these strategies, you could save your brand money in the long run.
If you’re going to take some digital scissors to inaccurate or outdated content, you’ll need to find it first. That means you’ll need to audit your content. Ideally, you’ll review every piece of content on your site (blog posts, tutorials, or interview posts) to see what works and what no longer provides authority boosting or informative benefits to your target audience. If the amount of content makes that task too daunting, start with one content category (blog posts, for example), then expand from there.
Delete or redirect content that’s:
No longer relevant because of changing industry trends or changes in your business
Not of use to your current target audience
The content you decide to keep may benefit from the SEO boosting techniques below.
Top Options for Hosting (and Optimizing!) Your Content
Your website hosting decisions can make a big impact on speed, performance, and manageability – and the experience that you deliver to your content consumers. Before you choose a hosting solution, watch this chat with Harry Jackson of InMotion Hosting, where he outlines the three main options and explains the pros and cons of each. Watch now!
Update and republish good but old content
What to do with the old blog posts and other accurate and worthwhile content pieces you want to keep? Google loves recently published content– so republish them.
But don’t repost them on your site as is – that’s a fast way to earn a rankings penalty in the computerized eyes of Google’s search engine algorithm.
Then, republish your quality content with the current date. Databox, a popular dashboard tool for businesses, saw a 75% increase in website traffic after updating more than 20 old blog posts.
Look for the following opportunities as you work on updating older content.
Delete references to outdated stats in the content you’re keeping
Comb through reasonably good and accurate posts. Edit out any inaccurate data points or outdated references they may contain.
Say you have an excellent blog post for your small business on how to use a specific product. The article remains relevant but includes data points from five years ago. Do your brand a favor by removing those sentences.
Doing so prevents your content from suffering due to broken links. It also prevents your target audience from reading outdated information, then coming to incorrect conclusions about your brand’s authority or authenticity.
After getting rid of these outdated references or data points, you can boost the SEO value of previously published pages by replacing those old data points with current ones.
Go hunting online for modern, accurate, up-to-date references to replace each removed link. This guarantees that your links aren’t broken and that they link to authoritative sources.
Remember to review and refresh evergreen posts, too
Similarly, read through evergreen posts for references that aren’t accurate or compelling any longer. Consider the current industry, its trends, and what your target audience is searching for.
Then put on your editor’s hat and get to work. Trim the fat from old content, add new sentences with new insights, and more without touching the bones of previously published pieces. You’ll spend less time doing this than if you created new content while improving your brand’s SEO.
Weave in new SEO keywords and links
Lastly, you can update old content and improve its search engine optimization value by weaving in freshly researched SEO keywords and links.
If you’re still posting new content regularly as part of your content strategy, you can take the keyword research you’ve already done and use it to spruce up old content. Swap out or add new keywords to old posts, and consider placing a few fresh, high authority links within old blogs to make them more relevant and more authoritative than before.
Cleaning up old content and making it relevant to your target audience once again takes work. But getting rid of content that no longer works and updating what does can do wonders for your content marketing strategy.
You may improve organic traffic to your site without investing in fresh content, and you’ll better serve your audience’s needs. Try these strategies for yourself and see the results.
The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
During my time in search, there are certain ranking factors that I’ve changed my perspective on. For instance, after coming to Go Fish Digital and working on internal linking initiatives, I started to realize the power of internal links over time. By implementing internal links at scale, we were able to see consistent success.
Freshness is another one of these factors. After working with a news organization and testing the learnings gained from that work on other sites, I started to see the immense power that content refreshes could produce. As a result, I think the entire SEO community has underrated this concept for quite some time. Let’s dig into why.
Reviewing news sites
This all started when we began to work with a large news publisher who was having trouble getting in Google’s Top Stories for highly competitive keywords. They were consistently finding that their content wasn’t able to get inclusion in this feature, and wanted to know why.
Inclusion in “Top stories”
We began to perform a lot of research around news outlets that seemed quite adept at getting included in Top Stories. This immediately turned our attention to CNN, the site that is by far the most skilled in acquiring coveted Top Stories positions.
By diving into their strategies, one consistent trend we noticed was that they would always create a brand new URL the day they wanted to be included in the Top Stories carousel:
As an example, here you can see that they create a unique URL for their rolling coverage of the Russia-Ukraine war. Since they know that Google will show Top Stories results daily for queries around this, they create brand new URLs every single day:
This flies in the face of traditional SEO advice that indicates web owners need to keep consistent URLs in order to ensure equity isn’t diluted and keywords aren’t cannibalized. But to be eligible for Top Stories, Google needs a “fresh” URL to be indexed in order for the content to qualify.
After we started implementing the strategy of creating unique URLs every day, we saw much more consistent inclusion for this news outlet in Top Stories for their primary keywords.
However, the next question we wanted to address was not just how to get included in this feature, but also how to maintain strong ranking positions once there.
Ranking in “Top stories”
The next element that we looked at was how frequently competitors were updating their stories once in the Top Stories carousel, and were surprised at how frequently top news outlets refresh their content.
We found that competitors were aggressively updating their timestamps. For one query, when reviewing three articles over a four-hour period, we found the average time between updates for major outlets:
USA Today: Every 8 Minutes
New York Times: Every 27 minutes
CNN: Every 28 minutes
For this particular query, USA Today was literally updating their page every 8 minutes and maintaining the #1 ranking position for Top Stories. Clearly, they were putting a lot of effort into the freshness of their content.
But what about the rest of us?
Of course, it’s obvious how this would apply to news sites. There is certainly no other vertical where the concept of “freshness” is going to carry more weight to the algorithm. However, this got us thinking about how valuable this concept would be to the broader web. Are other sites doing this, and would it be possible to see SEO success by updating content more frequently?
Fortunately, we were able to perform even more research in this area. Our news client also had many non-news specific sections of their site. These sections contain more “evergreen” articles where more traditional SEO norms and rules should apply. One section of their site contains more “reviews” type of content, where they find the best products for a given category.
When reviewing articles for these topics, we also noticed patterns around freshness. In general, high ranking articles in competitive product areas (electronics, bedding, appliances) would aggressively update their timestamps on a monthly (sometimes weekly) cadence.
For example, as of the date of this writing (May 25th, 2022), I can see that all of the top three articles for “best mattress” have been updated within the last 7 days.
Looking at the term “best robot vacuum”, it looks like all of the articles have been updated in the last month (as of May 2022):
Even though these articles are more “evergreen” and not tied to the news cycle, it’s obvious that these sites are placing a high emphasis on freshness with frequent article updates. This indicated to us that there might be more benefits to freshness than just news story results.
Performing a test
We decided to start testing the concept of freshness on our own blog to see what the impact of these updates could be. We had an article on automotive SEO that used to perform quite well for “automotive seo” queries. However, in recent years, this page lost a lot of organic traffic:
The article still contained evergreen information, but it hadn’t been updated since 2016:
It was the perfect candidate for our test. To perform this test, we made only three changes to the article:
Updated the content to ensure it was all current. This changed less than 5% of the text.
Added “2022” to the title tag.
Updated the timestamp.
Immediately, we saw rankings improve for the keyword “automotive seo”. We moved from ranking on the third page to the first page the day after we updated the content:
To verify these results, we tested this concept on another page. For this next article, we only updated the timestamp and title tag with no changes to the on-page content. While we normally wouldn’t recommend doing this, this was the only way we could isolate whether “freshness” was the driving change, and not the content adjustments.
However, after making these two updates, we could clearly see an immediate improvement to the traffic of the second page:
These two experiments combined with other tests we’ve performed are showing us that Google places value on the recency of content. This value extends beyond just articles tied to the news cycle.
Why does Google care?
Thinking about this more holistically, Google utilizing the concept of freshness makes sense from their E-A-T initiatives. The whole concept of E-A-T is that Google wants to rank content that it can trust (written by experts, citing facts) above other search results. Google has a borderline public responsibility to ensure that the content it serves is accurate, so it’s in the search giant’s best interest to surface content that it thinks it can trust.
So how does freshness play into this? Well, if Google thinks content is outdated, how is it supposed to trust that the information is accurate? If the search engine sees that your article hasn’t been updated in five years while competitors have more recent content, that might be a signal that their content is more trustworthy than yours.
For example, for the term “best camera phones”, would you want to read an article last updated two years ago? For that matter, would you even want an article last updated six months ago?
As we can see, Google is only ranking pages that have been updated within the last one or two months. That’s because the technology changes so rapidly in this space that, unless you’re updating your articles every couple of months or so, you’re dramatically behind the curve.
The concept of freshness also makes sense from a competitive perspective. One of the biggest weaknesses of an indexation engine is that it’s inherently hard to serve real-time results. To find when content changes, a search engine needs time to recrawl and reindex content. When combined with the demands of crawling the web at scale, this becomes extremely difficult.
On the other hand, social media sites like Twitter don’t have this issue and are made to serve real-time content. The platform isn’t tasked with indexing results, and engagement metrics can help quickly surface content that’s gaining traction. As a result, Twitter does a much better job of surfacing trending content.
Thinking about the web from a platform based perspective, it makes sense that most users would choose Twitter over Google when looking for real-time information. This causes a big threat to Google, as it’s a reason for users to migrate off the ecosystem, thus presenting fewer opportunities to serve ads.
Recently in Top Stories, you now see a lot more “Live Blog Posts”. These articles utilize LiveBlogPosting structured data, which signals to Google that the content is getting updated in real-time. While looking for real-time URLs across the entire web is daunting, using this structured data type can help them better narrow in on content they need to be crawling and indexing more frequently.
Google seems to be aggressively pushing these live blogs in Top Stories as they often see strong visibility in Top Stories results:
This might be a strategic move to encourage publishers to create real-time content. The goal here could be increased adoption of content that’s updated in real-time with the end result of showcasing to users that they can get this type of content on Google, not just Twitter.
Utilizing these concepts moving forward
I think as an industry, sometimes there’s room for us to be more creative when thinking about our on-page optimizations. When looking at how to improve pages that have lost traffic and positions over time, we could take freshness into consideration. When looking at pages that have lost prominence over time, we might want to consider checking if that content is also outdated. Through testing and experimentation, you could see if updating the freshness of your content has noticeable positive impacts on ranking improvements.
Build-A-Bear is remaking itself for the 25th anniversary of its founding this year. This means using its experience and its data to appeal to older customers and create stronger online connections.
“The goal that was stated for us was to diversify our brand, evolve our retail portfolio and build stronger relationships with our consumers,” said Ed Poppe, Build-A-Bear’s vice president, loyalty and performance marketing for Build-A-Bear, in a presentation at The MarTech Conference.
That’s why they launched HeartBox, an e-commerce play which the company says will let it move into “the adult-to-adult gift-giving and gift box market which has been meaningfully expanding over the past few years.” This goes along with its new Bear Cave line of “adult” bears (in this case adult means they have alcohol in hand). The brand has also expanded through partnerships with film, entertainment and streaming TV properties like Harry Potter, Pokémon, The Matrix and the Marvel series WandaVision.
These efforts are designed to give more options to customers who buy online, and increase options for engagement. This has required integrating new teams and new sources of data.
Connecting customer data and teams
“Over half of businesses now say that they expect the majority of their revenue to come from digital channels,” said Loretta Shen, senior director, product marketing, marketing cloud intelligence for Salesforce. “To meet changing consumer behavior, marketers are adopting digital channels like video, social media and digital ads across search and paid media. But it’s not just adopting these channels, but how you use them, and in particular how you use them in tandem.”
Build-A-Bear adapted to customers’ increased digital use by adding new digital experiences while also reorganizing customer data to better understand what customers want.
“We have to understand our guests at Build-A-Bear,” said Bryce Ahrens, Build-A-Bear’s senior analyst, CRM, loyalty and performance marketing. “How do they engage with our email, our websites, our advertising and, of course, how do they engage and experience our in-store environment?”
They keep a large CRM database made up of loyalty program members, website customers, retail customers and sales prospects. Additionally, through access to the CRM, the organization is pulling together different teams: web development, analytics, marketing and also data privacy people.
These teams have to remain connected because data is coming through different systems. Build-A-Bear has a first-party data warehouse, a commerce cloud storefront, an order management system, marketing cloud, an email platform and different analytics solutions, not to mention ad platforms for campaigns.
“We need to be able to bring this information together, prioritize what we look at, and identify strategies to move quickly,” said Ahrens.
Data and digital experience come together in an ongoing Build-A-Bear effort called “Count Your Candles.”
The promotion is a special offer for customers to order a discounted bear (regularly priced at $14) that costs a dollar amount that matches their age.
The dedicated webpage for this promotion also allows customers and gift-givers to buy gift cards and become loyalty members. Additionally, there are a number of other ways that customers can celebrate birthdays, including in-store birthday parties and special birthday gift boxes that can be ordered and delivered.
These strategies came from marketers looking at the data and seeing what sparked their customers’ interests. In this case, it was birthdays.
“We’re lucky to have a team up here who wants to jump in and help drive our business forward,” said Poppe. “But it also brings us back to where it’s important to aggregate data, identify patterns, see your opportunities, and pick your path forward.”
Get the daily newsletter digital marketers rely on.
Chris Wood draws on over 15 years of reporting experience as a B2B editor and journalist. At DMN, he served as associate editor, offering original analysis on the evolving marketing tech landscape. He has interviewed leaders in tech and policy, from Canva CEO Melanie Perkins, to former Cisco CEO John Chambers, and Vivek Kundra, appointed by Barack Obama as the country’s first federal CIO. He is especially interested in how new technologies, including voice and blockchain, are disrupting the marketing world as we know it. In 2019, he moderated a panel on “innovation theater” at Fintech Inn, in Vilnius. In addition to his marketing-focused reporting in industry trades like Robotics Trends, Modern Brewery Age and AdNation News, Wood has also written for KIRKUS, and contributes fiction, criticism and poetry to several leading book blogs. He studied English at Fairfield University, and was born in Springfield, Massachusetts. He lives in New York.
You’ve started a website for your local business, but with so much competition out there, you may be struggling to make your website more visible online. That lack of visibility could hinder potential customers from finding your company. To improve your visibility in search engine results, local business schema could be the tool you need.