Sunday, October 2

Google wants to continue curbing clickbait | Digital Trends Spanish

Because Google knows that everyone we hate clickbaitthe company will soon take steps to address this issue in the results of Google search. Starting next week for searches using English, Google will aim to reduce the ranking of offensive websites while rewarding those who create high-quality, original content.

Clickbait is often seen in ads that make bold or even outrageous claims in the hope that you’ll be intrigued enough to click on the ad so you can learn more. Search results can also be misleading and inspire a click based on an interesting title and snippet.

Of course, finding that information or anything else relevant to your search on the clickbait page might be impossible. As you scroll down the page, you’ll pass several more ads, giving the unscrupulous seller and web developer exactly what they wanted. Known as blackhat SEO, it’s a massive waste of time and incredibly frustrating to fall for this trick.

Improving search results to display more useful content sounds great, but achieving this isn’t easy, and Google has been continually refining its search engine since it first launched in 1997. After more than two decades, here’s how the search engine works. Google Search intends to reach the next level and return even more accurate and valuable results.

Websites that collect results from others, for example movie reviews from multiple sources, but do not add anything new, will be ranked lower. Instead, your search for information about the movie “Wakanda Forever” deliver results that link to new information and original commentary on the next blockbuster. The improvement will be most noticeable in searches related to online education, arts and entertainment, shopping and technology, according to Google.

This means that aggregators like Rotten Tomatoes and Metacritic could carry less weight in search results compared to the original reviews. This update appears to be primarily targeting reviews, as well as content that “looks like it was designed to attract clicks rather than inform readers.”

The update also appears to target bots. Although Google didn’t explicitly state how it plans to address content written by bots (or copied from another website), the company says the impetus for this update is content that “might not have the information you want, or might even look like it was created for, or even by, a person.

Google did not detail in its publication of Blog how it discovers deceptive websites, but it has a lot of data and machine learning resources to analyze searches and visits to particular websites compared to the amount of time spent there after clicking. How big of an impact this will have remains to be seen, but any improvements are welcome.

Publisher Recommendations