Connect with us

SEO

8 Useful Python Libraries for SEO & How To Use Them

Published

on

8 Useful Python Libraries for SEO & How To Use Them


Editor’s note: As 2021 winds down, we’re celebrating with a 12 Days of Christmas Countdown of the most popular, helpful expert articles on Search Engine Journal this year.

This collection was curated by our editorial team based on each article’s performance, utility, quality, and the value created for you, our readers.

Each day until December 24th, we’ll repost one of the best columns of the year, starting at No. 12 and counting down to No. 1. Our countdown starts today with our No. 3 column, which was originally published on March 18, 2021.

Ruth Everett’s article on utilizing Python libraries for automating and accomplishing SEO tasks makes a marketer’s work so much easier. It’s very easy to read and perfect for beginners and even more experienced SEO professionals that want to use Python more.  

Great work on this, Ruth, and we really appreciate your contributions to Search Engine Journal.

Enjoy!   


Python libraries are a fun and accessible way to get started with learning and using Python for SEO.

Advertisement

Continue Reading Below

A Python library is a collection of useful functions and code that allow you to complete a number of tasks without needing to write the code from scratch.

There are over 100,000 libraries available to use in Python, which can be used for functions from data analysis to creating video games.

In this article, you’ll find several different libraries I have used for completing SEO projects and tasks. All of them are beginner-friendly and you’ll find plenty of documentation and resources to help you get started.

Why Are Python Libraries Useful For SEO?

Each Python library contains functions and variables of all types (arrays, dictionaries, objects, etc.) which can be used to perform different tasks.

For SEO, for example, they can be used to automate certain things, predict outcomes, and provide intelligent insights.

It is possible to work with just vanilla Python, but libraries can be used to make tasks much easier and quicker to write and complete.

Python Libraries For SEO Tasks

There are a number of useful Python libraries for SEO tasks including data analysis, web scraping, and visualizing insights.

Advertisement

Continue Reading Below

This is not an exhaustive list, but these are the libraries I find myself using the most for SEO purposes.

Pandas

Pandas is a Python library used for working with table data. It allows for high-level data manipulation where the key data structure is a DataFrame.

DataFrames are similar to Excel spreadsheets, however, they are not limited to row and byte limits and are also much faster and more efficient.

The best way to get started with Pandas is to take a simple CSV of data (a crawl of your website, for example) and save this within Python as a DataFrame.

Once you have this stored in Python, you can perform a number of different analysis tasks including aggregating, pivoting, and cleaning data.

For example, if I have a complete crawl of my website and want to extract only those pages that are indexable, I will use a built-in Pandas function to include only those URLs in my DataFrame.

import pandas as pd 
df = pd.read_csv('/Users/rutheverett/Documents/Folder/file_name.csv')
df.head
indexable = df[(df.indexable == True)]
indexable

Requests

The next library is called Requests and is used to make HTTP requests in Python.

Requests uses different request methods such as GET and POST to make a request, with the results being stored in Python.

One example of this in action is a simple GET request of URL, this will print out the status code of a page:

import requests
response = requests.get('https://www.deepcrawl.com') print(response)

You can then use this result to create a decision-making function, where a 200 status code means the page is available but a 404 means the page is not found.

if response.status_code == 200:
    print('Success!')
elif response.status_code == 404:
    print('Not Found.')

You can also use different requests such as headers, which display useful information about the page like the content type or how long it took to cache the response.

headers = response.headers
print(headers)

response.headers['Content-Type']

There is also the ability to simulate a specific user agent, such as Googlebot, in order to extract the response this specific bot will see when crawling the page.

headers = {'User-Agent': 'Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)'} ua_response = requests.get('https://www.deepcrawl.com/', headers=headers) print(ua_response)

Beautiful Soup

Beautiful Soup is a library used to extract data from HTML and XML files.

Advertisement

Continue Reading Below

Fun fact: The BeautifulSoup library was actually named after the poem from Alice’s Adventures in Wonderland by Lewis Carroll.

As a library, BeautifulSoup is used to make sense of web files and is most often used for web scraping, as it can transform an HTML document into different Python objects.

For example, you can take a URL and use Beautiful Soup together with the Requests library to extract the title of the page.

from bs4 import BeautifulSoup 
import requests
url="https://www.deepcrawl.com" 
req = requests.get(url) 
soup = BeautifulSoup(req.text, "html.parser")
title = soup.title print(title)

Beautiful Soup Title

Additionally, using the find_all method, BeautifulSoup enables you to extract certain elements from a page, such as all a href links on the page:

Advertisement

Continue Reading Below

url="https://www.deepcrawl.com/knowledge/technical-seo-library/" 
req = requests.get(url) 
soup = BeautifulSoup(req.text, "html.parser")

for link in soup.find_all('a'): 
    print(link.get('href'))

Beautiful Soup All Links

Putting Them Together

These three libraries can also be used together, with Requests used to make the HTTP request to the page we would like to use BeautifulSoup to extract information from.

We can then transform that raw data into a Pandas DataFrame to perform further analysis.

URL = 'https://www.deepcrawl.com/blog/'
req = requests.get(url)
soup = BeautifulSoup(req.text, "html.parser")

links = soup.find_all('a')

df = pd.DataFrame({'links':links})
df

Matplotlib And Seaborn

Matplotlib and Seaborn are two Python libraries used for creating visualizations.

Matplotlib allows you to create a number of different data visualizations such as bar charts, line graphs, histograms, and even heatmaps.

Advertisement

Continue Reading Below

For example, if I wanted to take some Google Trends data to display the queries with the most popularity over a period of 30 days, I could create a bar chart in Matplotlib to visualize all of these.

Matplotlib Bar Graph

Seaborn, which is built upon Matplotlib, provides even more visualization patterns such as scatterplots, box plots, and violin plots in addition to line and bar graphs.

It differs slightly from Matplotlib as it uses fewer syntax and has built-in default themes.

Advertisement

Continue Reading Below

One way I’ve used Seaborn is to create line graphs in order to visualize log file hits to certain segments of a website over time.

Matplotlib Line Graph

sns.lineplot(x = "month", y = "log_requests_total", hue="category", data=pivot_status)
plt.show()

This particular example takes data from a pivot table, which I was able to create in Python using the Pandas library, and is another way these libraries work together to create an easy-to-understand picture from the data.

Advertools

Advertools is a library created by Elias Dabbas that can be used to help manage, understand, and make decisions based on the data we have as SEO professionals and digital marketers.

Advertisement

Continue Reading Below

Sitemap Analysis

This library allows you to perform a number of different tasks such as downloading, parsing, and analyzing XML Sitemaps to extract patterns or analyze how often content is added or changed.

Robots.txt Analysis

Another interesting thing you can do with this library is to use a function to extract a website’s robots.txt into a DataFrame, in order to easily understand and analyze the rules set.

You can also run a test within the library in order to check whether a particular user-agent is able to fetch certain URLs or folder paths.

URL Analysis

Advertools also enables you to parse and analyze URLs in order to extract information and better understand analytics, SERP, and crawl data for certain sets of URLs.

You can also split URLs using the library to determine things such as the HTTP scheme being used, the main path, additional parameters, and query strings.

Selenium

Selenium is a Python library that is generally used for automation purposes. The most common use case is testing web applications.

Advertisement

Continue Reading Below

One popular example of Selenium automating a flow is a script that opens a browser and performs a number of different steps in a defined sequence such as filling in forms or clicking certain buttons.

Selenium employs the same principle as is used in the Requests library that we covered earlier.

However, it will not only send the request and wait for the response but also render the webpage that is being requested.

To get started with Selenium, you will need a WebDriver in order to make the interactions with the browser.

Each browser has its own WebDriver; Chrome has ChromeDriver and Firefox has GeckoDriver, for example.

These are easy to download and set up with your Python code. Here is a useful article explaining the setup process, with an example project.

Scrapy

The final library I wanted to cover in this article is Scrapy.

While we can use the Requests module to crawl and extract internal data from a webpage, in order to pass that data and extract useful insights we also need to combine it with BeautifulSoup.

Advertisement

Continue Reading Below

Scrapy essentially allows you to do both of these in one library.

Scrapy is also considerably faster and more powerful, completes requests to crawl, extracts and parses data in a set sequence, and allows you to shield the data.

Within Scrapy, you can define a number of instructions such as the name of the domain you would like to crawl, the start URL, and certain page folders the spider is allowed or not allowed to crawl.

Scrapy can be used to extract all of the links on a certain page and store them in an output file, for example.

class SuperSpider(CrawlSpider):
   name="extractor"
   allowed_domains = ['www.deepcrawl.com']
   start_urls = ['https://www.deepcrawl.com/knowledge/technical-seo-library/']
   base_url="https://www.deepcrawl.com"
   def parse(self, response):
       for link in response.xpath('//div/p/a'):
           yield {
               "link": self.base_url + link.xpath('.//@href').get()
           }

You can take this one step further and follow the links found on a webpage to extract information from all the pages which are being linked to from the start URL, kind of like a small-scale replication of Google finding and following links on a page.

from scrapy.spiders import CrawlSpider, Rule
 
 
class SuperSpider(CrawlSpider):
    name="follower"
    allowed_domains = ['en.wikipedia.org']
    start_urls = ['https://en.wikipedia.org/wiki/Web_scraping']
    base_url="https://en.wikipedia.org"
 
    custom_settings = {
        'DEPTH_LIMIT': 1
    }
 
    def parse(self, response):
        for next_page in response.xpath('.//div/p/a'):
            yield response.follow(next_page, self.parse)
 
        for quote in response.xpath('.//h1/text()'):
            yield {'quote': quote.extract() }

Learn more about these projects, among other example projects, here.

Final Thoughts

As Hamlet Batista always said, “the best way to learn is by doing.”

Advertisement

Continue Reading Below

I hope that discovering some of the libraries available has inspired you to get started with learning Python, or to deepen your knowledge.

Python Contributions From The SEO Industry

Hamlet also loved sharing resources and projects from those in the Python SEO community. To honor his passion for encouraging others, I wanted to share some of the amazing things I have seen from the community.

As a wonderful tribute to Hamlet and the SEO Python community he helped to cultivate, Charly Wargnier has created SEO Pythonistas to collect contributions of the amazing Python projects those in the SEO community have created.

Hamlet’s priceless contributions to the SEO Community are featured.

Moshe Ma-yafit created a super cool script for log file analysis, and in this post explains how the script works. The visualizations it is able to display including Google Bot Hits By Device, Daily Hits by Response Code, Response Code % Total, and more.

Koray Tuğberk GÜBÜR is currently working on a Sitemap Health Checker. He also hosted a RankSense webinar with Elias Dabbas where he shared a script that records SERPs and Analyses Algorithms.

Advertisement

Continue Reading Below

It essentially records SERPs with regular time differences, and you can crawl all the landing pages, blend data and create some correlations.

John McAlpin wrote an article detailing how you can use Python and Data Studio to spy on your competitors.

JC Chouinard wrote a complete guide to using the Reddit API. With this, you can perform things such as extracting data from Reddit and posting to a Subreddit.

Rob May is working on a new GSC analysis tool and building a few new domain/real sites in Wix to measure against its higher-end WordPress competitor while documenting it.

Masaki Okazawa also shared a script that analyzes Google Search Console Data with Python.

2021 SEJ Christmas Countdown:

Advertisement

Continue Reading Below

Featured image: jakkaje879/Shutterstock





Source link

Continue Reading
Comments

Marketing

How Can You Improve Your Blog’s Content with a Paraphrasing Tool?

Published

on

How Can You Improve Your Blog’s Content with a Paraphrasing Tool?


Paraphrasing tools are getting extremely popular, especially among bloggers. The reason is that these tools allow them to rewrite some of the old stuff with very high accuracy.

Uniqueness is the most important factor that determines the search engine ranking of your website. Most search engines determine the worth of your site by looking at the content that you post.

This is why you need to make sure the material you write in your blog contains zero plagiarism. For this purpose, you can use paraphrasing tools. These tools allow you to come up with unique ideas, words, and phrases that you incorporate into your blog to increase readability as well as reader engagement.

What is a Paraphrasing Tool?

A paraphrasing tool can be used to generate new text to explain existing ideas, concepts, or themes. These tools take minutes to convert your old text into an entirely new form having new phrases, words, and synonyms while keeping the original theme intact.

These tools improve the readability, grammar, and other key aspects of your text to make it coherent and consistent. These tools use AI technology to make your content unique and to improve the tone, style, and other features.

There are many reasons to use these tools and in this next section we will take a look at some of these

1. Complete Analysis of Your Content

Before rephrasing your content, these tools analyze it completely to determine a few key things. These include word count, readability, spelling and grammar mistakes, and the main theme and tone of the content.

This complete analysis allows these tools to generate highly accurate content that you can post on your blog without fearing plagiarism.

These tools are very accurate when analyzing your content and that allows you to trust these completely to perform paraphrasing for you.

2. Changing Content Tone

The tone of your content is what separates it from others and engages your audience. Paraphrasing tools can rewrite your material while giving it a pleasant and consistent tone.

These tools can make adjustments that make your content easy to read, understand, and digest. By working on the tone of your text, these tools make it SEO-friendly which leads to better search engine ranking.

3. Better Content Flow

When writing content for your SEO or blogs, you need to make it seem like it’s connected and flowing in a consistent manner. Writing about different stuff randomly makes it seem all over the place which leaves a bad impression on your readers.

Paraphrasing tools can help you improve the flow of information that you provide in your content. This makes it more concise and understandable.

Some Ways in Which Paraphrasing Tools Can Improve Your Blogs

Paraphrasing tools are really a blessing for bloggers and general content writers. These tools save time and offer very high accuracy.

Here are some of the main ways in which such tools can help you write plagiarism-free blogs

1. Replacing Words with Synonyms

The main reason these tools are effective is that they offer a number of synonyms for every word in the content. You can use these tools to replace single words, phrases, sentences, or even paragraphs.

The paraphrase online turns your entire text into something new which makes it free from plagiarism of every type.

2. Improve Spellings and Grammar

Paraphrasing tools improve the grammatical errors and inconsistencies in your original text. These tools highlight lines that need to be changed and you can use some other tool to eliminate these errors.

These tools also identify and remove spelling mistakes as well. The final content that you get from these tools is immaculate in every way. It is consistent with the main theme and each sentence flows from the last one.

3. Save Time and Energy

Paraphrasing without a tool can take so much of your time and energy. You need to consult various sources to learn new words and ideas to incorporate into your text which is very time-consuming.

Paraphrasing tools help you save a lot of time by rewriting more than 1000 words in a matter of a few minutes. Doing this yourself can take several hours which you can spend on something more important.

4. Cost-Effective

Hiring content writers to write unique content for your blog is quite expensive. You have to spend a lot if you hire someone else to rewrite content for you and there is still no guarantee that the contest will be plagiarism-free.

You can find several free paraphrasing tools online to do that for you. These tools require no registration or login which means you can just go online and convert the text instantly.

5. Creative Writing

Most paraphrasing tools can help you write creative content. These tools take your words and phrases as prompts and use AI to write creative material that you can post on your blog.

This is especially helpful for a writer suffering from writer’s block. These tools can inspire them to look at things from a different perspective. This improves their skill as a writer and enhances the quality of their content.

Final Thoughts:

So, these are some of the few ways in which paraphrasing tools can help you improve the content of your blogs. These tools can help you write better material that has zero grammatical errors and is more engaging.

Without these tools, you will only be wasting your time and money with little to get in return. Paraphrasing tools are being used both by academic and non-academic persons who often find it hard to rewrite stuff due to having limited vocabulary and a grasp of grammar.

Now you know what are some of the major benefits of using paraphrasing tools when writing content for your blogs. These tools can lead to better content for your blogs that is both search engine friendly and engaging.

We hope this stuff helps and we suggest you to use these tools for improving your skills as a blog writer.      



Source link

Continue Reading

SEO

10 Key Steps To Ranking Higher In Google Maps

Published

on

10 Key Steps To Ranking Higher In Google Maps


You’re searching for a lunch spot in an unfamiliar neighborhood, or you need a mechanic to assist with an unexpected flat tire.

Where do you look?

If you answered Google Maps, you’re not alone.

These days, many of us are turning to Google Maps to discover local businesses and make more informed buying decisions.

So how can local businesses rank higher in the place consumers are increasingly looking to purchase local products and services?

Here are ten steps to take in order to rank well, drive more traffic and secure more customers via Google Maps.

1. Claim And Complete A Google Business Profile

The first, crucial step in establishing visibility in Google Maps is claiming and optimizing your Google Business Profile (GBP – formerly known as Google My Business or GMB).

You can do this by simply searching for your business name on Google or Google Maps and verifying your listing if you have not already done so.

Once you have a listing and are logged into your Google account, you can now edit it, even from directly within the search results.

Screenshot from Google Business Profile, June 2022

Being a Google property, GBP provides a primary signal to Google of your business’ existence – and the information here is assumed to be accurate and up to date.

Google will cross-reference these details with those it finds on your website and in other local directories and resources; more on the importance of these in a moment.

2. Post Linked Content (Including Photos)

After you’ve claimed your GBP listing, your work is only partway done.

Google rewards active businesses with higher visibility in Google Maps, so it’s important to post regular updates to your GBP profile.

These updates may and should include special offers, hosted events, links to relevant blog posts, or general business updates.

Posting photos to Google Business ProfileScreenshot from Google Business Profile, June 2022

Where possible, incorporating photos into your updates is also encouraged, as visuals are more likely to boost viewer engagement in terms of shares or clicks.

You should also be including links in your posts, ideally to primary product or service pages on your website.

3. Optimize Your Web Presence For Local Organic Search

If you want to rank well on Google Maps, you should ensure your web presence, including your website and external content, is optimized for your local audience.

You can start by performing a local SEO audit to identify where you need to focus your attention from a keyword, content, and linking perspective – as these are the three primary components upon which a presence is built.

Your website needs to be properly structured to enable Google to easily crawl and index your content, and the content within your site needs to be rich with relevant, locally-oriented, intent-driven keywords and logical internal and external links to the answers your audience is searching for.

Google rewards websites that lead searchers to answers in as few clicks as possible.

Websites must also load quickly and provide seamless navigation, regardless of device.

This is particularly important at a local level, as searchers increasingly begin their quests on their phones.

4. Use Local Business Schema

When it comes to structuring content, and especially business details, Google and other search engines prefer standardization – which has led to the development of schema.

Local Schema enables businesses to wrap code around their content to make it easier for Google to crawl and index.

Local business schema covers many of the same business details captured in a Google Business Profile, which Google will naturally cross-reference.

The easier it is for Google to validate your location, the more likely your business is to show up prominently in Google Maps.

5. Embed The Google Map On Your Contact Us Page

While it’s not explicitly stated that embedding a Google Map in your website will make a difference in terms of where you rank in Google Maps, it’s not far-fetched to assume this is Google’s preferred format.

Here again, Google is able to ensure a consistent user experience for its searchers, which should likewise be the aim of any business looking to please its customers.

6. Mine And Mind Your Reviews

Any business can create a GBP listing, ensure its basic business information is up to date, and post plenty of relevant, local content.

However, another critically important factor in determining if, and where, a local business shows up in Google Maps is customer reviews.

Reviews on Google Business ProfileScreenshot from Google Business Profile, June 2022

Google pays close attention to both how many reviews your business obtains, and how active it is in responding to those reviews, regardless of whether they’re positive or negative.

Any business naturally wants to limit the number of negative reviews it receives and all negative reviews should be dealt with swiftly.

This can actually become a valuable way of displaying your business’ commitment to customer service.

While there are many places customers can leave reviews online, including Facebook, Yelp, and other industry-specific review sites, reviews on GBP profiles will carry more weight when it comes to Google Map rankings.

Consider proactively asking your customers for reviews soon after you’ve successfully delivered a product or service when a presumably positive experience is top of mind for their customers.

There are services available to help automate review requests (via email or text) once certain on or offline customer actions have been completed (e.g. appointment completed, invoice paid, etc.) and review management across multiple sources through a central dashboard.

Automation can save busy local businesses a lot of time, and ensure positive reviews flow in on a regular basis.

7. Update Your Local Listings/Citations With Your NAP

The three most important pieces of directional information on your GBP, website, and across the web are your Name, Address and Phone Number or NAP.

It’s critical for both Google and your audience to have your NAP consistent and accurate across all of these sources.

These references to your business from third-party sites are also called citations.

To find and ensure your NAP is up to date, you can start by simply searching your business name and noting all of the places your business details can be found.

Check each instance and reach out to each directory or website owner to update this important contact information, as needed.

There are also free and paid automated local listings services, which will enable you to identify and update your NAP, along with other important business information like your website URL, services, or even relevant images, from one central location.

8. Build Local Backlinks

Backlinks or inbound links are effectively an extension of our NAP strategy, whereby you look to have relevant, local third-party websites link to your primary website pages.

Backlinks can validate your business from both local and product/service perspectives.

If you maintain listings with links in local directories, you will want to ensure those listings are in the proper categories, if category options are offered.

Ideally, these links to your website are “follow” links, which means Google will follow and recognize the source of the link to your content.

Most directories realize the value of “follow” links and therefore charge for inclusion, but you should also look for opportunities to secure links from other non-paid sources such as relevant partner, industry or service organization sites.

9. Engage With Your Community

Just as Google rewards GBP activity, it also pays attention to how active a business is within its community as a means to establish its local presence and authority.

Businesses noted to be engaging with local service organizations (e.g. Chambers of Commerce, charities, or sports groups), sponsoring local events, or partnering with other prominent local businesses are naturally deemed to be a thriving part of the community.

Engagement can include publishing and/or promoting linked content e.g. event announcements, partner pages tied to these partner organizations, and, of course, physically engaging and perhaps getting mentioned/linked in local news stories or other publications.

10. Pay Attention To The SERPs And The Long Tail

If you are going to optimize any aspect of your local web presence, you will want to monitor your progress in terms of whether or not and where you rank within Google Maps and the regular search engine results pages (SERPs) based on the keywords you are hoping to be found for.

You can perform your own manual Google searches (preferably in Incognito Mode and while not logged into a Google account), or you can choose from a number of rank monitoring tools, many of which enable you to specifically filter out Map rankings.

When considering which keywords to follow, be sure to consider and include local identifiers and qualifying keywords such as “near me,” “best,” and “affordable” – e.g “auto body shops near me,” “best auto body shop in Barrie,” or “affordable auto body work.”

Three, four, and five-keyword phrases like these are considered long tail, which means they may not have significant local search volume – but these volumes can add up, and any local business is well advised to focus on topical groups of related keywords rather than chasing more competitive phrases.

In time, if you’ve truly established your business’ local authority, the short tail top rankings will follow.

Put Your Business On The Google Map

So now, with your laundry list in hand, be like Mike and put your local business on the map.

Establishing your authority and expertise online is not really all that different from how it’s always been in the real world, but it can take time, as any real relationship should.

Google rewards those businesses that provide the best answers to their customers’ questions, deliver solid products and services, take an active role in their local community, have their customers say nice things about them, and provide a high level of customer service at all times.

If this describes your business, get out there and do it.

More resources:


Featured Image: BestForBest/Shutterstock





Source link

Continue Reading

SEO

Google Outdoor Play Area & Outdoor Games

Published

on

GooglePlex Outdoor Play & Games


Here is a photo from a spot at the GooglePlex where Google has set up a nice and comfortable outdoor play area with outdoor games.

This was shared on Instagram who posted “fun times at work.”

This post is part of our daily Search Photo of the Day column, where we find fun and interesting photos related to the search industry and share them with our readers.





Source link

Continue Reading

Trending

Copyright © 2021 Liveseo.com