In the ever-evolving digital age, SEO remains a crucial strategy for businesses to generate organic traffic, build trust, and establish brand authority. The art and science of ranking high on search engines primarily revolve around keywords. But how does one determine the potential of a keyword?
Enter SCR (or KGR, Keyword Golden Ratio), a powerful method to identify low competition keywords. And the mastermind behind simplifying this approach using Google Colab is none other than Aamir Iqbal, an SEO expert. Today, we’ll dive deep into his strategy, and by the end of this blog post, you’ll have a grasp on:
- The logic behind SCR.
- How to implement this approach.
- Utilizing Google Colab for this analysis.
SCR (Sultan’s Competition Ratio) in SEO
The SCR, or Sultan’s Competition Ratio, is a concept introduced by Aamir Iqbal to refine keyword research in the realm of SEO. The logic behind SCR is simple yet powerful. It’s a ratio formulated to gauge the potential of a keyword by scrutinizing both its search volume and the total number of Google search results that specifically target the keyword.
The SCR Calculation:
To determine the SCR for a keyword, the following formula is applied:
SCR=Number of Google “allintitle” resultsMonthly Search Volume
By using this formula, one can ascertain whether a keyword is potentially lucrative for SEO efforts based on its SCR value.
Here’s how it works:
- Number of Google “allintitle” results: This represents the number of search results that have the keyword precisely in the title. This is often a good measure of how many pages are intentionally trying to rank for the keyword.
- Monthly Search Volume: This denotes how many times the keyword is searched for on Google in a month.
By evaluating the ratio between these two factors, the SCR offers a clearer picture of the competitiveness of a keyword. The lower the SCR, the higher the chances of ranking for the keyword.
For instance, if a keyword has a monthly search volume of 1000, and there are 250 or fewer “allintitle” results for that keyword, then the SCR would be 0.25 or lower, making it a viable keyword to target.
Building The SCR Tool
Choosing The Right Tools
Aamir Iqbal, utilizing his profound knowledge in the field, decided to harness the power of Python and Google Colab to implement this strategy. Google Colab offers a free Jupyter notebook environment, allowing users to run Python code directly in the browser, making it a perfect fit for this task.
Step-by-Step Implementation
- Setting Up Google Colab:
- First, head over to Google Colab.
- Click on ‘New Notebook’.
- Now, you’re in a Python environment, ready to roll!
- Installing Necessary Libraries: In the Colab notebook, you need to install the required Python libraries. For our tool, we rely on
selenium
for web scraping andrequests
for API calls. Installing is as simple as running:
Python
!pip install selenium requests
- Leveraging Google Custom Search API: To avoid the challenges of web scraping, Aamir’s code uses Google’s Custom Search API. This requires setting up a custom search engine and obtaining the API key.
- Head to Google Cloud Platform.
- Create a new project and enable Custom Search API.
- Generate your API Key.
- Set up a Custom Search Engine through Google CSE and get your Search Engine ID.
- Writing The SCR Code: With the API set up, the focus shifts to writing the logic for SCR calculation. This involves:
- Fetching allintitle results using the Google API.
- Calculating SCR.
- Filtering out valuable keywords.
The code provided above, written by Aamir Iqbal, beautifully encapsulates this logic. By running it in Google Colab, one can effortlessly find low competition keywords.
SCR Analysis Tool
Understanding the Code’s Anatomy
At the heart of this powerful tool, we have Python – an incredibly versatile language. The genius of Aamir Iqbal lies not just in understanding the importance of SCR but in leveraging Python to make the entire process accessible and automated.
Importing the Libraries
Python, being a high-level language, has a vast ecosystem of libraries. For our tool, the two primary ones are requests
(for API calls) and Google Colab’s own files
module for file handling.
import requests
from google.colab import files
Setting Up Constants and Parameters
Before diving into the main logic, we need to set up the constants. This includes your Google Custom Search API Key and the Search Engine ID.
API_KEY = 'YOUR_API_KEY'
CSE_ID = 'YOUR_CSE_ID'
MONTHLY_SEARCH_VOLUME = 1000
Here, MONTHLY_SEARCH_VOLUME
is a placeholder, and in a real-world scenario, you’d replace this with the actual monthly search volume of your keywords.
The Core Logic: Google Search Function
The google_search
function is the core. Instead of manually searching Google for each keyword, this function does it automatically. It takes a keyword as input and returns the number of sites that have this keyword in their title (the allintitle count).
def google_search(query):
...
return total_results
Determining the Validity of Keywords
Once we have the allintitle count, the is_keyword_valid
function calculates the SCR. If the SCR is 0.25 or lower, and the allintitle count is 250 or fewer, the keyword is considered valid.
def is_keyword_valid(keyword):
...
return allintitle_count <= 250 and scr <= 0.25
From Notebook to User-Friendly Tool
Aamir Iqbal’s brilliance shines here. Instead of running the script in isolation, he utilized Google Colab’s capability to upload files. Users can upload a text file containing their list of keywords, and the tool will then analyze each keyword.
uploaded = files.upload()
Displaying the Results
The results are then displayed in a user-friendly format. If the keywords are deemed potentially profitable (based on our SCR criteria), the tool will affirmatively declare, “Yes Sultan Meri Jan , You Can Rank These Keywords:”.
Running The Tool
After inputting the necessary constants, the user simply runs the entire notebook, uploads their file, and gets the results in a matter of seconds.
import requests
from google.colab import files
import time # <– Import the time module# Constants
API_KEY = ‘your api key’
CSE_ID = ‘SE ID’
MONTHLY_SEARCH_VOLUME = 1000def google_search(query):
url = ‘https://www.googleapis.com/customsearch/v1’
params = {
‘key’: API_KEY,
‘cx’: CSE_ID,
‘q’: f”allintitle: {query}”
}
response = requests.get(url, params=params)if response.status_code == 200:
data = response.json()
total_results = int(data.get(‘searchInformation’, {}).get(‘totalResults’, 0))
return total_results
else:
print(f”Error {response.status_code}: {response.text}”)
return Nonedef is_keyword_valid(keyword):
allintitle_count = google_search(keyword)
time.sleep(10) # <– Add a 10-second delay here after each keyword processingif allintitle_count is None:
return Falsescr = allintitle_count / MONTHLY_SEARCH_VOLUME
return allintitle_count <= 250 and scr <= 0.25uploaded = files.upload()
file_name = list(uploaded.keys())[0]with open(file_name, ‘r’) as f:
keywords = f.read().splitlines()valid_keywords = [keyword for keyword in keywords if is_keyword_valid(keyword)]
if valid_keywords:
print(“Yes Sultan Meri Jan , You Can Rank These Keywords:”)
for kw in valid_keywords:
print(kw)
else:
print(“None of the keywords meet the SCR criteria.”)
Final Thoughts
This tool’s beauty lies in its simplicity and efficiency. Instead of spending hours analyzing keywords manually, Aamir Iqbal’s approach reduces it to mere minutes. Moreover, by sharing this with the community, he exemplifies the spirit of collaboration, ensuring everyone can benefit from his expertise.
The subsequent sections will provide a more hands-on tutorial on setting up and using the tool, along with insights and tips from Aamir Iqbal himself. So, stay tuned!
Leave a Reply