This is an excellent idea for automation. Instead of generating content with AI, this script will focus on web scraping and data extraction. It's a great way to monitor competitor prices, track product information, or simply keep your data up-to-date.
Here’s a step-by-step guide on how to build this automation script using Google Apps Script.
The core idea is to automate the process of visiting a website and extracting specific data points. The script will perform the following steps:
First, create a new Google Sheet and set up the columns. The script relies on these specific column names and order.
| Domain | Product Name | Code | Brand | Model | Price | Promo Price |
Export to Sheets
Important: The "Domain" column should contain the base URL of the website you want to scrape (e.g., https://example.com/). The "Code" column will hold the unique product identifier that the script will use for its search query.
Open Extensions > Apps Script from your Google Sheet and paste the following code into the Code.gs file.
This example uses regular expressions (RegEx) to parse the HTML. This is a simple and effective approach, but you will need to adapt the regular expressions to match the specific HTML structure of the websites you want to scrape.
Code.gs
// Column index configuration (starting from 1)
const COL_DOMAIN = 1;
const COL_PRODUCT_NAME = 2;
const COL_CODE = 3;
const COL_BRAND = 4;
const COL_MODEL = 5;
const COL_PRICE = 6;
const COL_PROMO_PRICE = 7;
/**
* @description Adds a custom menu to the Google Sheets interface.
*/
function onOpen() {
SpreadsheetApp.getUi()
.createMenu('🔍 Price Comparison')
.addItem('Update Prices', 'updatePrices')
.addSeparator()
.addItem('Clear Price Data', 'clearPriceData')
.addToUi();
}
/**
* @description The main function that iterates through rows and extracts product data.
*/
function updatePrices() {
const sheet = SpreadsheetApp.getActiveSpreadsheet().getActiveSheet();
const range = sheet.getDataRange();
const values = range.getValues();
// Start from the second row to skip the headers
for (let i = 1; i < values.length; i++) {
const row = values[i];
const domain = row[COL_DOMAIN - 1];
const productCode = row[COL_CODE - 1];
// Check if a domain and product code exist
if (domain && productCode) {
try {
const searchUrl = `${domain}/search?q=${productCode}`;
sheet.getRange(i + 1, COL_PRODUCT_NAME).setValue('Searching...');
SpreadsheetApp.flush();
const productData = scrapeProductData(searchUrl, productCode);
if (productData) {
// Put the extracted data into the corresponding columns
sheet.getRange(i + 1, COL_PRODUCT_NAME).setValue(productData.name);
sheet.getRange(i + 1, COL_BRAND).setValue(productData.brand);
sheet.getRange(i + 1, COL_MODEL).setValue(productData.model);
sheet.getRange(i + 1, COL_PRICE).setValue(productData.price);
sheet.getRange(i + 1, COL_PROMO_PRICE).setValue(productData.promoPrice);
} else {
sheet.getRange(i + 1, COL_PRODUCT_NAME).setValue('Product not found.');
sheet.getRange(i + 1, COL_PRICE).clearContent();
sheet.getRange(i + 1, COL_PROMO_PRICE).clearContent();
}
} catch (e) {
sheet.getRange(i + 1, COL_PRODUCT_NAME).setValue(`Error: ${e.message}`);
Logger.log(`Error on row ${i + 1}: ${e.message}`);
}
}
}
SpreadsheetApp.getUi().alert('Price update finished!');
}
/**
* @description Clears the content of the price and product data columns.
*/
function clearPriceData() {
const ui = SpreadsheetApp.getUi();
const response = ui.alert(
'Confirmation to Clear',
'Are you sure you want to clear the content in all cells of the "Product Name", "Brand", "Model", "Price", and "Promo Price" columns? This action is irreversible.',
ui.ButtonSet.YES_NO
);
if (response === ui.Button.YES) {
const sheet = SpreadsheetApp.getActiveSpreadsheet().getActiveSheet();
const lastRow = sheet.getLastRow();
if (lastRow > 1) {
const rangeToClear = sheet.getRange(2, COL_PRODUCT_NAME, lastRow - 1, 6);
rangeToClear.clearContent();
ui.alert('Done!', 'The price and product data have been cleared.', ui.ButtonSet.OK);
} else {
ui.alert('No Data', 'No rows to clear.', ui.ButtonSet.OK);
}
} else {
ui.alert('Canceled', 'The clear operation has been canceled.', ui.ButtonSet.OK);
}
}
/**
* @description Sends an HTTP request and parses the HTML for product data.
* @param {string} url The search URL.
* @param {string} productCode The product code.
* @returns {Object|null} An object with product data or null.
*/
function scrapeProductData(url, productCode) {
try {
const response = UrlFetchApp.fetch(url, { muteHttpExceptions: true });
const htmlContent = response.getContentText();
// Regular expressions to extract data
// You MUST ADAPT these expressions for each specific domain you want to scrape.
const nameRegex = /<h1 class="product-name"[\s\S]*?>([\s\S]+?)<\/h1>/i;
const priceRegex = /<span class="price">([\s\S]+?)<\/span>/i;
const promoPriceRegex = /<span class="promo-price">([\s\S]+?)<\/span>/i;
const brandRegex = /<span class="product-brand">([\s\S]+?)<\/span>/i;
const modelRegex = /<span class="product-model">([\s\S]+?)<\/span>/i;
const nameMatch = htmlContent.match(nameRegex);
const priceMatch = htmlContent.match(priceRegex);
const promoPriceMatch = htmlContent.match(promoPriceRegex);
const brandMatch = htmlContent.match(brandRegex);
const modelMatch = htmlContent.match(modelRegex);
if (nameMatch && priceMatch) {
return {
name: nameMatch[1] ? nameMatch[1].trim() : '',
brand: brandMatch ? brandMatch[1].trim() : '',
model: modelMatch ? modelMatch[1].trim() : '',
price: priceMatch[1] ? priceMatch[1].trim() : '',
promoPrice: promoPriceMatch ? promoPriceMatch[1].trim() : '',
};
}
return null;
} catch (e) {
Logger.log(`Error parsing URL ${url}: ${e.message}`);
return null;
}
}
This script is a generic example and requires customization. To make it work correctly for the websites you want to scrape, you must modify the regular expressions within the scrapeProductData function.
How to find the correct regular expressions:
<h1 class="product-title">Product Name</h1>, your regular expression could be:JavaScriptconst nameRegex = /<h1 class="product-title">([\s\S]+?)<\/h1>/i; The part ([\s\S]+?) captures the text between the tags.Tips for Improvement:
scrapeProductData function, you could create a different parsing function for each domain (scrapeAmazon, scrapeBestBuy, etc.) and call the correct one based on the value in the "Domain" column.I hope this provides a solid foundation for your new price comparison automation project. Good luck!