Day 4/100 100 Days of code

Info Hunter

I spent most of the time making corrections because the Scraper class didn't work properly. It works now as it should but it only finds text that contains the keywords, not the whole page.

It still needs a lot of work. I am pretty happy with my progress so far. This is the code that I added:

//    Start scraping
    std::vector<std::string> scraperKeywords;

    for (int amount : urlCounterHolder)
    {
        std::cout << amount << std::endl;
        for (int j = 0; j < amount; j++)
        {
            std::cout << getSettingsKeywords[j] << std::endl;
            scraperKeywords.push_back(getSettingsKeywords[j]);
        }

        std::cout << getUrls[counter] << std::endl;

        Scraper scraper;
        scraper.SetupScraper(scraperKeywords, getUrls[counter]);
        AnalyzePages pageAnalyzer;

        // Get info from website
        cpr::Response r = scraper.request_info(scraper.baseURL);

        // Parse it
        std::vector<std::string> urls = scraper.ParseContent(r.text,
                                                             (char *) "href",
                                                             (char *) "/");

        // Iterate through them
        for (const std::string &item: urls) {
            std::cout << item << std::endl;
            pageAnalyzer.analyzeEntry(item, scraperKeywords, scraper);

        }

        counter++;
    }

I need to handle disconnections when the application runs and add an interface for the run option to inform the user what is going and give them the ability to stop the search if they want to.