Hey I have been kicking myself with this problem for a week now and i was wondering if anyone can help. I am implementing cURL to fetch source code from a website. I am using fairly common code to do so. If I implement the functionality in my main.cpp, it works perfectly fine. when I create a class called url_handler to do it for me, it segfaults and returns really large sizes from the writer function. i literally copied and pasted the code from my main.cpp test to the url_handler.cpp so nothing is different. Any help would be greatly apprectiated. Here is my code:
#include <url_handler.h>
size_t url_handler::writer(char *data, size_t size, size_t nmemb, std::string *buffer)
{
int result = 0;
if(buffer != NULL)
{
buffer->append(data, size * nmemb);
result = size * nmemb;
}
return result;
}
std::string url_handler::fetch(const std::string &url)
{
CURL *curl;
CURLcode source;
std::string buffer;
curl = curl_easy_init();
if(curl)
{
curl_easy_setopt(curl, CURLOPT_URL, url.c_str());
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, &url_handler::writer);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, &buffer );
curl_easy_perform(curl);
//always cleanup
curl_easy_cleanup(curl);
}
return buffer;
}
Thank you.