Coming soon - Get a detailed view of why an account is flagged as spam!
view details

This post has been de-listed

It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.

3
[HOWTO] large file downloads (to server) >1.2GB each
Post Flair (click to view more posts with a particular flair)
Author Summary
allw is in HOWTO
Post Body

hi,

title says it all really, i am looking to download a data-feed using a script and so far all the examples i have come across using curl and others don't seem to be working. What is the best way to download these large files? and for later what is the best way to handle these files for import into MySQL server (i have heard LOAD DATA INFILE is the best) is this true?

thanks reddit,

-AW

EDIT: cURL is now working, but is displaying the file instead of saving it code below

<?php

$path = 'datafeed.gz';

$fp = fopen($path, 'w');

$curl = curl_init();

curl_setopt ($curl, CURLOPT_URL, $constURL);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, 1 );

echo "curl";
$fp = curl_exec($curl);
echo "curl1";

curl_close($curl);
fclose($fp);

?>

EDIT: SOLVED

Author
Account Strength
100%
Account Age
10 years
Verified Email
Yes
Verified Flair
No
Total Karma
10,703
Link Karma
1,089
Comment Karma
9,370
Profile updated: 1 hour ago
Posts updated: 10 months ago

Subreddit

Post Details

Location
We try to extract some basic information from the post title. This is not always successful or accurate, please use your best judgement and compare these values to the post title and body for confirmation.
Posted
9 years ago