This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
3
[HOWTO] large file downloads (to server) >1.2GB each
Post Flair (click to view more posts with a particular flair)
Author Summary
allw is
in
HOWTO
Post Body
hi,
title says it all really, i am looking to download a data-feed using a script and so far all the examples i have come across using curl and others don't seem to be working. What is the best way to download these large files? and for later what is the best way to handle these files for import into MySQL server (i have heard LOAD DATA INFILE is the best) is this true?
thanks reddit,
-AW
EDIT: cURL is now working, but is displaying the file instead of saving it code below
<?php
$path = 'datafeed.gz';
$fp = fopen($path, 'w');
$curl = curl_init();
curl_setopt ($curl, CURLOPT_URL, $constURL);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, 1 );
echo "curl";
$fp = curl_exec($curl);
echo "curl1";
curl_close($curl);
fclose($fp);
?>
Post Details
Location
We try to extract some basic information from the post title. This is not
always successful or accurate, please use your best judgement and compare
these values to the post title and body for confirmation.
- Posted
- 9 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/PHPhelp/com...