This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
I get a lot of conflicting advice about matching a suitable power amp to speakers, and so I'd like to seek some clarification.
If an amplifier comfortably puts out 40-50% more power than a speaker's peak power rating, isn't the risk of the amplifier clipping, due to a high crest factor within source material played at a loud volume, a moot point? Wouldn't the speakers be likely to fry at (or even before) any sort of power requirement that might lead to clipping?
I have speakers with a 100W peak power rating and a sensitivity rating of 90db. I would like to power these using monoblock amps that will go to 150w comfortably with very low harmonic distortion and ~170 max before clipping. I'm switching to these monoblocks from an amp that'll start clipping between 220w and 250w. All other things considered equal (that part is important), am I actually going to experience any sort of difference in dynamics, such as reduced crest factor and a subsequent increased risk of speaker damage, by making the move?
I kind of figure that the additional headroom provided by a larger power-amp in the above-mentioned scenario isn't going to add anything, given that in order to realise a larger crest factor at higher listening levels I'd have to effectively damage the speakers.
EDIT: Everything into 8 ohms.
EDIT 2: I suppose my original question could be phrased more simply: in a scenario such as the one I've outlined initially, assuming all figures are roughly accurate, are the speakers or the amplifier the limiting factor upon the system, and would this outcome change by moving to a more powerful amplifier?
Subreddit
Post Details
- Posted
- 8 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/audiophile/...