At this point I must say that Dark Shikari is generally correct (surprise?) about the benefits of HT.
I ran a series of small trials using a 5 min section from the start of Star Trek 2009. I first verified that HT was off and then I re-encoded the test clip using BD-Rebuilder, selecting "good", "better", "high" and "highest" quality settings, all 1-pass. Then I enabled HT in my BIOS and I ran the batch again.
Results:
Code:
Duration HT OFF HT ON HT Benefit
Good 0:02:26 0:02:47 -13%
Better 0:06:24 0:05:05 26%
High 0:10:47 0:08:28 27%
Highest 0:15:54 0:12:34 27%
CPU load HT OFF HT ON
Good 100% 50%
Better 100% 100%
High 100% 100%
Highest 100% 100%
CPU temp HT OFF HT ON
Good 68c 68c
Better 68c 75c
High 68c 75c
Highest 68c 75c
So I am seeing a definite benefit using HT in most cases, almost exactly to the extent that Dark Shikari said I would.
But there is an actual downgrade in speed when I encode using the "good" quality setting, and the same thing happens when I run the IntelBurnTest "torture" test. Core loading goes down by half, and core temps drop. Maybe it is just MY processor? But there is a lot of discussion about HT on the net, with some people saying that their games/apps seem to run slower with HT, and others disagreeing. That sort of Jekyll/Hyde behaviour seems to be in accord with what I found above; it helps sometimes, and hinders at other times.
Anyone else out there with an i7 who could confirm this on their machine?
And it leaves me wondering what to do? I'm perfectly happy with "good" quality encodes, and x264 encoding is, by far, the most intense thing that I, personally, do on this computer. So I'm leaning towards just leaving HT off, and enjoying faster encodes and lower temps. Who cares if the kids games run a little slower (I'm pretty sure they aren't reading this forum

)?
At least I know what to do if I want to start running higher quality encodes in the future.