Some starting places might be:
* upvotes from someone who reads regularly but votes irregularly count more
* upvotes from IP's that have not clicked through count less
* using a collaborative filter on upvotes to guess which stories are more likely to appeal to different readers
* randomly putting a few threads or stories out of order for each user
* users who, early on, vote-up comments that are voted up later are rewarded (f''<0 or just a ceiling on the reward like 10 upvotes) with their upvotes being worth more
* modal version of the above, using a pagerank style algorithm to calculate the helpfulness of users
* upvotes from people with more karma are worth more (again f''<0)
* mess around with sub-thread weighting. I don't know how you do it right now but it seems like a good comment on a lower sub-thread is less likely to be seen than a mediocre comment right below the +43 top comment.
* mess around with page-placement weighting. The very top is most likely to be seen and voted on. 3/4 of the way down is very likely to not be seen -- so a vote either way means more there.
* limit the number of upvotes each user gets. Could be per time, per story, per karma....
I didn't use HN a year or two ago, but it seems to me that across such social news sites the following types of content are unjustifiably upvoted:
- confidence
- lists of books
- slams (mother###ker)
- references to high-IQ stuff
- certain lengths are preferred [must be 2-3 para's long to get hugely upvoted, 2-3 sentences has a higher prob. of just a few points]
If you do some more research perhaps you could just decide on what are "bad" kinds of comments, such as negativity, and use text mining / sentiment analysis to detect them and hold back their points.
Using any of the - ideas would force HN designers to commit to what actually constitutes bad content, rather than social engineering (* ideas).