Getting a handle on Value For Money

After reading this post on TR forums I figured out a great way to calculate how value for money some of my possessions are. Cost is not a true indicator of value.

So after fiddling a bit with Excel, I came up with this:

What totally confirms my theory is that stuff which I always felt was really worth it, really was worth it (got good numbers on the cost/day). And stuff that I felt ‘meh’ about really did score ‘meh’. I fuzzed the ‘days’ part a bit on some things, because I owned some things for a long time but didn’t use them every day.

For example, I always did feel that the Seiko watch and the 1TB external hard drive was worth it. And the Samsung F2380 monitor, which I used all the time. The only exceptions are the Macbook Air (which I use all the time, but still scores badly because it was so expensive) and the PowerMac G5 (which scores badly because I used it only for a few months, but I really enjoyed my time with it). The Radeon HD 6870 should have a better score really because I mined quite a few altcoins with it. Oh well, cost/day is not the perfect metric.

Going on to non-technology related possessions, army surplus was always going to be a good deal, the exception being the ILBE because I had it shipped from the US, and I had to pay customs. Why so much trouble for it? Because it was a good pack – I put my entire load in it and walked all around Potsdam yesterday with it. Besides being bigger than the ALICE pack, it’s also easier to get things in and out of it (because there’s a side pocket and no big flap to deal with, and I can open the lid while wearing the pack), easier to don and doff, and less uncomfortable after a long walk.

This Excel table really helped put things into perspective. For instance, all my things cost me at most an EUR per day, but food and lodging are 10x that at least. So the reality is, although possessions seem expensive, what’s really expensive in the long run is living (because we all want to live).

With my new found job I was thinking of buying new things, but now that I’ve seen this I’m going to keep my PC and HTC One S for a while longer just to extract maximum value out of them.

So now that we’ve seen all this, what is an expensive possession? It looks like an expensive possession is something that costs more than 1EUR/day, assuming you keep it for a reasonable amount of time.

How to install exfat-fuse on 10.5.8 PPC

Just a few notes for the future.

Macports installs a lot of dependencies, plus it cannot compile osxfuse (can’t use another port anymore) because osxfuse wants XCode 3.2, which doesn’t run on Leopard (only Snow Leopard). Use the binary provided on the official OSXFUSE website.

Use tigerbrew to install python 2.7 and scons. Just to install scons, Macports will install a lot of different crap, but at least it knows that python 2.7 is required to get scons to work properly. scons 2.3.4 is supposed to run on python 2.5.1 according to the documentation, but if you try you will end up with this error. Apparently ‘as’ only works with exceptions since python 2.6:
Import failed. Unable to find SCons files in:
Traceback (most recent call last):
File "/usr/local/bin/scons", line 190, in
import SCons.Script
File "/usr/local/lib/scons-2.3.4/SCons/Script/", line 76, in
import SCons.Environment
File "/usr/local/lib/scons-2.3.4/SCons/", line 56, in
import SCons.SConf
File "/usr/local/lib/scons-2.3.4/SCons/", line 199
except TypeError as e:

After you have python 2.7 and scons, you can follow the instructions on the exfat-fuse website. And it works.

Killed a Radeon HD 7950 in 2 days

I bought a Radeon HD 6870 and 7950 to help mine altcoins, thinking it would all pay off in two months.

No excitement here, these are slaves, workers, brought in to do the work my trusty olde GTX470 can’t do by itself. As such, I only took pictures of the Radeon HD 6870, but not the 7950. Well… it’s dead now. But I’m getting ahead of myself.

The first thing I noticed about the 7950 was either the cooler sucked (Sapphire Dual-X), or the chip puts out way more heat than my GTX470. Ran a bit of Metro 2033 on the thing just to satisfy my friend, didn’t feel noticeably smoother than on my GTX470 unfortunately. Also ran Crysis 2 on DX11. The added oomph from the 7950 wasn’t enough to make Crysis 2 as smooth as DX9 (on my GTX470). Overall, gaming wise, I hadn’t gone anywhere. Not that I needed extra performance in games – I hardly play games anymore.

Mining was a good upgrade over my overclocked GTX470. The Radeon HD 6870 didn’t want to work with the integrated Radeon HD 4250 in my second computer, but once I told it it was the only card for me, it set to work making LTC/FTC/CNC at 300kH/s (this is at high intensity). The Radeon HD 7950 managed to score the same amount at low intensity settings in GUIminer-scrypt (preset: 7950 low usage) but at high intensity it could pump out 450kH/s.

This made the desktop incredibly laggy, even if it was driving only one monitor, both monitors would be laggy. Seems to be a Windows issue, no wonder so many headless mining rigs run Linux. So I kept all the monitors on my GTX470 instead, and enjoyed smooth desktop operation while the 7950 cranked away.

All in all, 450kH/s+300kH/s+150kH/s is not an impressive show, when I bought the two cards, I had banked on the 6870 producing 350kH/s, and the 7950 to produce ~650kH/s. Turns out these were only peak figures achieved by people with watercooling loops.

Then I found that Powertune was throttling the 7950 down to 64% every now and then. WTF? When I buy a card with a custom cooler, I expect it be able to run at stock clocks without throttling, no matter the workload! So I raised the limit, and overclocked it, but I found that the overclock made Powertune throttle the GPU down to 64%… again. I figured the 100% of 925MHz was better than an intermittent 64% of 1100MHz… so I left it all at stock clocks, but kept the Powertune at +20%. OK, it’s now at 100% and not throttling – but I’m only getting 560kH/s, max. The Litecoin wiki said I’d get 600! (spent some time with thread-concurrency at 21712-24000, nothing got me up to 600 on stock clocks.

Nevertheless I had other concerns. It was getting really fucking hot, the GPU was reaching 83C. and loud. Freaking loud. The 6870 was working hard, and I couldn’t hear it over the background noise coming in from the open window. The 7950 made my PC sound like a blade server, and the sheer heat scared me enough to keep the case open, and the window to my room open. With the case closed, the heat from the 7950 made my Phenom II X6 just as hot as if it had been working at 100% too. I decided to leave the 7950 alone at 560kH/s and not overclock it.

I put up with this racket for two days, helped immensely by a pair of green foam earplugs and copious amounts of cold tap water on my body. In the end, I decided to give the computer a rest – I was hearing a rattling fan somewhere in there. Killed the miners (I got one block of BTB! yay) and let it idle at the desktop for a while. Then I shut the system down.

Ah, some peace and quiet. The Corsair Graphite 600T is making cracking noises as it sheds the heat, damn that was some workout. The CPU’s heatsink, despite it idling all the time, is hot. The heatplate on the 7950 burnt my finger. The GTX470 is doing just fine despite being just below the 7950 and running cudaminer (I don’t know how but Gelid’s Icy Vision custom heatsink is an incredibly good performer). Really, such peace and quiet. XFX’s 6870 is working in the other computer, reliably, making a loud whooshing sound but nothing really grating. OK, it’s time to get back to work.

I press the button, and my LED fans flash on for an instant, and quickly die. I smell something. Fuck. was that my mainboard? was that…. anything? I press again and again, the computer refuses to turn on, seems like the PSU’s short circuit protection is working. I pull out the Radeon HD 7950 and the computer boots.

God damn. So much for high quality Black Diamond chokes. So much for the Dual-X cooler. So much for Powertune. So much for mining!

Lessons learned:
1. Graphic cards should be seen and not heard.
2. Slow, steady and silent is actually preferable to fast, hot and raucous (does this mean I should mine LTC instead of the more profitable FTC/CNC?)
3. Sapphire’s Dual-X cooler isn’t all that. Shitty hardware review sites like say that the cooler keeps the GPU at 63C while being silent, without mentioning that this is all because Powertune is silently throttling it in the background. Although Sapphire’s Dual-X heatsink is “custom”, the aftermarket Gelid Icy Vision dissipates the same amount of heat silently, without any fuss. I’m sure my GTX470 can put out more heat.
4. Most importantly: mining is for suckers. Buying hardware just for mining is for real suckers.

The 7950 is going back for a refund. It wasn’t perceptibly faster than my GTX470 anyway, despite what Anandtech bench may say.

Will reducing capture resolution reduce chroma noise from my smartphone’s camera?

So I was hoping that maybe, just maybe, telling my HTC One S to capture at 4MP or 2MP would cause it to downsample the image and reduce chroma noise. It doesn’t. Chroma noise is just as loud as ever, and I’ve got the images to prove it. Images are too large, and I’m too lazy to fiddle with HTML to make them fit, so here’s a lazy directory link.

Adaptec AHA-2940U on Windows 7 64bit

Just a reminder to myself how to set this up again in case I need to do this again in the future (I hope not). Needed this to work with a Nikon LS-2000 scanner.

Firstly, download, yanked from a Windows Server 2008 64bit installation. Unzip it somewhere. Now, Add New Hardware may or may not work (in my experience, it didn’t work). What I do remember doing to get it to work is getting to the “Have Disk” button (Update Driver->Browse my computer for driver software->Let me pick from a list (even if you provide the folder it may not automatically “find” the driver despite the inf and sys files being there already)->Have Disk) and then it’ll work.

It isn’t digitally signed. But this is what it should look like:


This is where I heard of the “have disk” way:

0 Points

Sign In to Vote
Don’t feel bad Phil! I had a little trouble getting it to work too. I then left the folder on my D: drive and went through the procedure as if I were installing it as a new driver. When it wants to browse or have you tell it where the folder is located, select that you will choose from the Adaptec selections available in Windows 7. Then select “I have a disk” and direct it to drive D: or wherever you have the folder stored. It then installed the driver okay and I am running just fine. Good Luck and thanks to the person who found this fix.

More information on Especially useful is this:

0 Points

Sign In to Vote

There’s a lot of discussion here to this dilemma concerning Windows 7 and a driver for the Adaptec 2940 and I wanted to throw my two cents in just in case someone else finds this in a search for this problem.

Here’s my setup: Windows 7 Pro x64, Adaptec 2940au SCSI to control an Epson scanner GT-10000+. Just upgraded from XP x32 but needed Win 7 x64 for some back up issues I was having with my Home Server 2011. (long story, but all resolved now)

So anyway, I could never get the Adaptec SCSI driver to install. I started with “Adaptec_Aic_78xx” driver from the Adaptec site that I placed in the “Driver Store” which installed in the “Device Manager” but wouldn’t work (code 10). I had also found “Adaptec_7870_and_2906_and_2940_driver_for_Windows_7_-_64_Bit” which I tried to install with the “Update Driver” in the “Device Manager” with no results.

Then I saw something someone said about using the “Have Disk” in the “Update Driver” in the “Device Manager”. I used it and pointed it into my file that I had put into the “Driver Store”: “Adaptec_7870_and_2906_and_2940_driver_for_Windows_7_-_64_Bit” and bingo it started loading! It gave me the pop up window about an unsigned driver, which I clicked through and finished the installation. It was recognized in my Hamrick’s VueScan too and scanned beautifully!

So, there was how I got mine going. I hope it works for others trying to get their Adaptec SCSI card going.

Current Philosophy of Female Attraction (6 April 2013)

The following philosophy is the culmination of around 2-3 years of observing and thinking. The previous years of my life I had had hints, of course, but I never took them as the lessons they were.

Beauty is to women as success is to men. Success, however, is harder to quantify. Therefore women look for above all else indicators of success. If a tree falls in the jungle, and no one is there to hear it, can it be said that the tree has fallen? In the case of women and attraction, the answer is a definite no. Such is the importance of indicators of success.

Example: When I was in primary school, I once performed piano in the assembly. Wow I got popular fast with the girls. A girl told me she loved me. I said “ok”. I had no idea what that meant. When I transferred  to a different primary school, however, I was back at square one.

Young girls are easily impressed with good clothes. As they grow older, they begin to look at personality and confidence. A lot of times they confuse arrogance/douchbaggery with confidence. Eventually, it seems they settle on confidence as a good yardstick. And indeed it is a good indicator of success, for to be genuinely confident, one must be successful.

But since women look for indicators of success rather than success itself, you can find many men who may be successful, but can’t seem to get anywhere with women. To be attractive to women is to display that you are successful. That is why, to the chagrin of women everywhere, pickup artistry works. A hot body is one way to do it, and so are good clothes. Good manners suggest upperclass upbringing. Fame definitely helps, because fame is good friends with success. On the other hand, if you’re Caucasian, you can be a total loser in your home country but pull chicks in poor countries. It depends more on what she perceives than what actually is. This is why women pay so much attention to their outer appearance: they learned this at a young age – what matters is not whether you are beautiful, what matters is whether others see you as beautiful.

Because what generally attracts men is blindingly obvious, women mature faster as they learn the ways of the world – what matters, what is and what isn’t. Exactly how important appearances are. To transition from outer appearances (clothes) to behaviour to read a male’s worth and position in the world (social skills). All this is driven by their desire for us.

Once you have attraction, then you can talk about love.

Thought 1

To desire only a specific woman from the very beginning is to be needy; yet once she is in love with you, she appreciates this very much. Hence do humans silently acknowledge that for the most part, we are all easily replaceable.

Striving for Simplicity

I think I must credit my Macbook Air for making me re-appreciate simplicity. It wasn’t so much the machine itself, rather it was the fact that my home directory was completely empty, and I didn’t want to use up the SSD’s write cycles so I thought twice every time I copied something over to it. So now, I’ve only used 52GBs of the 112GBs available.

And I found that I was much more productive. Before downloading anything I thought twice. No untagged music downloads to distract me. No unwatched videos, no new shit I have to try out. No virtual machines to update, maintain, or remember what it was for (on that note, I’m switching from Linux to FreeBSD). Just a pure computing environment. All of a sudden I got more work done on my Air than I did on my main computer in months. Sometimes I actually try to use my Air instead of my main computer, knowing that I won’t be distracted.

Then I started to see simplicity elsewhere, in OS X, in the design of the Air, in the way things just worked, and I desperately wanted to recreate this feeling of easy simplicity everywhere else in my life. No more thinking “in Arch Linux it was like this… or was that Debian, or Gentoo?”, or “which VM should I boot to fulfill this purpose?”.

Once upon a time the web was simple. There was HTML+CSS. Then there was Javascript. Then people started to use PHP and other shit combined with databases to write HTML on the fly. Then nowadays, apparently there are frameworks that write Javascript for you. What the fuck? Web development has become way too complex for its own good these days.

This must be what digital photographers must’ve felt like when they started to shoot with film. You think more carefully, and make less, need less, and in the end your life is improved because of it.

Design is related to simplicity. In fact, it’s the child of simplicity. And the thing about the world is, there isn’t enough simplicity in it. Many cars are dead ugly… the Nissan GT-R especially. Lamborghini and Ferrari make good looking cars, with their simple, meaningful lines and instantly recognizable shapes. In life, properly designed products command a price premium. I once saw a really cool nail clipper – unfortunately it probably sacrifices durability for design. Still, I did contemplate buying it!

In computing: they speak of the joy of programming. Why, then, do I feel so bewildered when I’m trying to code some program for Windows or with Qt? That was not enjoyable at all. The joy is only to be found when I’m peeking and poking at a very low level of a system. Coding in assembly brings me joy – not that I do it often enough, because my needs don’t extend beyond file management automation. Integrated Development Environments scare me, and really who has the time to learn what they really do behind the scenes when all you want to do is get your program up and running? In all of computing, only Apple and the guys who designed Unix really get it. Why are the accessories so expensive? Because when you look at the Mac, you see: oh, it looks so beautiful, if only this thing that I have connected to it were to be as well designed! And so you spend extra so that your well designed world grows a little larger than just the Mac. An iPhone, perhaps. Or maybe those Bose speakers. Apple didn’t start out on design, however. It was only with the Macintosh that Jobs actually seemed to make the focus “designed computing”.

I think I’ve found what’s important in this world.

Why Macs are a great development platform

Developing on Windows isn’t that great. If you want to code something quick and dirty, or make something that actually does something, not just waste tons of code on GUI fluff, that means forgoing the GUI, and that means having to deal with the horrible cmd.exe. I don’t know anything about reading off exit status codes in Windows, nor will I have the slightest urge to learn how to do so, because I can’t resize cmd.exe. That’s reason enough. Cygwin? A hack, and I hate typing /cygwin/drive_c to change drives. Plus, pathnames in Windows are just so long. I just don’t get the feeling Windows is a good environment to code in.

Linux. Ah, Linux. I used to use it all the time. Then I found out that instead of using it, what I was actually doing was maintaining it. Just like in Windows, there’s always something to be done. USE flags to be winnowed out (and the whole system recompiled), manpages to be read (there’s a whole lot of that), today X won’t start (there’s a lot of that too) so I spend the whole day troubleshooting it, getting desktop compositing working, messing with gtk-engines to make my desktop environment look the same across programs that use different GUI toolkits (I seriously hate that, even Windows got it right), reading through urxvt’s manpage to make heads and tails of all that crap it expects you to dump in the .Xdefaults file (because the default appearance just sucks), choosing this WM or that DE or that particular terminal, configuring/compiling the kernel, and when packages get updated… great. Especially on Arch Linux. Every now and then they make some drastic change which makes pacman spit out some stupid error, and then I have to read the News and follow the instructions “very carefully or you could hose your entire system!!!”

You get my point.

OS X is different though. My Macbook Air is a miracle sent down from heaven – if I had gotten a Thinkpad X230/X1 Carbon that would also have been a miracle albeit a smaller one. The small SSD makes me go light on the distractions like music/videos. You can do everything with just a few touchpad gestures. I can’t live without virtual desktops and Totalterminal. Sublime Text 2 not only looks like it’s part of OS X, it does so out of the box. I know Sublime Text is available for Linux, but it would be the only good looking program out of the whole bunch, and that just sticks out like a sore thumb. Windows doesn’t have an aesthetic anyway, so it would also stick out. I just keep thinking of Notepad++ anyway. The whole laptop is silent and just works. JUST WORKS. I cannot emphasize how important this is. There’s always something that isn’t working in Linux. In Windows things do work, but you have to work a bit to get them working, say, for instance, running Lenovo’s crappy homemade program just so you can switch from integrated to discrete GPU. On OS X, it’s all… invisible. Magic!

It’s magic! It is well worth the money spent on it – it’s a great tool that will serve me for a long time to come. Sturdily built, no software fusses, beautifully designed, feature rich, completely silent… I’m in love.

Oh wait. This was supposed to be about development.

Yes. So Linux is great because everything’s comprehensible. You have Makefiles, and you have the configure script, and you have gcc which does its business and calls ld after that. Simple enough. And you can have all that, with the magic of OS X! That’s the beauty of OS X – it’s like Linux, only you don’t have to work so hard to get it to work/be just the way you like it. The familiarity/maturity of the Linux development environment (why not Unix? because most Unices that I’ve worked with don’t even come with a shell that supports tab-completion by default, and have a huge space wasting ugly console font… see AIX/Solaris), and the “just works”ness of OS X. It’s a winning combination.

A few thoughts on AMD’s Piledriver/Vishera

It’s obvious through HardOCP’s Bulldozer->Piledriver IPC investigation that Piledriver is faster than Bulldozer largely because of added frequency headroom made possible by higher power efficiency. As Kanter pointed out from the very beginning, Bulldozer is a speed demon. It needs clocks to get anywhere near Intel.

Overclocking attempts haven’t been very fruitful though. Now is still not the time to buy an AMD FX processor. Over time I’m sure there will be a bit more headroom as they tweak the process, and FX processors will be able to clock higher out of the box. If we get supremely lucky, the next generation (Steamroller) might still be on AM3+!