Freelancing vs. Working

Do I want to freelance, or do I want to work a job?

I’m not sure. But I strongly suspect that freelancing is where it’s at. Why? Increased responsibility, for your books, for your brand image, for your paycheck. Most people would shy away from responsibility. But whenever something had to be done on the computer, I always chose to do everything myself. The end result? I learned, and grew, and just became better.

Working at a company was fun. It was like school, except that people were anxious to perform otherwise they’d get expelled. On a slow day, there was always office politics to give you a chuckle. And it was the right kind of company. But after a while, my mind grew fat and complacent. There was no incentive for going above and beyond – and when I did, it went unused.

I remember when I first came to Berlin. I camped in the woods on the weekends to save on hostel fees. I had a map full of bookmarks. I attended many events to network and find job opportunities. I talked with everyone, everywhere. I even went dumpster diving with hippies (not doing that again though).

Now that I’ve started freelancing, it’s all coming back. The sweat in my palms, the pressure to get out and perform, the responsibilities that weigh upon me… I feel truly alive.

So far I’ve been doing work for contacts, but it certainly isn’t easy finding new ones. I just made an account on Upwork, but it seems that all the elite go to Toptal. Looking at their blog, you can see why. It’s one of the 2  companies that actually have a corporate blog with valuable content – the other one is Digital Ocean. Writing a blog post to get priority access is obviously a cover letter with integrated free promotion, but I have no qualms because it looks like the people there are seriously smart.

They say that Toptal only accepts 3% of applicants. Looking at the Github accounts of some of their freelancers, it doesn’t look like I have a good chance – then again, they’re all 30+. To maximize your chances of becoming an elite, you have to join the elite… so I’m keeping my fingers crossed.

From Windows 7 to Arch Linux

So you’ve gotten quite comfortable with your home which you’ve been living in for the past 5 years. But your life has changed so much that it doesn’t quite fit anymore, even if it does keep you warm just as well as it did before. So many files here and there from when you did this or that, and you don’t even remember what they contain anymore and you don’t have the time to go through every one of them. It’s time to rip everything out and start all over again.

So with some excitement, and maybe just a little wistfulness, you type ‘parted; mklabel gpt’ and wipe your partition table clean.

This wasn’t because Linux had gotten better – oh no. In fact it’s quite the same as it was years ago, when everybody was declaring every year to be the year of the Linux desktop – flaky drivers here and there, small niggling things that require quite a bit of research to get working right, and always the possibility that a package update might break something.

But what’s the alternative? Windows 10 has improvements but you have to flip half a dozen switches (ok, more) to get it to not talk to Microsoft,  and even then, it seems like it still does. I grew up on Windows, but it has gotten so complicated that I don’t know what’s what anymore, and actually I don’t think anyone at Microsoft knows either. How could they? The whole thing has gotten out of control, the Windows directory is at least 20GB with all the patches, you need a SSD to get it to run quickly, and putting a swanky new interface on it that isn’t even applied consistently everywhere just makes it a joke now that I’m using OS X daily.

Then there’s AVG Antivirus, which loves to remind me that I haven’t bought their product. But I don’t have a choice, because any antivirus that’s worth a shit is commercial. Wait, do I still need antivirus? What about spyware, rootkits?

Then there’s the little niggling problem that my workflow on OS X doesn’t translate well over to Windows, not without cygwin. See, I work a lot on the console. And cygwin on Windows is an ugly hack, Linux in a VM on Windows is better, but I still have to remember to fire it up, and then I have to check its IP to ssh into it. But then I can’t use Windows programs to edit files in the VM without setting up a lot of stuff.

Then there’s the fact that I’ve been keeping my pictures/music collection on a ZFS pool for the past two years to prevent bit rot, and I would like not to have to fire up a VM and wait for Samba to announce its NetBIOS hostname when I want to listen to my music anymore.

Gentoo was an obvious first choice. My first foray into Linux was with Gentoo because they had the best documentation. I was familiar with how things While waiting for stuff to compile, I wanted to listen to some music. worked, I wanted to keep track of USE flags to not pull in dependencies, and it was sure to teach me a thing or two today still. Plus compiling things would make great use of my Phenom II X6 (but apparently a Haswell Core i3 can approach its level these days).

I chose an EFI+GPT installation with OpenRC, but apparently the minimal install ISO doesn’t boot in EFI mode, and you can’t really install GRUB2 in EFI mode properly with being already in EFI mode, because you need efivarmgr to edit the EFI variables, which aren’t accessible in BIOS mode. For some reason the Gentoo Handbook, which normally mentions small things like this that make or break the user experience, omitted this little detail. Fortunately all I had to do was temporarily rename the GRUB2 executable to EFI\boot\Shellx64.efi, which my M5A99X EVO always looks for, and I could boot into the system, whereupon I promptly installed GRUB2 properly.

I pored over each config in the kernel sources, compiling a kernel tailored just for my use case and nothing else, that included drivers just for my hardware. I added discard to my /etc/fstab, -march=barcelona to /etc/portage/make.conf, and those zfs packages to /etc/portage/package.accept_keywords. I edited the USE flags and kept them in alphabetical order.

Then I tried to emerge mate.

I had to add all these extra USE flags just to get it to start. It wasn’t my meticulously pruned system anymore, no, it was now a mess. I was forced to accept the reality that GUIs are extremely complex and I just am not willing to deal with the little details of it all. Basically, I had to accept that I wasn’t going to have any control over what went into my system anymore.

Once I had gotten over myself and added the USE flags to make.conf, the emerge failed halfway because libgnome-keyring wasn’t pulled in when it should have been. I found a one year old post about this exact issue on the Gentoo forums. One year old and it still hadn’t been fixed. Report a bug? Get real, I’m still trying to get my desktop up and running here.

Once everything was up and running, I fired up Firefox and went on Youtube to celebrate. But the audio was skipping every second or two. Ugh. I listen to music on my computer all the time, and not having audio work properly was another thorn in the side on top of all these problems I had already which are, admittedly, characteristically Linux. I Googled but nobody had anything concrete. The closest I came was some guy on a forum advocating passing ‘snoop=0’ to the snd_hda_intel module when loading it. I tried it, and it just made the sound not work.

Around this time I started getting issues with the USB ports not being able to assign an address to plugged in USB devices. This did not happen with the Asmedia XHCI controller, only the built in AMD EHCI controlled ports. Weird, but then the GUI wasn’t a priority yet.

Then on the Arch Linux wiki, which had a larger Troubleshooting section than Gentoo’s with slightly more relevant entries, linked to the kernel documentation. Hmm. I have a Realtek ALC892 codec, with a Intel HD Azalia compatible sound system, and an Asus motherboard, so I passed model=asus-mode1 to snd_hda_intel and it worked!

It worked! I was breathing fresh air again – obviously this sort of problem shouldn’t have occured in the first place but that’s what Linux does – it forces you to learn the nitty gritty. I chalked it up to Linux and listened to some Carpenters to soothe my growing annoyance.

At this point I had realized it was taking too long for me to get anywhere, and Gentoo, which was supposed to be lean and mean through CFLAGS and judiciously applied USE flags, was not lean and mean anymore because I had to pull in lots of dependencies for the GUI and I was probably going to pull in a lot more in the future each time I installed graphical software because after all, this is my main desktop, I’m not going to type at the tty on it all the time. Plus, Arch Linux had a much better wiki. I found myself on it all the time when I had a problem even though I was running Gentoo.

So I figured I’d wipe it and install Arch Linux. Having had my patience worn thin, I didn’t bother with the EFI+GPT option this time and went straight for the BIOS+MBR. Installed the base system, everything took less than 20 minutes because I didn’t have to compile anything. Then I booted into the new system without a hitch. Thank god, something finally working out of the box.

Within another hour I had my MATE desktop and Firefox back, a point which took me several hours to reach in Gentoo and a lot of messing with USE flags and wondering if I really wanted this in my system or not (now: who cares). But the USB problem still occurred intermittently, I couldn’t use my mouse, and the audio was skipping again. No problem, I passed ‘model=asus-mode1’ to the snd_hda_intel module. But it didn’t work.

This is Linux all over again, I sighed. What did I expect. Why didn’t I just install Windows 10. My entire weekend is gone, my eyesight probably slipping because I have to spend more time in front of the screen in addition to the time I already do at work… now that I’m older, I really don’t have patience for this shit anymore.

And on top of that, my Realtek 8111E Ethernet just decided not to send any packets, despite bringing the physical link up. And I knew it wasn’t my Linksys WRT54GL, because that thing is solid as a rock – I trust it so completely because it has earned it. So I was stuck without the internet on my main computer, unable to install new packages.

On my Macbook, I decided to read the other documentation pertaining to the HD Audio module. On this page, I found something interesting: position_fix. I had no idea what a LPIB register was, and I wasn’t about to spend time figuring it out. I put in ‘position_fix=1’ and YES! My sound works again!

The networking and USB detection problem, however, was more difficult because there was nothing I could change, no options to tweak. I kept rebooting (thankfully Arch Linux boots in less than 10 seconds) hoping each time that the network would at least work again. It never did. I read a blogpost claiming that the USB overcurrent protection might be causing it, and that I should just unplug the computer for some time and it’ll be right as day. That didn’t work either, and I left it for a good 5 hours.

Finally in sheer frustration I reset the CMOS.

And the next time I booted, everything worked perfectly again! The network, the mouse was showing… oh yes! Now I recalled old Mac zealots recommending to ‘reset the NVRAM’ to prevent weird problems, and similar exhortations from the SGI and Sun SPARC communities. It made sense. The NVRAM in the PC world is called the CMOS, and problems never arise with Windows because everybody writes their BIOS to work with it.

Writing a BIOS is thankless work, so companies tend to test it against Windows and nothing else. I know Fujitsu in particular doesn’t give a fuck – the Primergy RX300 had broken ACPI tables. If a server vendor couldn’t be bothered to test its products rigorously, what about a consumer motherboard vendor that markets itself to gamers?

So now, finally, I have a wonderful setup. It automounts my ZFS pool, runs foobar2000 through WINE, can serve an rsync daemon, runs Dropbox, git, runs the Python and bash scripts that I write on OS X, doesn’t add hidden desktop.ini files to folders, doesn’t automatically put up a firewall on incoming connections, and more importantly, doesn’t talk back to Microsoft. I can ssh into it when I’m not home, and actually get stuff done in Linux, because you can’t get anything done in Windows with just the command line. Also,

And when I sniff the network and find that this computer is sending packets to the internet, I can pinpoint what it is, because I know that this OS isn’t doing anything behind my back. That it’s not doing anything behind my back is EXTREMELY important. For instance, Windows automatically has a firewall on the network interfaces. When I forward a port, and I don’t get a response, I always have to wonder: did I mess up when configuring the router, or is something wrong somewhere? I spent so much time getting so frustrated over this before realizing that Windows has a firewall up by default. I know what’s doing what and what’s set up in which way. In OS X I don’t know what’s doing This is extremely important, twhat most of the time, but I don’t have to because it always works perfectly and has sane, reasonable defaults. Not so with Microsoft.

The Windows era is over. Slowly but surely Microsoft is making it worse and more bloated with each passing incarnation, and only Office is keeping people on that platform. If people aren’t switching to OS X or Linux, they’re certainly doing more of their computing on Android or iOS.

Cute Women…

There’s a Japanese magazine called Ultimate Top Beauty. So of course I had to download it.

But it’s funny, when I look at those doe eyed chicks, something happened.

‘Eh, she looks sweet and innocent but… is she really that sweet?’

And I used to be a real sucker for cute girls! But now I find myself swinging more to the ‘hot’ side than the ‘cute’ side… probably because the ‘cute’ implies a personality that it might not deliver on, whereas ‘hot’ is just that: hot.

Or maybe, in the distant future, I will look at a picture of a hot girl and think:

‘Eh, she looks hot… but I bet her life isn’t all that interesting’

I’m not sure if that’ll make me jaded or experienced.

My Mindset is Slowly Slipping/I Am Not Special

For about a full year now, since… oh, January 2015, I’ve had had two girlfriends. And when something didn’t work out with the second, I found another girl that very same day, and since then she has been my second girlfriend.

It’s given me a quiet sense of confidence for a long time now, but ever since Julia said she doesn’t want to be a #3 I’ve found my confidence wavering a little bit.

Yes, I am lucky that two attractive women know about each other choose to stay with me and still let me approach other women (although not without some consternation!).

So why am I so lazy and choose NOT to approach other women nowadays? Every time I see a nice chick, my brain STILL thinks of excuses to not approach her.

I think to myself “oh, F and L wouldn’t like this, and I’m pushing things as it is with them.”

I think to myself “she’s not that hot really”

And if she is hot, I think to myself “her attitude probably sucks compared to L’s”

Having attained some measure of success, I find myself becoming, gradually more afraid of being rejected. Yes, that is what happened with Julia. And my mindset is already wavering, in a vague, fundamental way.

The truth is I have two girlfriends because I am lucky. It is not because of any incredible merit.  Subconsciously I know this, that I will still get rejected a lot. So I try not to try anymore with other girls to preserve, to hang on to what’s left of my illusion that I am someone so special, someone so attractive that two women choose to be with him.

I am not so special, or so attractive – my girlfriends are with me because they somehow chose to, and I was lucky to have met them. You can see this because many girls will still reject me.

And now, once again, I have nothing to lose.

So apparently the first Austin Powers movie

… had the subtitle ‘International Man of Mystery’, which dilutes my eagerness to use the same moniker.

But I swear I came up with it before I had even heard of Austin Powers. In fact, that was how I saw James Bond. A mysterious, worldly man who had the skills and the knowhow to get through anything.

Anyway, this blog will be about my several endeavours at becoming an international man of mystery.

  • First is becoming the ’10’ for women. To have everything, the looks, the charm, the grace, the power, the wealth, the wisdom, and the mystery.
  • Then wealth, which enables freedom. Not much to be said here, except that they say that getting wealthy is much harder than getting women.

That’s all. Just two endeavours. They should be enough to keep me busy for a while.

Getting a handle on Value For Money

After reading this post on TR forums I figured out a great way to calculate how value for money some of my possessions are. Cost is not a true indicator of value.

So after fiddling a bit with Excel, I came up with this:
excelexpenses

What totally confirms my theory is that stuff which I always felt was really worth it, really was worth it (got good numbers on the cost/day). And stuff that I felt ‘meh’ about really did score ‘meh’. I fuzzed the ‘days’ part a bit on some things, because I owned some things for a long time but didn’t use them every day.

For example, I always did feel that the Seiko watch and the 1TB external hard drive was worth it. And the Samsung F2380 monitor, which I used all the time. The only exceptions are the Macbook Air (which I use all the time, but still scores badly because it was so expensive) and the PowerMac G5 (which scores badly because I used it only for a few months, but I really enjoyed my time with it). The Radeon HD 6870 should have a better score really because I mined quite a few altcoins with it. Oh well, cost/day is not the perfect metric.

Going on to non-technology related possessions, army surplus was always going to be a good deal, the exception being the ILBE because I had it shipped from the US, and I had to pay customs. Why so much trouble for it? Because it was a good pack – I put my entire load in it and walked all around Potsdam yesterday with it. Besides being bigger than the ALICE pack, it’s also easier to get things in and out of it (because there’s a side pocket and no big flap to deal with, and I can open the lid while wearing the pack), easier to don and doff, and less uncomfortable after a long walk.

This Excel table really helped put things into perspective. For instance, all my things cost me at most an EUR per day, but food and lodging are 10x that at least. So the reality is, although possessions seem expensive, what’s really expensive in the long run is living (because we all want to live).

With my new found job I was thinking of buying new things, but now that I’ve seen this I’m going to keep my PC and HTC One S for a while longer just to extract maximum value out of them.

So now that we’ve seen all this, what is an expensive possession? It looks like an expensive possession is something that costs more than 1EUR/day, assuming you keep it for a reasonable amount of time.

How to install exfat-fuse on 10.5.8 PPC

Just a few notes for the future.

Macports installs a lot of dependencies, plus it cannot compile osxfuse (can’t use another port anymore) because osxfuse wants XCode 3.2, which doesn’t run on Leopard (only Snow Leopard). Use the binary provided on the official OSXFUSE website.

Use tigerbrew to install python 2.7 and scons. Just to install scons, Macports will install a lot of different crap, but at least it knows that python 2.7 is required to get scons to work properly. scons 2.3.4 is supposed to run on python 2.5.1 according to the documentation, but if you try you will end up with this error. Apparently ‘as’ only works with exceptions since python 2.6:
Import failed. Unable to find SCons files in:
/usr/local/bin/../engine
/usr/local/bin/scons-local-2.3.4
/usr/local/bin/scons-local
/usr/local/lib/scons-2.3.4
/System/Library/Frameworks/Python.framework/Versions/2.5/lib/scons-2.3.4
/usr/local/lib/python2.5/site-packages/scons-2.3.4
/System/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scons-2.3.4
/System/Library/Frameworks/Python.framework/Versions/2.5/lib/scons-2.3.4
/usr/local/lib/scons
/System/Library/Frameworks/Python.framework/Versions/2.5/lib/scons
/usr/local/lib/python2.5/site-packages/scons
/System/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scons
/System/Library/Frameworks/Python.framework/Versions/2.5/lib/scons
Traceback (most recent call last):
File "/usr/local/bin/scons", line 190, in
import SCons.Script
File "/usr/local/lib/scons-2.3.4/SCons/Script/__init__.py", line 76, in
import SCons.Environment
File "/usr/local/lib/scons-2.3.4/SCons/Environment.py", line 56, in
import SCons.SConf
File "/usr/local/lib/scons-2.3.4/SCons/SConf.py", line 199
except TypeError as e:
^

After you have python 2.7 and scons, you can follow the instructions on the exfat-fuse website. And it works.

Killed a Radeon HD 7950 in 2 days

I bought a Radeon HD 6870 and 7950 to help mine altcoins, thinking it would all pay off in two months.

No excitement here, these are slaves, workers, brought in to do the work my trusty olde GTX470 can’t do by itself. As such, I only took pictures of the Radeon HD 6870, but not the 7950. Well… it’s dead now. But I’m getting ahead of myself.

The first thing I noticed about the 7950 was either the cooler sucked (Sapphire Dual-X), or the chip puts out way more heat than my GTX470. Ran a bit of Metro 2033 on the thing just to satisfy my friend, didn’t feel noticeably smoother than on my GTX470 unfortunately. Also ran Crysis 2 on DX11. The added oomph from the 7950 wasn’t enough to make Crysis 2 as smooth as DX9 (on my GTX470). Overall, gaming wise, I hadn’t gone anywhere. Not that I needed extra performance in games – I hardly play games anymore.

Mining was a good upgrade over my overclocked GTX470. The Radeon HD 6870 didn’t want to work with the integrated Radeon HD 4250 in my second computer, but once I told it it was the only card for me, it set to work making LTC/FTC/CNC at 300kH/s (this is at high intensity). The Radeon HD 7950 managed to score the same amount at low intensity settings in GUIminer-scrypt (preset: 7950 low usage) but at high intensity it could pump out 450kH/s.

This made the desktop incredibly laggy, even if it was driving only one monitor, both monitors would be laggy. Seems to be a Windows issue, no wonder so many headless mining rigs run Linux. So I kept all the monitors on my GTX470 instead, and enjoyed smooth desktop operation while the 7950 cranked away.

All in all, 450kH/s+300kH/s+150kH/s is not an impressive show, when I bought the two cards, I had banked on the 6870 producing 350kH/s, and the 7950 to produce ~650kH/s. Turns out these were only peak figures achieved by people with watercooling loops.

Then I found that Powertune was throttling the 7950 down to 64% every now and then. WTF? When I buy a card with a custom cooler, I expect it be able to run at stock clocks without throttling, no matter the workload! So I raised the limit, and overclocked it, but I found that the overclock made Powertune throttle the GPU down to 64%… again. I figured the 100% of 925MHz was better than an intermittent 64% of 1100MHz… so I left it all at stock clocks, but kept the Powertune at +20%. OK, it’s now at 100% and not throttling – but I’m only getting 560kH/s, max. The Litecoin wiki said I’d get 600! (spent some time with thread-concurrency at 21712-24000, nothing got me up to 600 on stock clocks.

Nevertheless I had other concerns. It was getting really fucking hot, the GPU was reaching 83C. and loud. Freaking loud. The 6870 was working hard, and I couldn’t hear it over the background noise coming in from the open window. The 7950 made my PC sound like a blade server, and the sheer heat scared me enough to keep the case open, and the window to my room open. With the case closed, the heat from the 7950 made my Phenom II X6 just as hot as if it had been working at 100% too. I decided to leave the 7950 alone at 560kH/s and not overclock it.

I put up with this racket for two days, helped immensely by a pair of green foam earplugs and copious amounts of cold tap water on my body. In the end, I decided to give the computer a rest – I was hearing a rattling fan somewhere in there. Killed the miners (I got one block of BTB! yay) and let it idle at the desktop for a while. Then I shut the system down.

Ah, some peace and quiet. The Corsair Graphite 600T is making cracking noises as it sheds the heat, damn that was some workout. The CPU’s heatsink, despite it idling all the time, is hot. The heatplate on the 7950 burnt my finger. The GTX470 is doing just fine despite being just below the 7950 and running cudaminer (I don’t know how but Gelid’s Icy Vision custom heatsink is an incredibly good performer). Really, such peace and quiet. XFX’s 6870 is working in the other computer, reliably, making a loud whooshing sound but nothing really grating. OK, it’s time to get back to work.

I press the button, and my LED fans flash on for an instant, and quickly die. I smell something. Fuck. was that my mainboard? was that…. anything? I press again and again, the computer refuses to turn on, seems like the PSU’s short circuit protection is working. I pull out the Radeon HD 7950 and the computer boots.

God damn. So much for high quality Black Diamond chokes. So much for the Dual-X cooler. So much for Powertune. So much for mining!

Lessons learned:
1. Graphic cards should be seen and not heard.
2. Slow, steady and silent is actually preferable to fast, hot and raucous (does this mean I should mine LTC instead of the more profitable FTC/CNC?)
3. Sapphire’s Dual-X cooler isn’t all that. Shitty hardware review sites like techpowerup.com say that the cooler keeps the GPU at 63C while being silent, without mentioning that this is all because Powertune is silently throttling it in the background. Although Sapphire’s Dual-X heatsink is “custom”, the aftermarket Gelid Icy Vision dissipates the same amount of heat silently, without any fuss. I’m sure my GTX470 can put out more heat.
4. Most importantly: mining is for suckers. Buying hardware just for mining is for real suckers.

The 7950 is going back for a refund. It wasn’t perceptibly faster than my GTX470 anyway, despite what Anandtech bench may say.

Will reducing capture resolution reduce chroma noise from my smartphone’s camera?

So I was hoping that maybe, just maybe, telling my HTC One S to capture at 4MP or 2MP would cause it to downsample the image and reduce chroma noise. It doesn’t. Chroma noise is just as loud as ever, and I’ve got the images to prove it. Images are too large, and I’m too lazy to fiddle with HTML to make them fit, so here’s a lazy directory link.
http://ritchan.dasaku.net/wp-content/uploads/2013/04/noisecompare

Adaptec AHA-2940U on Windows 7 64bit

Just a reminder to myself how to set this up again in case I need to do this again in the future (I hope not). Needed this to work with a Nikon LS-2000 scanner.

Firstly, download AdaptecAic78xx.zip, yanked from a Windows Server 2008 64bit installation. Unzip it somewhere. Now, Add New Hardware may or may not work (in my experience, it didn’t work). What I do remember doing to get it to work is getting to the “Have Disk” button (Update Driver->Browse my computer for driver software->Let me pick from a list (even if you provide the folder it may not automatically “find” the driver despite the inf and sys files being there already)->Have Disk) and then it’ll work.

It isn’t digitally signed. But this is what it should look like:

adaptec

This is where I heard of the “have disk” way: social.technet.com

autopilot7
0 Points

Sign In to Vote
0
Don’t feel bad Phil! I had a little trouble getting it to work too. I then left the folder on my D: drive and went through the procedure as if I were installing it as a new driver. When it wants to browse or have you tell it where the folder is located, select that you will choose from the Adaptec selections available in Windows 7. Then select “I have a disk” and direct it to drive D: or wherever you have the folder stored. It then installed the driver okay and I am running just fine. Good Luck and thanks to the person who found this fix.

More information on technet.microsoft.com. Especially useful is this:

strych-9
0 Points

Sign In to Vote
0

There’s a lot of discussion here to this dilemma concerning Windows 7 and a driver for the Adaptec 2940 and I wanted to throw my two cents in just in case someone else finds this in a search for this problem.

Here’s my setup: Windows 7 Pro x64, Adaptec 2940au SCSI to control an Epson scanner GT-10000+. Just upgraded from XP x32 but needed Win 7 x64 for some back up issues I was having with my Home Server 2011. (long story, but all resolved now)

So anyway, I could never get the Adaptec SCSI driver to install. I started with “Adaptec_Aic_78xx” driver from the Adaptec site that I placed in the “Driver Store” which installed in the “Device Manager” but wouldn’t work (code 10). I had also found “Adaptec_7870_and_2906_and_2940_driver_for_Windows_7_-_64_Bit” which I tried to install with the “Update Driver” in the “Device Manager” with no results.

Then I saw something someone said about using the “Have Disk” in the “Update Driver” in the “Device Manager”. I used it and pointed it into my file that I had put into the “Driver Store”: “Adaptec_7870_and_2906_and_2940_driver_for_Windows_7_-_64_Bit” and bingo it started loading! It gave me the pop up window about an unsigned driver, which I clicked through and finished the installation. It was recognized in my Hamrick’s VueScan too and scanned beautifully!

So, there was how I got mine going. I hope it works for others trying to get their Adaptec SCSI card going.