.comment-link {margin-left:.6em;}

Wednesday, January 25, 2006


Routing for multiple uplinks/providers

I've been researching the topic in the subject line for a little while now. My plan is to use my old desktop box to route for my home LAN. I have DSL and Cable at home (large house, lots of roommates), so I will be (finally) hammering out a way to use a single box to provide DHCP and other LAN services as well as provide outbound routing for BOTH ISP's.

This should be fun!

Anyways, I have here a couple links that address this:
Routing for multiple uplinks/providers
Multiple Connections to the Internet

They seem a bit stale, but I will try to post back with my progress.

I've already tried out pfSense, but found it to dislike my hardware as well as having a very sluggish web interface. The multi-WAN features of pfSense also turned out to be very unpolished and a bit confusing. So I figured that if I was going to be confused anyways, I may as well teach myself something useful.

I will probably begin by using gentoo, since it is what I have on my laptop, though I may go back to a more streamlined distro once I have things "figgered". Gentoo is also appealing as some of the baselayout changes in the latest version are designed to help out multi-homed hosts (particularly laptops, but I figure I can take advantage for my own goals).

A few extra goals I have beyond simply load-balacing/aggregating over both ISP connections will be to make it fairly easy to direct specific traffic across one ISP or the other (provided that ISP is up, of course). I also want my DHCP server to provide DNS servers from BOTH ISP connections (which are both DHCP... *g*). So a little bit of scripting beyond just setting up a multipath route.

I guess an alternative DNS approach would be to get a DNS server running on my router box itself, and provide special DNS routing rules that would prevent lookups from going out on the wrong interface.

I'll try to keep things updated -- my intended starting time for this project is this weekend. Any additional information that might be of use is, of course, appreciated.

Thursday, November 17, 2005


Opinion: New Linux study suggests fundamental Microsoft creditability problems

Wait, this isn't news, is it???



I've gone and read some of the study in question now. Not too impressive.

First is all the commentary about "Out of support" packages on SLES 8, while SLES 9 is available (and with those packages that are out of support in SLES 8!!!). Hmm! Maybe the applications that required those out of support packages were written towards distributions of SLES 9's vintage. Rather than upgrade 1-2 packages "out of support", I would expect a competent administrator to upgrade to SLES 9 or find an application compatible with SLES 8.

Rather, 3rd party extensions/products were chosen that operated both on Linux and Windows. Though of course, the specific application isn't disclosed, nor were the versions of those 3rd party products discussed.

Second, again with the "Out of support" packages. The report indicates that two of three of the SLES (*ahem*) administrators completely broke their servers while trying to upgrade glibc. I have to question the competence of the admins they selected. No backups? No knowledge of how to deal with this type of issue at all? Why were "third party extensions" to their "custom" ecommerce platform selected that weren't compatible with the existing platform?

The footnote at the bottom of page 3 is telling in this regard. Tallying non-version-specific "Windows" experience vs. non-version-specific "Linux" experience and having some version-specific requirements as well doesn't quite add up. What would the Windows administrators be getting experience in if not Windows 2000 or 2003?

Third, there is a mention of one of the Microsoft admins calling "Third party product support". Didn't the SLES administrators have similar options available?

Fourth, the number of patches installed, and the conveniently "monthly" patch cycle. Convenient, as Microsoft has a monthly patch release cycle. And the "number" of patches is questionable, as every patch not rated "optional" was installed. This entirely ignores whether the patch was to a component actually in use. SLES has all services disabled by default, and needn't automatically launch GUI software either.

SLES's patches included upgrades and patches to things like Acrobat Reader, which, given the methodology, would not have been patched on MS's enterprise solutions. The number of patches was higher for SLES because the vendor supports virtually all of the software on the machine.

Given the way modern linux distributions handle patch management, this isn't much of a burden. Requiring that _all_ patches be applied, presumably even to packages that were only installed to need patching, isn't a sensible approach. If it isn't needed, don't install it. If it isn't installed, don't patch it.

Some interesting charts, from Secunia:


Windows 2000:
Windows 2003:


Tuesday, November 01, 2005


Sony's DRM rootkit

If you use some of Sony's new copy-protected CD's on your computer, you get a nice present. You get some "stealth" processes and services, putatively designed to implement DRM. This doesn't sound like fun.

Mark's Sysinternals Blog: Sony, Rootkits and Digital Rights Management Gone Too Far

Sunday, August 14, 2005


Microsoft's "monkeys" find first zero-day exploit

Microsoft's "monkeys" find first zero-day exploit
Robert Lemos, SecurityFocus 2005-08-08

Microsoft 's experimental Honeymonkey project has found almost 750 Web pages that attempt to load malicious code onto visitors' computers and detected an attack using a vulnerability that had not been publicly disclosed, the software giant said in a paper released this month.

Microsoft's "monkeys" find first zero-day exploit

Thursday, May 26, 2005


Google Translator: The Universal Language

Here is some really interesting machine learning. Google took all of the UN's translated documents and fed it to a statistical engine of some sort. Now that engine can translate between all of the UN languages with fairly high accuracy.

Google Translator: The Universal Language

I am curious about how this would handle colloquial use of language. My guess is not terribly well, without enough input data to chew on. I can imagine this type of database being grown organically though, with people translating the occasional document or snippet in their spare time, or as needed. What sort of statistical strength piecemeal (and potentially low-quality) translations would have is debatable, though. It may be that a rigorous and repeated approach such as what the UN needs to take with all of their documents would be needed.

Wednesday, May 25, 2005



Proof that people will find a way to abuse any technology given half a chance.

Computer users anxious about viruses and identity theft have a new reason to worry: hackers have found a way to lock up the electronic documents on your computer and then demand a ransom for their return.

Hacker holds computer files hostage

Tuesday, May 24, 2005


It always helps if the damn thing is plugged in.

Yeah. It might seem obvious, but time and again, I find myself banging my head because I didn't think to check that first.