Do You Trust Microsoft?

I try to give things, ideas, people, etc. a fair shake in most cases, and this even applies to the Microsoft Windows operating system (specifically Windows XP). Yeah, I use Linux and go out of my way to avoid using Windows, and yes, I sometimes _do_ feel like I need a shot of penicillin after touching a Windows box (which is an infrequent event itself), but I try to acknowledge the contributions and advancements that Microsoft has built on, when those things have happened. Some things are even admirable, like the time I plugged in a Firewire camcorder, and XP just popped open a program (Media Player, I think) and captured the video, no questions asked. Pretty cool, and I have not seen this happen on Linux (not to say that it can’t – I just haven’t seen it).

Two recent events (among many) stand out, however. I recently helped local school kids (6th grade and up, about ten of them) learn how to load Linux (Mandriva 2007) on some old donated Dell 3 GHz machines (512 MB RAM, Intel on-board graphics, 80 GB hard drives), and the installs went smoothly. They then got to take home their computers, free. We did not have much time to show them how to use Linux once KDE was up and running, and I honestly expect seven or eight of them to have reloaded their free computers with Windows by now – at least two seemed likely to stay with it, even though all of them seemed impressed by Linux and KDE. This doesn’t bother me – most of them game, and reloading seems to be ingrained way of troubleshooting for home users of Microsoft products. At least two seemed likely to stick with it, and that is fine with me.

I ran into one of the (likely-to-load-Windows) kids a couple weeks later at a store, and asked him how things were going. He then told me a tale of woe and sorrow that surprised me – his computer was now extremely slow and hard to use, and he was doing nothing but RELOADING DRIVERS. This surprised me, since the drivers were already loaded, and I mentioned this. He then clarified, saying that while he liked Linux and it had worked fine, he had gone out and gotten another hard drive, installed it, and proceeded to loaded Windows XP on it so he could play his games. The Linux hard drive had been pulled out, since he did not want Windows overwriting his boot loader – something I had warned the kids would happen if Windows was loaded after Linux. Now, no matter how many times he reloaded drivers and patches, the system crawled and was unstable – and he couldn’t figure out what was wrong. I smiled, nodded, and told him that this was one of many reasons I was glad I no longer used Windows at home. I wished him luck and offered my help if he ever decided to try Linux again.

The other event was when we were troubleshooting an odd error with svchost.exe (application error) that was showing up on some desktops. After Googling (the event logs were pretty useless – again), we found multiple identical fixes, posted from different sources, and similar explanations. Apparently, the error was suspected of being caused by a corrupted update pushed from Microsoft. (Anyone remember the bad old days when such an event could thrash a Windows network?) The fix was pretty involved – the typical home user would either put up with the error, or reload. We decided to make sure our images were clean and reimage to save time.

Quality software? These are just two examples of many that over the years seem to point to a pattern of poor quality that cannot be defended or excused. There have been many many times I have been troubleshooting weird errors in Windows workstations and servers at work in which I have found cryptic error messages in the event logs, looked them up on various Microsoft resources (including TechNet), and discovered absolutely NOTHING useful. Google has many times only provided links to others who have had identical results – but no answers. Another time, I found a workstation that was so boned it would only let me, and no one else, log in. Apparently, someone had power-cycled it in the middle of an update, essentially busting it quite nicely – which is what a quick Google revealed was the expected behavior. The fix, of course, was to reimage the machine. Nothing else would do. This is tip-of-the-iceberg stuff…

From the Windows side, rebooting, reloading, formatting, fdisking, destroying and losing data, starting over from scratch – these are acceptable methods of troubleshooting and problem-solving. People are used to not being able to find out why, or how. It is an annual event (sometimes more often than that) to rebuild because the machine has become slow and unpredictable – it is like Spring Cleaning.

From the ‘nix side – this is unacceptable. Heresy. Sacrilegious, even. Instead, you can read the logs, and they mostly make sense with only a little familiarity with Linux. You can Google and get real answers. There are lots of forums, chat rooms, and channels one can participate in to get answers, but help is so easy to find that I have almost never had to ask a question online. I have had many problems with Windows that have had no solution other than to reload. I have never, ever had this case with Linux (certainly, if one tries hard enough, such a problem can be induced on Linux, or any other OS, of course).

Linux has it share of faults, especially with printer management (I am sure you can think of other things), but it is free. No cost other than your time to set it up. It can happily coexist with other operating systems. It can happily use hardware long since abandoned by Microsoft. Forced upgrades in order to get Linux patches and security updates are possible, in some extreme cases, I suppose, (only because I try not to rule anything out) but I have never actually seen this in practice. Major updates and upgrades almost always seem to yield impressive results, making the effort feel very worthwhile. In most cases, stuff just works, and once you get things working, they tend to stay working.

Windows? Forced updates for hardware and software, if you want to keep getting patches. Pay expensive support fees to get security patches for older versions of Windows. Pay to get anti-virus subscriptions from a third-party to protect your PC from harmful software that exploits flaws in Windows. Pay to someone else to provide software that protects you from flaws in the Windows OS you already paid for. Pay for cleaners, spyware-busters, registry sweepers. Then get updates from Microsoft that break Windows. Then reload and start over when your computer, for no good reason other than enough time has passed, becomes slow and stupid. I won’t go into the reactivation schemes if you change hardware. I won’t go into DRM. I won’t go into UAC-nagging and phone-home-to-Microsoft features. These are things you PAY MONEY for. Oh, but I did get a patch to update the Microsoft Genuine Advantage program, to ensure I was using a real version of Windows, even though I had already done that previously. Sure did. Yup. Good thing THAT was free. Don’t forget upgrades that required you to relearn how to drive around the desktop and applications. Happened with 95 (good), happened with XP (sorta good), and happened with Vista (now I gotta wonder). Windows supporters grouse about Linux requiring the user to relearn the GUI, but the same has happened with Windows and Office before, and sometimes, you just cannot see why it had to be that way.

Windows does a lot of useful stuff for folks, and that is fine. Most people won’t actually *pay* for their copy, since they will just get a new PC in order to meet the increasing hardware requirements, and it will come with the newest offering from Microsoft, with all the drivers, 30-day anti-malware services, subscription discounts, some lightweight productivity software, some games, etc. And paying for useful software is hardly criminal. But how much money has to be sucked from your wallet to make the OS you bought with the new computer safe enough to stay connected to the Internet? How is it right to fund a third-party industry that was built around protecting Windows from itself – without complaint? How is it right that there are still known holes in widely-used Microsoft products that remain unpatched, products someone paid for?

Would you expect the coffee pot you just paid for to have a hole? Would you next obediently pay someone else to patch it, or or would you return it and demand your money back? Simplistic, sure, but come on – this is an operating system that drives entire groups of industries. Hundreds of billions of dollars move around because of Windows. It is astounding that so many have become so accepting of such shoddy quality. I am not addressing applications not written by Microsoft – I am addressing the operating system and Microsoft applications like Office and Internet Explorer.

All software has bugs, holes, flaws, and over time, it is expected that old ones will be patched, new ones will be found. But isn’t there a systemic problem when the anti-virus industry *grows*? When the anti-spyware industry *grows*? When the security-cleaner-defragger-performance-tweaker industry *grows*? If things were getting better, shouldn’t they be shrinking or at least *not* growing, since there would be fewer holes to exploit? What does this mean? Do you doubt that without these industries and their tools, your hardworking OS is in danger of being exploited or damaged just because you connected to the Internet?

Why are the holes that allow viruses and worms and keyloggers and trojans not fixed, when others are with patches? There is a long list of viruses that are quite old that can still infect Windows XP, even with SP2 applied. There is a lot of spyware out there that can still get in. New versions seem to spring up weekly, and some are just minor tweaks to older versions that were blocked by a patch. There have been Microsoft updates that can break software and countless patches to fix problems introduced by patches. Doesn’t anyone at Microsoft know their own software well enough to at least avoid that scenario?

My guess – some flaws in Windows are so deeply rooted they cannot be baked out of Windows without severe, drastic changes. Imagine a set of holes and cracks in a dam that cannot be fixed without gutting the dam and rebuilding it. So someone else installs a set of protective drains and diverters and pipes and valves that all needs constant vigilance and repair to keep the holes from growing and the cracks from spreading – because it is cheaper than rebuilding and affecting communities downstream. I suspect that the business realities of commercial software works against quality in a similar fashion.

  • Programmers have deadlines and deliverables.
  • Bosses and managers have progress reports to pass upwards, and cost-cutting measures to pass downwards.
  • Executives have shareholders and the media to massage and seduce so stock prices go up, not down.
  • Teams work in isolation so no one can know too many trade secrets.
  • Everyone has a job to protect and a promotion to work towards, and maybe rocking the boat gets in the way of some of that.
  • Code gets rushed, and sloppy code gets reused rather than rewritten to get things out the door on time.

I said I wouldn’t mention DRM, but quite a lot of effort went into it on Vista, and it seems to work quite well, restricting how users play their media on their hardware. Why couldn’t that same level of effort go into better security patches, overall better quality of Windows? Because you need to upgrade to Vista to be more secure, and Vista will sell beefier computers. Making XP better and safer to use works against those goals. Vista DRM helps establish future revenue sources. Security fixes do not generate income. The reasons go on, but they all boil down to profits first, everything else last.

It makes me wonder how much room is left at Microsoft for quality, except for the times when a large enough event forces a change, such as the revamped IE7. Why did this happen after Microsoft declared that IE6 was the last stand-alone release? I am sure the success of Firefox had something to do with it. IE7 and Vista sound like progress has been made in fixing holes and providing a more secure OS, but look at past releases – all prior versions have been the “most secure Microsoft operating system ever” (what else would they be?), yet all have rapidly been shown to be quite a bit less secure than hyped. Even now, many are advising users to wait until Service Pack 1 for Vista is released before upgrading. Any reason to think Vista, Office 2007, IE7, etc., will be any different? Are you getting better quality software, or just different looking software that essentially does most of the same things, and adds a few things you probably wouldn’t miss?

I lost my trust of Microsoft a long time ago, after security updates broke machines, after zero-day exploits slagged networks, after viruses repeatedly smoked corporate networks and slowed the Internet to a crawl in many places around the world. We paid good money for that software, and we paid more to secure it. We paid money to troubleshoot it, to learn it and understand it. After enough bad news, you cannot help but start wondering what you are investing in…. And being a convicted monopolist sure didn’t help Microsoft, either.

I have yet to lose faith in Linux. I have yet to see any event that affected a huge community of Linux users in a common fashion (and yes, there really are enough Linux machines out there to qualify as a huge community). I have yet to see a zero-day on Linux that flogged the Internet. I have seen at least one distribution-specific update that borked a major system component, Xorg, but fixes were quick and better tested, and the problem update only affected the one distribution, not all of them.

I had put up with a lot before I finally gave up on Microsoft – Linux still has a long way to go to before I do the same with it. In fact, it is my experience with Linux and the exposure to the level of quality of the OS and its major component applications that has made me more keenly aware of and less tolerant of the quality Microsoft puts into its software. So I ask – what will it take for Microsoft to improve their software, to make it safer for the end users at home, and to make it easier for admins to troubleshoot? And how much more trust will they be willing to place in Microsoft?

Finally, some related good reading, if you stuck it out this far:

“A Cost Analysis of Windows Vista Content Protection”

“The Missing Microsoft Patches”

“Latest Ubuntu xorg-core update breaks X – this is quite old news”

Old article on a Microsoft patch break…

Googling on svchost.exe issues…

Microsoft phones home…

Remember – this is an opinion piece about trusting the quality of software you pay for and depend on from Microsoft. These links support my point of view – I am sure you can find plenty of links to support the opposite if you want. In the end, your personal experiences are going to drive you one way or the other – and mine have definitely made me question everything Microsoft does or does not do and say.

Update: Basic file operation problems in Vista…

How I Upgraded My Custom Kernel on Mandriva

I like to tweak.  I really really do.  This should help you understand why I insist on running the Cooker version of Mandriva.  BUT…due to certain structural changes in the kernel config (I assume) between the 2.6.18 and 2.6 19 versions, I have been unable to successfully take my vanilla (but customized) kernel past 2.6.18.8.  Success means mostly everything works (I can sacrifice VMware Server, since I really only use Windows for Turbo Tax [see previous post if you're really that interested]).  The firewall, Xorg, networking, everything else, should work.  I should mention now that every kernel I have run up to 2.6.18.8 has been heavily customized (by me) to strip out all the cruft I do not need.  You know, token ring, ATM drivers, FDDI, etc.  All gone.  Took a while the first time, so thank God for “make oldconfig”.  This worked right up until the 2.6.19 kernel, when I had lotsa problems.

Since 2.6.18 was running good enough, I just kept going with minor version patches up to 2.6.18.8.  Attempts to go beyond 2.6.18.x ended badly, with me always crawling back to the “Old Faithful” kernel, regardless of which machine (mine, my son’s, or work) I tried it on.  Something always failed to work as planned, with serious enough errors to be beyond my skill level to fix (which is not actually that hard, to be frank).  Tonight, however, I resolved to upgrade beyond the 2.6.18 kernel, the easy way.

Sorta.

I could have simply looked at error logs, dmesg output, and (on the stable version) kept reconfiguring and recompiling until I got it right.  Instead, I let Mandriva do the work for me by installing the Mandriva kernels via urpmi.  I almost never use the GUI for installs anymore, since the command line is so much faster and simpler.  I tried “urpmi kernel”, “urpmi 2.6.19″, “urpmi 2.6.20″, and “urpmi 2.6.21″, which listed all installable packages with that text in the names, until I found the kernel-linus- packages.  Then I installed the 2.6.21-0.rc3.4mdv and matching source package, rebooted to the 2.6.21-0.rc3.4mdv kernel (“lilo -R 2.6.21-0.rc3.4mdv” and reboot), and quickly determined that the latest NVidia drivers would not install.  I next used lynx to download the 2.6.20 kernel and the 2.6.21-rc4 patch, extracted the 2.6.20 tar.bz file into /usr/src, bunzipped the patch file, and patched the 2.6.20 kernel up to 2.6.21-rc4 (“patch -p1 < [patch file]).  I also checked my /boot partition to compare sizes – my custom 2.6.18.8 kernel was about 1.3 MB – the new 2.6.21 kernel was 1.5 MB.  Oh yeah – it is really important that the base kernel version (the “2.6.20″ version) be used when patching to an RC version, so do not try this with 2.6.20.x, or attempts to patch will fail.

Anyway, after changing the soft link for Linux over to the new Linux source directory for 2.6.20 (“rm -f linux” and “ln -s linux-2.6.20 linux”), I cd’d into the linux directory, ran “make oldconfig”, “make && make modules && make modules_install && make install”, and waited.  Once it finished (no errors), I rebooted on the new kernel, and all was well.  It booted fine, and I was able to install the latest NVidia driver.  I was unable to get VMWare Server (even the newest 1.0.2 version) to install, but this just means I won’t upgrade at work for a while.  Everything else seems fine.  I will try “make menuconfig” later on and dig through the kernel to see what cruft I can take out.

So basically, rather than troubleshoot this, I let Cooker fix the kernel issues I was having with my custom kernel, by giving me a fatter distro kernel config that I later used as a basis to install a new, non-custom vanilla kernel.  This also kinda goes against my past post of upgrading vanilla kernels, but I probably would not have has so many problems if I had not tweaked my kernel as much as I did.  Again, I’ll need to run through the new kernel and trim the fat, but at least I am on the newer version, which *should* make future upgrades go more smoothly, at least until big enough changes happen again <smirk>.  The price of progress, I guess…

What keeps me using Windows at home?

Could be several things, but it’s not.

Games?  No.  I just don’t play games anymore, and I don’t feel like I’ve really missed anything, even if they are really cool once in a while.  I am just not a gamer.

Office apps, like Outlook?  No.  I barely check e-mail at home, and it’s all web-based anyway.

Any Active Directory stuff?  No, although I figure I could get Samba working with it well enough if I had to – sounds like an interesting day project.

No, what keeps me tied, ever so tenuously, to Windows is TurboTax.  I have looked everywhere, but cannot find a viable equivalent.  Due to my more complex filing needs, I cannot use the online version – I have to get the boxed retail premium version.  I do not use any other software from Intuit.  And, in their defense, I have to say that despite being forced to use TurboTax due to a seeming lack of alternatives, the software itself is a real pleasure to work with – the user-interface is extremely friendly and intuitive.  At least it is not built like Windows…

It would be nice if Intuit would port TurboTax to Linux, or  if there were a Linux-alternative to TurboTax that was feature-rich enough to handle more substantial filing needs.  I already pay – I would happily continue to pay for a Linux version (provided the price was comparable to that of the Windows version).

If this were the case, I could scrap my Windows VMWare virtual machine – loading TurboTax on it is not what takes time.  Patching it is what takes time – I use it so infrequently that it gets seriously out-of-date and I need to patch it, rebooting the VM at least half-a-dozen times in general.  Add to that the frustration of having it blue-screen because an update borked it, forcing me to roll-it-back, and you can see why I feel as I do.  Honestly, TurboTax is one of those critical must-have applications for my home, even if it is just once a year.

It is the only single thing left for which I keep Windows around.

More Good Beryl Info….

I found this blog, and and since most of the visitors here are looking for information on Beryl, thought I would post a link to his archived beryl articles. They look quite useful and informative, as does the rest of his site.

Check it out at:

The Nameless One’s Beryl Archive…

Bugatti Veyron At Top Speed…

Alright, this has next-to-nothing to do with Linux.

But gawd, is it cool!

http://www.dailymotion.com/video/x157l2_bugatti-veyron-at-top-speed

Interesting Case Modding Site….

My brother sent this to me. I especially liked the Humidor cluster….

Slipperyskip Computers

Web-filtering with Squid, SquidGuard, and Dansguardian

(Note: this is a modified repost of a reply to this thread (Child-Proofing Linux) on Tek-Tips Forums.)

(UPDATE – This post is superceded by this one: HOWTO – Child-Proofing Internet Access on Kubuntu – use it instead.

I did this successfully, with some digging around on Google. I use Mandriva 2007 for both home PCs, one is for the kids, and I use transparent redirection in iptables so there is no browser preference modification needed (and it works on all browsers, including text-only). I installed everything from source tarballs – it was simpler to tie it all together this way. The end result – per-user proxy restrictions, so I am exempt but the kids are not, and they are time-limited to between 7am and 9pm for web access. I also get emails of blocked attempts. They do not use IM, so this only applies to web access. Several false-positives, so a little tweaking of the blacklist files might be needed… I posted a write-up on this earlier here, but I think this one goes into better detail and is a little easier to follow. Here are the steps I took:

1. Download the following (there may be newer versions, but definitely need db-2.7.7):

2. Unpack the downloaded files:

  • tar xvfz db-2.7.7.tar.gz
  • tar xvfj squid-2.6.STABLE5-20061110.tar.bz2
  • tar xvfz dansguardian-2.9.8.0.tar.gz
  • tar xvfz squidGuard-1.2.0.tar.gz

3. Make user, group, and firewall rules (iptables commands may appear wrapped in two lines):

  • groupadd -r squid
  • useradd -g squid -d /var/spool/squid -s /bin/false -r squid
  • iptables -t nat -A OUTPUT -p tcp --dport 80 -m owner --uid-owner squid -j ACCEPT
  • iptables -t nat -A OUTPUT -p tcp --dport 3128 -m owner --uid-owner squid -j ACCEPT
  • iptables -t nat -A OUTPUT -p tcp --dport 80 -m owner --uid-owner exemptuser -j ACCEPT (change exemptuser)
  • iptables -t nat -A OUTPUT -p tcp --dport 80 -j REDIRECT --to-ports 8080
  • iptables -t nat -A OUTPUT -p tcp --dport 3128 -j REDIRECT --to-ports 8080

4. Make BerkelyDB – must be 2.x version, not newer, not older:

  • cd db-2.7.7/dist && ./configure && make && make install

5. Make squid v.2-6:

  • cd squid-2.6.STABLE5-20061110
  • ./configure --enable-icmp --enable-delay-pools --enable-useragent-log --enable-referer-log --enable-kill-parent-hack --enable-cachemgr-hostname=hostname --enable-arp-acl --enable-htcp --enable-ssl --enable-forw-via-db --enable-cache-digests --enable-default-err-language=English --enable-err-languages=English --enable-linux-netfilter --disable-ident-lookups --disable-internal-dns && make && make install (this is one long wrapped command from ./configure to make install)

6. Make squidGuard v.1.2:

  • cd squidGuard-1.2.0 && ./configure && make && make install

7. Make dansguardian v.2.9.8:

  • cd dansguardian-2.9.8.0
  • mkdir /usr/local/dansguardian
  • ./configure --prefix=/usr/local/dansguardian --with-proxyuser=squid --with-proxygroup=squid --enable-email=yes && make && make install (./configure command is wrapped)

8. Make and configure squid directories:

  • mkdir /usr/local/squid/var/cache
  • chown -R squid:squid /usr/local/squid/var
  • chmod 0770 /usr/local/squid/var/cache
  • chmod 0770 /usr/local/squid/var/logs

9. Make and configure squidGuard directories:

  • mkdir /usr/local/squidGuard
  • mkdir /usr/local/squidGuard/log
  • chown -R squid:squid /usr/local/squidGuard/log
  • chmod 0770 /usr/local/squidGuard/log
  • mkdir /var/log/squidguard
  • touch /var/log/squidguard/squidGuard.log
  • touch /var/log/squidguard/ads.log
  • touch /var/log/squidguard/stopped.log
  • chown -R squid.squid /var/log/squidguard
  • mkdir /var/lib/squidguard
  • mkdir /var/lib/squidguard/db
  • mkdir /var/lib/squidguard/db/blacklists
  • mkdir /var/lib/squidguard/db/blacklists/ok
  • chown -R squid:squid /var/lib/squidguard

10. Make and configure dansguardian directories:

  • chown -R squid:squid /usr/local/dansguardian/var/log

11. Edit and copy configs from respective source directories:

  • cp squid.conf /usr/local/squid/etc/squid.conf
  • sample squid.conf settings:
    • http_port 127.0.0.1:3128 transparent
    • icp_port 0
    • htcp_port 0
    • redirect_program /usr/local/bin/squidGuard
    • cache_effective_user squid
    • cache_effective_group squid
    • acl all src 0.0.0.0/0.0.0.0
    • acl manager proto cache_object
    • acl localhost src 127.0.0.1/255.255.255.255
    • acl to_localhost dst 127.0.0.0/8
    • acl allowed_hosts src 192.168.12.0/255.255.255.0
    • acl SSL_ports port 443
    • acl Safe_ports port 80 21 443 # http ftp https
    • ##acl Safe_ports port 21 # ftp
    • ##acl Safe_ports port 443 # https
    • ##acl Safe_ports port 1025-65535 # unregistered ports
    • acl CONNECT method CONNECT
    • acl NUMCONN maxconn 5
    • acl ACLTIME time SMTWHFA 7:00-21:00
    • deny_info ERR_ACCESS_DENIED_TIME ACLTIME
    • #http_access allow manager localhost
    • #http_access deny manager
    • http_access deny manager all
    • http_access deny !Safe_ports
    • http_access deny CONNECT !SSL_ports
    • http_access allow localhost ACLTIME
    • http_access deny NUMCONN localhost
    • #http_access allow allowed_hosts
    • http_access deny to_localhost
    • http_access deny all
    • http_reply_access allow all
    • #icp_access allow allowed_hosts
    • #icp_access allow all
    • icp_access deny all
    • visible_hostname localhost
  • cp squidGuard.conf /usr/local/squidGuard/squidGuard.conf
    • change ip gateway address in squidGuard.conf
  • cp dansguardia*.conf /usr/local/dansguardian/etc/dansguardian/
  • sample dansguardian.conf settings:
  • sample dansguardianf1.conf settings:
    • groupmode = 1
  • cp getlists.sh file to /usr/local/bin
  • cp etc-shorewall-start /etc/shorewall/start (change user name)
  • cp etc-shorewall-stop /etc/shorewall/stop (change user name)
  • cp etc-rc.local /etc/rc.local

12. Start or restart services as needed:

  • chkconfig iptables on
  • chkconfig shorewall on
  • service iptables restart
  • service shorewall restart
  • /usr/local/squid/sbin/squid -z (first-time config)
  • /usr/local/squid/sbin/squid -N -d 1 -D (test squid, kill when working fine)
  • /usr/local/squid/sbin/squid (this also runs squidGuard from "/usr/local/bin/squidGuard")
  • /usr/local/dansguardian/sbin/dansguardian
  • /usr/local/bin/getlists.sh (takes a very long time, and may need to be killed and run a couple of times)
  • /usr/local/squid/sbin/squid -k reconfigure
  • /usr/local/dansguardian/sbin/dansguardian -Q

13. Post-install testing and tweaking:

  • test with browser – should be transparent proxy surfing now, works with lynx as well
  • set up a mailer for notifications:
  • used postfix, pointed it to your mailserver.isp.domain
  • postfix needs /etc/postfix/transport and /etc/postfix/generic
  • dansguardian.conf calls it with ‘sendmail -t’ command
  • for non-authenticated use, do not set ‘by user = on’ in dansgaurdianf1.conf

14. Edit squid.conf and set up time based access, to prevent late night surfing (add the following lines):

  • acl ACLTIME time SMTWHFA 7:00-21:00 (add to the ACL section)
  • http_access allow localhost ACLTIME (add to the http_access section)

Final notes: This probably will not work exactly as posted, especially if you use newer versions than I posted, so be prepared to tweak. Read through the squid.conf, squidGuard.conf, dansguardian.conf, and dansguardianf1.conf files for other options and file locations, and refer to the University of Google for further help with options and error messages. I had to play around with configure options for a while before I could get squid to compile, so be ready to to the same, depending on your setup. This all runs on a local box, which is not used to proxy any other computers – instead, I just do not allow them to use the main computer. I sincerely hope this helps someone secure their kids’ computers. I have set this up on a friend’s home PC as well, and they are very happy with the results. Good luck!

Follow

Get every new post delivered to your Inbox.