Wednesday, February 24, 2010

My favorite online comic - xkcd

I don't read it everyday, but I'll check it out and read backwards to catch up. Well while going through all the recent comic strips, I found one that applies well to linux. Enjoy.

Tuesday, February 23, 2010

I'd like to share my annoyance with Ubuntu

Ubuntu is a superb replacement for windows. The hardest part about using a new system though is getting used to it's quirks. Windows has a ton of quirks but we've all had it for so long we are used to them. So I use Ubuntu on a daily basis and I'm used to the quirks, there aren't that many. When I am forced to use windows (@ work) the quirks I was once used to and accepted now annoy me more than anything Ubuntu may put me through.

So to say the least, I'm not giving up Ubuntu, but just sharing some of the what to expect.

I use an internal USB card reader to get my digital files off my camera and into the computer. A few days ago I had several memory cards to copy. I got through the first one, right clicked on the card and chose eject. On to the next one, when finished I ejected it too. On the last card my mouse slipped and I clicked 'Safely Remove Hardware'. Wow, this 50 pixel slip caused the entire card reader to be removed from the system. The only way to recover after this is a REBOOT.

It seems that this 'Safely Remove' Option was added here:

Now it's causing more bug reports to show up:

I don't see the point in allowing the entire reader to be disconnected. I mean there is no data to loose on a reader. The original solution seems to be about making ipod and kindle users happy. They also seem to be focused on determining if the drive/reader is internal or external as a solution. Even if it was external, I wouldn't want to unplug and replug the reader every time I switched cards.

A suggested solution (perhaps easier said then done).
Have a single Eject option. Don't completely remove the hardware/reader from the USB bus. However allow devices to be identified in the USB manager or nautilis to be flagged as 'Remove Completely' on a user preference basis. And automatically flag the kindle and the ipods, but still allow the user to unselect this as the option later for these devices. Any flagged device should then be completely removed when 'Ejected'. Then the majority of devices are simply unmounted without causing the user a lot of pain.

Well, that's my Ubuntu pain of the month. Surprisingly while trying to find the solution, I came across windows users with a similar problem. It surprised me to find windows problems in the first page of google results although I included 'Ubuntu' as a keyword.

Monday, February 22, 2010

Slow times. Or Not.

I have not posted in a week due to so much happening in my life at the moment. I am hoping to get right back into the daily blog posts to keep both of my readers informed :).

Coming up in the linux/Ubuntu world I plan on discussing
  • MP4ize script which easily converts videos to MP4 files. This is great if you have an Apple device.
  • Media Server solutions and what I ended up with.
  • Nightly FTP Off site backup solution
  • My efforts with the Ubuntu MD team and our upcoming free workshops to introduce the locals to open source software.

For photography I am mostly going to show some of my results from recent shoots.
  • Cub Scouts team shots at a local elementary school
  • Modeling Shots at A. Salon in DC
  • Children in my home studio
  • And perhaps some pregnancy shots in my home studio

So stay tuned, there is a lot to come. I'm also open to suggestions too.

Wednesday, February 17, 2010

Introduction Of Guest Writer: Dr. Watson

As a guest writer on Techorator I will hopefully be adding value to this blog by incorporating my own flavor and style of writing.  Like John, I am huge tech geek who loves to try new things and share the experience with others.  So not to bore you will more useless background information, I will get right to the meat:  I have decided to spend my time reviewing Ubuntu 10.04 Lucid Lynx.  As 10.04 is a work in progress (currently Alpha 2), I will share my journey into this release exposing the changes as they are made and hopefully enlightening followers, guests, and passersby along the way.

Monday, February 15, 2010

How to clean up Google Chrome on Ubuntu 9.10

If you saw my post yesterday you will know I wasn't happy about the 39 new packages Google Chrome decided to install. I have a command that will clean the mess up for most ubuntu desktop systems.

sudo apt-get autoremove bsd-mailx g++-4.4 dpkg-dev

  • dpkg-dev gets rid of the bulk (30+ packages) including alien, rpm and QT.
  • bsd-mailx gets rid of the mail server stuff like postfix
  • g++-4.4 cleans up one or two odds and ends. 

Make sure you don't actually use any of these packages. If you do, adjust the command to just get rid of what you don't need.

I went through all the packages that were installed and this command takes them all out plus Google Chrome itself.

No more open ports for my desktop system. Just the way I like it.

Sunday, February 14, 2010

No thank you google! Chrome automatically installs a mail server.

I just updated my Ubuntu desktop system and I didn't read the list of updates first. I swear this is the first time I didn't read the list and boy has it bitten me. I currently have the Google Chrome repo ( stable main) on my system and it has really done it now.

Chrome has installed the following new packages on my system:
  • alien (8.78)
    bsd-mailx (8.1.2-0.20081101cvs-2ubuntu1)
    build-essential (11.4)
    cvs (1:1.12.13-12ubuntu1)
    debhelper (7.3.15ubuntu3)
    dpkg-dev (1.15.4ubuntu2)
    g++ (4:4.4.1-1ubuntu2)
    g++-4.4 (4.4.1-4ubuntu9)
    gettext (0.17-8ubuntu2)
    html2text (1.3.2a-14)
    intltool-debian (0.35.0+20060710.1)
    libmail-sendmail-perl (0.79.16-1)
    libqt4-assistant (4.5.3really4.5.2-0ubuntu1)
    libqt4-dbus (4.5.3really4.5.2-0ubuntu1)
    libqt4-designer (4.5.3really4.5.2-0ubuntu1)
    libqt4-gui (4.5.3really4.5.2-0ubuntu1)
    libqt4-opengl (4.5.3really4.5.2-0ubuntu1)
    libqt4-script (4.5.3really4.5.2-0ubuntu1)
    libqt4-sql (4.5.3really4.5.2-0ubuntu1)
    libqt4-sql-sqlite (4.5.3really4.5.2-0ubuntu1)
    libqt4-svg (4.5.3really4.5.2-0ubuntu1)
    libqt4-xml (4.5.3really4.5.2-0ubuntu1)
    librpm0 (4.7.0-9)
    librpmbuild0 (4.7.0-9)
    librpmio0 (4.7.0-9)
    libstdc++6-4.4-dev (4.4.1-4ubuntu9)
    libsys-hostname-long-perl (1.4-2)
    lsb (4.0-0ubuntu5)
    lsb-core (4.0-0ubuntu5)
    lsb-cxx (4.0-0ubuntu5)
    lsb-desktop (4.0-0ubuntu5)
    lsb-graphics (4.0-0ubuntu5)
    m4 (1.4.13-2)
    mailx (1:20081101-2ubuntu1)
    ncurses-term (5.7+20090803-2ubuntu2)
    pax (1:20090728-1)
    po-debconf (1.0.16)
    postfix (2.6.5-3)
    rpm (4.7.0-9)

WOW that is a lot of junk I didn't want. The most concerning to me is the unwelcomed mail server. Yes. anyone who updated their desktop system and is using the Chrome repo is now running a mail server. Complete with port 25 open.

Arggg. Now I have to rip this stuff out of my system, hope I don't break anything and say good-bye to Chrome.

Friday, February 12, 2010

Wow. Now this would be a great mini server!

I'm a little slow catching some news, but looking at the specs of the Fit-PC2i has me amazed. This tiny computer (4"x4.5"x1") packs a 1.6ghz Atom processor, two LAN, one wifi, a 2.5" drive bay and more. It only consumes 6 watts of power and has everything you need to setup your own firewall, router, or an Ubuntu FTP server ;). You are going to have to wait a little unless you can settle for the Fit-PC2. The older model doesn't have dual lan ports, but is nearly the same machine.

Thursday, February 11, 2010

Quick Studio Update

I bought a dowel rod and two wood curtain holders from Homedepot. Now I have hung my seamless. The seamless had to be cut to fit the room and I was able to accomplish that with a coping saw.

Wednesday, February 10, 2010

My studio is almost ready.

I've been working on clearing space in my house for a studio. I've finally done it and I can say that there is something nice about having a more permanent space for studio work. Here is a shot of the studio setup. I'd shoot it wider but my I'm not quite done so it's tight right now.

What I still need is a way to hang backdrops in a more accessible manner. Right now I'm just using my backdrop stand but it is limited. I'm thinking of getting one of these instead and keeping several backdrops ready to go.

The other part of a studio I want is a place for a roll of seamless paper. Unfortunately I need a way to hang it and I need a special size. I don't have 107" in width. I plan on buying a roll and cutting it to the size I need.

Finally I am going to place a hook or something on the ceiling to hang a hairlight. Ohh when it is completed it will be awesome.

Ahh, just for fun, I took a picture of myself. Yes, f5.6 at arms length is not enough.

Then I turned off a light and put on a gelled background light for my son.


Photo tip of the day, when shooting kids always wipe their faces. I just don't know if its a rule or what but every kid has a koolaid smile or chip crumbs on their face.

Tuesday, February 9, 2010

Mini-server follow-up.

I got my mini server together and everything is working right. I noticed the motherboard had a capacitor that obviously struck the heatsink and I worried that the board may have been DOA. I decided to try it anyways and luckily it all worked.

The Rosewill case I ordered from newegg is nice. It is very cramped after getting it all in there but it fits. I realized Rosewill designed the case to easily accept a 2.5" harddrive as well and I would recommend that instead of the 3.5" WD Blue. The Blue drive is wonderful and cool, but it just makes this case that much more tight. The only gripe I have about the case is the CD rom bay is uncovered. If you don't put in a CD, when you push the front latch you will be greeted with a big hole in your case.

Power. One of the reasons I wanted the smaller computer was to save power over the old computer. The old computer was a Dell Dimension E520 (Intel Core 2 Duo, etc). The Dell is nice except it is huge and it sucks some power. According to my killawatt the dell sucked 90 watts at peak, and 65 watts idle. The new computer sucks 30-35 watts consistently. I was surprised actually by the low wattage of the dell. But I still have a savings and a lot more space!

Thanks for reading.

Monday, February 8, 2010

TLS support for Pure-FTP Server

Here are the final steps in my series on setting up Pure-FTP in Ubuntu.

Again, connect to your server via putty or open a terminal and switch to root.

Then follow these commands below.
echo 1 > /etc/pure-ftpd/conf/TLS
You can set this to 0 for off, 1 for optional, and 2 for required. My goal is to reach 2 but my clients aren't ready for that.

Install OpenSSL package.
apt-get install openssl

The command below creates an encryption key for your ftp. Several questions will be asked
openssl req -x509 -nodes -newkey rsa:1024 -keyout /etc/ssl/private/pure-ftpd.pem -out /etc/ssl/private/pure-ftpd.pem

Here are example choices for a Key.
Country: US

State: Maryland

Locality: Columbia

Organization Name: ftp

Organization Unit: blank

Common Name: ftp

Email Address: blank

Lock the key file from other users.
chmod 600 /etc/ssl/private/pure-ftpd.pem

Reboot your server and now you should be able to connect to your server with encryption on. This is an easy step if everything goes well. Your choice is to decide between 1 and 2 for the TLS option. If you decide to use 2, you must know who your users are so you can guide them through setting up their client. If this is a personal FTP then it is no problem.

Well I hope you are enjoying your FTP server now. Good Luck.

Sunday, February 7, 2010

Creatively using mount to handle SMB shares for Pure-FTP

 This portion will walk you through mounting SMB shares. I do some things differently for a more complex layout. But in the end it's worth it. I also use a credentials file to protect my share user accounts. Again, get to your server as Root.

Install smbfs so you can mount these network shares:
apt-get install smbfs

Go to the /mnt folder. Some may prefer media, but I like to reserve that for less permanent shares.
cd /mnt

create a folder for each network share

mkdir music
mkdir video
mkdir work

Give ownership of your share folders to the ftpgroup. This shouldn't matter because these folders will end up with the permissions given by the mount settings.

chown -R nobody:ftpgroup .
---Note: The period at the end is intentional

Create credential files to protect mounted accounts. The credential files are sensitive to spaces and line feeds. Just fill it in exactly as below.

nano music.cred


Save and exit.

nano video.cred



restrict credential files. You need to make it so only the root account can read these files.

chmod 600 *.cred

Now lets permanently mount some of these shares:

nano /etc/fstab

Add the following to the bottom of the file, noperm (no permission checking) on the first line is recommended for any share which the user will upload/write into. ro shares (read only) shouldn't need this.

//worldbook-work/work /mnt/work cifs credentials=/etc/work.cred,rw,uid=65534,gid=2001,noperm 0 0

//worldbook-media/music /mnt/music cifs credentials=/etc/music.cred,ro,uid=65534,gid=2001 0 0

//worldbook-media/video /mnt/video cifs credentials=/etc/video.cred,ro,uid=65534,gid=2001 0 0

# These following lines bind the mounts to our FTP folders.
/mnt/music /var/ftp/media/music none bind 0 0

/mnt/video /var/ftp/media/video none bind 0 0

/mnt/work /var/ftp/work none bind 0 0

/var/ftp/work /var/ftp/workmedia/work none bind 0 0

/var/ftp/media /var/ftp/workmedia/media none rbind,_netdev,noauto 0 0

Because /var/ftp/workmedia/media relies on another nested mount the timing of the mounts prevents the mount from working right away. To solve this I added some lines to rc.local to force mounting again at login.

nano /etc/rc.local

before exit 0 add these lines.

sleep 10

/bin/mount -a -t cifs

sleep 5

/bin/mount /var/ftp/workmedia/media

Exit and Save.
Reboot and go to your var/ftp (or other mount folder) and see if it worked.

You may notice that I bind my  shares into the FTP folders instead of mounting them right there. This is done because my FTP folder is for FTP, but the mount folder is for any purpose. If I decide my FTP server will also serve as a DLNA media server then I have nothing else to do but point the dlna server to my mount folder.

Saturday, February 6, 2010

HOWTO: Setup Virtual Users with shared folder access on Pure-FTP in Ubuntu 9.10

This post assumes you have already setup pure FTP and ubuntu server 9.10. If you have not, go read my earlier posts. Start by logging into the server and switching to root.

If you haven't already, make sure pure-ftp supports the database file for virtual users.
ln -s /etc/pure-ftpd/conf/PureDB /etc/pure-ftpd/auth/50pure

Restart your FTP server process or just reboot.

Create an FTP group account and 2 shared accounts for users. Users will get individual virtual accounts.
groupadd -g 2001 ftpgroup

useradd -u 2010 -s /bin/false -d /bin/null -c "pureftp limited access user" -g ftpgroup ftplimited
useradd -u 2011 -s /bin/false -d /bin/null -c "pureftp full access user" -g ftpgroup ftpfull

Create a set of directories for the FTP files and to serve as chroot access points. I create a mess of folders for my purposes. You can read about my goals in an earlier post.
cd /var
mkdir ftp
mkdir dropbox
mkdir work
mkdir media
cd media
mkdir music
mkdir video
cd /var/ftp

mkdir workmedia
cd workmedia
mkdir work
mkdir media
chown -R ftpfull:ftpgroup /var/ftp
chmod -R 755 /var/ftp
chmod 735

Lets add some users:

Joey - dropbox user, SLOW download (he shouldn't need to download anyways.)
pure-pw useradd joey -u ftplimited -g ftpgroup -d /var/ftp/dropbox -t 1
Make up a password for joey at the next prompt.

Pam - Dropbox admin, normal download, but slow upload
pure-pw useradd pam -u ftpfull -g ftpgroup -d /var/ftp/dropbox -t 200 -T 1

Mr. Smith - Work user, normal download speed, unlimited upload
pure-pw useradd smith -u ftpfull -g ftpgroup -d /var/ftp/work -t 250

Dave - Media user, normal download speed, unlimited upload
pure-pw useradd dave -u ftplimited -g ftpgroup -d /var/ftp/media -t 250

ME. No speed limits.
pure-pw useradd john -u ftpfull -g ftpgroup -d /var/ftp/workmedia

After adding/modifying users you must issue this command to commit them.
pure-pw mkdb

Some other helpful commands:
List all user accounts
pure-pw list
Show a particular account details
pure-pw show dave
List all active users
Reset User Password
pure-pw passwd dave
Delete User
pure-pw userdel dave
Modify User (give Dave an upload speed limit of 20)

pure-pw usermod dave -T 20

You should be able to log in as any user. However since we have not mounted the shares, no files will be visible. Some users should be able to upload files, but delete these temporary files before mounting your shares.
Next I'll go through my mounting techniques. I use binding too to simplify mounting.

Friday, February 5, 2010

HOWTO: Setup Pure-FTP on Ubuntu 9.10 with passive NAT support.

Please read my earlier posts on what I'm trying to accomplish. They will guide you to setup an Ubuntu 9.10 Server and give you an idea of my conventions. This is assuming you are using Ubuntu Server 9.10. Other flavors of linux may behave differently. The majority of my guidance on this came from ubuntu howto. My real work came from putting the other pieces together to meet my particular needs. Namely, the other 5 posts in the series.

Login and use sudo su again

apt-get install pure-ftpd

Now you should be able to connect to your FTP on port 21 at the server IP using your username and password. we are not ready to connect from the outside yet.

Setup all the pure-ftp settings. These setting files for pure-ftp work differently than what the pure documentation indicates. This may just be an Ubuntu thing. In the end, these files are 'translated' into command line switches.
Each line simply writes a value into the file. You can do the same by opening the file with nano and typing the value on the first line. Some setting files already exist and they should remain as they also contain important settings.

cd /etc/pure-ftpd/conf

Security Related Settings:

echo 99 > MaxDiskUsage
You can lower this but keep some setting here or someone could crash your server by filling the disk.

echo no > PAMAuthentication

echo 20 > MaxClientsNumber

You can change this but don't get too high. Be realistic with your bandwidth and server power.

echo 4 > MaxClientsPerIP

echo yes > ChrootEveryone

echo yes > NoChmod

echo yes > ProhibitDotFilesRead

echo yes > ProhibitDotFilesWrite

Network Settings:

echo > ForcePassiveIP
Your external IP. Assuming you are behind a NAT Router.

echo ,3421 > Bind
A port for your FTP server. 21 is the standard but I change mine to keep them guessing. The comma is intentional.

echo 60000 60100 > PassivePortRange
For NAT forwarding issues you need to set this. Make sure your range supports 2x MaxClientsNumber. This and the Bind setting will be used on your router. write them down.

Misc Setting:

echo yes > BrokenClientsCompatibility

Finally restart your FTP server:
/etc/init.d/pure-ftpd restart

Here is my output from the command above.

Restarting ftp server: Running: /usr/sbin/pure-ftpd -l puredb:/etc/pure-ftpd/pureftpd.pdb -x -O clf:/var/log/pure-ftpd/transfer.log -u 1000 -S ,3421 -c 20 -k 99 -C 4 -P -X -b -8 UTF-8 -Y 1 -A -E -R -p 60000:60100 -B

You can now go into your router and forward the main port (3421) and the passive port range (60000-60100) to your server IP (

Test your system from outside of your network if possible.

In the next post I will show you how to setup some virtual users with shared folder access.

Thursday, February 4, 2010

HOWTO: Set up Ubuntu 9.10 Server with SSH and Static IP

If you read earlier I have strange requirements for my FTP needs. While I'm still working out the kinks I have solved many problems with Samba, shared access, groups and more using Pure-FTP and Ubuntu Server. All of this is now running on my new mini server which I built last week for $219 shipped.

I'm not going to jump into any particular step in too much detail (it would make a book) but I will try to note on a few things. Other steps are just listed.

Install Ubuntu Server edition.
Start the server install and on many of the prompts, just choose the logical or default choice. However here are a few that some may get confused with.

  • For the disk partitioning Choose Guided - Use Entire Disk. Do not choose the default LVM option unless you know what it is. LVM disks are harder to clone and some disk utilities do not work with them.
  • For proxy settings Leave blank unless you know this for sure.
  • For Automatic Updates Choose Manual Updates
  • For Packages Choose SSH Server and no others. You can always add more later, but get this working first.

After installation is finished reboot the system.

Logging In via SSH

I choose to keep the server headless (no monitor, keyboard, mouse, etc.) So all that is plugged into the computer is a network cable and power. It can help the computer boot faster if you go into the bios and turn off booting to other devices except for the primary drive. It also helps improve the physical security a little. Using this computer headless means that we need to connect via SSH. To do this you can download putty for free. If you have Ubuntu as your desktop you can find putty in your Add remove Programs.

The next step is to find out what the IP of the server is. I simply go to my router and it lists it.
Using Putty, just type in the IP of the server. You will accept the certificate then you are connected. Now log in with the name and password you created during the install.

Commands/text that should be typed will be italicized. Perform all actions in sequence.

Switch to root/administrator:  
sudo su

Upgrade your System
You will want to do this when ever you login via putty and you see packages are ready to be installed.
apt-get upgrade
If you see some packages are bypassed then you need to use this also.
apt-get dist-upgrade

I recommend rebooting if you used apt-get dist-upgrade. There are other ways to apply certain settings instead of rebooting, but I just default to a reboot elsewhere.
reboot now

Don't forget to switch to root if you rebooted
sudo su

Change to a Static IP. I only have one network interface. Yours may be different. Your network IP settings may be different too.
nano /etc/network/interfaces

Change the last line dhcp to static. Mine is now iface eth0 inet static

Then add the following lines to the end.

Then press control-o to save and press enter to keep the current name.
Then press control-x to exit
From now on it is assumed you will save and exit nano on your own.

nano /etc/hosts
Change the ip on the second line to the address you used above. My hosts file starts like this (atom is the name of the computer):   local host   atom
At the bottom, add any ip's for any file servers or other network resources you may need to access followed by the system name. worldbook-work worldbook-media

save and exit.

/etc/init.d/networking restart

At this point your putty window should stop working. close putty and start a new window for the new IP. You will need to accept the certificate again.

Congrats. You now have a server setup and ready to work. Now you need to give it something to do. In this case I'm going to make mine serve FTP. Check out the next post where I install and configure Pure FTP with security settings and passive NAT support.

Wednesday, February 3, 2010

HOWTO: Set up a versatile FTP server with Pure-FTP and Ubuntu 9.10 Server

Over the next several posts I am going to give you the steps I used to setup my Pure FTP server. I have some requirements that others may have and I feel you could benefit from parts of it too. Here are the broad details with hypothetical names and purposes to make explaining the situation easier.

I have two network shared devices called worldbook-work and worldbook-media. I will refer to these as work and media respectively. Work is used by my company as an information store. Users will need to read and write to it. Media has two separate shares on it. One for music and one for videos. I have this to allow family access remotely. There is no need to write to this remotely so the share account on the device only allows reading as another layer of security. Finally I need to support a dropbox for a photography group I am in. The photographers will drop their pictures off and the administrator will log in, download the pictures and put them on another website, the administrator will delete the files once retrieved. Photographers should not be able to see or modify another photographer's files. I am just using local server disk for this since the files are transitory and the original photographer can send another copy if a problem arises.
Work users will need full access to work. Family will need read access to media. I will need full access to work and read access to media. and I have users who need to use the dropbox as described above. All while most shares are over a SMB mount.

Here are the coming posts to show you what I did.
  • Setting up Ubuntu Server with Static IP, network device aliases, and SSH support.
  • Setting up Pure-FTP server with Passive NAT support, and proper security precautions
  • Setting up a directory structure with permissions and virtual users with bandwidth control
  • Setting up mounts with credential files for added security and binding to share mounts amoung many folders.
  • Setting up TLS security to enable encrypted FTP. Referred to as FTPS.

To do/desires:
  • Force TLS for all users except dropbox users.
  • send an email after a file is sent to dropbox, but no other share
    • (Not possible without a separate server process from what I can tell.)
  • Refine directory management and permissions. Perhaps have a virtual layer of permissions on top of existing directory permissions.
    • (Not possible without a software change. Though it should be possible to code with dot files in folders.)
  • Allow each virtual user to have a virtual private folder.
    • (I have no clue or even a suggestion for this)
Check out the coming posts in this series to see the way I solved this task.

Tuesday, February 2, 2010

Jailbreaking the Iphone

I love my iPhone and the last thing I thought I would do would be to Jailbreak the iPhone. I wasn't even sure what it really meant. However my computer lost my itunes library for the second time in a row which left me totally unprepared without any kind of backup (it's a long story). I am so tired of rebuilding my itunes library and fiddling with the stuff that would be so much easier without all this 'protection'. I'm certain I have lost several games and music purchases. But what really gets me is loosing my app data. All those wasted hours on games with accomplishments I'm certain I will not be able to do again are very important to me. Ohh yeah, I've had to fill in data a couple of times into mSecure and I sure don't want to do that again either.

Well it turns out there is an easy way to backup your app data. But it's only available in Jailbreak land. It's called Chronos. Why apple doesn't provide some means of backing up just app data is beyond me. A full restore can be destructive to certain things when all you need is to backup your app data, sync with a new itunes library, and restore app data. There are other great improvements available for Jailbroken iphones which I may explore in a future post.

The amazing thing for Jail Breaking the iphone is it is surprisingly easy and very low risk. Simply download the utility. Plug in your Iphone, and click one button. As far as risk, Jailbreaking doesn't modify the phone, it just gives you a means to install new applications. The applications you choose to install would bear the risk, much like installing applications on your computer. In the worse case scenario, you would have to restore the iPhone, but very few have had that issue.

If you want to learn more, post a comment and I'll try to write about it.

Monday, February 1, 2010

Ten Tips for Keeping Windows Fast and Secure (Part 3 of 3)

If you missed the first two parts, please go read them now.

8. Use an active virus scanner and spyware blocker.
    There isn't much of an explanation for needed this. In the windows world this is required. On Ubuntu, virus scanners and spyware scanners are not needed due to other measures in place to help prevent infections. However, for Windows, ensure the definitions are updated daily and don't let your scanners expire. You need to stay up to date.

9. Don't click links in emails even if you know the source.
    This is true for any OS, but even more so with Windows. In every OS, links in emails can trick you to reveal your passwords and other private data to bad guys. In windows, they can also infect your computer easily. Also watch out for attachments. If you get a file from someone you know, it can still be bad. If it's a video, or a picture ask them to put it on a popular site like youtube, flikr, or facebook. Then go view it there. All it takes is a new virus which infects that type of file to get in their system, then it will be in your system. Using a known third party doesn't eliminate the risk completely, but it reduces the risk.

10. Don't be an administrator.
    Many of us share a computer with family. It can be a pain in the butt, but to help keep the bad guys out of the computer give each family member their own account. Make sure your family member is not also an administrator. In fact, it's better if you create a separate account for yourself too which isn't an administrator. When you need to install a new application, switch to the administrator account and run the install. Some applications won't play nicely with this and expect administrator privileges. Really if they can't get these security requirements right, do you expect them to get it right in other ways?

If you enjoy this blog, please click the follow button!