Wednesday, 13 October 2010
Vulcan Bomber Spotted
Today I went out at lunchtime to watch a few work colleagues fly their model planes, and which were quite nice. There were two regular looking planes, one of which was a P51 Mustang (Caddillac of the sky) and also a really great looking bi-plane. They were all quite large scale models.
Whilst watching we were quite amazed at the odd sight of a real life bi-plane flying over, which seemed to odd coincidence as it's not every day you see one, especially not when currently flying a model one!
But then, to add to the bizarre aeronautical experience, low and behold in the not too far distance, a bloody great Vulcan Bomber came flying by! I couldn't believe what I was seeing! It was far enough away to be rather quiet but that just added to the impression of how large it is as it looked huge even in the distance.
It was flying very low and pretty slow and was accompanied by a smaller plane which just looked miniscule in comparison. Unfortunately it didn't come over head but made a long circle around us.
Even more bizzarely coincidental was the fact that I had been standing under one of these the previous weekend in the RAF museum, London (an excellent place to visit). When under one it is absolutely massive!
Apparently this is the only one left flying now an looking it up today it looks like it's not likely to be flying any more. They need to raise a lot of money to keep it air worthy. Check out the appeal website.
Feel privileged to have seen it.
Friday, 24 September 2010
The Moon, Jupiter and Galileoscope
Whilst looking at the moon, Jupiter was VERY visible to the lower right of the moon, less than a fist's width away! Was an awesome sight. Jupiter has just (in the last few days) passed a point where it was closest to Earth for some years and is just past opposition.
I couldn't get over how bright Jupiter was so I did what any good geek would do and got my binoculars out. Whilst collecting my binoculars I thought, hmm, may as well grab my Galiliescope as well.
I bought the Galileoscope last year when they first started releasing them. I have had a few plays with it but not a lot. I only have a camera tripod (which is better than no tripod!) and keeping it steady and navigating the sky can be tricky. But I have seen Mars through it and also had a nice view of the Orion Nebula. The Orion Nebula was definitely recognisable as a cloudy splurge, which was enough to get me excited and Mars was still just a round dot, but it was nevertheless a disc as opposed to just a point of light. Also it was a bit 'salmony' pink in colour, making it definitely more than just a star (by star I just mean point of light).
But last night, I looked at Jupiter firstly with my binos and even though it was hard to hold them steady, I could definitely detect it was a disc not a point of light. And I could also, for the first time, distinguish what looked like a moon of Jupiter to it's side!! I was amazed that I could make that out with just my binos!
So I got the Galileoscope quickly set up and pointed it at Jupiter. It was tricky getting the telescope to point at Jupiter, firstly to actually aim it at it and secondly to finely adjust the direction it was pointing was impossible with my camera tripod. It was a case of squeezing it in the direction I wanted to move, over-shoot, knowing it would spring back to where I want. Not very clever but all I could manage. Also, I had to ensure no single part of me touched the scope or tripod as it just wobbled like mad.
But it was worth it! I got a view of Jupiter through the scope and was able to distinguish definitely three of the four Galilean (quite fitting with my scope) moons! I was again amazed. Not only could I make out the moons, but I could also just about make out a band roughly around Jupiter's equator!!! It was mostly a bright white disc, but definitely with a slightly darker band around the middle.
I was chuffed! Can easily see why people get the bug and want bigger and better telescopes all the time. I will definitely, when life/finances allow, get myself a decent telescope one day.
Monday, 23 August 2010
Backing Up My Life With Amazon Web Services
S3 seems to be the solution I have been looking for for ages. A simple, flexible, no-contract, reliable and easily accessible storage facility that resides entirely in the cloud. Its cost is based on exactly what you use; rather than renting a fixed amount of space on some server, you just pay AWS for what you use of their services. So if you upload N amount of data you will be charged for storing N amount of data per month. If you access that data for COPYing GETting or PUTting commands then you will be charge per command. So you can use it as little or as much as you like.
There is a perfect little application called s3sync which is a small ruby program which syncs data you specify to S3. It works very much like rsync. This is ideal as it is a command line program so I can run it on my web server which contains a lot of data I wouldn't like to loose, such as family photos/videos.
I altered my backup script to now use this new method, based on a great tutorial I found on the web, which is dated as 2006, so this has been around for a long time.
#!/bin/bash
#Backup script for server
#set variable of date for labelling
date=`date +%F`
cd /home/user/backupdata/
#remove oldest mysql backup
rm `ls -t *mysql* | tail -n 1`
#Dump mysql databases
mysqldump --all-databases > /home/user/backupdata/${date}_mysql_backup
cd /home/user/bin/s3sync/
export AWS_ACCESS_KEY_ID=My Amazon Access Key ID
export AWS_SECRET_ACCESS_KEY=My Amazon Secret Access Key
export SSL_CERT_DIR=/home/user/bin/s3sync/certs
export AWS_CALLING_FORMAT=SUBDOMAIN
ruby s3sync.rb -r -v --ssl --exclude="/home/user/some/dir" --delete /home/user mybucket:
ruby s3sync.rb -r -v --ssl --delete /var/www mybucket:
I have this setup as a cron job once a week.
Friday, 20 August 2010
Upgrade Drupal Core and Restore from Mysql Backup
I tried once but it didn't work. I went 'more or less' through the process described in upgrade.txt, which accompanies the download, but after doing it the admin pages still said it was out of date, as if I hadn't done anything, which was odd as I had deleted almost everything and replaced it with the new version.
Anyway, I came back to it recently and decided to sort it out. I downloaded the latest core and followed the instructions to the letter. But once again, my admin pages still showed me at 6.12, as if I had just done nothing!
This was obviously a bit annoying as I had very carefully followed the documentation. So I put my problem out in the #drupal-support IRC room on irc.freenode.net and started getting some helpful hints to start tracking down the problem.
The very best bit of information I got was a suggestion to use drush. drush is a command line program that you run in a terminal on the web server, within the site root. It is a simple little program that can perform such functions as checking the current status of the site, check for updates of modules, install those modules, clear cache, run update.php etc. The beauty is the feedback it gives you, highlighting any problems with error messages to help track down problems.
The first problem I came across when trying to update all modules was a duplication error, where one 'piece of code' was conflicting with another of the same name. This was preventing modules from updating. So after tracking down and removing the duplicate files, all modules (except core) updated seamlessly.
Now every time I ran the command
drush pm-update
or drush up drupal
to try and get the core module, drupal, to update from 6.12 to 6.19, it would run through the apparent process with successful messages, but then if I ran them again, it would keep saying it was at 6.12 and needed updating. So the update was failing.Trouble was that drush wasn't reporting any error messages and even
drush core-status
would confusingly say it was at 6.19, but everything else said 6.12.I tried to track down any duplications in my site's directory but there was nothing. So I finally decided to start afresh and wipe everything out and install 6.19 as new.
To ensure I would not loose all my site content, including nodes, images, modules, theme, functionality I needed to ensure I backed up everything I could possibly need.
I made a complete copy of my site's root directory, in this case /var/www/. I also made an up to date dump of the mysql database, very important as this is where everything is contained.
To make a backup of the mysql database:
mysqldump drupal > drupal_dump.sql
Obviously ensure you back up the right database. In this case my database was called 'drupal', but it could have been called anything. If you need to add certain user and password options, add
-u username -p password
after mysqldump. This will create a text file containing the whole database. I wasn't sure how I would use it yet, but I just knew backing it up was the thing to do if I stood any chance of putting all my data back into the new site.Next was to make the bold move and empty var/www. I then dropped the database in mysql.
To drop a database, enter the mysql environment with 'mysql' with admin privileges, then simply enter the command
DROP DATABASE drupal;
. I then re-created the database so it was new and empty, but using the same name: CREATE DATABASE drupal;
I then, perhaps niavely, believe that I have to use a unique user for each mysql database I have so I issue the following command to grant privileges to my chosen user on the drupal database:
GRANT ALL PRIVILEGES ON drupal.* TO myuser@localhost IDENTIFIED BY 'password';
Next I downloaded and unpacked the latest drupal core:
cd /var/www/
wget http://ftp.drupal.org/files/projects/drupal-6.19.tar.gz
untar xvf drupal-6.19.tar.gz
I then went to my root page in a web browser and ran the standard install script to get a new, empty up to date drupal installation.
Next was to get all my existing data back into my site for the mysql backup to try and make it appear as if everything is just as it was before.
The key to getting all content and settings back is to write the mysql dump back into mysql into the newly created, albeit same named, drupal database. But before this will work, you have to put all the modules that were present in the installation when the mysqldump was performed back in place so that the database doesn't get confused thinking it has modules installed that are not present.
To find out which modules were installed I created a temporary database in mysql and wrote the backup into that just so I could examine it. To write a mysqdump into a new database:
mysql drupal_tempdb < drupal_dump.sql
I then wanted to view what modules were installed in this database backup. To do this I viewed what tables were in the database in the mysql environment:
USE drupal;
SELECT * FROM system WHERE TYPE = 'module';
This showed me all the entries in the system table containing the modules. I could then examine this list to see what modules were installed and also, importantly, which versions of each module. This is though where the backup of /var/www/ comes in handy, as all the modules that were installed are contained in here. So using the list in mysql, which also tells me the location of each module. So I just copied each module back from the backup into the right location in the new installation.
Once I was happy that I had replaced all the existing modules, I wrote my drupal database dump in to the new database and ran update.php............... and it was mostly back!
All my themes were back (after copying the theme dir from backup), all my nodes and views were working. There were just some files missing, such as uploaded images, but again copying these back in from the back up solved this.
Saturday, 7 August 2010
Seesmic Desktop on Ubuntu
I have been using Seesmic (#seesmic @seesmic) on my Android phone (HTC Desire) for a while now and must say I like it a lot. Just works nice and looks good. Plus support for Facebook is coming soon and I like the 'all-in-one' type of approach.
I had a look at the great Seesmic web app. and it looks great, works anywhere you have a web browser. Also noted they have a desktop client too, which runs using Adobe Air. Adobe Air is avaible to install from the repos in ubuntu so I gave it a go.
First:
sudo apt-get install adobeair
Once installed, this added Adobe Air Application Installer to the Applications > Accessories menu.
Download the air file from the seesmic website
Open that from the menu and it prompts you to browse for an air file to install, simply follow on-screen instructions to install the app. You will be prompted for your admin password.
Once installation is complete follow the simple set up wizard to add Twitter and Facebook accounts.
Great app., check it out.
Thursday, 15 July 2010
Predators
Sunday, 11 July 2010
My Simple Ubuntu Home Server
I wanted to set up a simple home server properly for a long time. Just something that would serve up my media files and give me somewhere to permanently set up some samba shares where any computer can access and store data. I also wanted to continue to be able to backup my (this) website at home. This I was doing using a simple NAS drive connected to my router. I could just FTP into it from the web server and mirror the server as a backup.
The things that bothered me about the current backup strategy was a) FTP is not very secure and b) I had only one NAS drive and no permanent automatic way of duplicating that drive in case it failed. I had a spare USB enclosure and HDD which I will use as the backup of the backup.
I had recently acquired an old Dell Latitude C640 which would do the job just nicely. I planned to do the following to start:
- Install Ubuntu Server Edition
- Set up SSH
- Set up the old NAS drive as simply an external USB drive to store my data on, mounted permanently (hereafter known as the media drive).
- Set up second USB HDD to act as a back up of the main data drive (hereafter known as the backup drive).
- Set up new method of backing up website server to media drive using rsync.
- Set up Samba shares of certain directories on the media drive for other computers to be able to mount and access.
- Set up Music Player Daemon (MPD) so that I could connect the server to my home music system and finally have a means of playing my music collection through a home stereo system rather than just through computer speakers.
- Set up Ryhthmbox to use samba share as its music lilbrary
Installing the Operating System
I only had an 8.04 Hardy Server Edition CD to hand so I installed that. The laptop does not boot off USB as I would otherwise just have downloaded and created a bootable 10.04 USB drive. I also did not have a CD burner to hand.
I won't go through all the installation procedure as that's all pretty self explanatory. Once I had the base system installed I needed to get from 8.04 to 10.04 to get the latest version.
First I did:
sudo apt-get update
sudo apt-get upgrade
to get everything up to date. I then had to edit the update-manager config:
sudo vi /etc/update-manager/release-upgrades
Then find the line containing
prompt=lts
and change it to prompt=normal
. Save and close, then run:
sudo do-release-upgrade
This updated to the next version, in my case 8.10. Obviously a reasonably good broadband connection will be required to download the c.300-500 MB upgrades between each distro. I repeated this until I got to the latest version, 10.04, Lucid Lynx. This may seem tedious but is the correct and neatest way to do it.
Once I got to 10.04, I ran
sudo apt-get clean
to remove old release installations files.Back to top
Setting up SSH
I wanted to set up SSH to be secure, using public keys.
I wanted to remove password authentication to the server and only allow non-root, key access.
I editted /etc/ssh/sshd_config and checked the following lines:
PermitRootLogin no
PasswordAuthentication no
UsePAM no
I then created a key pair using
ssh-keygen -t rsa
. This creates a pair of keys, one public and one private in /home/user/.ssh. Ensure that directory is there before starting.To use keys to log in via ssh, the server needs a copy of your public key in its /home/user/.ssh/authorized_keys2 so before you lock yourself out by disabling password authentication, scp the public key to the server, then append the contents of the key to ~/.ssh/authorized_keys2 by, for example,
cat id_rsa.pub >> ~/.ssh/authorized_keys2
. Be careful, using '>>' and not '>'; '>>' will append to the file, whereas '>' will replace everything in the file, i.e. loosing any other keys already in there.Once the public key is put into authorized_keys2, restart the ssh service:
sudo /etc/init.d/sshd restart
. Then try and log into the server again, this time it should prompt you for your key 'passphrase'. If you still have password authentication enabled and you forget or get the passphrase wrong, it will prompt you for your regular password.More information on setting up ssh keys can be found here.
Whilst on ssh, I also set up keys to access the server from my webserver. The same process again for regular ssh access. But I also wanted to setup a second key, which would just be used for backing up my webserver using rsync over ssh. The second key would have no passphrase, so could be run as part of a cron job to back up my webserver without the need to enter a passphase.
This sounds a bit insecure but I added a bit of security to this setup by adding a rule in the authorized_keys2 file on my home server.
Looking at the contents of authorized_keys2, you will see entries for the public keys on seperate lines. Something like this:
ssh-rsa AKSIDEJE...[long string of random characters]...==user@domain
I added a rule in front of the key that is to be used by the backup script:
from="ip.addy.of.webserver" ssh-rsa AKSIDEJE...[long string of random characters]...==user@domain
This means that this key will only be allowed access if the request is coming from the specified IP address. So the only way to get that is to log onto my web server, which will require a key and passphrase to access as well first. So hopefully secure enough.
Back to top
Setting Up External USB Hard Drives
The laptop's internal HDD is pretty small at 20 GB by today's standards and I know I have at least c.140 GB of data to store, so my only option was to use external HDDs. Well of course I could have replaced the internal HDD with a bigger one and just had one external USB HDD but that would mean buying a new drive and I already had a couple of spare 250 GB 5.25" HDDs, which would do the job for now.
The NAS could become a simple USB enclosure and I just needed to buy an empty USB HDD enclosure for the second one.
My problem with this route of external USB HDDs was that they really would need USB 2.0 ports to connect to on the laptop if it was to be of any use. The laptop only had one USB 1.1 port. So I bought a PCMCIA USB 2.0 adaptor, which would provide me with USB 2.0 ports.
Someone worth noting here; the first PCMCIA adaptor I bought was some unknown, cheap, made in China copy and this simply did not work. It would allow me to mount the USB HDDs and browse them but as soon as I started trying to transfer data through to the drives, the computer would crash. Every time I tried to rsync data (or FTP) from anywhere (remote or from internal HDD) through the USB adaptor, it would make the computer just lock up, with a hard reset the only option.
For a while I thought it just wasn't going to work via a PCMCIA adaptor, but I thought I would just try another adaptor in case I had a dodgy one. So I bought a more well known branded one, a D-Link one, and thankfully, this worked. So my lesson was that the cheapest option sometimes just doesn't work.
Before setting up the permanent mount of the two drives, the old NAS drive had to be reformatted to ext3. It was currently FAT (or NTS can't remember now) and I didn't want that. It also had all my data on it, so I manually mounted both temporarily and used rsync to copy all the data from the old NAS drive to the other, which was already ext3.
I then formatted the old NAS drive using fdisk:
sudo fdisk /dev/sdb #logical name of freecom drive
d #to delete partiton
n #new partition
p #primary partition
1 # partition number
1 #when asked about cylinder start 'from 1 to max'
enter to accept default maximum cylinder
w #write changes
exit
I then used mkfs to format the new partition to ext3:
sudo mkfs -t ext3 /dev/sdb1
I then used rsync to copy all the data back to this drive as this was now the primary 'media drive'. I now had two copies of my data on two drives.
Back to top
Mounting External USB HDDs Permanently
To mount the drives automatically every time the server boots, you just need to put some simple entries into /etc/fsab. The best way to refer to the drives is by their UUIDs so there is no confusion with another other drive that may arrive on your system bearing the same label.
To find out the UUIDs of the two drives, mount them somewhere temporarily and then:
ls -l /dev/disk/by-uuid/
This should produce an output similar to this:
lrwxrwxrwx 1 root root 10 2010-06-08 19:28 2cfda077-e676-4832-80b2-aad33963136b -> ../../sda1
lrwxrwxrwx 1 root root 10 2010-06-08 19:26 5b3df31b-5e75-47d7-886d-dc35722189b2 -> ../../sda5
lrwxrwxrwx 1 root root 10 2010-06-08 19:26 8a7d8e21-4def-4b00-8595-e1baae916b54 -> ../../sdb1
lrwxrwxrwx 1 root root 10 2010-06-08 19:26 a85c00fc-abc3-405f-bd84-6447b8b094ce -> ../../sdc1
sda* is the internal HDD, sdb* and sdc* are the two USB HDDs. You can double check with
sudo fdisk -l
to help be sure which drive is which.Set up mount points. I created a directory in home called 'media' where I would mount the media drive and another directory in /mnt called 'backup250gb' for the backup drive. Then edit /etc/fstab and add the following lines:
UUID=8a7d8e21-4def-4b00-8595-e1baae916b54 /home/jonr/media ext3 defaults 0 0
UUID=a85c00fc-abc3-405f-bd84-6447b8b094ce /mnt/backup250gb ext3 defaults 0 0
This now mounted the drives where specified every time on boot.
Back to top
Automatic Web Server Backup
In a previous blog post, I explained how I was using lftp to do a mirror of my web server to a NAS drive at home as I could only access the NAS drive via ftp.
Of course, now I have a small server running I can use better (more secure) methods. I wanted to set up rsync as the means to perform the backup.
After a few trial runs, using the
--dry-run
option of rsync to ensure everything was working, I editted my backup script like so:
#!/bin/bash
#Backup script for server
#set variable of date for labelling
date=`date +%F`
cd /home/jonr/backupdata/
#remove oldest mysql backup
rm `ls -t *mysql* | tail -n 1`
#Dump mysql databases
mysqldump --all-databases > /home/jonr/backupdata/${date}_mysql_backup
rsync -ave 'ssh -i /home/jonr/.ssh/backupkey' --delete --exclude g2data/tmp /home/jonr/ addy.home.server:/home/jonr/media/.jcrdev/home-backup
rsync -ave 'ssh -i /home/jonr/.ssh/backupkey' --delete /var/www/ addy.home.server:/home/jonr/media/.jcrdev/varwww-backup
The '-i' option for ssh specifies which key to use.
The rsync options:
-a for archive mode
-v for verbose output
-e to specify ssh command
So this is rsync'ing the parts of my web server I want to backup directly to the mounted media drive on my home server. I run this as a cron job once a week.
Back to top
Backing up the Media Drive
To add redundancy to the system, in case the media drive ever fails, the backup drive is there to simply be a mirror of the media drive, so once a week I run another simple script on the server using rsync:
#!/bin/bash
#Back up whole media external USB HDD to secondary external USB HDD
rsync -av /home/jonr/media/ /mnt/backup250gb
Simple.
Back to top
Setting up Samba
I simply edited /etc/samba/smb.conf and added:
interfaces = 127.0.0.1, 192.168.0.2
bind interfaces only = yes
hosts allow = 127.0.0.1, 192.168.0.6 #plus any other IP addresses I want to allow access
hosts deny = 0.0.0.0/0
security = user
[Music]
comment = Music Share
path = /home/jonr/media/music
read only = no
browseable = yes
This set-up allowed only my laptop on 192.168.0.6 to connect, any other will be denied.
security = user
means only users with an account on the server can access it.Browsing to 'network' in Nautilus showed me my server, and when accessing it, prompted my for username, then i could see the share.
The other way I could access this share is by mounting it using a command similar to this:
//192.168.0.2/music /home/jonr/Music cifs credentials=/root/.smbcredentials,iocharset=utf8,f ile_mode=0777,dir_mode=0777 0 0
Where
credentials=/root/.smbcredentials
is a file containing username and password for the share. This mounts the music share into my Music folder in my home directory.I set up a few other directories on the server to be mountable via smb by just replicating the section in smb.conf with different directory settings.
Back to top
Setting up MPD
What I wanted was to be able to remote control the server to play music, that would be output from the physical server, which I would then plug into a home stereo system, thereby having my music collection available to play through the home sound system, rather than through computer speakers. I wanted it to be a web based thing, so that any computer could just open a web browser and play music.
This was much simpler than I expected.
sudo apt-get install mpd
Then edit /etc/mpd.conf and just adjusted the following lines to suit my needs:
music_directory "/home/jonr/media/music"
playlist_directory "/home/jonr/media/music/playlists"
bind_to_address "192.168.0.2"
I had to chmod 777 the playlist directory so that the music player of my choice could write playlists to it.
Browser MPD Player
There are many different clients available to control an MPD daemon, but I specifically wanted a web based one, so after a bit of looking around, found many alternatives, so settled for trying phpmpreloaded.
Installation was as simple as unpacking the downloaded tar into the directory of your choice, probably /var/www/ so you can just access it from ip.address.of.server/9099 as default. This presents you with a choise of players that come with the installation. I prefer the first one.
It is very basic, but this is good as it works well from my phone too. But I can very simply browse by file or tag and search easily and then build and save playlists.
Volume
When it came to testing the player out, I struggled for a while to get any sound. After a lot of fiddling with wires in different configurations, I eventually discovered that the headphones socket (yes headphones out only unfortunately) wasn't putting any sound out when connected to headphones.
After a bit of digging around I eventually installed a simple little command line program called 'aumix'. This simply allowed me to turn on the (currently off) volume and we had MUSIC!
Very pleasing that I can now just flip to a certain input on the home cinema, and just browse with any device to the web address and control the music!
Back to top
Setting up Rhythmbox Library
One more thing to do with music I needed to get setup was getting Rhythmbox on my netbook to use the music samba share as its library source. The main reason for this was so that I could use Rhythmbox to sync music from the collection to my wife's iPhone.
I found that mounting the music samba share by using Nautilus or the Places menu 'connect to server' would not allow Rhythmbox to use the share as its library but using the mounting code described in the samba section above, it worked. So I could just mount the music share whenever I wanted with a simple script containing that mounting command or put it into my netbook's fstab to do it automatically every time.
Ubuntu 10.10 claims iPhone support and I can confirm, yes it does! I just plugged the phone into a USB port and it opened up a Nautilus window so you can browse the disk as with any USB storage device. Opening Rhythmbox and I can see the iPhone under the Devices list on the left.
Syncing music is as simple as finding the music in the library and drag-n-dropping it onto the iPhone device. Progress is shown in the status bar at the bottom. Simple!
Back to top
Friday, 25 June 2010
HTC Desire
After surviving for a few months with an old Motorola V70e, I am back in the world of 3G and smartphones. My 1 year old ended the life of the screen on my HTC Touch HD just a few months before I was due an upgrade. Gotta love kids...
Thanks to the folks at Phones4u for getting me an HTC Desire upgrade, not only for free (if I went direct through Orange it would have cost me 50 GBP to upgrade to the same phone) but also with the cash back I got for trading in the Motorola my wife was able to buy herself out of her old contract and get herself a new iPhone.
First Impressions
This is my first Android phone. I have been using Linux for years now, so have been wanting an Android device for some time, but was stuck with the HTC Touch HD due to the contract. The Touch ran Windows Mobile 6.5 and was a nice phone, but in comparison to the Desire, much slower. Slower as in the time it takes to open programs, to smoothly scroll around the screen (where you could). Of course the Desire has a Snapdragon 1GHz processor compared to the HD's 525MHz one, so its bound to be better.
The screen is just gorgeous. The touch screen is so nicely setup, needs very gentle pressure, almost to the point where it is reacting before I make physical contact. Its very responsive and always does as I am intending it to do. Sweeping between the 7 home screens is very smooth and user friendly. And of course the ability to multi-touch is awesome and so very well calibrated. On the HTC home screen, one can pinch two fingers on the home screen and it zooms out to show all seven home screens as thumbnail for quick selection. It's very well thought about. The colours are rich and bright and I just love it!
I haven't used an Android device before so am not sure what a raw Android interface is like, say on a Nexus One. The Desire comes with HTC's own front end, HTC Sense. To my naive eyes, this just seems to add a menu bar at the bottom, where you have a permanent 'phone' button, and also a button to access your apps, and a button for adding stuff to the home screen (widget, apps etc). How much this differs to raw Android I am not sure, but it seems to work well for me.
The camera is pretty good, as far as small lensed phone cameras go. It's may be the same camera in the Touch HD as it is also 5MP, but it performs a lot better. The Touch HD camera was my biggest complaint. In anything other than broad daylight, it was painfully slow, taking a good 4 seconds from pressing the button to the actual photo being taken. This was no good for me trying capture moments with my toddler who never stayed still. In daylight it was fine.
The Desire has a flash which of course helps no end in low light conditions, even night scenes, although I find the quality of pictures taken with the flash not very good, to be honest.
But overall it has improved a lot, taking pictures quickly. It has touch focus, which I love for setting the point of focus for arty pictures. Also it has face recognition, which seems ok, although I have yet to find it of real use. Another thing it seems to let you do is to just press the button fully to quickly take a shot; the HD would not allow this, which contributed to the slow capture in low light.
Here is a sample photo taken to demonstrate the touch focus. Obviously I touched on the left hand side to make it focus on the object in the very near foreground.
I'll talk about apps that I have discovered and am using in another post as there is probably going to be a few. I love the concept of the Market place being contributed to constantly by software developers, both professional and amateur. Thee seems to be an app for everything. It's such a great concept and the Market software works pretty well.
As expected the setup out the box was very oriented towards prompting Orange with custom Orange apps. I soon got rid of them and started setting my home screens up from scratch, which seems to be something that constantly evolve I think, as today I am still constantly rearranging things. But I think hats part of the beauty of it, its not a phone with a fixed menu, you can just keep changing it at whim.
What I like is how well things seek to just flow. Everything that seems to be the obvious way to go is there. It's very intuitive. I find I am experimenting with different ways to control things and finding that most of the time it does work like that. It really gives sense of a lot of different people have given input in to how it should work audits all been implemented leading to such a usable device.
My wife has an iPhone. I find it utterly boring as far as the phone and its OS goes. You can play with apps but the phone is nothing. The Desire I can play with all day exploring how it works and finding new ways to do things.
I think they say, only nerds have Android phones and can see why.
Created with Drupal Editor
Tuesday, 6 April 2010
phpgedview Genealogy
I thought about starting a family tree a while back and finally got round to starting something this week. One of my criteria was that I wanted it to be web based so that I could access it from anywhere, keeping it in the 'cloud' so it wasn't tied down to just one computer in my home and also to have the ability to give other people the ability to log in and edit the database.
I thought my parents might like to contribute what they know towards it, so the web based option will work well. I can create them a log in and they can edit it online from where they are.
I am not quite sure yet the best way to find out how to trace your ancestors, but I am sure that will come as I get into it and start asking questions, as obviously, many people do it. I am not sure how big my family gets, but I know my wife's side is very big, and has origins in rulers of Ethiopia, so that will be interesting to discover.
I know that some of my wife's family have already done a lot of this work, but I am not sure at the moment if I want to just ask them for info or unearth it myself. Seems pointless just filling it in from someone else's hard work.
Anyway, the software I am trying out is phpgedview. This installed VERY easily on my Ubuntu 8.10 web server. Just a case of extracting the zip into your chosen directory and running the config page.
I have no experience with genealogy software, but I like the way this works, with the ability to add a lot of detail about each individual and it is very intuitive as to how all these people are linked. It kind of forces you to only add people who can be linked to existing individuals in the database, avoiding having floating individual who are not connecting to anyone and therefore not part of the 'tree'. As an admin though, one can add unlinked individuals if necessary.
I have had little time to play with it yet, so not sure how it can be customised, if at all, but the good thing is that it is built on a GEDCOM database, which is a standard for most genealogical software. So my database could be exported into another genealogy package.
It allows you to view various charts and download reports of individuals, families etc in PDF or HTML format for easy viewing or printing.
So far so good, hope to build up a comprehensive database.
My set-up can be viewed here but I think the viewing of details is limited by non-members. I am still to work out the details of privacy levels.
Tuesday, 23 March 2010
My Brain
I was listening to Are We Alone podcast this morning on the way to work (on an old iriver instead of my HTC Touch HD - but that's another story) and it was all about how your memory works. One thing I always think about and complain to others about is what I perceive to be the state of my memory.
I really think I have problems remembering things. I think I spend a lot of time listening to podcasts (science, linux, etc), reading literature/blogs and watching documentaries and I am quite annoyed at the fact that I seem to really struggle to retain all the interesting facts that I am reading/listening/watching. I could read a book on something, thoroughly enjoy it, then forget most of it. The subject could come up in conversation and I always feel that I should be able to speak up because I have read about it, but I usually quite blank!
Its really beginning to bother me as I imagine how much I would know if I remembered more that I took in, as I do feel like I make an effort to keep myself reasonably well read. Its worse when I hear someone talking about something that I know I have learnt about, and they are saying it wrong, but I am not able to recall the facts to put them right.
One of the interesting things in the podcast was that 2 main things are detrimental to whether you remember something or not: 1) Interruption and 2) Concentration (I actually can't remember from this morning if that's the right word, but it fits for what I remember I was going to write about).
Interruption is obvious and an external factor, but concentration is really interesting. I realised when I heard this that that is exactly what happens to me. I find myself all the time when reading or listening to someone, realising that my mind has actually wandered onto something else and I have to kinda snap myself out of it.
I actually did this whilst listening to the podcast as I was thinking about the above paragraph, so missed the next few minutes of the podcast! And then when I realised that's what I had done, I started thinking about that too missing even further minutes! Its ridiculous.
I don't think I have problems, I just think I want to be able to have a better memory, as I do have interests in many things and my mind is always going at a hundred miles an hour on different topics.
So I guess if I really do think it is a problem I will have to find ways to train my memory... can only be a good thing...right?
Monday, 8 March 2010
Samsung N130
Firstly booted it up into XP to ensure it worked. Was a bit put off by the Samsunng recovery facility, which has a partition of its own plus it also wanted to split the XP partition further into a C: and D: and run a backup. Took ages before I could get to actually use it. The XP install is about 6GB! So didn't look around that much.
So I quickly set to installing Ubuntu9.10 on it from a live USB stick, setting up a dual boot. I kept XP seeing as it was already paid for and has its occasional use, but I am very tempted to wipe out everything, including all the restore nonsense and just have 100% Ubuntu. But I know I would probably regret it. It would be nice to be able to clean it up to just a Windows partition and an Ubuntu partition, but I don't know how to get rid of all the Samsung stuff. Suggestions welcome!
So as usual, Ubuntu installed, booted and worked wonderfully and quickly. So, so pain free and simple compared even just to the XP initial setup, let alone a fresh XP install. My only hold up was wireless, as often encountered. But a quick bit googling quickly found a solution here. Sorted, all working! Updated and installed a few essentials to get started.
On a side note, was a bit pissed off with the insurance company. Made it clear from the beginning that I wanted the old one back if possible. The screen was smashed but I thought I could use it as a little server. They told me no problem on multiple occasions I could get it back.
So I sent it off to be innspected, very time consuming process, as one glance would show how beyond repair it was and got a call back to offer the Samsung. Agreed on the Samsung and asked for the old one to be sent back. 'That'll cost you £100 please'
'Eh?!'
Apparently they failed to mentioned that as soon as I sent it off it became their property and I had the right to buy it back! Maybe this how all insurance companies work, I don't know, it was my first encounter. So I ended up just paying £15 for the HDD! Gits!