I just converted my last Ubuntu machine to ArchLinux.
Everything went fairly smoothly (with the exception of ever-changing order of HDD’s, which is now fixed without the Live CD thanks to UUID’s) but I couldn’t figure out how to re-create my RAID1 mdadm array.
Googling away actually didn’t return what I wanted! I was starting to panic, as I didn’t quite think out my backup routine quite as well as I should have (i.e. don’t backup to a software-raid-array!) I sorta needed to get it restored!
Thankfully I figured it out:
mdadm --assemble /dev/md5 /dev/sdc1 /dev/sdd1
Back in business!
As with all my software, I like my server to run the function if possible. I have my torrents, calender/PIM, knowledge base and now my RSS feeds all accessible via the web from my webserver.
TT-RSS is a great piece of software, it’s AJAXified PHP so the interface is smooth as but nice and responsive.
Couple of great features, it can be single or multi-user, there is a grouping + tagging/label function. This involves setting SQL strings, but I haven’t used it yet. TT-RSS can be set to download in the background or when you load it.
I imported my OPML from Akregator and now I’m up and running, just needed to sort the categories as it only supports 1 level of grouping as opposed to nested in Akregator.
I recommend giving it a go!
EDIT: Added links, some reason I didn’t before .. weird!
My file server has a 500GB LVM partition in XFS spanned across 3 HDD’s so I thought I’d check how fragged my system was. Here was the result:
actual 352480, ideal 58355, fragmentation factor 83.44%
83.44%! WOW! I’m amazed … I does contain a lot of files I download through TorrentVolve that usually get extracted then moved around so I wasn’t surprised it was fragmented. I’ve left it running now, wonder how long it’ll take?
And of course – thanks for the tip Micke!
I’m currently in the process of trying to find a piece of software that I can use as a web-based knowledge base.
To date, I’ve been using MediaWiki, but I find the navigation of wiki’s painful. If you want everything in one page, it’s great, but once you start getting sub-pages, often cross-referencing, I find it becomes cumbersome to navigate around.
Here’s what I want
- Ability to search the entire knowledge base for a word/string/phrase contained within a document.
- Easy to view and then edit the data.
- Ability to reference and cross-reference documents in a tree structure.
- Ability to enter information as easy as it is to in either the WordPress WYSIWYG editor or code editor.
- Is compatible with Opera! (You’ll see why in a sec)
- Is web-based and runs on apache2 and if necessary, php5 and mysql. No postgresql.
- GPL (or equally ‘free’) software.
The one I like most at the moment and have been entering data into is KnowledgebasePublisher, but it’s not quite working out how I hoped.
The structure of the information is exactly what I want, and the search feature is good too, but viewing then editing information is painful. There is a public and admin view which are different parts of the site. In public view, you can’t edit in admin view you can’t view! There is a WYSIWYG editor but is just plain bad. Switching between (using WP definitions) Code and Visual adds all it’s own formatting, often overwriting what I’ve put in! I have to manually type out all the HTML tags to get the formatting I want. This is where MediaWiki excelled. The last annoying aspect is that by default in admin mode, you only see the last ten posts by order of original date!
I used to send my self “Daily” emails with various links and stuff I found on the web during my lunch breaks, often researching my next ‘pet’ project but work monitors emails so I wanted to stop this so now I can just stick all the links in a text document and then stick em on my site at the end, with minimal hits on my web-server through work’s proxy. Instead I wanted to create a loooong thread which I can edit/add/del links as I find them and deal with them. If they are good, stick them in another topic related thread, e.g. I am building one for debugging my current Postfix issues.
Anyway, so now KBP is starting to bug me (after 3 days and building up 35 articles), I want to find something else. I’ve been scouring freshmeat.net, under document storage and information management. Haven’t found anything that crash hot, except one which then I realised was proprietary for something like $200USD pa! No way Jose! I’m I want GPL! If I really like it then I’ll donate. But the search continues.
I’m now contemplating expanding my search into a CMS type system. I’ll let you know how it goes.
This was one of my posts from my privately hosted WordPress blog. I decided not to expose it to the world as 1. it has 256kb upload 2. I keep private docs on that same PC so it’s a security issue for me. Anyway, here was my original experiences posted 18 February 2007.
My adventure into the World of Linux (WoL) started off with me building a server for files serving and basic web-hosting, this site you are currently on now, it’s a PC in my linen cupboard and does some funky stuff for me!
Setting up Linux as a server was I have to say, fairly painless. It’s been many years since I last used Linux, I set up a slack-ware 9 box to play with and I think I had it about 2-3weeks before going .. what now? Linux has come a long way since then in terms of getting distros that work ‘out of the box’ .. <in steps Ubuntu>.
So, I want a server (PC care of my dad) … Ubuntu is the top of distrowatch, I go check it out and they have a server edition, cool! Download, install … Using their help site. Wow i’m up and running .. how easy was that!! I was impressed. Since then I broke that server and reinstalled from scratch, but basically set it up again the same.
In steps another PC (care of the father-in-law) which is higher specs (scsi card & tape backup care of brother-in-law) and a nice fast SCSI HDD ($20 from Discount Computer company I used to work for), a stop gap until I can get a Dual processor rackmount server I can attach to the bottom of one of the shelves! [Now’s a good time to mention first PC was running IDE at ATA33!! woot, fast </sarcasm>]
I decided to do more research again, see if another distro was more suitable. After hours of reading, I came to the conclusion that Ubuntu server is a fairly decent server product. It installs very little to start so you can pick and choose. The Ubuntu repositories have up-to-date software, at least everything I needed, latest mythtv, wine etc.
So, installed it all from scratch again and it’s working sweet, in fact it’s running this site and the associated gallery, plus a few other web-based service that I use for personal stuff and it hardly gets above 50-60% CPU utilisation.
My only problem now is I keep running out of hard drive space, the SCSI is only 18GB and the IDE I’ve put in is 40GB, that willl tide me over until I can get something bigger in it!
Next story is my wonder story of installing Linux on my desktop with Enlightenment E17 (mmmm)!
I’ve actually been getting a lot of hits (OK fine – my only hits) about this and upon re-reading I made a few assumptions about getting VM in the first place.
So this is part zero, the bit before what I previously wrote.
- Head on over to VMWare Server page found here
- Click the Download of VMware Server 1.0.2
- Download the tar.gz of VMware Server for Linux (BTW its a little over 100MB)/li>
- Whilst that’s downloading, go register here
- Once it finishes downloading, which depending on your link speed could take a while, let’s unpack it.
tar -zxvf VMware-server-1.0.2-39867.tar.gz(Remember just type tar -zxvf VM then hit tab to complete)
- Run the installer:
- When it asks:
Before running VMware Server for the first time, you need to configure it by invoking the following command: “/usr/bin/vmware-config.pl”. Do you want this program to invoke the command for you now? [yes]
- Now do what I mentioned in my previous post
And once it all finishes you should have a working VMWare Server under Feisty!
My wife & I have a 7 week newborn and we have been taking photos like crazy, thanks to a big 1GB memory card. Very very quickly we are getting a lot of photo’s, like another 2gb+ in the first 7 weeks and being digital, it’s so easy to loose them, either through HDD failures [unlikely but possible], accidental deletion [wife learning Linux desktop – very possible!]; or my worst fear, fire or theft – someone picking up my PC and walking out the door or my house burning down!
I have almost a terabyte of hard drive space (actually 948GB) scattered through my 3 machines. Only 750GB is in my file server in LVM and only 400GB-ish is currently used. I made the realisation very quickly that I’m never going to back it all up, but I do have at my disposal, thanks to a kind brother-in-law, a DDS20/40 tape drive. Not only that, I don’t want/need a full enterprise backup solution. If any of the PC’s OS die/gets severely corrupted, I’ll reinstall from scratch. It’s an inconvenience, but nothing I have running is ‘mission critical’.
So I’ve had to decide what I’m going to backup. Basically, all I care about is my documents (about 1GB worth), my photo’s (now 8GB), both are kept on my desktop PC and the config files of my server (/etc – about 200MB (I think)). All my music and videos are going to have to be sacrificed if the worst was to happen.
I’ve decided to implement a two-fold backup strategy. I have my originals on my desktop PC, I regularly copy my docs/photo’s to a specific backup share on my file server then less regularly palm it off on to a tape. At the moment it’s all done manually, I manually copy the files, I manually run the tape script (synbak).
To the point – I wanted to do it automatically!
Firstly, I wanted to setup an automated way of copying my local docs/photos to the file server. After a bit of Googling I found the article Backup for the Home Network from Linux Gazette. I wanted a script that did incremental backups and could handle doing it over the network. This fulfilled both requirements! This is how I set it up:
- I grabbed their backup script,
run-backupand saved it to my script directory,
~/scripts. This is a directory under my home drive where I put all my scripts. Depending on what it does, sometimes I’ll create a symlink to /usr/local/bin, but in this case it’s a personal backup so I’ll skip the step. **
- Make the script executable, just for me though!
chmod 700 run-backup
- I ensure I have my file server being mounted on boot-up in
/etc/fstabto my backup share,
/data/backups. See below about the section on mounting with samba.
- Tweak the run-backup script with my settings.
- I needed to create the last-full directory so:
sudo mkdir /data/backups/last-full
- Double check tar is in the right place:
- I changed the Full Backup day to a Monday, changed
$DOW = "Sun"to
$DOW = "Mon". We tend to take most of our photo’s on the weekend, so this was a better choice, plus being a desktop PC it gets used more on the weekends – I don’t want it slowing down while it’s backing up.
- I needed to create the last-full directory so:
- Next I set up a cronjob which runs this on a daily basis at 4pm. My home PC is almost always on, at least it’s always on when I’m awake. I did this as my user, not a root cronjob:
- I added:
* 16 * * /home/user/scripts/run-backup
Just to make sure the script is working, I run it manually. Depending on the amount of files it could take awhile. My 9 GB worth takes about 25-30min.
Problems with Samba!
When I was testing, the script kept dieing at about 2GB. Upon closer examination, it died at exactly 2147483647bytes each time for 3 runs. At the time I didn’t have time to examine what was wrong, but during my lunchbreak I was reading my Linux news and lucked up the article Using external file devices in Linux: Climbing the “mount” command.
At the end of the article, the author, dcroxton, had the same problem, there is a limitation of samba in that it only handles 2GB files by default. I’d found the solution without even looking for it!
With a quick add of ,lfs to the end of my
/etc/fstab, I was back in business! I ran the script and all 9GB went across fine. Just to be sure, I extracted the files out again and checked them and they were all fine and dandy!
The next article is going to be on the synbak script and backing up it all up to tape.
** The reason for this is I’ve broken a few installations. As /home is a separate partition, when remount it, all my scripts are back! I’ll post what I have in there soon. There are some nice little ‘helper’ scripts.
EDIT: I noticed that I made a few early assumption on getting to this stage. For completeness I have gone back and described how to get VMWare Server. Check is out here.
EDIT2: Updated for new vmware-any-any-update109. Thanks to Monku for letting me know.
VMWare Server is available for free from their website. All you need to do it ‘register’ and get a key. The key is reusable if you ever need to reinstall (which I’ve done a few times now!), mind you there are different keys for Linux and Windows.
When installing vmware-server (and I believe workstation but I don’t use this) on Feisty I was getting errors around compiling vmmon.ko and it would bomb out.
When running 6.10 Edgy I had this same problem, however I was using a kernel.org vanilla kernel – 2.6.20 which was the latest at the time. Feisty, however, comes with 2.6.20 as standard so this problem will occur (I imagine) for everyone.
There is great discussion on this on the VMWare forums and it took me a little while to disseminate all the info. Here is a description of what the patch is, but I’ll run through how to install it.
Go to http://ftp.cvut.cz/vmware/ and download the latest vmware-any-any-updateXXX.tar.gz update, or (as of writing):
tar zxvf vmware-any-any-update109.tar.gz
Make sure you have what you need to compile
wajig install build-essential **
Enter the directory:
Run the script:
It should prompt you to run vmware-config.pl as well, but if it doesn’t enter:
Go through and answer all the vmware questions and you are up and running! To run the server the command is
** wajig is a simplified command-line replacement for the package management that comes with debian/Ubuntu. It combines various apt-get/apt-<everything else> functions into one tool and saves a heap of key presses over time. It doesn’t come standard so get it with
sudo apt-get install wajig. Make sure you have universe enabled in your sources. Otherwise just use the boring
sudo apt-get install XXXX