Sunday, 15 December 2013

MySQL Replication

MySQL replication is very flexible for running multiple servers. Most MySQL administrators should already have a copy of "High Performance MySQL" (); if you don't, get this book as it is top notch and will guide you through most configurations imaginable. Rather than repeat what is already written, here's a few things I've found that can help make things simpler.

There are a few things with replication you need to be attentive to.
  • Set a unique "server-id" for each server
  • Keep your binary logs (and relay logs) with your databases
  • Initialize your slave cleanly
  • Run your slave "read-only" 
  • Use "skip-slave-start" if the slave is used as a backup

Figure out a scheme to create a unique "server-id". If your database hosts all have unique number in their name, you could use that. I've been using their IP addresses converted to decimal (IP Decimal Converter). Even if you're aren't thinking about master-master or many slaves configurations today, starting with this set will save you the trouble later of having to reinitialize everything.

Unlike best practices on other platforms like SQL Server, with MySQL it is going to be simpler to keep the binary logs with the databases. Like if you try to initialize slaves with LVM snapshots, capturing binary logs with databases in one snapshot is going to be the best. And moving binary logs later is a challenge. So for the "log-bin" and "relay-log", set these to a file name only and not a full path ("mysql-bin", and "<hostname>-relay-bin").

 Create a script to (re)initialize a slave. If your environment is on the messy side and there's a lot of strange things that people are doing in the databases, you are going to find your slave is at risk of getting out of sync with the master. I would suggest re-initializing the slave(s) as often as possible, like monthly. Even if your environment is pristine, you will inevitable make changes from time to time and will want to have had a consistent tool for initializing slaves. Again, "High Performance MySQL" has several good ways of doing this. LVM snapshots and rsync are what are in my scripts. The Percona Toolkit (including the former Maatkit) also has some good tools for you.
Running a slave in "read-only" ensures only "root" can make changes which helps ensure the slave integrity. If you want to have a writable slave, e.g. one out of sync with the master, why use replication? Load data as needed and skip the whole replication thing. Even so, very bad schema (like tables with no unique keys) are still susceptible to falling out of sync as replicated transactions may not behave consistently. There may be other reasons for slaves to fall out of sync, but this has been my problem so I couldn't attribute any out-of-sync issues to anything else.

For a server that is intended to be the backup of the primary, the "skip-slave-start" is necessary for the times you do use the backup server. It means every time you restart MySQL, you have to manually issue "start slave" which prevents the backup server from downloading transactions from the primary server like after you have made a cut-over and are trying to restore the primary.

Wednesday, 27 November 2013

Monitoring Network Traffic at Home

Since the news last week about LG "smart" TVs ignoring privacy settings and sending all your viewing and media information to LG, BBC News LG investigates Smart TV 'unauthorised spying' claim, I started looking at increasing monitoring of network activity at home and see what my Wii or Sony Blu-ray player or other devices are up to.

Virtually any router allows you to enable SNMP which is enough to collect aggregate interface traffic. I have been using Cacti for recording traffic for years.

What I've found is that DD-WRT has something called "rflow" to send live packet information to a monitoring server - an equivalent of Cisco NetFlow. The Network Traffic Analysis With Netflow And Ntop guide is very good, and on Ubuntu the ntop server is readily available. This gives a live view of what systems are connecting to what, how much traffic is passing on different protocols and top users. Great if there's any question of who is hogging all the tubes.

But not enough to tell if your "smart" DVD player is reporting to Sony that you enjoy "midget porn" in the privacy of your own home (I'm not judging; that was the example in the BBC article). For that, I will need to look at some bigger iron - Snort to really go whole hog.

Wednesday, 17 July 2013

Simple web page for generating passwords

I find it annoying not to have APG handy when I want to create a password. I also don't like using random websites online for this either because I can't trust that they aren't logging their output. So I put a simple little form and using PHP to invoke APG to create passwords. My form right now is very simple and doesn't support all of the APG options, but it will do:

https://alia.thenibble.org/passwords/

Here's the simple code I'm using:

https://docs.google.com/document/d/12CViQAEW6q2moZ4GtkJllsD0-8pazpVMfeYjlrBz-4s/pub

Monday, 15 July 2013

Use owncloud for RSS feeds

Sorry hosted services, I will be using my owncloud with the "news" plugin for my feeds.

Whipping Up a Quick Site

Recently I helped set up a quick site along the lines of a proof of concept for a community type site and had decided to use Google Sites and we were very well rewarded with the richness of what is readily available after getting a bit used to how everything is put together.

I feel like there is almost no technical knowledge required, it just takes a bit of getting used to. Not much, but some. When you are first trying to layout your site, you have to poke around for a bit to figure out what things need to be changed on the page, the page layout, the page template, or the site layout. I don't want to do a mock-up and the site we put together will get taken down too soon to be used as a reference - as long as you have some idea for a layout for the site you want, you can make it in Google Sites. Sketch it out on paper works fine, just fine. There will be a couple places where some HTML knowledge is useful, not necessary but useful if you want something too align or look a very specific way.

Now for the cool stuff.

Site templates - there are a bounty of free site templates everything from generic Pokemon theme to a complete Strata community site with calendars and council meeting minutes etc. 

And if you can't readily find a site template that helps get you started, it is super easy to put a site together even from scratch. Managing the site layout and using page templates make building your site very fast. You get the look and feel you want for your pages together quickly so you can work on the content. A handy trick with your page templates is you can give a default "parent" page so you can create many pages in one section.

In your site layout, you can have a main navigation bar. The nav bar you put together manually putting your main pages and sections. When you put sub-pages in the main nav bar, it makes nice little pull-down menus. It sure beats the stock left-hand nav section which lists all your pages in alphabetical order - get rid of that ugly thing.

And then there's the widgets. The best integration on Google Sites is obviously Google's other products: Calendar and Docs. An event or other calendar can be dropped right in your site and then regular sharing rules apply.

One of the features of Google Docs (or is it Drive now?) that is particularly useful for your site is the "forms". You can create contact forms, polling data, and probably a lot more than I've seen in my quick tour. 

The contact form is interesting because what you can do is change the form responses to send notifications for when it is filled out. This will give you a stock contact form so you don't have to put an email address on the site and only takes a minute or two to put up.

Any information your gathering like from a poll comes with rich analytics out of the box. The form responses have full reports on response selection and trending. Which you can publish or not as appropriate.

Lastly I will mention you can of course use your own domain for the site. Anonymous visitors to your site will see the custom domain. Users logged in to sites will be instead redirected to sites.google.com/a/sitename/page/blah/blah/blah... 

Super quick and easy. 

Google Sites is "free" so remember "If you're not paying for the product, you are the product."

... And even if you pay for the product, you may still be the product.

</rant>

Saturday, 15 June 2013

Get your ownCloud

I've recently moved some of my "cloud" files to ownCloud. There are some files we all store in the cloud which we probably shouldn't: password files, financial information including taxes, and maybe some dirty laundry too. For these uses, you don't need "anywhere access", you probably just need convenient access between your own computers and the backups afforded by having the files stored both on your desktop and your laptop is probably enough.

OwnCloud is your own cloud and you install it on some station to be the file repository. A workstation, or an Ubuntu server, whatever you have. The stock package in Ubuntu 12.04 is the much older 3.0x. You can use this with the older sync clients, but it doesn't have file versioning and other features of the newer ownCloud versions. You can add the ownCloud software repository from their site. I assume that setting up ownCloud on Windows is a bunch of double-clicking, that seemed like too much effort so I didn't bother.

There isn't much configuration, once it's up your first hit to the web page does the configuration which includes: creating an administrator account, ... That's it. After logging in there are more options like requiring SSL if you want.

 
There are also a lot of plugins / apps for enabling calendar / contact sync, for integrating authentication with LDAP, external storage, and more than you can imagine.




Since you likely have a hard drive at least 500GB in size if not multiple TB, this is automatically a much larger repository than you'll get without a monthly bill from cloud services. It's pretty cheap too, a new drive should be under $0.10 / GB these days whereas DropBox is about that per month.

Like any of the cloud services, you can access files via web interface:



There is a "sync client" for PCs and that includes Windows, Mac, and the major GNU/Linux distributions (CentOS/RHEL, Fedora, openSUSE, Ubuntu, Debian). There are also mobile clients but I haven't tried them.


There's ownCloud for you. You can totally replace the big corporate run cloud services if you want or complement them.

Thursday, 11 April 2013

Cron scheduling for first Sunday of the month

For everyone who uses cron, you are familiar with the job schedule form:

min hr day-of-month month day-of-week <command>

A problem with cron job scheduling is if you want to schedule something, like backups or updates, for "the first Sunday of the month".  The job spec "0 0 1-7 * Sun" will run every Sunday and every day on the 1st to the 7th of the month.

The way to work around this is schedule the job for the possible days to run and then as part of the command, check the date before running the command.  I've just seen what is The Best format for this:

0 9 1-7 * * [ "$(date '+%a')" == "Sun" ] && <path>/script.sh

This solution comes from LinuxQuestions.org user kakapo in the post here:

http://www.linuxquestions.org/questions/linux-software-2/scheduling-a-cron-job-to-run-on-the-first-sunday-of-the-month-524720/#post4533619

Up until now I used a slightly different form of this using the day of the week in the cron job and then testing  date %d to test the day of the month.  But the above form is far clearer and easier to schedule jobs with.

So props to kakapo for sharing that form and until cron changes how the day-of-the-month and the day-of-the-week fields are used, this will be the best way to schedule a job on the first Sunday of the month.

Popular Posts