Linux CRON Scheduling and Setup for MySQL Backups

Backups are useful, and a must have feature of anyone’s site or product if you are going to be successful. Backup as often as you can, as much as you can. In this short post I’m going to quickly outline Linux CRON Scheduling with MySQL to backup your databases.

Once logged into your server run:
crontab -e

This is like your CRON configuration file and will show you all scheduled tasks that you currently have running in your system. Here is an example of what part of my config file looks like:

# m h dom mon dow command

# db Backups - Every day at 4AM. - Write to _backups folder.
0 4 * * * /usr/bin/mysqldump -u'root' -p'[password]' dbname1 | /bin/gzip -9 > /path/to/backups/dbname1-`date +\%m-\%d-\%Y`.gz
0 4 * * * /usr/bin/mysqldump -u'root' -p'[password]' dbname2 | /bin/gzip -9 > /path/to/backups/dbname2-`date +\%m-\%d-\%Y`.gz
0 4 * * * /usr/bin/mysqldump -u'root' -p'[password]' dbname3 | /bin/gzip -9 > /path/to/backups/dbname3-`date +\%m-\%d-\%Y`.gz

# db backup cleanup - Every day at 3AM. - Remove backups that are 30+ days old (this is based on last time the file was modified)
0 3 * * * /usr/bin/find /var/www/html/_backups/ -mtime 30 -exec rm {} \;

These are some pretty basic setups, so I’ll explain the very first line:

0 4 * * * /usr/bin/mysqldump -u'root' -p'[password]' dbname1 | /bin/gzip -9 > /path/to/backups/dbname1-`date +\%m-\%d-\%Y`.gz

In this cron schedule I am backing up a database named “dbname1″ using the mysqldump executable. Then I am getting a little fancy and zipping the dump (as to save space) using the gzip executable. This new zip file is being moved to a backups folder of my choice and I’m naming the file with the days date, since these are occurring daily.

You may be puzzled by now at what the “0 4 * * *” does. Well there are explanations out there to explain it more thoroughly but here is a quick explanation. The first 0 is for minutes. If you want the task to occur on any given minute, for example every 5 minutes, then you would do 5 * * * *. The second 4 is for hour of the day. This uses the 24-hour system so a 4 would be 0400 or 4AM. So in my case, I have my task running at 0 minutes past 4AM, every day.. or 4AM every day. The third entry is for a specific day of the month, 0-30 or so. So if you wanted the task to run on the 15th of every month, at 4AM you would do 0 4 15 * *. The fourth entry is for actual months, 0-12. If I wanted to run a task on the 15th of October at 4AM, each year, I would do: 0 4 15 10 *. Lastly the fifth entry is the day of the week, 0-7. Again using this format if I wanted to run a scheduled task at 4AM on Tuesday, every Tuesday, I would do 0 4 * * 2.

I hope this helps explain the CRON scheduling tab a little more to those just getting started. One flaw to note is that whenever you are making backups you, if at all possible, want to have them backing up “Off-site” instead of within the same server. What good is a backup if the server goes down and you can’t access them. I’ll write a part 2 on this for how to move backups off-site after you’ve done this much leg-work to make them happen!

If you find any errors in this post please let me know.

Universal Geneve Medico-Compax ca. 1947 cal 285

Universal Geneve Medico-Compax, with rare 100% original dial, hands, case unpolished, etc. I put an ostrich strap on it but this watch is amazing in person. UG’s are very hot in the vintage collecting market and only going to increase in price as time goes by.










Honeybee Hive – The Extraction

Since I live on a farm I get the joys of multiple garages, barns, and storage areas. When going out to one of the old horse barns on a Saturday morning I noticed an increased volume of bugs, bees, and other flying around. We have plenty of carpenter bees flying around constantly eating up the wood beams and what not, but this time I noticed smaller bees, flying in and out from what seemed like a concentrated spot. What does anyone do when they see this… WHERES THE CAN O’ WHOOPASS.. Or RAID, or whatever you have that kills wasps, hornets, etc.

Luckily I’m a terrible shot, and these buggers were hard to get to.. So I missed most of it and only killed maybe 1 or 2 bees. Upon further inspection I noticed they weren’t attacking me nor anything else so I looked at the dead bee, did some research and what do ya know, its honey bees. From the entrance it seemed like they were camping out under my floorboards–a very tight space with really no way in or out.. besides the little hole they made. Perfect for a bee hive.. Cozy in the winter, protected from predators (for the most part), and easy to defend an entrance with such a small area.

Fast forward a bit (2 months) and my friend Jim Leether stops by some Saturday with beekeeping equipment, hive, and everything needed to move these guys to a safer area where they can thrive and produce honey and wax that can be harvested! Oh.. and I would get to use my workshop again, inside the horse barn, without disturbing them :). Without further ado… Pictures!



































16610 Submariner ca. 1989

Every watch enthusiast has a starting point, whether it be a $50 Seiko and a handful of quartz’s, or inheriting a handful of vintage mechanical chrono’s. Either way over time a collector or anyone interested in watches eventually finds out what they enjoy the most and they start honing in their purchases to a specific type of set that attracts them. Usually the first few purchases are a “tax” to learn this. This too was how I started, but I’m starting to zero-in my collection of watches with significant meaning to me and in the history of watches.

I recently sold one of my first “diver” watches, a Tag Heuer Aquaracer, which was a beautiful watch. Why did I sell it? To help fund this beauty that you are about to see.

One of my “grail” watches, which has evolved over the years, has been a birth-year submariner. It used to be a “hulk” submariner, then a sea-dweller, and then back-and-forth between the three. After quite a while of thought and decision making, I decided the birth-year submariner sung to me the most, and had the most meaning, something I could keep until I die, and hopefully pass on to my kids some day.





I went with a 16610 Submariner which includes the date window, and is COSC certified. I found it at the right price and the right time and decided to pull the trigger! I couldn’t be happier with my purchase and although my watch collection is still quite small, this will make quite a huge impression and forever be one of the most memorable purchases of my life.

(More photos to come as I plan to interchange a few worn leather straps and potentially NATO straps to create different flavors for different occasions!)


Write VERY large record-sets to CSV, JSON with ColdFusion

So I was tasked to build something that writes a very large data set to CSV Files, and JSON files. I was running into memory issues with ColdFusion because I was trying to store the CSV string in memory before writing it to a file on the server. For a 1 Million+ row recordset in SQL Server this usually would time out after a 2 hour request timeout, and just not finish. By using the underlying Java objects we can write 1 Million+ records to a CSV file, or JSON file in just about 8-10 seconds. Credit for finding these Java objects goes to my co-worker Sean Louisin.

The fix for this, which worked for me, is using the native Java functions that ColdFusion has access to. My process and full function below:

1) Instantiate the Java objects we need:

oFileWriter = CreateObject("java","").init(local.FileName,JavaCast("boolean","true"));
oBufferedWriter = CreateObject("java","").init(oFileWriter);

This creates our File and prepares the Buffer for writing to that file.

2) Create headers for the CSV file (Query Columns)

oBufferedWriter.write( local.qCSV.columnList & chr(13) & chr(10) ); // Write Headers First

3) Loop query and write directly to the file.

   local.rowArray = arrayNew(1);
   for (row in local.qCSV) {
      local.rowArray = arrayNew(1);
      local.columnCntr = 1;

      aCol = listToArray (local.qCSV.ColumnList);
      for( fieldName in aCol ) {
         local.rowArray[local.columnCntr] = PrepareCSVValue(local.qCSV[fieldName][local.qCSV.currentRow]);
         local.columnCntr += 1;

   oBufferedWriter.write(arrayToList(local.rowArray) & chr(13) & chr(10)); // Write fields, followed by new line carriage return.


Here we are using a generic function to “prep” the CSV value by escaping double quotes, etc. And then after we process the field’s for the current row, we write them directly to the file using “arrayToList”.. so that they are comma seperated for the CSV file. By doing it this way, we do not have to store the string in memory and risk out of memory issues. We write the query fields directly to the file and move on to the next. We can see how this might be an issue when working with rather large datasets with millions of records. Having all of those in memory is just a bad idea.

Here’s the full function for Query to CSV File writing:

Full Function:

public any function exportToCSVFile(required query q, required string fileName) {
   var local = {};
   local.FileName = arguments.fileName;
   local.qCSV = arguments.q; // The Query Object. To Which we will Export
   local.queryResult = structNew(); // Result set of the Query Object

   oFileWriter = CreateObject("java","").init(local.FileName,JavaCast("boolean","true"));
   oBufferedWriter = CreateObject("java","").init(oFileWriter);

   oBufferedWriter.write( local.qCSV.columnList & chr(13) & chr(10) ); // Write Headers First

   local.rowArray = arrayNew(1);
   for (row in local.qCSV) {
      local.rowArray = arrayNew(1);
      local.columnCntr = 1;

      aCol = listToArray (local.qCSV.ColumnList);
      for( fieldName in aCol ) {
         local.rowArray[local.columnCntr] = PrepareCSVValue(local.qCSV[fieldName][local.qCSV.currentRow]);
         local.columnCntr += 1;

   oBufferedWriter.write(arrayToList(local.rowArray) & chr(13) & chr(10)); // Write fields, followed by new line carriage return.


   oBufferedWriter.close(); // Close file buffer writer.


Writing a Query to JSON file is much easier and I won’t explain the functions again but here is a Query to JSON File function:

public any function exportToJSONFile(required query q, required string FileName) {
   var local = {};
   local.FileName = arguments.fileName;

   oFileWriter = CreateObject("java","").init(local.FileName,JavaCast("boolean","true"));
   oBufferedWriter = CreateObject("java","").init(oFileWriter);
   oBufferedWriter.write( SerializeJSON(arguments.q) ); // Write file
   oBufferedWriter.close(); // Close file buffer writer.

Hamilton Men’s Khaki Field Automatic

It’s that time of year again.. Yep my birthday! My wonderful girlfriend decided to get me one of the best birthday presents yet – a new watch. No matter what, she can’t go wrong getting me a watch. I even got to pick it out. What did I choose? Well I wanted something that came original with a leather strap. Something that was a nice casual, everyday watch.. yet not too fancy or eye-catching.

Hamilton Watches

This seemed like the perfect brand of watches for what I was looking for. Yea sure they have some crazy looking watches like the “Code Breaker” or the “Classic Jazzmaster”. I dove into the Khaki Field collection.. just seemed perfect. I like the simple face, no chrono, just a date, as well as the numbered clock and sweeping seconds. The black dial watch was calling my name. The others seemed, too catchy, albeit very nice watches.


Three things I love about this watch.

  • Automatic Movement
  • Leather Strap
  • Black Dial

Along with the other usual must-haves of a sapphire crystal glass, sweeping seconds hand, and a smooth outer casing that weighs and feels great.

Would I recommend this watch?

If you’re looking for a subtle, everyday, yet casual watch.. and want to keep the price under $1,000.. This is it. It has a great feel, but as all leather it will take a few days of wearing to get the leather to hug your wrist a little better. But a small price to pay for such a nice piece.


My Trading Story

I started investing and getting into the stock market back in October of 2011. The markets were just bottoming from a relatively large correction at that time and I had bought stock in Ford (F), Verizon (VZ), Paychex (PAYX), Freeport McMoran (FCX), and Apple (AAPL). I had under $10K invested at the time and was doing alright. By mid 2012 I had pulled a decent 10-20% profit on each investment. This was to be expected, since if you bought anything in October 2011 and held it for a few months you’d instantly be up as the bull market continued.

Then I started options trading.

After making profits on my investments, I wanted to do more, make more.. Build up my account more. I began learning how to trade options after I realized their powerful return potential. I first bought options in AAPL, some monthly options for an earnings play. I just figured they would beat earnings so I took around $1,600 in options risk during that play. After earnings, the stock opened up just about neutral and my options, already decaying badly, had been reduced to close to zero. This was my first loss I ever took playing options and I was devastated to say the least. How could $1,600 be lost so fast? I had nearly wiped out all my gains from stock trading, in a day or two timeframe.

And I continued trading options.

After that loss, I decided I had to make it back. So I traded larger, more risky plays in order to have a potentially large payout. I ended up putting in more money to my account, increasing the total I had put into it around $30,000. By the end of Spring in 2013 I had gambled away about half of it, sitting at $15,000. I was losing money very fast, and didn’t know what to do. At the end of the week, Friday September 20th 2013, I had put just about all I had left into AAPL options. During that day AAPL was hit hard, down quite a bit, and ended the day at $467 per share. I went over the weekend with 20 contracts of calls, and whatever money I had left into more calls.. This was one of the worst weekends I can remember, every minute I was checking twitter and news sources for anything AAPL related to see if I was going to be hosed come Monday morning. I did the stupidest thing.. Essentially I put everything on black.

The worst thing happened.

Over the weekend, AAPL announced it had sold 9Million iPhones in its opening launch of the 5S and 5C. This was, ‘well ahead’ of expectations. The stock opened up at $496 a share, almost $30 higher. I sold all my calls as fast as I could at the open. I made a boat-load of money. ( This was great right? I made back everything I had lost, and then some more. My account ended that day at $52,000. I was ecstatic. I did the smartest thing out of all of this.. I took a 2 week break from trading. Eventually I came back.. made some more, lost some.. etc. I ended 2013 with $72,000. Doubling my money in one year, not a bad year eh?

January, incoming.

I continued my trading style, and during January, I played AAPL many times, against the tide.. Breaking all rules of trading.. I was adding to losing positions, doubling, tripling down.. I wasn’t cutting losses, and letting them grow and grow, as AAPL continued to fade its stock price. I played earnings again and lost. I ended the month of January, down $30,000. I knew something needed to change, or my account would be $0 soon. I knew that the way I traded was lucky and that my gains have been doing more harm than good–to my trading psychology that is. You see I didn’t think I could lose a lot of money, and I thought my trading was sustainable. Boy was I wrong.

Whats happened since then?

Since then, I’ve changed my trading strategy. I trade NOT to get rich quick, but to pursue persistency while mastering a sustainable trading strategy that can work with any amount of capital. During every trade, no matter what, I risk 1% of capital.. Meaning my losses are capped at that amount. Anything worse would be detrimental to my trading psych and to my trading account.

Learn from my naivety.

Just because I made good money one year, doesn’t mean that my strategy for trading worked. I was throwing money at the market, and it just so happened it was paying me back pretty well. Over time, that strategy will crumble to the ground. By risking 1% of your capital for any given trade, you’re protecting your account. In this game its all about surviving. You don’t have to trade a lot, you don’t even have to win a lot. You just have to protect your account from losses, and the winners will come. Cutting losers has been the hardest thing for me to grasp and learn, however once I understood it and put my pride aside, my account started to grow again. Take it from me–have a plan when you trade, have an entry, exit, and a good reason, all while managing your position size such that you can’t lose more than 1% of your account. Trading shouldn’t be gambling.. You should create an edge for yourself and then manage that edge so that you can win.

Helpful resources to learn more

WatchHimTrade – Live Options trading.
New Trader U – Trend trading with technical analysis.
Kay Kim (2 Traders Club) – Trend trading with technical analysis.
Train Wreck Trader – What can happen when you don’t trade properly.

Introducing: Lets Monitor


Back in February of this year, a colleague at my current job recommended this awesome site monitoring mobile app. This of course was a normal day for us, talking about new software or apps that we randomly find. After looking into this mobile monitoring app I liked it and it seemed very useful, however it wasn’t as user friendly as I had hoped, nor did it boast the experience I was expecting. In fact none of the site monitoring tools I had ever looked at really gave me that sense of feeling that I was satisfied with and enjoyed using. Each time I was left with this feeling of “I just want this space to be updated into todays experiences, but maintain the nerdy back-end quality”. Through the hard work of my colleague’s and long time friends @MichaelDick & @EvanDudla, I hope we’ve scratched that itch with:


Simple Website Monitoring

Website monitoring doesn’t have to be loaded down with technical terms nor complicated options (unless you’re into that stuff, then have at it!), but instead monitoring tools should be simple, reliable, and actionable. Lets Monitor hits on each of those points by bringing you an experience from today’s age, reliability backed by our colleagues at the NSA, and actionable alerts to make sure you’re the first one that knows

Why is Lets Monitor different?

The first thing we notice with each of our new customers is the ease of use that we provide. We get comments such as “Wow, that was easy!”, “Really? That’s all I have to do and I’m set?”, and “Where are all of my technical options? Don’t I have to set those first?”. These reactions start to explain the expectation of the average user because of what they’ve been <strong>trained</strong> to expect from other services out there. After they realize that it is just as simple as add your site and walk away, they are almost left feeling like they are lacking something, lacking the array of options or technical details they’ve come to expect. We’re different because we say… Add your sites, walk away. Leave the gory techy details to us, we’ve got your back.


Crowded Space

So after reading this far, you’re probably thinking “oh just another site monitoring tool, there’s too many of these already”.. and we can’t dispute that. What we can say is that we’re not competing with any of the enterprise tools out there. There are far complicated monitoring tools and experiences that we just don’t care about. And yes, some companies and enterprise customers salivate over those tools and they provide them with exactly what they need.. and that’s great. What we’re here to disrupt and to bring value to is, as I like to call “the little guys”.. The engineers, the entrepreneurs, the small businesses, and those that can’t afford to pay $20+ a month just to monitor there sites, but yet still need to stay on top of some of their online properties. Guys like me, guys like you, we’re here to provide a product that helps us all and doesn’t break the bank trying to do so.

Go forth, never miss another outage!

This isn’t ground-breakingly new technology, nor is it going to change the world, however it may change the way a few of us look at site monitoring and hopefully give us a new expectation on how monitoring should be done. Removing complications out of everyday tasks is the goal that many new products try to reach because no one wants to further complicate something so simple. Our lives are busy enough! Sign up for Lets Monitor‘s 14-day free trial.. check it out, and then if you think it provides you value, upgrade to premium and we’ll love you for it. Take a vacation, you deserve it :)