Lightspeed Trader – Trading software review

Like many retail traders, you start out with Etrade, TDAmertrade, Schwab, etc. and learn their trading software, dabble with some money, and then if you really start to get into it and do well you wonder if there is better software out there. I did just that..

Bring in Lightspeed Trader. This trading software checked a couple boxes for me that I was looking for. For one, it had a Mac OSX version that many leading trading platforms did not offer. This is somewhat important, but not a deal-breaker for me as I could always run Bootcamp/Parallels to use Windows software. I was looking for the fastest and best executions, along with level 2 and book information. Charting was secondary, as I really enjoy Thinkorswim’s charting platform for most of my needs anyway.

Lightspeed Trader is one of the many platforms in the Lightspeed suite of products, all backed by Wedbush Securities clearing house. At first, the platform takes quite a bit of learning to understand and customize so that you can get the most out of it.


The charting software I would give a 6/10 and it is likely the least performative area in Lightspeed Trader. They work and you can do what you need with them but they are no where near the grade and standard that Thinkorswim has put out. Maneuvering the charts, adding studies and drawings are a bit cumbersome, although they are possible.


As for executions, Lightspeed is a direct market access broker, so you do NOT have the broker working against you when you are trading, thus you can have much better fills. This is very important for high volume traders who constantly depend on great fills and executions. This broker lives up to their name, they are lightning fast to say the least. Executions get 10/10, none better that I know of.


Trading options and stock on Lightspeed trader is very nice once you get the hang of it. You have the option to set up hotkeys to really speed up the process which is nice. They also provide an option for Hot Buttons which is the same as Hot Keys except for buttons–predefined actions set via hotkeys/buttons. The one downfall of the platform (for me) is that it does not allow futures trading, and that actually requires you to open a separate account (at a $25K minimum) to trade. This was a bit of a turnoff to me, since I enjoy trading Futures as well. Nonetheless, the platform is very solid and proven for trading stock/options and if you’re a day-trader it would be at the top of my list to be looking at. 9/10 on Trading.

Fees & Commission

As with many direct market access brokers there are fees for market data at the speed and real-time manor that you receive it. For Lightspeed they charge a $100 platform fee per month, however commissions are deducted from that so if you spend $100+ in commissions the platform fee is waived. Also for extra book and level 2 data it is priced accordingly on their website at I chose the per share fee structure, which charts 0.004 cents per share with a $1 minimum if you’re trading < 100 shares in the transaction. For high volume traders this is a huge value with cheap commissions. Also you get paid for adding liquidity, which also cuts down in your commission cost. Overall, you can't ask for much better fees/commission in this level of brokers, so I give them a 9/10 in this area as well.

Wrap Up

Overall, there are a couple trading platforms to consider in this space, depending on what your goals are as a trader. My top 3 to consider would be 1) Lightspeed , 2) Tradestation , & 3) DAS Trader. Each of these have comparable fee structures, as well as direct market access brokers. Each require $25K minimums for margin trading, as does everyone as they are required to. Tradestation has multiple plans and pricing structures that are NOT outlined very well on their website, so I recommend calling them to get a full detail explanation. Overall Lightspeed trader got the green light for me, and I’ll be actively updating the blog in the future with any new things I find.

Respecting your STOP loss in a trade

If you’re trading the stock market, whether day trading or swing trading, respecting your stops is crucial to longevity in the game. Without it you have the potential to blowing up your account and losing all of the profits you’ve made in a single trade. I was recently reminded of that (in a good way, no I did not lose all my money) but actually RESPECTED my stop and stopped out of a trade even though I did not want to. I will outline it below.

The stock was TSRO. On the morning it was gapping down on some news, I really don’t care what or why, but rather what the chart and price action was telling me. It appeared on my scanners so I looked into it. The daily/weekly/monthly charts were all somewhat bearish, with the stock moving lower. This gave me a short bias to the morning. I looked at some key levels, took my entry short around 55.70 and let the trade work from there, with a stop out above 57 or so price maintaining.

The stock opened and shot up a bit, giving me a better entry really than I got, and I should have added, given my thesis. The flush came shortly after the open and I got my move lower, covering 50% of my position into 54.50 to take some early green on the day. The stock continued its move lower, respecting VWAP continually. This gave me confidence to add to my short, so I did around 53.75. I picked up another 50% shares to make my position back to a full 100%. TSRO immediately flushed lower and dropped on big volume into 53-52 area where I had thought we might find some support. With the luck of the irish, I sold 50% of my position once again at the very bottom on the day, 52.20.

Besides being ecstatic about hitting that bottom before the reversal, I also decided to move my stop down to the 54 area. Contemplating where I wanted the stop, at first 54.50, then 53.75, I decided on 54 due to some price action and chart analysis from the weeks prior. I told myself if this thing jumps back up above $54 and holds price, then I probably don’t want to be in this anyway, as if it is truly bearish and going to continue lower, it most certainly will continue trending, and any pops up should get immediately sold off.

Well, it hit my stop at 54, and I was a bit bummed as I was still bearish on this today, but also knew I had to respect my stop. Luckily the stop hit and the market order went out at 53.92.. great an even better fill than I had hoped for! I watched the stock on and off the rest of the day, but as you can see once it broke 54, that level immediately became support and it really just grinds sideways the rest of the day, bleeding out all shorts still in their morning positions. Had you not have worked your core position, nor stopped out, you could have had a decent morning entry, but ended up even by the end of the day.. or worse, red.

If you’re shorting the stock long term, this doesn’t apply to you obviously. But for day-traders out there, if you haven’t already, do yourself a favor and add this rule to your playbook: RESPECT MY STOP LOSS. Trust me, it will save you a lot of $$ and headache in the long-run and it might suck taking that loss initially, but statistics are in your favor and if you want to survive in this game then you need to be consistent and keep that edge.

Linux CRON Scheduling and Setup for MySQL Backups

Backups are useful, and a must have feature of anyone’s site or product if you are going to be successful. Backup as often as you can, as much as you can. In this short post I’m going to quickly outline Linux CRON Scheduling with MySQL to backup your databases.

Once logged into your server run:
crontab -e

This is like your CRON configuration file and will show you all scheduled tasks that you currently have running in your system. Here is an example of what part of my config file looks like:

# m h dom mon dow command

# db Backups - Every day at 4AM. - Write to _backups folder.
0 4 * * * /usr/bin/mysqldump -u'root' -p'[password]' dbname1 | /bin/gzip -9 > /path/to/backups/dbname1-`date +\%m-\%d-\%Y`.gz
0 4 * * * /usr/bin/mysqldump -u'root' -p'[password]' dbname2 | /bin/gzip -9 > /path/to/backups/dbname2-`date +\%m-\%d-\%Y`.gz
0 4 * * * /usr/bin/mysqldump -u'root' -p'[password]' dbname3 | /bin/gzip -9 > /path/to/backups/dbname3-`date +\%m-\%d-\%Y`.gz

# db backup cleanup - Every day at 3AM. - Remove backups that are 30+ days old (this is based on last time the file was modified)
0 3 * * * /usr/bin/find /var/www/html/_backups/ -mtime 30 -exec rm {} \;

These are some pretty basic setups, so I’ll explain the very first line:

0 4 * * * /usr/bin/mysqldump -u'root' -p'[password]' dbname1 | /bin/gzip -9 > /path/to/backups/dbname1-`date +\%m-\%d-\%Y`.gz

In this cron schedule I am backing up a database named “dbname1” using the mysqldump executable. Then I am getting a little fancy and zipping the dump (as to save space) using the gzip executable. This new zip file is being moved to a backups folder of my choice and I’m naming the file with the days date, since these are occurring daily.

You may be puzzled by now at what the “0 4 * * *” does. Well there are explanations out there to explain it more thoroughly but here is a quick explanation. The first 0 is for minutes. If you want the task to occur on any given minute, for example every 5 minutes, then you would do 5 * * * *. The second 4 is for hour of the day. This uses the 24-hour system so a 4 would be 0400 or 4AM. So in my case, I have my task running at 0 minutes past 4AM, every day.. or 4AM every day. The third entry is for a specific day of the month, 0-30 or so. So if you wanted the task to run on the 15th of every month, at 4AM you would do 0 4 15 * *. The fourth entry is for actual months, 0-12. If I wanted to run a task on the 15th of October at 4AM, each year, I would do: 0 4 15 10 *. Lastly the fifth entry is the day of the week, 0-7. Again using this format if I wanted to run a scheduled task at 4AM on Tuesday, every Tuesday, I would do 0 4 * * 2.

I hope this helps explain the CRON scheduling tab a little more to those just getting started. One flaw to note is that whenever you are making backups you, if at all possible, want to have them backing up “Off-site” instead of within the same server. What good is a backup if the server goes down and you can’t access them. I’ll write a part 2 on this for how to move backups off-site after you’ve done this much leg-work to make them happen!

If you find any errors in this post please let me know.

Universal Geneve Medico-Compax ca. 1947 cal 285

Universal Geneve Medico-Compax, with rare 100% original dial, hands, case unpolished, etc. I put an ostrich strap on it but this watch is amazing in person. UG’s are very hot in the vintage collecting market and only going to increase in price as time goes by.










Honeybee Hive – The Extraction

Since I live on a farm I get the joys of multiple garages, barns, and storage areas. When going out to one of the old horse barns on a Saturday morning I noticed an increased volume of bugs, bees, and other flying around. We have plenty of carpenter bees flying around constantly eating up the wood beams and what not, but this time I noticed smaller bees, flying in and out from what seemed like a concentrated spot. What does anyone do when they see this… WHERES THE CAN O’ WHOOPASS.. Or RAID, or whatever you have that kills wasps, hornets, etc.

Luckily I’m a terrible shot, and these buggers were hard to get to.. So I missed most of it and only killed maybe 1 or 2 bees. Upon further inspection I noticed they weren’t attacking me nor anything else so I looked at the dead bee, did some research and what do ya know, its honey bees. From the entrance it seemed like they were camping out under my floorboards–a very tight space with really no way in or out.. besides the little hole they made. Perfect for a bee hive.. Cozy in the winter, protected from predators (for the most part), and easy to defend an entrance with such a small area.

Fast forward a bit (2 months) and my friend Jim Leether stops by some Saturday with beekeeping equipment, hive, and everything needed to move these guys to a safer area where they can thrive and produce honey and wax that can be harvested! Oh.. and I would get to use my workshop again, inside the horse barn, without disturbing them :). Without further ado… Pictures!



































16610 Submariner ca. 1989

Every watch enthusiast has a starting point, whether it be a $50 Seiko and a handful of quartz’s, or inheriting a handful of vintage mechanical chrono’s. Either way over time a collector or anyone interested in watches eventually finds out what they enjoy the most and they start honing in their purchases to a specific type of set that attracts them. Usually the first few purchases are a “tax” to learn this. This too was how I started, but I’m starting to zero-in my collection of watches with significant meaning to me and in the history of watches.

I recently sold one of my first “diver” watches, a Tag Heuer Aquaracer, which was a beautiful watch. Why did I sell it? To help fund this beauty that you are about to see.

One of my “grail” watches, which has evolved over the years, has been a birth-year submariner. It used to be a “hulk” submariner, then a sea-dweller, and then back-and-forth between the three. After quite a while of thought and decision making, I decided the birth-year submariner sung to me the most, and had the most meaning, something I could keep until I die, and hopefully pass on to my kids some day.





I went with a 16610 Submariner which includes the date window, and is COSC certified. I found it at the right price and the right time and decided to pull the trigger! I couldn’t be happier with my purchase and although my watch collection is still quite small, this will make quite a huge impression and forever be one of the most memorable purchases of my life.

(More photos to come as I plan to interchange a few worn leather straps and potentially NATO straps to create different flavors for different occasions!)


Write VERY large record-sets to CSV, JSON with ColdFusion

So I was tasked to build something that writes a very large data set to CSV Files, and JSON files. I was running into memory issues with ColdFusion because I was trying to store the CSV string in memory before writing it to a file on the server. For a 1 Million+ row recordset in SQL Server this usually would time out after a 2 hour request timeout, and just not finish. By using the underlying Java objects we can write 1 Million+ records to a CSV file, or JSON file in just about 8-10 seconds. Credit for finding these Java objects goes to my co-worker Sean Louisin.

The fix for this, which worked for me, is using the native Java functions that ColdFusion has access to. My process and full function below:

1) Instantiate the Java objects we need:

oFileWriter = CreateObject("java","").init(local.FileName,JavaCast("boolean","true"));
oBufferedWriter = CreateObject("java","").init(oFileWriter);

This creates our File and prepares the Buffer for writing to that file.

2) Create headers for the CSV file (Query Columns)

oBufferedWriter.write( local.qCSV.columnList & chr(13) & chr(10) ); // Write Headers First

3) Loop query and write directly to the file.

   local.rowArray = arrayNew(1);
   for (row in local.qCSV) {
      local.rowArray = arrayNew(1);
      local.columnCntr = 1;

      aCol = listToArray (local.qCSV.ColumnList);
      for( fieldName in aCol ) {
         local.rowArray[local.columnCntr] = PrepareCSVValue(local.qCSV[fieldName][local.qCSV.currentRow]);
         local.columnCntr += 1;

   oBufferedWriter.write(arrayToList(local.rowArray) & chr(13) & chr(10)); // Write fields, followed by new line carriage return.


Here we are using a generic function to “prep” the CSV value by escaping double quotes, etc. And then after we process the field’s for the current row, we write them directly to the file using “arrayToList”.. so that they are comma seperated for the CSV file. By doing it this way, we do not have to store the string in memory and risk out of memory issues. We write the query fields directly to the file and move on to the next. We can see how this might be an issue when working with rather large datasets with millions of records. Having all of those in memory is just a bad idea.

Here’s the full function for Query to CSV File writing:

Full Function:

public any function exportToCSVFile(required query q, required string fileName) {
   var local = {};
   local.FileName = arguments.fileName;
   local.qCSV = arguments.q; // The Query Object. To Which we will Export
   local.queryResult = structNew(); // Result set of the Query Object

   oFileWriter = CreateObject("java","").init(local.FileName,JavaCast("boolean","true"));
   oBufferedWriter = CreateObject("java","").init(oFileWriter);

   oBufferedWriter.write( local.qCSV.columnList & chr(13) & chr(10) ); // Write Headers First

   local.rowArray = arrayNew(1);
   for (row in local.qCSV) {
      local.rowArray = arrayNew(1);
      local.columnCntr = 1;

      aCol = listToArray (local.qCSV.ColumnList);
      for( fieldName in aCol ) {
         local.rowArray[local.columnCntr] = PrepareCSVValue(local.qCSV[fieldName][local.qCSV.currentRow]);
         local.columnCntr += 1;

   oBufferedWriter.write(arrayToList(local.rowArray) & chr(13) & chr(10)); // Write fields, followed by new line carriage return.


   oBufferedWriter.close(); // Close file buffer writer.


Writing a Query to JSON file is much easier and I won’t explain the functions again but here is a Query to JSON File function:

public any function exportToJSONFile(required query q, required string FileName) {
   var local = {};
   local.FileName = arguments.fileName;

   oFileWriter = CreateObject("java","").init(local.FileName,JavaCast("boolean","true"));
   oBufferedWriter = CreateObject("java","").init(oFileWriter);
   oBufferedWriter.write( SerializeJSON(arguments.q) ); // Write file
   oBufferedWriter.close(); // Close file buffer writer.

Hamilton Men’s Khaki Field Automatic

It’s that time of year again.. Yep my birthday! My wonderful girlfriend decided to get me one of the best birthday presents yet – a new watch. No matter what, she can’t go wrong getting me a watch. I even got to pick it out. What did I choose? Well I wanted something that came original with a leather strap. Something that was a nice casual, everyday watch.. yet not too fancy or eye-catching.

Hamilton Watches

This seemed like the perfect brand of watches for what I was looking for. Yea sure they have some crazy looking watches like the “Code Breaker” or the “Classic Jazzmaster”. I dove into the Khaki Field collection.. just seemed perfect. I like the simple face, no chrono, just a date, as well as the numbered clock and sweeping seconds. The black dial watch was calling my name. The others seemed, too catchy, albeit very nice watches.


Three things I love about this watch.

  • Automatic Movement
  • Leather Strap
  • Black Dial

Along with the other usual must-haves of a sapphire crystal glass, sweeping seconds hand, and a smooth outer casing that weighs and feels great.

Would I recommend this watch?

If you’re looking for a subtle, everyday, yet casual watch.. and want to keep the price under $1,000.. This is it. It has a great feel, but as all leather it will take a few days of wearing to get the leather to hug your wrist a little better. But a small price to pay for such a nice piece.


My Trading Story

I started investing and getting into the stock market back in October of 2011. The markets were just bottoming from a relatively large correction at that time and I had bought stock in Ford (F), Verizon (VZ), Paychex (PAYX), Freeport McMoran (FCX), and Apple (AAPL). I had under $10K invested at the time and was doing alright. By mid 2012 I had pulled a decent 10-20% profit on each investment. This was to be expected, since if you bought anything in October 2011 and held it for a few months you’d instantly be up as the bull market continued.

Then I started options trading.

After making profits on my investments, I wanted to do more, make more.. Build up my account more. I began learning how to trade options after I realized their powerful return potential. I first bought options in AAPL, some monthly options for an earnings play. I just figured they would beat earnings so I took around $1,600 in options risk during that play. After earnings, the stock opened up just about neutral and my options, already decaying badly, had been reduced to close to zero. This was my first loss I ever took playing options and I was devastated to say the least. How could $1,600 be lost so fast? I had nearly wiped out all my gains from stock trading, in a day or two timeframe.

And I continued trading options.

After that loss, I decided I had to make it back. So I traded larger, more risky plays in order to have a potentially large payout. I ended up putting in more money to my account, increasing the total I had put into it around $30,000. By the end of Spring in 2013 I had gambled away about half of it, sitting at $15,000. I was losing money very fast, and didn’t know what to do. At the end of the week, Friday September 20th 2013, I had put just about all I had left into AAPL options. During that day AAPL was hit hard, down quite a bit, and ended the day at $467 per share. I went over the weekend with 20 contracts of calls, and whatever money I had left into more calls.. This was one of the worst weekends I can remember, every minute I was checking twitter and news sources for anything AAPL related to see if I was going to be hosed come Monday morning. I did the stupidest thing.. Essentially I put everything on black.

The worst thing happened.

Over the weekend, AAPL announced it had sold 9Million iPhones in its opening launch of the 5S and 5C. This was, ‘well ahead’ of expectations. The stock opened up at $496 a share, almost $30 higher. I sold all my calls as fast as I could at the open. I made a boat-load of money. ( This was great right? I made back everything I had lost, and then some more. My account ended that day at $52,000. I was ecstatic. I did the smartest thing out of all of this.. I took a 2 week break from trading. Eventually I came back.. made some more, lost some.. etc. I ended 2013 with $72,000. Doubling my money in one year, not a bad year eh?

January, incoming.

I continued my trading style, and during January, I played AAPL many times, against the tide.. Breaking all rules of trading.. I was adding to losing positions, doubling, tripling down.. I wasn’t cutting losses, and letting them grow and grow, as AAPL continued to fade its stock price. I played earnings again and lost. I ended the month of January, down $30,000. I knew something needed to change, or my account would be $0 soon. I knew that the way I traded was lucky and that my gains have been doing more harm than good–to my trading psychology that is. You see I didn’t think I could lose a lot of money, and I thought my trading was sustainable. Boy was I wrong.

Whats happened since then?

Since then, I’ve changed my trading strategy. I trade NOT to get rich quick, but to pursue persistency while mastering a sustainable trading strategy that can work with any amount of capital. During every trade, no matter what, I risk 1% of capital.. Meaning my losses are capped at that amount. Anything worse would be detrimental to my trading psych and to my trading account.

Learn from my naivety.

Just because I made good money one year, doesn’t mean that my strategy for trading worked. I was throwing money at the market, and it just so happened it was paying me back pretty well. Over time, that strategy will crumble to the ground. By risking 1% of your capital for any given trade, you’re protecting your account. In this game its all about surviving. You don’t have to trade a lot, you don’t even have to win a lot. You just have to protect your account from losses, and the winners will come. Cutting losers has been the hardest thing for me to grasp and learn, however once I understood it and put my pride aside, my account started to grow again. Take it from me–have a plan when you trade, have an entry, exit, and a good reason, all while managing your position size such that you can’t lose more than 1% of your account. Trading shouldn’t be gambling.. You should create an edge for yourself and then manage that edge so that you can win.

Helpful resources to learn more

WatchHimTrade – Live Options trading.
New Trader U – Trend trading with technical analysis.
Kay Kim (2 Traders Club) – Trend trading with technical analysis.
Train Wreck Trader – What can happen when you don’t trade properly.