This may seem like either a “duh” title or a very obnoxiously pretentious title. It’s neither. It’s simply a reminder that what I’m going to talk about (the “meat” mentioned in the previous article”) revolves around looking at what charts and trends and patterns tell us and how those, when thought about rationally, do make me wonder about the research companies’ ratings a lot of the time. Don’t misunderstand, I know that these companies do good work and spend a lot of time doing it and that a lot of people put a lot of faith in their reports…I’m just not one of those people, anymore. Here’s why…
Remember Wayne Gretzky’s horrible refrigerator art? There are some charts out there that I wouldn’t call the most gorgeous either. I’m going to my experience with some nasty refrigerator art as examples of pre- and post-Weinstein approaches. What do I mean? Well, read on.
The one that will prove to be the “d’oh!” of the early buys in my history and among the shortest lived of all my holdings, ever, is Stamps.com (STMP). This was, initially, before I really understood how to protect my investment, which is what stop-limits, trailing stops and the like are all about. So…
Stamps.com is an example why being green and not understanding trends was a debacle for me from the beginning. First, when I purchased (late October 2011), it was in the middle of a massive single-day surge, going from just over $26/share to just over $32/share. I bought in at $32.13. Yeah…I saw a “leap” and I leapt, not understanding that was probably the stupidest thing I could have done. The next day it peaked at $33.73. Then, the weekend happened and something cooled off, because that Monday, it closed at $32.56. The following day, it closed at $29.48. The following close was $28.67. I sold at $27.71 before it hit its low of $27.31. This is a classic case of day late, many dollars short.
If I had looked at the rumblings – the trends – instead of listening to the hype and trusting a very poor instinct, I would have purchased on Thursday, at closing, which would have been $25.91. The trend was that the volume was increasing at an odd rate – think exponential, though it wasn’t really. The anomaly, to me, though, is the gap on Tuesday, when it tried to surge from a previous close of $25.41 and on a surprising drop in volume (from 401K to 170K), dropped in price $0.51 on an increased volume of 409K. Now, this makes sense, to a point.
To a point?! Yes, it makes sense to a point. The problem was that the Tuesday volume was perfectly in line with what had been happening the previous weeks. So, to me (now), the indicators were that it was trying to explode, but hadn’t quite made it yet. Enter Thursday.
The jump on Thursday didn’t make sense to me, then, as much as it does, now, increasing $1.9 on much increased 631K. Presently, this tells me that it’s getting over its shyness. However, here’s where n00b Phil stepped in. I saw this flash mid-day Friday… The buy signal was flashed, in my eyes, now, when volume tried and failed to match the previous day (Tuesday vs Monday), but then succeeded on Wednesday and surpassed on Thursday. It’s a micro-trend, but, like I said, now that tells me it’s getting over some shyness and is probably going to have a good day. What I know, now, that I didn’t know then, however, is that the aforementioned good day, was just that – a good day.
Friday exploded, jumping $6.82/share on a strong surge where volume hit 2.1M.
Here’s the thing…remember that refrigerator art? Remember going to where the pass will be rather than where it is, presently? Yeah…I broke all those rules. I saw this surge as a sign that this was going to take OFF! …and it did…for 2 days. Monday saw a slight reduction in price, down by $0.16 on almost half the volume, 1M. Not horrible, right? Well, here’s where it ALL comes together. Tuesday saw the price drop to $29.48 with increased volume. That’s a $2+ decrease on increased volume which, really, in the grand scheme of things, a bit of a warning…you know, when you then set a stop-limit of $28.95/$28.45, which may seem kind of tight, but when it looks like things are going to drop, I tend to play things tight…I use hard-earned tiny dollars and I like to keep them.
The selloffs continued, though not catastrophically, with volume slowly reducing back to 637K on Friday after a total drop for the week of $4.34, giving back a good chunk of the earnings that surged the previous Friday. Now, because of the stop-limit in place, I jettisoned the stock at $27.74, giving up and losing a total of $4.39/share, well above what should have been my threshold for loss, as it dropped 15% and most sane investors would have decided that 8% is quite enough, thanks.
Now, Stamps.com did continue to maintain itself for a while before doing something that all overly cautious investors hate – after coming down a bit and setting down a nice support just under $24 for the next few months, it ramped up, again, from mid-January until mid-February when it should have, again, had a solid stop-limit in place when the stage 3 top formed and then went *poof* into another mini free-fall. The difference with the January-February surge, to me, was that it was done on sane volume levels with no extraordinary leaps one way or another…well, that and the relative strength and momentum were actually moving up, as opposed to in November when they were contraindicated by these two metrics.
Here’s the thing and why these are titled “Everyone’s Lost But Me,” the Monday after the surge, STMP was upgraded from Buy to Strong Buy. It then dropped $7 over the course of the next month at which point it was downgraded to Buy. It then dropped another $3 and over the course of the next month set a stage 1 support floor another $2 below when it was set to Buy. It then does its fun thing in Jan through March where it goes from the $25 support to a high of just under $33 and then back to form another support-looking trend that when you look at it after it dropped to $23.39 and jumped back up $28.52, you’d see that it’s trended more towards forming a stage 3 top, which means…at some point, fairly soon, it’s going to start sliding again. However Sabrient changed its rating from Buy to Hold. Honestly, this is the first one that makes sense to me, since it’s not started dropping, yet, but it’s not really gaining, either. Here’s what made my jaw drop.
As the stock closed at $27.69 or so, the following day, the rating is changed from Hold to Sell. Makes sense, right? Sort of. There was only a slight dip in Relative Strength and the volume was consistent. Was it, then, any surprise when the stock jumped up to $30.24? Not really. What kills me, though, is the “whipsaw” ratings change from Sell to Strong Buy with no in-between. It then tried to break resistance twice and failed and on the ensuing stage 4 drop, the rating was dropped from Strong Buy to, simply, Buy. It staged a micro-rally ( $0.80 ) and was then upgraded to Strong Buy, again. Over the next 5 or so days, the stock dropped just under $4. It was reduced to Buy.
What is my point? Simple – if you had followed the Sabrient ratings, you’d be down a chunk of change. If you’d paid attention to the numbers on the page and lines on the graph, you could have come out a bit ahead. And, finally, if you do what I did, you’ll lose money, as well.
Please don’t misunderstand…I’m by no means an expert at any of this. I just know now things that make my brain twitch when I see what I consider ratings changes that are counterintuitive or contraindicative when you look at the graph, the trends, the “where you’re going” not “where you are.”
“So, Mr. Smartypants, what would YOU have done differently that’s so different from what the ratings say and come out ahead?!?”
That’s a whole ‘nother article…stay tuned, for next time when I bring you “Everyone’s Lost But Me, Part IV: Making Stamps.com Work Phil-style” or something…that doesn’t sound like a very pithy title…
So…we’ve established that I’m a bit green when it comes to the financial world and not the good kind of green. So, what’s this “Everyone’s Lost But Me” hooey? What’s this “What Is versus What Is To Be” nonsense? Read on.
“Everyone’s Lost But Me”
This refers to the line in Indiana Jones and the Last Crusade. It also refers to how I feel, sometimes, when I read the reports from the ratings companies that rate the stocks based on their myriad metrics and come up with something completely counter-intuitive. Now, I know – I’m neither an economist nor a seasoned trader. I also know that math and I have a very guarded relationship. I also know that after reading the book I pimped in the last entry (Stan Weinstein’s treatise on making money regardless of the market) when I look at MY math and their assessments, there’s usually a discernable difference of opinion. I know…who knows best? While I can’t tell you exactly, the subtitle of this entry will help.
“What Is vs. What Is To Be”
I know…”OK, Yoda…” However, think about it this way – when you’re defending against the pass in hockey or basketball or football, do you want to move to where the ball *IS* or where the pass is going? You can’t intercept a pass if you’re rushing to the quarterback and the ball goes whizzing over your head or you’re playing with the forward who sends a beautiful saucer pass right behind you. You would snap your back trying to get back to get the ball/puck, and you still wouldn’t get it. Why? You were too busy focusing on the ball/puck in its current state and not thinking ahead to the destination in order to meet it, there. Have we beaten the sports analogy into the ground, yet? No?
Wayne Gretzky, as a young boy before he set every NHL scoring record known to man, used to sit in front of the television with a pad of paper and a marker. He would then, with the ice layout already drawn on the paper, trace where the puck was going through the course of the entire game. While some people would look at the paper after a game and declare it was the worst refrigerator art ever, Wayne used it to see patterns as to where the puck was going, ended up and what path it took to the net.
Where is this leading? Well…you’ll have to wait for next time to get to the meat of the subject, but the point, here, is that by looking at trends rather than points on a graph, you’re life will be easier and, like me, you’ll wonder what the ratings companies drink while coming up with their metrics…
June 27, 2012
Yes, Indy, everyone’s lost but you. I’m picking this as a title because I’m at a loss. I think part of it is that I’m fairly new to the world of finance and part of it is that after reading even 100 pages of the book “Stan Weinstein’s Secrets For Profiting in Bull and Bear Markets,” it made more sense to me than having read other sites for the previous year or so. If you’ve not read the book and “play” the stock market, read it. If you’ve not read it and you’re serious about making money in the market, read it. If you’ve read it and are reading this, as a choir, prepare to be preached to…as it were. Actually, I’m finished pimping the book. What I’m not finished doing is wondering how companies like Sabrient or The Research Team, Standard and Poor’s or The Street Ratings manage to make the ratings they do and why, really, people buy into their evaluations.
I’m a n00b. Really. I’ve lost a lot of money, well, not a lot…a lot to me…in the stock market. For reference, as a contractor, I didn’t get the luxury of a 401K, so, I decided to stab into the dark financial abyss on my own. Not necessarily a bad idea, but riding the experience that saved me thousands right after 9/11, it’s a completely different beast, here. First, my previous experience was with bonds and manipulating an existing 401K to optimize distribution and so on and so forth. I would do things like decide I needed to read over prospectuses and so throw everything into a money market until I figured out what I “really” wanted to do. This strategy seems really silly to me, now, however, I did that 9/11/2001 at 7:25am and, well, as history played out, I’m exceedingly glad I did. While friends and coworkers saw four-digit+ losses, I made $8. Yes, $8.
So, thinking that this experience somehow translated to savvy, I entered the market $100 at a time with, literally, no idea what I was doing, really. I remember thinking, “this looks like a strong stock and the reports all say ‘Strong Buy’ so it HAS to be good.” After a month or so, it had lost a lot of steam and was dropping like the proverbial sack of potatoes. I didn’t get out. I didn’t drop it like the hot potato it was nearly in a timely enough fashion. A perfect example of “n00b”-ist thinking was the mindset that, “it’s a strong company with a good product, so it HAS to improve.” A week goes by with –5% then the next week with –2.5% then the next week with –4% which, weighted, is beyond my math, but it is still 5% more of a drop than I should have been satisfied to withstand. I had no idea how to use a stop-loss/limit sell.
So, I kept trying, finally setting my mind to scouting IPOs. I’ve actually had some decent luck with the IPOs I selected – none of them being the Facebook IPO, which even *I* saw as a recipe for fiscal disaster. I’m still riding two of them and just jettisoned one this past week after it crested and without much fanfare hit a nice downward spiral. The other two have actually posted into the positive even without having large share totals (I’m not a high power trader, mind you…I deal in ones and tens, not tens of thousands…).
So, I think this might end up being a series of blog entries. Maybe, “Phil’s First-Hand Guide to The Stock Market: A N00b’s-Eye View” or something equally pithy. Next up? What Is versus What Is To be …
April 16, 2012
Talking with a friend about recording some music and rocking out, I had to pause and think about all the hoops that used to be involved with the process and how spoiled we, as recording musicians, are. Now, this isn’t going to be a “when I was your age!” rant or anything close. This is merely a “holy cow!” retrospective where I kind of trace my path of how I got here and look at how technology has advanced.
Begin at the Beginning
I wanted to be in a band. Gene and I had talked and Sonic Repercussions was a sure thing – as soon as we had instruments. He got his drum kit about 8 months before I got my guitar. While feeling a little Bill and Ted’s-ish, this is how it was. Not instruments, not training, but desire to get us to the first milestone: I put a black Kramer something-er-other on layaway at Alpha Music. I think I made 2 payments on it before I snapped my knee, thus hampering my ability to mow lawns and, well, pay for the guitar. On as tight a budget as I know our family teetered, waiting for me after school not long thereafter, was the guitar. My dad had paid for the rest of it. That’s something I’ll never forget – it still kicks my butt to think about it. Sentimentality aside – I still couldn’t play for anything in the world. It was strung righty. Guess what I’m not…
So, let’s move from this to the beginning of recording. It was not what I would call the most amazing rig ever. It basically consisted of the guitar, the cord, a blank tape and a Sanyo component stereo system with cassette. I believe I was using an Ibanez LA Metal pedal to produce excruciating noise and discomfort to my parents. Now, this wasn’t a chump tape deck by any means – it utilized Dolby B and C noise reduction and had stereo inputs with independent input level knobs (well, one knob, but cool split-knob capability) and headphone out. The headphones helped cut down on the pain inflicted on those around. So, we arrive at the real first milestone: stereo recording.
That was 1987. Later that year, I believe, my folks got a Tandy1000TL. This is important in the timeline because it had built-in earphone, mic and line-in jacks. This was the first time I had ever encountered such a thing and, with the Tandy Desktop “Digitize” software, I did some *very* rudimentary guitar recording. Bear in mind that it could digitize at 22KHz. Also bear in mind that even with the proprietary compression on the sound files, there really is only so much you can fit on either a 360K or 720K floppy. So, with a ¼”-to-1/8th” plug, I was able to do some very simple digitization. I don’t think anything ever came from this. Still, in ’87, this was just too cool to a 14-year-old.
Even in 1991, when I got the PerfectSound 44KHz digitizer for my Amiga500P, it was still stereo, though the quality had increased, greatly. Still limited by the 880K floppy, there was only so much one could record at a time. I believe my first, real, production “project” was to make a radio-safe version of Pantera’s “F@#$@@! Hostile,” which didn’t turn out too badly, but was still…stereo. I had also traded my Kramer in for this weird candy apple red Explorer-ish custom with scalloped frets and hideously noisy electronics. I still did the vast majority of my recording from the line-outs of my amp into the left- and right-channels on the stereo cassette recorder. I didn’t do much by way of recording in the digital realm. From an effects standpoint, I was still using pedals, though my choices had expanded to a TurboRAT, a Boss CH-1 Super Chorus, DD-3 Digital Delay and a Dunlop Crybaby.
Very little progressed for a while, recording-wise. It was all two-channel, and boring. So, we’ll fast forward to the year 1994 when I spied, and bought, a Fostex FX-18 cassette 4-track into which I plugged my Alesis QuadraverbGT rack effects processor accompanied by a DOD FX17 Digital Wah/Volume pedal. So, now, I had multitrack capability, an upgraded effects process and a new rig – I purchased a Jackson Stealth EX lefty, which I still use, today, though with a couple of modifications. The four-track and I had a love-hate relationship as I learned how to multi-track, bounce, overdub and mixdown. It was awesome and it was analog. By this time, my Amiga had died an undignified death and I had a PC with an AdlibGold 12-bit audio card that I didn’t use, at all, for recording of any sort.
I graduated into the world of the working class in 1995 and, at that point, needed to upgrade my system’s sound card by virtue of the need for a CDRom (4x, baby!) and so ended up with a SoundBlaster/CDRom combo. The following February, at a multimedia conference in Orlando, I met the developer of the filters for CoolEdit. I had used CoolEdit, lightly, at work for some multimedia presentation work, but was still doing all my recording on the 4-track. After talking with Filter Guy, I decided to give the ¼-to-1/8” plug a whirl with SB and CoolEdit 1.2. In conjunction with the Fostex, it brought about the ability to, then, run a line out from the sound card back into the 4-track for some really fun tracking capabilities. Using splitters and … combiners(?), the possibility of recording two tracks to digital and two to analog alongside a “click track” became reality. I was a happy man. Then, I wanted more.
The MIDI on the SB was…passable. Replacing the old 486SX/25 with a shiny Pentium 100 and, later, replacing the factory weirdness that was the “Sound144M or what-have-you” with the Creative SB AWE64 was better. CoolEdit released a newer version and, at this point, it was simply a better version of the previous rig, using line-in and the 4-track piping from the Marshall Micro-stack. Cakewalk was starting to make splashes in the music world, as well, adding the capability for doing basic drum tracking using the staff notation. I went on my first date with my, now, wife, Lara, and the following day wrote and recorded both rhythm and solo guitar parts and drum line for the song “Cornrows” (about a serial killer – Lara’s still not impressed by that fact…the lyrics were already written, hon, I swear!) in just under 8-hours. With the AWE64’s softbank-based MIDI, the drums sounded phenomenal. The whole song sounded awesome and, somewhere, buried in the mists of history and assorted “stuff,” the recordings of this song can be found. Due to then-stellar and now unusable technology, the original digital recordings are lost, forever. For reference, when you hear your ZIP disk “click,” copy EVERYTHING off of it as soon as you possibly can and pray it doesn’t die before you finish. It died before I finished.
At this point, I was doing a great deal of line-in recording and eagerly awaiting the day that a multi-track audio editing program would grace the market. Ping-ponging tracks back and forth between tape and computer got old, but I still managed to create some decent sounding things. Getting married, I backed off the 24/7 recording frenzies that would sometimes occur and still lived mainly in the analog world. Entering into the present century, I was still recording on the four track and then digitizing it to the computer, CoolEdit96 still being the program of choice. Along the way, there were some replacements – the FX17 digital wah/volume basically had a bit of an internal fire, and the replacement, the old, crackly Crybaby needed to be replaced with a shiny new Crybaby. The MIDI footpedal for the QuadraverbGT had always had issues and finally just decided random was the way to go…so, it had to go. The QVGT still remained the line-in effects processor of choice for a number of years until the power supply when to great power supply beyond. Discovering the cost of a new power supply was roughly the equivalent of a new effects processor, a new era of effects and recording was ushered in.
2000s to Present
I went the way of eBay, trying a couple of different effects boards before settling. There was a BOSS ME-30(I think) pedal with a nice short between the first and third footpedals so that pressing one might just get you something completely unexpected. There was also a DigiTech RP100 which provided a lot of nice tone for the price and worked well with the line-in recording method. In conjunction with CoolEdit2000 and its multitrack recording capability, it made for a decent tandem that saw quite a few “songs” recorded. This sufficed for a while. Then, things get wiggy and there’s a bit of moving around. Eventually, a RP250 is added to the fold which allowed to USB-direct-in recording. In conjunction with CoolEdit, this made for much faster multi-track recording. Then the RP250 disappeared.
My current rig brings us to present, where the RP250 is replaced with a RP350 unit. Rather than using its straight USB-in to record into either Audition or Audacity, I’m using an AvidMP+ MobilePRE to interface between the computer and the effects unit and it works really nicely, especially where latency is involved. The biggest addition has been in the form of ProToolsMP which is tied inexorably to the MobilePRE. It has opened up a lot of multi-track recording options and while I can’t say it has brought my playing to the next level, it’s helped my recording grow up a bit. The additional control over tone, output level – everything – is amazing and I haven’t even scratched the surface with this tool. Using it in conjunction with Audacity, final mixes are a breeze and figuring out how to use PTools, properly, I’ve learned new methods by which to output the mixes. The final piece in this puzzle is the replacing of the 18-year-old (yet still pretty solid) passive Jackson humbucker with a Seymour Duncan active blackout AHB-1 for the bridge. My tone is now chunkewy (that would be chunky-chewy for those uninitiated in Phil-speak) and I’m very happy with it despite having “done it wrong.” By way of clarification, apparently, I shouldn’t have left the two remaining pickups as passive…”It’s too much of a pain.” By replacing the 250KΩ tone knob with the Seymour 25KΩ, I use 4 out of the five positions on the knife switch to produce solid tone in all 4 positions. The fifth position is temperamental and I’m sure if I were to sit down with the basic wiring diagrams, it would be something that could be rewired, slightly, to produce a tone everyone would like.
So, that brings us to now — this very second. I say, again, “Holy cow!” I think back to the “beginning” and compare it with “now” and just marvel. It’s been quite a trip and it’s really only been 25 years. I think I’ll be set for a while…
July 1, 2011
I’ve done things that I’ve considered stupid. I think everyone has. That
said, there’s not much in the world of computing that I’ve encountered
that I couldn’t fix. It’s just how it is — either I have the knowledge,
myself, or know pretty much right off the hop where on the interenet to
look for answers. That hubris was challenged this week/end.
My son plays LOTRO — The Lord of the Rings Online. By virtue of this
fact, I, too, play some LOTRO. There’s something inherently geeky about
meeting up with your 12-year-old son to “hunt some orc” online. He’s
loved The Lord of the Rings since seeing the Fellowship of the Ring at
the ripe age of 4 1/2. Yes, I know, I’m going to hél. Since then, he
devours anything LOTR related, books, games, name it. So, when the Mines
of Moria expansion came out, it was a foregone conclusion that he would
want to have this installed as soon as possible. Well, I run a linux
system — in case you couldn’t tell from the rest of the blog entries on
here — and, so, my getting things set up, on my system, to do test
installations so I don’t hose anything up on his Win7 laptop,
First off, let’s get this out of the way — LOTRO is a large pain in the
rear to get running *well* on a linux system if you don’t know what
you’re doing. I do, for the most part, so it’s only a small-ish pain in
the rear. There are things to minimize the rear-pain out there and I’d
used one of them to get the standard installation of LOTRO up and
running on my machine. In wanting to set up the Mines of Moria
expansion, I thought I would set it up in a different “bottle” or
$WINEPREFIX, as it were. This, as it turns out is a good idea, however,
the way I set about doing it was not. I chose the “easy way.” Oops.
The easy way is to use CodeWeavers’ CrossOffice/Games proggie and simply
install the game into its own little bottle (compartment, really…).
This, as it turns out, was dumb. I don’t really know *why* it was dumb,
from a “how the heck did THAT happen?” standpoint, but I know *why*
inasmuch as “because it was.” Now, I’ve used CrossOffice in the past
over the years and have run Office2000, Office2003 and so on with it
with great success. Having shaken the shackles of MSOffice in favor of
more open source apps, I have had little use for CrossOffice since what
I do run through Wine does so quite nicely, even LOTRO. So what on earth
possessed me to color outside the lines and add another layer of “what
could go wrong?” to this equation? Foolish hope that it would, indeed,
simplify a fairly complicated installation procedure.
So, I installed CrossGames. I then created a new bottle and installed
LOTRO:Mines of Moria. It never worked. It would freak out when going
through the server checks in the native client and blow up in with a
partial black screen and an obscured error message that told of doom and
asked if I had changed my resolution. Not good. The exact error for all
the search engine hits is “err:xrandr:X11DRV_XRandR_SetCurrentMode
Resolution change not successful — perhaps display has changed?” The
answer to this, of course, is no. However, there’s an underlying error
message hidden behind the quartered off screen, and that’s “Could not
initialize Direct3D. Must have DirectX 9.0 or higher .”
Sadly, this isn’t going to tell you how to fix those two errors,
*really,* though what I did works, mostly. I know — such
definitiveness. Here’s what I *did.* I removed CrossGames, completely. I
removed Wine completely. I deleted both .wine and .wine-lotro
directories (after backing up some things, of course). I reinstalled the
most recent stable version of Wine and Winetricks. I also installed
PyLotro — which is nice…when it works. For the record, PyLotro worked
PERFECTLY until this whole CrossGames fiasco. Also, for the record, I’ve
*never* had to go *this* scorched earth with any linux installation
prior and that includes some pretty festive go-rounds with DB2, Oracle8i
and, well, just about any IDE prior to 2004 that would lead one on a
trail of dependencies. I digress…
I also did something *smart* this time around. After looking for roughly
5 days on all the forums from Turbine to WineHQ to just about every
MMO-related board out there, I came “back to my roots.” I came back to
the script that made it work, the last time. Why had I not looked into
this, before? Um…because I forgot? That’s about the only real excuse I
have. So — what magic has made LOTRO work again? Go here:
Follow the How-To. Really. It errors out at the end but, as far as I can
tell, everything’s working AND inexplicably I’m running with high detail
with very little slowdown…something even my previous, working,
installation couldn’t boast. The downside? PyLotro still blows up with
the Direct3D error. However, the command line script (lotrolauncher.sh)
works beautifully and, as I’m not averse to command lines in the least,
I’m up and running again.
So, why the title, “The Lord of the Direct3D?” Well, it’s simple, it’s
still the force of evil out there that makes you go insane after a bit.
Additionally, once you get it working, simply imagine curling up around
the proper files for DX9 and mumbling, “my own…my precious…” Trust
me, it could happen.
June 9, 2011
So, there’s that part of me that always fears straying too far from the warm comfort of Microsoft’s “operating systems.” I don’t want to give up my PhotoShop. I don’t want to give up my Visual Basic. I don’t want to give up — and this is an important one — my NHL09. What do I want to give up? I want to give up the endless drive grinding. I want to shed the shackles of random crashing applications that, despite “compartmentalized memory spaces,” take down the entire OS. I want to shed bloated OSs that grind away in an infinite loop of virtual memory swapping because the OS, itself, takes almost half of the 3GB RAM available to it, thus crippling PhotoShop and FireFox, should I decide to have my normal 5-10 22M photographs open in PS and 22 tabs open in FireFox… The bottom line is that I want to shed an operating system that hinders more than it helps.
With that in mind, it was with a large amount of trepidation that I installed Ubuntu linux on my XP box, Frankenstein. For reference, Frank has been a rock-solid XP Pro box for close to 6 years. It garnered the name Frankenstein because it hasn’t had a case for close to, well, 6 years because it has 4 hard drives a dvd-rw drive, 2 external card readers, 2 4-port USB hubs to which are attached a printer, scanner, three external HDs, my iPod cable, my RP350 effects processor and, occasionally, a TI calculator, which usually supplants the iPod. With close to 600GB of storage, 3GB RAM and a still quasi-zippy 2GHz AMD processor, it was solid, reliable, and my main development machine for all things software development related. Then, one morning, it wouldn’t boot. It couldn’t find the NTLDR. That’s bad, for reference. I tried several repair methods, and I mean several…5?
Since I started this writing, Franky has died. A power supply upchuck and a toasted motherboard really made the OS moot. So, Arkham has replaced it, and after a wiggy power supply on IT, Arkham II is plugging along perfectly happily running Linux Mint 10 and never having to worry about dual booting…all the Windows I need live in very compartmentalized virtual machines and that’s how it’s going to stay.
I’ve been running Linux Mint for close to 18 months, now, and as the sole OS for close to a year. There’s no going back to Microsoft land…there just isn’t.
October 6, 2010
Not quite the video game of the same name. Sometimes it *feels* like a video game, but it’s not. That would be too easy, and what fun would that be? No, this is referring to the adventure surrounding my hard drives, fstab and many hours of copying sizable amounts of data back and forth.
I set out with the single, laughable, purpose of getting my drives to mount consistently at chosen mount points. Certain programs — Rhythmbox, Amarok, VirtualBox, and so on… — get crabby about drives showing up at different mount points each time. More precisely, I get crabby about drives showing up at different mount points, especially when they’re the ultra helpful variety: both 250G drives mounted with the label “250GB External Filesystem.” “Duh.” Since they were NTFS, it was more than a simple pain to change the label on them even with ntfsutils.
Because I’ve worked with linux in one form or another since 1995 (dual-booting between WFW3.11 and Slackware 2.3…on 850,000 3.5″ floppies…), I thought I’d be OK editing my fstab in Webmin — with which I’ve worked since 1998 or so, with little to no problems — and, for whatever reason, boy was I wrong. Going in through the Filesystems module, I happily added mount points to each of the filesystem entries, as I’d done before, granted a bit of time before…I remember, fondly, finally moving to a 1.x version… At any rate, I think something got buggered up along the way because when I hopped into a shell and mounted -a, everything went to hell. Nothing mounted, properly, and the root was in read-only which caused the fascinating problem of exactly NO application or system function having privs to run. I could not even shut down. How sad is that?!
For starters, I simply wanted the two 250GB external drives to show up at consistent mount points. At the time, they were SDE1 and SDF1 (insert Robotech joke here…), so I thought I was safe in creating two happy directories in the /media directory — Dagon and Azathoth, respectively — and then having them mount there via the fstab. I was wrong. They merely mounted under their respective hardware IDs to the directories disk and disk_1. At this point, I figured, eh…fine. So, I went into the programs that needed to know where data was and felt safe, then, in pointing them to the drives. Apparently, that was a bad idea. The next time I needed to reboot, they mounted precisely opposite…not bloody useful.
It was at this point, I was sick of dealing with Kate or GEdit, so decided to go in through Webmin, and select precisely the mount points I wanted. This would be where the trouble hit. Something went bing, as detailed above.
Paring down fstab to its single / /dev/sda1 line, I set about to remedy this thing once and for all. However, I didn’t know how complicated it would be. No…it’s not that hard and the actually “taking care of it” was done in short order. It was the decision making process up to that point that was more than a little drawn out. The biggest question facing me was which filesystem to use. I was sick of the NTFS-to-linux latency and wanting everything happily native. That said, when it comes to native filesystems, there isn’t a shortage. I could have chosen EXT2, EXT3, EXT4, ReiserFS, Reiser4, JFS, XFS and if I had put more effort into it, ZFS-Fuse.
I, intially, decided to format one drive to EXT4 and go from there. First, it should be noted that the throughput on the two external drives via USB2.0 when both were NTFS was at a paltry 3.5MB/s. Let’s just say it took a while to copy all 117GB from on to the other. Once everything was copied over, GParted was kind enough to get the drive where I wanted in a matter of a few minutes, including setting the label to ‘Azathoth’ so as to give Gnome, at least, a reference for consistent mount locales. This will become important in a minute…
One thing that struck me, though, was how much ruddy space EXT4 thinks it needs to chew up just by virtue of formatting. A completely blank drive, it formatted under NTFS to 232GB. In EXT4, I ended up with ~220GB free. In looking in GParted’s display, it shows that it’s formatted to 232.88GB. Cool, right? It also shows that 161.94GB are unused with 70.94GB. For those of us who have to use a calculator, that brings us to 232.88GB. Again, cool, right?
Well, one would think…look at a little discrepancy below… Yeah…Free space: 150.3GB as opposed to 161.94GB unused space. I guess my question is where did the other 11.64GB go?
So, then, it was necessary to think long and hard about the filesystem choice. I figured for Azathoth, we were going to be happy with EXT4 long enough to copy over all the data from Dagon in order to format Dagon to a native format…but which one?! I looked at the veritable merits, numerous blogs, several technical articles, a bunch of forum posts and the obligatory flame-war or two to see which one seemed to have fewer flaws. I finally settled on XFS for Dagon’s new format, and thus set about getting it ready to be formatted and be happy. The throughput from NTFS to EXT4 was consistently around 11.0MB/s, which was a dramatic improvement over the NTFS-to-NTFS transfer rate (3.5MB/s, if you recall). So, once everything was copied over, I hopped into GParted and converted the ol’ fish god to a faster, native filesystem.
In a stunning display of “what it says you’re going to get is what you actually get,” the XFS format seems to be consistent with the data… It formatted to 232.88GB, as it should, and when you do that maths, end up with 232.89GB, reflecting 84.69GB unused. When loaded in Nautilus, it shows that Dagon has, indeed, 84.7GB free. That’s more like what I expected to see. There was only mild disappointment in the throughput speeds when I copied the files back to Dagon from Azathoth, averaging right around 9.5MB/s. It could have been a lot worse — it could have been the NTFS transfer speed. Of course, it didn’t help that I was copying over close to 950K files, a large contingent of which were smaller files which are not XFS’s strong suit. Of course, this can be tweaked through mount options and, really, it’s not that big of a speed hit and I get the full reported size of the disk. That’s all I could ask for.
So, the upshot is, to me, we’ll see how the performance is, overall between the two drives, as that may determine who becomes what, should I feel the need to wipe drives, again. While I am currently enjoying the idea of having access to all the precious GB that XFS seems to offer that EXT4 seems to cordon off for who knows what, I am also all about discerning, really, which of the two is better for what I need. Virtual machines and so on will help me along with that decision…I’ll keep you posted.