Netbook – Part 3

Thirteen months ago I wrote Part 1 about my experience with choosing a netbook to take on the very first SQLCruise. Part 2 continued with setting up the netbook for SQL Server development and education use.

Thoughts After a Year of Use

So after more than a year of use, what’s the verdict? Easy answer: success, no regrets.

The netbook not only went on the very first SQLCruise from Miami to Cozumel, but also the most recent SQLCruise to Alaska. On both trips I used it in the classroom as well as to VPN into the office to do work. Same for last year’s PASS Summit and some SQL Saturday events. For a year I carried it to the office so that I could work away from my desk for an hour or so each day. The netbook had a cost of around $300, throws off very little heat compared to a laptop, and only weighs 2.8 pounds. Most surprising: the estimate of 14 hours of use on battery power was not much of an overstatement (turning on wi-fi zaps it down a bit, turning on bluetooth zaps it down a lot).

So is that the end of this post? Am I going to end with “All is well, tune in next year when I will let you know if the netbook is still running” or something like that?

No.

Let us push-on and take the netbook to a new place. A wonderful place with everyone’s favorite creature of the arctic waters, the narwhal. In this particular case, a natty narwhal.

It’s Ubuntu Time

Ubuntu is a linux distribution. Unlike the olden days, you can avoid the disk partitioning stuff, boot loader configurations, command lines, having to do yet another operating system install where you have to babysit the machine in case you need to type in information and hit the enter key, etc. More on that later.

But first, why would I want linux on my netbook if it already works fine? Because sometimes I just want to grab my netbook and do some web browsing, but I don’t want to wait for Windows 7 to boot up. So I wondered if a minimal Ubuntu install would boot up faster on my netbook. No need for suspense… the boot up time is about the same, but the feature I like is that the Ubuntu user interface is very nice for smaller, netbook-size screens. This interface seems a bit awkward at first, but it doesn’t take long before you begin to appreciate it.

One way to get Ubuntu running would have been to use a virtualization solution such as VMware Player. But that’s not going to work in this case because then I’d have to boot to Windows first, then Ubuntu. That’s not much of a time-saver. Also, my netbook doesn’t have a lot of memory or CPU horsepower for pushing virtual machines. So it was looking like I would be setting up a dual-boot configuration.

At this point, I decided to try the option of running Ubuntu within Windows. This turned out to be very simple, and so far I’ve been quite satisfied with the result. With this installation option, Ubuntu installs into a folder in your Windows file system, and the experience is similar to installing a Windows application. So you don’t have to deal with partitioning your hard disk for a separate operating system. The Ubuntu installer uses your existing Windows installation to figure out the settings to use, so your interaction during the installation process is minimized.

If you want to try out Ubuntu then you can follow the instructions here, plus this page has more detailed instructions on installation as well as how to uninstall (you uninstall from the Control Panel just like you’d uninstall a typical Windows application).

Once you have Ubuntu installed, fire up the web browser and do some reading. A couple of links I’d recommend would be

So my netbook has become even more useful now. When I start it up, I can choose to boot into Windows to do some SQL Server work, or boot into Ubuntu for some linux goodness.

Posted in Hardware | Tagged , , | Comments Off

T-SQL Tuesday #021 – Inelegant Yet Educational

August’s T-SQL Tuesday is brought to us by Adam Machanic (Blog|Twitter). Actually, we have Adam to thank for every T-SQL Tuesday because he created it in the first place, however this month he’s hosting it as well. He’s also decided to have the event take place on a Wednesday rather than Tuesday, which is appropriate given this month’s topic, which is to reveal our “crap code.”

This seems easy enough because I’ve got plenty of examples. The first one that came to mind was when I was told to replace all cursors with temp tables and WHILE loops. However, I already discussed this in item #4 of last month’s post. No need to revisit that. So here’s two other crap code examples; one that died quickly and another that lives on.

Quickly Archived But Not Forgotten

I was directed to implement a soft-coded values solution that shoehorned generic objects into a relational model, with GUIDs as clustered keys and values in sql_variant columns. By the way, the data architect in this situation admitted not knowing anything about 1) the solution’s industry and 2) SQL Server.

I implemented the design. Now, you might ask, why write this code even though I knew it would be crap? Well, because I had already been labeled a naysayer by the new management team for raising objections on two other issues. I had also been shown negative emails that managers were circulating about me. So I decided to be a team player while preparing to get a pink slip, and along the way I’d learn about this type of design.

Guess what? I did learn a lot about soft-coded values design (e.g. when it’s appropriate, when it’s not, how far you can optimize it, and non-relational approaches). But in the end, a daily load of just one feed took 22 hours (and almost 100 feeds needed to be processed a day). Each consultant brought in to review my code said it was perfect. According to them, the solution was that we needed to buy much, much more expensive hardware to fix the situation.

New hardware was not possible, and the situation was desperate enough to ask this “naysayer” for ideas. I demonstrated a redesigned solution that could process a feed in several minutes. With that, the soft-coded values solution was replaced and sent off to the archive repository. Years later, my solution is still used to process large volumes of data. Even though the soft-coded values solution was only used for several months many years ago, I continue to be asked questions related to that experience.

Lesson: Data matters. Relying on technology and/or technical skills to allow abstraction or over-generalization of your data might not be a viable solution. I was able to design a superior solution in this situation because of my experience with data in this particular industry. This is why I like Buck Woody’s (Blog|Twitter) job title of Data Professional – it keeps me focused on what I consider to be the most important part of my work.

It Lives On

It was over eight years ago when I was asked for this particular hack. A new solution had been developed without considering that certain identifier values from the old solution needed to be maintained.

The manager of development asked if I could hack something together quickly to maintain these identifiers. It went something like “You can? Two days? Great! And, oh yeah, also create new identifiers that won’t collide with the old solution. This will be a temporary hack, don’t worry. In a couple of months a permanent solution will be built. Two, three months. Okay, maybe four or five since some people might go on vacation. That’s it. Probably. Well, maybe six months. Tops.

So I created my hack solution. Driven by nothing but the finest .bat scripts. Plus several tables, stored procedures and user-defined functions. Yes, I usually avoid user-defined functions, but this was just a temporary hack that I needed to throw together quickly, right? Since this was a temporary solution, I decided to stick these objects in their own database that could just be dropped when the hack was no longer needed.

I asked the DBA for a database, which he named DB_CorpCo (replace “CorpCo” with client name). I said this was a terrible name because it’s not doing something necessarily related to a specific client, rather it’s performing a certain function. The DBA said for now only one client, CorpCo, will use it, so that’s a fine name. I said that sometimes temporary hacks become permanent, so this database might end up being used for other clients such as AirCo, FruitCo, StoreCo, etc. The name will be confusing. It should not contain a client name. However, my argument was going nowhere, and I had a lot of hacking to do, so DB_CorpCo was created. Over eight years ago.

DB_CorpCo worked great. Six months came and went. Then most of the development team was laid off. DB_CorpCo was working fine and there were no resources to replace it, so it kept going. Pretty soon, it’s use was expanded to include other clients than just CorpCo. I got to explain over and over that, yes, it’s called DB_CorpCo, but all clients use it.

I left the company for a while, and when I returned I found that even with all the comments and documentation warning that no new functionality should be added to DB_CorpCo, it’s tables had been altered and stored procedures had been modified to add new, expanded duties that belonged elsewhere. Like a beast in a horror film, DB_CorpCo, with all it’s temporary, hacked together abominations, was now intertwined with new and old solutions serving multiple clients. And at the top of each source code file is the comment “Original version created by Noel McKinney.”

To this day, in a data center far, far away, on a lonely server in a rack, a database named “DB_CorpCo” is being updated and queried daily. It Lives (cue the horror music now).

Lesson: Temporary hacks can become permanent, unless you want to argue that eight years is not “permanent” when it comes to computer solutions.

I’m looking forward to reading the contributions to this month’s T-SQL Tuesday Wednesday and reading the examples that others were generous enough to share. Thanks to Adam Machanic for creating this monthly blog event and hosting it this month!

Posted in T-SQL Tuesday | Comments Off

SQLCruise Book Review Number 1

During SQLCruise Alaska last month, Buck Woody (Blog|Twitter) challenged us to pick twelve books to read that would help us with our professional development goals in the year ahead. Almost a month ago, I posted my book list.

I started by reading the last book on my list. That book is Yes We Did! An inside look at how social media built the Obama brand by Rahaf Harfoush (Blog|Twitter).

Rahaf worked on the President’s campaign in 2008. Her book provides an in-depth view and discussion of how technology was utilized during that campaign.

This book was very helpful in preparing for my new job.

And with that, I have a disclaimer. This review is not exactly unbiased.

Because…

I am on the Technology Team at Obama For America.

A lot of you had a hand in this. From the bottom of my heart, Thank You!

Posted in Books | Tagged , | 13 Comments

T-SQL Tuesday #020 – T-SQL Best Practices

July’s T-SQL Tuesday is brought to us by Amit Banerjee (Blog|Twitter), and he has chosen T-SQL Best Practices as the topic for this month’s T-SQL Tuesday. Amit discusses best practices in light of their impact on performance and states that “Sometimes, the most obvious things are the easiest to overlook!”

I’m not going to try for a comprehensive list. Instead I’ll put down the first three best practices that came to mind when I saw this topic. Just for fun, I added a fourth issue that might stir things up a bit.

  1. Put your T-SQL scripts in a version control solution. I’ve mentioned this before but I’ll repeat it. If you’re a DBA and don’t know where to start with this, talk with your developers. It will be a bonding moment with them. I know that’s a frightening thought, but it could save your bacon someday.
  2. Go out of your way to make T-SQL scripts re-runnable in an elegant and graceful manner. Seriously, these are the words I use when discussing this. Ask yourself how you can make scripts more like a ballet dance and less like a freeway pileup. Consider what can happen when running your script. Does your script drop a database? Check for users in that database and handle appropriately, e.g. exit the script with a message output that the database was not dropped because it had users. Going to create a schema? Check for its existence and if it already exists then don’t create it. Output a message informing that the schema already exists and didn’t have to be recreated. You’ll need to use Dynamic SQL to do this. Fine, do that. Your output messages should include the object names and a timestamp so that you’ll know when it was run. If you use the OBJECT_ID function to check for existence, include the optional object_type parameter so that you aren’t just relying on the name (even if you use naming conventions based on object type, you can’t assume that everyone always followed that convention everywhere). Take foreign keys and other dependency relationships into account and never assume that your scripts will be executed in the proper order. Sure, you could say “It’s not my job to make sure others use my scripts correctly.” But you know what? You could make things easier for someone else, even if they don’t notice it. And if someone else does notice it, they will think highly of you. Yes, it’s extra work. But you will learn things. And it will make you a better database professional. I guarantee it (disclaimer: this not an actual guarantee).
  3. Ask if the functionality even belongs in T-SQL. Sure, I like stored procedures and the SQL language. But there are other layers to most solutions than just the database. Don’t be afraid to ponder the possibility that something belongs in another layer. For example, when I encounter a user-defined function that’s used to format a number to appear a certain way, my first question is “where the heck is this being used?” Same thing with stored procedures that are doing a heavy amount of aggregation. Maybe that’s not a bad thing, but it could be. Situations are different all the time, so I think the easiest general rule is to be open to the possibility that a thing might belong elsewhere, even if it has already been implemented in T-SQL.
  4. I’m going to step on some toes and say that not all cursors are evil abominations to be completely and totally disallowed. Some are different than others. While they shouldn’t be the first solution considered in SQL Server, they can work just fine. Once upon a time, I was told to go through all of a company’s stored procedures and remove all the cursors. When I mentioned how long this could take, the DBA said “It’s easy. Just do what the cursor does, but put everything in a temp table, then iterate through the temp table with a WHILE loop.” I demonstrated that declaring the cursors as LOCAL and FAST_FORWARD consistently beat the performance of his temp table and loop solution. But it didn’t matter. He explained that some day he would be a manager and would fire anyone who used cursors. Interesting. There might be absolutes in the world, but I don’t think making “never use cursors” a best practice is one of them.

I’m looking forward to reading the contributions to this month’s T-SQL Tuesday and reading the best practices discussions of others.  Thanks to Amit Banerjee for hosting T-SQL Tuesday #20, and thanks to Adam Machanic (Blog|Twitter) for creating this monthly blog event!

Posted in T-SQL Tuesday | Tagged | 2 Comments

TRUNCATE TABLE Regrets

“Good things, sometimes, take time.”

- Character of Lou Mannheim in Wall Street (1987 film)

I don’t like regrets, and try to resist saying “If only I had done blah blah…” and so forth. Even when I do, my wife reminds me of what wonderful things I would have missed out on if I hadn’t taken the alternative, and she’s always completely correct.

In January we were talking about the year ahead, how fortunate we are, how much fun experiencing life has been, etc. But then I let an “If only” slip out.

I said “I shouldn’t have left Boston in 1996. It was supposed to be just for a few years, then I was going to return. It will be fifteen years this year since I left. I wouldn’t have moved if I had thought it would be this many years later and I’m still not back.”

For a moment I was in a bit of shock that I had expressed a regret like that. Sure, I had been feeling it for a long time, but I’d never let it slip out like that. However, my wife then surprised me with her reply. She agreed with me. She had nothing to counter my statement. The things I’ve done since leaving Boston could have just as easily been done there. I was on a defined path there and had plans and everything was in motion.

And I was so very happy. I loved living in Boston. I felt at home there immediately. My whole life I had wanted to live there, and once I arrived, everyone thought I’d never leave.

But an opportunity came along that required returning to the Midwest, and somehow I felt I should take it. I told myself it was only going to be for a few years, then I’d return, plus I could always visit on weekends and vacations.

Things didn’t turn out that way. And that was fine. Really. I’ve had a great time during the last fifteen years. Lived in a few places, had several jobs, met lots of fantastic people. Great experiences.

But I had a problem. In short, I couldn’t figure out what I was supposed to be doing. Twenty years ago I knew what I was supposed to do. Thirteen years ago I adjusted those goals and jumped on another path with a defined purpose that really excited me. However, about ten years ago I wandered off that path. Then while working on the goals and resolutions posts around the New Year, it began to sink in… I’ve been busy and engaged and learning and experiencing, but that overall goal or purpose was missing.

I was in a funk… now you see why I don’t like New Years resolutions :-) And I couldn’t find something to point to and say “That’s why I left Boston. This could not have happened if I had stayed there.” Fifteen years. I couldn’t find it.

Then it happened. Suddenly an opportunity appeared, a result of meeting the right people, working at the right places at the right times, having the right skills, being in the right city. The opportunity turned into an offer for a job that I’m starting this Monday morning.

When I received the job offer, my wife and I were jumping around for joy, and I suddenly looked at her and said “This is why I left Boston fifteen years ago!” She agreed with a definite “There is no way this would have happened if you had stayed in Boston. Not possible.” She and I continued on listing all the things I’ve done and how they led to this job offer. I’m going to be doing what I am supposed to be doing, and it’s consistent with the path I started on twenty years ago. Only better.

“I shouldn’t have left Boston in July 1996″ has become “Thank goodness I left when I did!”

Posted in Professional Development | 7 Comments