Crucial M4 SSD in MacBook Pro (Early 2011)

I recently bought and installed in my 2011 MacBook Pro a Crucial M4 128GB solid state drive. The results have been good and, as a first time SSD user, I can honestly consider myself impressed. Startup times for everything from the OS to applications are much, much snappier. I also appreciate no longer hearing the mechanical sounds of an HDD, even though they weren’t too loud in this case. If I ever get around to running benchmarks, I’ll attach them here for your perusal.

Here’s a screenshot of BlackMagic’s Disk Speed Test taken 8/12/2012, over a year after I installed the drive:

BlackMagic Disk Speed Test Benchmarks
BlackMagic Disk Speed Test Benchmarks

These benchmarks are actually a little better than the “out of the box” due to slight improvements in performance delivered by firmware upgrades. Either way, these are precisely in line with Crucial’s advertised specs (note: the 256GB and 512GB M4 have faster write speeds than the 128 — about 250 MB/s).

The M4 supports TRIM but, as many people no doubt know by this point, OS X 10.6.6 and versions beyond (including Lion and Mountain Lion) only support TRIM when using one of the Apple SSDs (Samsung/Toshiba). Personally, I think this is ridiculous. After all, Windows 7 provided TRIM support out of the box in October 2009, and the Linux kernel has supported TRIM (on certain filesystems) for nearly the same length of time. So it’s weird to find something so backward-facing in a product that the manufacturer touts as “the world’s most advanced desktop operating system.” Don’t get me wrong: I love my Mac and OS X is a good operating system, but sometimes they leave things out that just make no sense.

Not willing to go quietly, I used the TRIM Support Enabler utility to enable TRIM support for this drive. There’s always a bit of risk when using a system hack of this nature, but it seemed like a no-brainer when considering the potential benefit to the performance and life expectancy of a piece of technology that’s still relatively costly. Just remember to make a backup of your system before proceeding and you should be fine. I haven’t had any problems that I can attribute to enabling TRIM support for my drive.

In respect to the Crucial M4 specifically , it should be noted that there have been a significant number of user complaints about this drive when paired with the 2011 MBP ranging from excessive “beach balls” to complete failure. Normally, I’d approach these complaints with some skepticism, but Crucial support started issuing responses to many of these users implying they were aware of these issues and were ceasing to list the Crucial M4 as compatible with MacBook Pro 2011 and, later, that their engineers are looking into it. Crucial has a had a good name amongst Mac parts suppliers for years (namely RAM); let’s hope they issue a firmware update before losing too many SSD customers. Until then, I’d recommend that MacBook Pro users exercise caution when shopping for an SSD.

For the record, I too have had some lengthy “beach balls,” namely when using my Win7 VM with Parallels. Bummer.

UPDATE: Ever since the earliest firmware upgrade (0003, I think), the beach ball issue has become practically non-existent. I mentioned as much in the comments but realized I’m doing my readers and Crucial a disservice by not mentioning this explicitly in the review. Apologies.

The Crucial M4 SSD rocks. That said, SSDs are still somewhat expensive (ranging often from $1.50-$2.00 per GB), so be prepared to open your wallet, especially if you want a SATA III drive and/or lots of storage space. If only Apple would polish up OS X’s SSD support so that I don’t have to perform witchcraft to get the full functionality of my drive, then I’d be perfectly satisfied.

128GB is much less than I’d ideally like to have, but a 512GB SSD still costs nearly $1000. I’m not sure I could justify that for a storage drive on a personal machine (unless that machine was being used to generate income in some way, which mine is clearly not). I think/hope we’ll see pricing gradually decrease over the next year or two and hopefully SSDs will replace HDDs as the standard shortly thereafter.


MacBook Pro 2011 15″ Review

Pros: 2nd-Gen (Sandy Bridge) i7 Quad-Core processors, Unibody design is still as good as it gets, battery life is a cut above the competition.

Cons: Price, No USB 3.0, no HDMI out, Generally runs a tad warm, discrete graphics card runs hot and slurps power when in use.

Summary: In past years, I always passed on Apple’s MacBook Pro despite my considerable interest because I couldn’t reconcile the the premium price tag with specs that were anything but cutting edge. This year, however, Apple made a significant stride towards making the MacBook Pro more competitive with what Engadget has accurately described as “one of the more aggressive refreshes in the machine’s history.”

The most prominent improvement is the inclusion of Intel’s Sandy Bridge quad-core CPUs. Apple, while known for being an innovator and trendsetter in the mobile device market, has never been an early adopter when it comes to their Mac products. So it was a big surprise when MacBook Pro started offering Sandy Bridge less than 60 days after its debut (and was, in fact, the first to make the CPU available en masse following the Cougar Point chipset fiasco). These processors have been the key component in a package that has yielded some of the most impressive benchmarks of any notebook PC to date (though it’s only fair to note that this gap is likely to narrow significantly once other manufacturers start offering their own Sandy Bridge models). This powerhouse CPU is packed inside Apple’s sturdy and sexy unibody design, complete with Apple’s industry-leading, marvelous glass trackpad (and gesture support), and comfortable keyboard. Even after 3 years on the market, this is still the design to beat (HP Envy seems to be catching up but, for the time being, the unibody design has no equal).

One of the more subtle improvements is an upgrade of the iSight camera to an HD webcam which allows for HD FaceTime calls (strangely, PhotoBooth and Skype for OS X still take VGA shots only, but I’d expect that an update is forthcoming).

The display is a big selling point for true professional users. Apple is one of the only portable PC makers to still offer 16:10 displays (most have opted to follow the TV-inspired trend of 16:9). As someone who does a fair amount of software development and web design, I can attest to the usefulness of the extra vertical pixels. The LED-backlit panel itself is above average [in relation to other laptop displays] in practically every measure of quality. The 15″ MacBook Pro allows customers to choose from two very usable resolutions (1440×900 and 1680×1050). This might not seem like much, but most other manufacturers don’t offer customers more than one resolution and, with most 15″ notebooks, that resolution is a mere 1366×768. All praise aside, one has to wonder when Apple, reputed for making aesthetically-pleasing devices, will start offering RGB LED displays in their MacBook Pro line.

Battery life has been one of MacBook Pro’s “tent-pole” features in recent years. Interestingly, Apple’s published rating of up to 7 hours for the new battery is a slight decrease from last year’s model. Apple has explained this as the result of using a more realistic testing approach. So how long does this battery really last? It should go without saying that it largely depends on how you’re using your MacBook. For casual usage with some video playback, most users will see this machine top out at around 5 and 1/2 hours (give or take 30 minutes). And while that figure is shy of the 7 hours claimed by Apple. it’s head and shoulders above most of the competition. If your use case involves frequent participation of the discrete GPU, lengthy encoding/decoding sessions, or other tasks with a heavy processing payload, then the battery life will decrease precipitously. And since the battery isn’t replaceable in an on-the-go fashion, some professionals might be well served by adding to their arsenal a portable external power source (photography professionals working on-location, for example). See the “Windows Performance” section below for information on the how the battery performs when booting into Windows 7 using Boot Camp.

The new Thunderbolt I/O port is an interesting prospect but isn’t really a draw at this point as no Thunderbolt-enabled peripherals have hit the market. In fact, I’d say we’re at least 12 months out from knowing whether this technology has a reasonable shot at success/proliferation. Personally, I expect the first few batches of Thunderbolt-enabled devices to be priced considerably beyond the average consumer’s budget. The silver lining, as confirmed by Intel, is that there’s no exclusivity on Thunderbolt which means that other PC makers are free to implement the technology when they feel the time is right.

The 2011 MacBook Pro does lack a few things that a PC labeled as “Pro” and sold in 2011 probably should have — GPU is good, but could be better (though I assume the nVIDIA/Intel debacle played a role in that), no USB 3.0, still only 2 USB Ports, 8GB RAM option is cheaper than last year but still overpriced, no HDMI port — but nothing about it is so archaic as to cause the price point to be downright offensive as it was in years past. One thing a number of reviewers have criticized is Apple’s consistent resistance towards Blu-Ray implementation. I’m on the fence on this matter as I find myself sipping on the Kool-Aid of those who are clamoring for the outright removal of the optical drive. If we could see an affordable SuperDrive with Thunderbolt and Blu-Ray support down the road, then I think that would be a fantastic compromise (though the latter seems highly unlikely as Steve Jobs himself has made his position against Blu-Ray clearly known).

After being on the fence about buying my first Mac for a few years, this refresh convinced me to finally pull the trigger. Yes, the price is still higher than it perhaps ought to be and some of the hardware isn’t quite cutting edge, but the new processor lineup coupled with the fantastic design made for a package that totally won me over. A recent negative experience with a different notebook also played into this decision (feel free to skip the next paragraph if you’re not worried about the story):

In early January I was in need of a notebook and was itching to try out the just-released Sandy Bridge processors. This was days before the Cougar Point issues were brought to light. I bought a 15″ Pavilion dv6t Quad Edition from HP. I customized a fully spec-ed out rig (i7-2820QM, 8GB RAM, 1GB AMD Radeon 6570). Through a discount program and a coupon, I made my purchase for a very reasonable price. It was fast and ran every Windows task I threw at it capably but, in every other regard, it was the worst notebook I’ve ever used. The screen (which res’d at a paltry 1366×768) had horrendous contrast and a frustratingly small arc of optimal viewing angles. The trackpad was extremely stiff and poorly designed (right-clicking was an arthritis-inducing nightmare). The standard 6-cell battery was only good for 2.5 hours of light-to-moderate use. I sent it back in less that two weeks. Given the vast difference in quality of user experience, components, and design, the significant price difference between that PC and the MacBook Pro seems infinitely less confounding.

OS X Performance: As a first-time Mac user, I can’t accurately compare the OS X performance of this machine to previous Macs, but I can say that just about every task I’ve attempted has been quick and fluid. There’s no doubt in my mind that this rig will meet and/or exceed OS X performance expectations at nearly every turn. Casual users will be pleased to know that this year’s MacBook Pro continues to handle with ease the more mundane tasks of web browsing, light-to-medium business productivity (iWork or Office), and correspondence. Professional users will likely find this machine adequate in most areas, though some concerns arise when working in situations that requires the constant attention of the discrete GPU as the heat output is considerable and the battery life dips well into the paltry ranges occupied by the rest of the industry.

UPDATE — March 17th, 2011: As a newcomer to OS X, I find myself wowed by the wide array of features and built-in tools, but disappointed by the general lack of stability. OS X has crashed on me more times in one month than three versions of Windows have (XP, Vista, Win7) in nearly a decade. This factor will likely warrant consideration the next time I look to purchase a PC.

Windows Performance: I know a lot of Mac users won’t care much about this, but it seems worth commenting on since so many people out there still use Windows at home (and it’s a subject with which I have considerable experience). I’ve been running Windows 7 through BootCamp from time to time and put the system through some considerable paces (I’m a software developer, so mainly large computations and heavy RDBMS queries). From a performance standpoint, the MacBook Pro seems to handle everything swimmingly (though not quite as well as the aforementioned HP notebook in some regards). And Windows has never looked as good as it does on the MacBook Pro’s lovely screen. That said, there are a couple downsides to the experience. Apple hasn’t included graphics-switching drivers to support Windows installations through BootCamp, so the discrete GPU runs constantly when booting into Windows. As a result, the machine runs much hotter in Windows (LubbosFanControl set for a minimum of 4000rpm seems to keep things running in the mid-to-upper forties [Celsius] range for most non-gaming situations). Also, the splendid battery life enjoyed in OS X is cut in half when running Windows. It’s a bit of a bummer for someone who does a fair amount of work in Visual Studio and SQL Server and might be a nuisance to hardcore gamers (though I honestly wouldn’t recommend this system to a hardcore gamer anyway) but I imagine a lot of Apple’s core Mac user base won’t have a big problem with it. I’m eager to try Parallels or Fusion at some point and compare the experience to BootCamp.

Recommendation: If you’re in need of a notebook, the 2011 MacBook Pro is a tempting package, particularly if you’re an avid Mac user. It’s not perfect for everyone — hardcore gamers, WIndows enthusiasts/exclusivists, and shoppers on a budget should stay away —  but it’s a fast and powerful machine with an unmatched design and a fantastic user experience. That said, I’m not sure I’d rush out to the store just yet. If you’re on the fence and can wait until early 2012 then I’d recommend doing so for the following reasons:

  • Apple’s next Macbook Pro refresh is likely to include Intel’s forthcoming CPU bump codenamed Ivy Bridge (scheduled to debut in Q4 2011) which, if Intel meets their goals, will have 20% better performance, use slightly less power and have a considerably more favorable TDP rating (less thermal output).
  • Intel’s integrated graphics will no doubt improve somewhat. I doubt it will be enough of an advancement to merit the outright removal of the discrete GPU from the higher-end MacBook Pros, but it could be enough to raise the cross-over threshold where the discrete GPU takes over, which would have a positive impact on battery life and net thermal output in several use cases.
  • OS X 10.7, codenamed Lion will have been released, as likely will the inevitable first service release that resolves some of Lion’s must-fix RTM bugs. Somewhere in this mix will likely exist improved graphics drivers (performance on the AMD Radeon 6570M is reportedly underwhelming in comparison to the nVIDIA 330M in the 2010 MBP line in some areas, primarily gaming).
  • Maturing SSD technology may finally put SSD upgrades at a point of mainstream accessibility. Right now, the bang is too small for the considerably large buck. Apple may also include TRIM support for non-Apple-supplied SSDs when Lion ships.
  • There will be a much clearer picture of where Thunderbolt stands.
  • From a driver perspective, it appears Apple doesn’t quite have a handle on some of the new hardware. Some issues were fixed by the 10.6.7 update released on March 21st but the user community is still abuzz with stories of random freezes (apps and system), application incompatibilities, heat spikes/fan spin-us, and so on.

A Note about Heat: I’ve criticized the heat output of the 2011 MacBook Pro a few times in this review. One hand, it’s only reasonable to expect that the inclusion of cutting-edge CPUs into MacBook Pro’s slim form-factor is going to tax the computer’s cooling resources. Power and cooling have long been a sort of cat-and-mouse game and this time power clearly has the upper hand.

On the other hand, I still think the 2011 MBP gets hotter faster than it ought to. For comparison, my last laptop was a Dell Inspiron E1705, purchased in November of 2006 and outfitted with a Core 2 Duo @ 2.2GHz and 4GB of RAM. I used that computer for nearly 4 full years before it was stolen during a break-in in October of 2010. For most of  time during which I used that laptop, it rarely reached 75C and only surpassed that mark when full-on gaming (and that was with modest fan speeds). The 2011 MacBook Pro seems to eclipse 75C regularly for even the most trivial of tasks (Web pages with RIA plug-ins such as Flash & Silverlight, YouTube/Hulu/NetFlix, non-GPU intense gaming, Xcode development) and passes 90C within moments of embarking on true “heavy load” operations (video encoding, large rendering jobs, moderate-to-heavy VM activity, 720p or 1080p streaming from some sources).

Recommended Reading:

  • iFixit Teardown   —
  • Engadget Review —

My Specs: 2.3 GHz i7 Sandy Bridge quad-core CPU, 8GB PC10600 DDR3 1333MHz RAM (ordered 4GB from Apple and performed an aftermarket upgrade), AMD Radeon 6570 GPU w/1GB DDR5 VRAM, 1680×1050 glossy display

A workaround for using the :active pseudo-class with anchors in IE7

NOTE: This post is intended for developers frustrated with IE7’s lack of support for the :active pseudo-class. The information contained within has been noted in other articles, such as Jeff Starr’s Unobtrusive JavaScript: 5 Ways to Remove Unwanted Focus Outlines or Oscar Alexander’s How to Make Sexy Buttons with CSS, but I don’t think I’ve seen a post solely dedicated to this specific issue, so I figured I’d take the time to create one.

The Background

A little while ago, I set out to create a couple of image buttons. I wanted something that was reusable and had the potential to be used with ASP.NET with the least possible extra work/maintenance. I liked the design concept that the guys over at Wufoo have been using and decided to see if I could code something that would emulate that look and feel, particularly in regards to the way they used the :hover and :active pseudo-classes to add a touch of interaction.

So here’s an example of what I came up with:


Here’s what that button looks like when the mouse is hovering over it (:hover):


And here’s what it looks like when the user clicks on it:


None of this is groundbreaking or terribly fancy, but it does the job nicely.

The Problem

The problem I noticed is that, in IE7, the :active pseudo-class seems to behave more like the :focus pseudo-class. That is, the CSS instructions provided in the :active pseudo-class should be presented only for the duration of the mouse click, but instead persist for as long as the anchor element has focus (focus is given to the anchor onclick). This may not be a problem in every scenario, particularly if clicking on the button results in a refresh or a redirect to another page. If the button in question is to be reused between page refreshes, however, then your users may end up wondering if it’s safe to continue when the button has not returned to its original state. In any case, it mildly degrades the user experience and, as such, it probably should be addressed.

The workaround I employed is quite simple. To each anchor tag, I added an onclick handler roughly equivalent to the following:

<a class="button green" onclick="this.blur();" href="#">Link Text</a>

That seemed to fix the problem. It also gave the added bonus in some browsers of eliminating the dotted border indicative of focus that appears post-click (but that may not be news to most people).

In case you were paying attention in the beginning of my post, you may recall that one of my goals was to create a solution that could be easily reused in ASP.NET. To do that, I’d simply use the LinkButton control in a manner similar to the following:

<asp:LinkButton ID= "myButton" runat= "server" CssClass="button green">
<img alt="" src="img/myImage.png" />
View Report

From there, I’d probably just use some unobtrusive JavaScript to wire up the onclick event of the generated links to perform the “blur” functionality as specified above.

I’m sure a User Control could be created for even swifter and more concise ASP.NET implementation, but I haven’t taken the time to do that yet. If I do, I’ll be sure to share it here.

If you need some additional guidance on making stylish image buttons with anchor tags and CSS, I’d recommend reading the aforementioned article, How to Make Sexy Buttons with CSS, by Oscar Alexander.

Let me know if you’d like me to post the CSS/XHTML code for this example.

Comments and suggestions are always appreciated.


Order By Case and Derived Columns

Not too long ago, I learned how to combine SQL Server’s ORDER BY with CASE expressions. I was dazzled by my newfound ability to apply custom sorting instructions to my queries. It wasn’t too long afterwards that I attempted to use this newfound technique to order the results of a query containing a derived column. That’s when the error messages started pouring in. It didn’t seem to make sense as it is perfectly possible to use ORDER BY with derived columns when CASE isn’t involved. Well, after searching far and wide and can tell you with a comfortable degree of certainty that it cannot be done (I’ve tested it out in SQL Server 2000, 2005, and 2008 and the result is the same in all three).

There’s good news, however: A workaround exists.

All you have to do is create a second query that selects from the query containing the derived column and implement the ORDER BY for the second query. The result is slightly less elegant but it does the trick. I’ve used the Northwind database to create the following example:

How to use Order By and Case with Derived Columns
How to use Order By and Case with Derived Columns

The extra SELECT statement will probably result in a slight increase in performance cost but any effect on the end-user experience should be nominal.

So, that’s how it’s done. Hopefully I’ve saved you some time and energy.

If you know of any other approaches to this problem that I haven’t mentioned, or if you have any additional information on the issue itself, I’d love to hear from you.

Comments and feedback are always welcomed!


Using Nested CASE Expressions in SQL Server

Note: This article is intended for developers who have a working knowledge of SQL Server’s CASE expression. If you do not, I reading this article from Microsoft; it gives a pretty good overview of the basics.

Applications are loaded with conditional logic. So it’s not a big surprise when you stumble across a scenario where one half of a conditional expression is the result of an entirely separate conditional expression. If you’re running into this on the backend of your application, then a nested case statement might be of use to you. Let’s take a look at a real world example.

Let’s say you’ve been asked to write a procedure to generate a report for a business unit within your company. Let’s assume that this unit has both a Status property and a Recommendation property, both of which are stored in the database and related through foreign keys. The requestor gives you the following matrix to use for determining what output to display on the report:

Status Recommendation Report Output
In Production Any In Production
In Test Add Add to Production Queue
In Test Remove Return to Development Queue
In Test Remain Remain in Test Queue
In Development Add Add to Testing Queue
In Development Remain Remain in Development Queue
Other None

Now that we have our problem, here’s an example of how to use a nested case expression to create a solution:

Using a Nested Case Expression
Using a Nested Case Expression

And there you have it. This example is pretty quick and dirty but it should give you a good grasp of the basic concepts needed to use this technique.

A couple quick additional tips:

  1. Do not attempt to derive a column name for any of the nested case statements as this will cause an error. If you wish to use a derived column name for the entire expression, do so at the end of the outermost case statement (as I have done in the example).
  2. You may nest deeper than two levels but, if you find the need arising, you may want to look at alternative approaches for maintainability’s sake before proceeding with a nesting nightmare.

Happy coding!


Preloading Images with JavaScript or CSS

Preloading Images on your Web Page with JavaScript or CSS

Note: This article is intended to be a tutorial for novice web developers and a refresher for web developers with more advanced skill sets.

Web Developers are always striving to create the best possible user experience. While it is often more convenient to deliver content with a minimal amount of images, the richest functionality is hardly ever the most convenient. And the richer the functionality, the heavier the payload that functionality requires. So after you’ve optimized your JavaScript, reduced the bit-depth of you GIF files, and trimmed the fat from your HTML code, what else can you do to facilitate a more fluid and expedient delivery of your content?

Preloading images can be a valuable technique for the developer to eliminate the subtle lag that separates a good page from a great page — if it is done thoughtfully and correctly. First, let’s examine a real world situation where preloaded images might come in handy.

Imagine you have an element on your page where you wish to employ a rollover effect. This is frequently used on form buttons or navigation bars as a way of illustrating a mouseover state or lack thereof. While the “Sliding Doors” technique is becoming the most frequently used approach for this type of effect, there may be situations where something else is needed. Let’s say that you have a photo gallery page. Each gallery item is displayed as a photo negative by default. When the user’s mouse enters the area of the image, you’d like the fully-developed photo to display. The problem should become somewhat evident if the photo files are of appreciable size. If the developed photo has not been preloaded, the user will have to wait for that process to occur when he/she moves the mouse over the photo negative. If the file is big enough (or the connection is slow enough), then this may have a substantial negative impact on the user’s experience.

So, any pre-emptive loading that you can effectively manage will generally translate into good results for your users.

Here, I will present two quick and easy methods for preloading images for your web pages.

Preloading Images with JavaScript

This technique has been around since forever, but, in case you haven’t needed it, or just haven’t used it in a while and could use a refresher, here’s how it works:

You can implement the script containing the preload statements using an external .js file or as inline javascript. The important thing is that you place the script in the HEAD tag so that the desired images load when the rest of the page loads. You’ll want something like the following:

function preloadImages() {
    if (document.images) {
        // Declare image variables
        var myImage1 = new Image();
        var myImage2 = new Image();
        // Assign image paths to variables
        // Make sure the paths are correct
        myImage1.src = "imagePath/fileName1.png";
        myImage2.src = "imagePath/fileName2.png";

And that’s all there is to it.

You might also want to create a function that allows you to loop through a group of image URLs passed as an array. Here’s an example of that approach:

function preloadImages(imageURLs) {
    if(document.images) {
        for (index in imageURLs) {
            var img = new Image();
            img.src = imageURLs[index];

One noteworthy benefit of preloading images with Javascript is that all preloaded images will be cached according to the settings of the user’s browser just like any other image. Therefore, subsequent pages that use any of the same images may load faster.

Preloading Images with CSS

With the proliferation of AJAX and other JavaScript-driven UI frameworks (such as jQuery), I wish I could say that it’s safe to assume your end users are browsing with JavaScript enabled. Unfortunately, just as the old proverb warns us that we should never say “never,” software devleopers know that we should never say “always” either. So if you’re coding for a demographic that can’t be expected to take advantage of JavaScript but you still wish to preload images for some reason, you still have the option to do so. To do this, place an IMG tag that points to the image you want to preload and use CSS to hide the image. You can accomplish this using a style sheet or inline styles but a style sheet is generally considered best practice, so that’s what we’ll use for our example.

First, create a class in your style sheet that you will use to hide the images:

.hidden { display: none; }

An added bonus is that this class is quite ambiguous and can be reused for any other page elements you wish to hide.

Next, create a wrapper element for the preloaded images using the “hidden” class we just created. 
<div class="hidden">

Next, place an IMG tag in the body of the wrapper element we just created for each image you wish to preload, using markup like the following:

<img src="imagePath/fileName1.png" alt="Some Caption" width="1" height="1" />

Make sure you position these elements at the bottom of the page. Otherwise, the rest of the page may not load/display until the images are finished loading. I’ve added the 1×1 dimensions for each image tag in case the user’s browser doesn’t support CSS, in which case the bottom would be littered with a bunch of seemingly random images. Like the JavaScript method, images preloaded with this method will be cached and therefore cut down the load time of subsequent pages that use them.

So there you have it.

Unless you’re going for a Javascript-free environment, I’d advocate using the JavaScript method when possible . The CSS method is equally valid, but it results in a messier and less semantic HTML document structure.

I’d like to give credit and thanks to Jeff Star and his article A Way to Preload Images Without Javascript That is So Much Better for some inspiration on the CSS-only approach.

Your comments and feedback are greatly appreciated!