Misleading Marketing

I subscribed to a magazine the other day, and the confirmation screen contained this.

Can you spot the problem? (The default state for the checkboxes is unchecked.)


The first five checkboxes are “opt-out” i.e. you must initiate an action to stop something.

Checkbox number six is “opt-in” i.e. you must initiate an action to start something.

This is bad interface design.

Most people will not read the explanatory copy closely, and leave the boxes unchecked.

People who take the time to read the copy will begin checking the boxes, and might tick the sixth box on the assumption that the logic is the same.


Is the magazine publisher trying to trick you into subscribing to their email blast?

Probably not, according to Hanlon’s Razor: “ Never attribute to malice that which is adequately explained by stupidity.”

Yes in this case we suspect the publisher is limited not by their brains, but by their systems. Their postal and telesales system probably assumes “opt-in” and the email system probably assumes “opt-out”. Combine the two little form widgets onto one page and you get dissonance.

The Downside

Is it worth fixing? The worst case is that hundreds of people, annoyed at receiving unwanted offers in their inbox, begin marking those emails as spam. Email providers like Google, Microsoft and Yahoo eventually learn from this aggregated customer behavior and preemptively treat all of these offers as spam. Delivery rates thus decline and the promotional channel is tainted.

Is that likely? It depends on the volume of subscriptions through this page, and we can only guess at this point.

Naturally, the best course of action would have been to build the forms with consistent logic from the start.


Hooray for crude solutions

The good folks at GigaOm take note of a study that shows how web pages have increased in size:

…The HTTP Archive charted the growth of the average web page and found that average web pages have grown from 726 KB a year ago to 965 KB now…

And a graph:


GigaOm’s main point is how this growth impacts usage by people visiting sites via mobile. People will bump into their data caps sooner, thus you should make a mobile-friendly site. Fair enough.

My own take on this is if your pages are getting fatter, then they will be slower to download in all cases. This annoys visitors, and negatively impacts your search engine optimization — since Google penalizes slow sites.

The elegant way to address this problem is to clean up your design, tidy page markup, consolidate style sheets, etc.

A cheaper solution may be to throw hardware at the problem. Two weeks ago, I upgraded the server that hosts a number of client websites. Some sites got remarkably more responsive. Here’s the average page load time of one (lower is faster and thus better):


My incremental expenditure was $50 per month. That’s cheaper than just about any page optimization effort.

Making Website Tweaks Easy

A recurring theme for 2010 is the increasing availability of web tools that make effective online marketing more accessible. Hot categories include a/b testing, mobile website maintenance, content management systems, and the interoperability of social media tools.

Yesterday I found another modest, but elegant, service that fits right in.

Sweaver is an new add-on for the Drupal CMS that dramatically eases the pain of making visual changes to your website. (As compared to content changes; that should be easy or you need a new site.)

Watch a minute of the developer’s video – begin at 0:45 or so.

Granted, sweaver isn’t an original idea. There are other visual website builders around, and it’s quite similar to the CSS editor in Drupal Gardens. But sweaver is built for one of the top three content management systems, and in good functional shape. It’s also free and appears to have the active support of a stable contributor. I love it.

The important point here is that visual tweaks are becoming less costly in terms of money and time. Either they now can be done by a less-expensive person, or in less time by your existing web person.

Transcribing Services

My favorite transcriber, an ex-IBM programmer in Rhode Island, retired not long ago. This left me without a proven way to turn customer telephone interviews into text.

Then, while reading the SEOMoz blog, I note that each of their Whiteboard Friday series of instructional videos has accompanying transcription. (This is wonderful on its own – scan the text before deciding whether to watch the video. And it helps a little bit with onpage SEO.)

Anyhow, there is a credits link to speechpad.com…



The Speechpad website is schizophrenic. On the one hand it has easy registration and several ways to send them your audio.

It almost seems like this is a slick new web app.

On the other hand, it does a half-baked job of converting visitors into buyers. Visit their site and the guessing begins. Who are they? What does it cost? What is the turnaround time? Where are the FAQ? What are they writing on their blog? (“Coming Soon.”) What guarantees do they offer? None of this is present.

It is a very mysterious experience.


Despite that, I took a flyer and voice-recorded two minutes of From Poverty to Prosperity (recommended! by Arnold Kling and Nick Schulz at AEI). I sent it over to Speechpad and within a day I had a very accurate transcription. Cost? $0.00. They treated it as a spec job, I gather.

A followup email from them said their rates were flexible, and to for future transcriptions to email them regarding budget and turnaround.

On also learns that SpeechInk is the company behind the Speechpad service. The SpeechInk website has all the missing pieces, most notably rates. Simple transcriptions begin at $1.75 a minute, a rate which goes up with the complexity of the recording and down with the volume of transcriptions.  This is somewhat less than what I paid the ex-programmer in Rhode Island.

Speechpad seems worth a go with future transcriptions — I just wonder what they were thinking when they set up a half-finished brand and website.

How SEO can screw up your…

Some Search Engine Optimization (SEO) tactics can give you a marginal boost in rankings, but hurt you on a net basis.

The main culprit is usually an overweening preoccupation with on-page keyword optimization, but link-building can hurt too.

This article is not about black-hat SEO practices, which are covered in plenty of detail elsewhere. Instead, we’ll look at “good” SEO practices when they’re overdone.

Let’s see what gets hurt.

Muddied Product Positioning
Pretend you sell software to corporate accountants. Corporate accounting software is obviously your #1 target phrase, but you naturally care about capturing related searches. So your SEO advisor says, “hey, let’s target some other key phrases too.” Next thing you know, they’ve added pages on your site titled Corporate Controller Software and Shareholder Accounting Softwareand Corporate Asset Management Software, in part since those are phrases that the keyword suggestion tools returned. And then the SEO advisor includes links to those pages within your left-hand navigation, because that helps the new pages rank higher. Your problem reveals itself when a prospect visits your site, and sees all those links. Prospect then thinks, “well, this company’s product does a whole lot of different things!” And that may not be the market positioning you want.

Underperforming Website Copy
There is an allegedly optimal level of “keyword density” that makes a given page rank well. For example, if you have “organic tea” occurring six times on the page, that’s better than two occurrences. Visitors read differently than search engine spiders, however, and they still respond to effective copy. Stuffing pages with key phrases, then, may reduce the conversion power of a page. So your tradeoff is higher traffic versus page effectiveness. Would you rather have a page that gets 100 natural search referrals and one conversion, or 30 referrals and two conversions? It’s possible to have both, but how are you going to get there? It’s much easier to test and change copy (and design) for improved conversion than to test keyword optimization. We think there is more upside with conversion-building efforts, and other SEO tactics. Landing pages for pay-per-click ads are a great place to test copy and design.

Cluttered Website Navigation
The left nav, footer, and site map page get heavy weight with search algorithms, since they are treated as pointers to a site’s most important pages. SEO advisors often will load the navigation with links to search-engine-friendly pages. This distracts people from your intended sales funnel. Excessive intrasite linking means people go in circles, and get frustrated. Repetition of keywords and phrases in links will lead to guesswork by the prospect: “Is the thing I want behind this link or that one?” Every site visitor has a finite number of clicks they’re willing to spend at your site – don’t make them click more than they have to.

Noise in Measurement
Your SEO scorecard should be visually simple, and focussing on the biggest business drivers. Don’t let it get cluttered with ranks of long-tail search terms in second-tier engines. Doing so obscures the high-value Google terms, which – like it or not – is where the money is. Track infrequent terms on a second worksheet if you must, and present an aggregate performance metric for the lot of them on the scorecard. Also, don’t track search terms which bring unqualified traffic.

Confusing Site Maintenance
This is a minor factor that can be avoided, but we’ll mention it anyway. Regular content changes to web pages is a healthy thing for SEO. For sites not built with a content management system, one byproduct of changing content is an accumulation of defunct and unlinked files on your webserver; the “retired” versions of live pages. You might even have new and old style sheets. Over time this can clutter up your webspace, to the point where it costs your webmaster time. You should have a system for saving old files (in case you need to revert back to them, or audit for troubleshooting purposes) that is clean and organized. Moving them to a hidden directory is one way. Giving them a unique extension like .defunct is another, since sorting by type gives the webmaster a method of visually ignoring the retired files.

Repetitive Blog Entry Titles
If you’ve got a business blog, resist the temptation to consistently pack your entry titles with keywords. Include keywords in the basenames instead, and save the keyphrases for your marquee blog content.

Meta Descriptions
Inside the top of your web page, there is usually a field for a description of the page. It looks like this:
<meta name=”description” content=”The leading purveyor of premium tea”>
Google often uses the copy after content= as the blurb under your listing in a search engine results page. This is a marketing opportunity, a chance to help position your product effectively before the click. Enthusiastic SEO advisors may instead see the META description field as a chance to further pack your page with more keywords. Yet doing so may mean a less-well-performing blurb, which means poorer conversion from the search results page.

Blackened Wikipedia Reputation
Even though a link from Wikipedia to your site hasn’t counted in the search algorithms since 2005, it’s still a potential source of inbound traffic. Don’t go adding links to your site within Wikipedia content, however, unless your content has the right patina of independence. Wikipedia editors are quite mindful of commercially motivated edits, and squash them with a Puritan zeal.